Layered Textures Design Feedback

Assuming the Datablocks of the Geometry Nodes, Materials and Textures were the same, I have tried to organize it so that the edition of these is accessible, comfortable and visually unified from the Attribute Editor.
As can be seen in the first screenshot, in the current state it is almost impossible to understand a material tree of simple nodes in Attribute Editor, on the other hand it is not related to the Geometry Nodes, and I do not see relation to the Texture Datablock proposed.

In the second image, the Geometry Nodes, Costraints, Physics, Hair, Particles, Materials and Textures proposal would use the same Data Block and use the same nodes, the Output would mark the difference between one and the other category.
Although a material is shown in the model, and surely there are a thousand gaps, I understand that it can be extrapolated to any tree of nodes, and in the case of textures it could show the Layer Stack as a starting point.
NODE EXPLORER, located below the list of components in the Attribute Editor, would show the nodes calculated by the Output in a format similar to that of Outliner, and would only show the attributes of the selected node. From the displayed node attributes it should also be possible to navigate to its Inputs and Outputs using arrows to the left and right of these respectively. As an Output can be connected to more than one Input, clicking on the output of the Output should bring up a floating menu with all the nodes connected to it.

I hope this is not nonsense.

@brecht with Unreal 5 and Nanite (virtualized geometry) we now have the possibility of using channel packed baked maps in vertex colors.

It’s little different than showing how a baked texture could be used in an end product, which is useful to know as we support different use cases.

Apologies if this has been mentioned and I missed it, but the point is that we need the ability to bake a channel pack RGB texture into a vertex color, since Unreal vertex painting isn’t available for Nanite (it has to be pre-baked in the incoming asset)

image

There is also a free and open source addon which assists with this process

Please follow these guidelines, I edited the post to remove the embedded video.

I explained in the post in detail why I believed it did not violate the guidelines, but you can disagree of course.

Will viewport painting be improved?

2 Likes

Enough people liked this for me to motivate myself to make another pass at this :grinning_face_with_smiling_eyes:

Here i made/thought several changes to the usability and ease of use

  • (Left side) Here i made some adjustment to the “Local/Applied” button’s, its more clear now what each button does with a visual indicator
    Another note, if by chance there are people who find it difficult to see the button visual que, there is also the idea to add a Dash/Line under the text Local/Applied to make it more clear which button is active (as shown on the left below the preview windows)

  • (Left side) Here i also created a “what if you could change the display preview size” that could be in the N-Panel together with the “Preview on/off” button

  • (Right side) If anyone turns off the “Preview” under the N-Panel then every node will look like the “Layer node” as shown in the picture

  • (Right side) Here i applied the new Loca/Applied buttons to the layerstack, the preview area has a additional “dark” overlay to differentiate it from the nodes input/output that are above it

1 Like

Following the same intention, I have made another Mockup to show the same concept but with the texture Output, although adding a shader after the LayerStack and changing the output it would also be possible to have a Material one.

I have added a bake on an attribute to reflect how these could be displayed from the Node and Attribute Editor. In the same way, baking the LayerStack could obtain the texture channels that the BSDF materials require.

your channels output need and output like shader or RGB color socket… Because you are in a material editor…

If I wanted the group of nodes to act as a material I would connect the LayerStack directly to a SeparateChannels, then to the Shader and then to a Material Output. As in this occasion the group of nodes pretends to be a reusable texture and a Channel Output is well connected, in this way various materials, modifiers, brushes, or in post production should be reusable.

My intention is to show how the unified nodes could be displayed in the same Datablock, which would allow them to be edited both from the node editor and from the AttributeEditor.

Maybe it should be a LayerOutput instead of a ChannelsOutput?

I’m not sure I get the Applied/Local concept. Is it like before and after where one displays the input texture, and another displays the output texture? If that’s the case, I’m not sure I like how it works.

These nodes could have multiple input or output textures, so I think what might make more sense is that if you use your mouse to hover over one of the wire ports/sockets, it will show the texture at that port, if you hover over an input, it will show a before, and if you hover over an output, it will show an after.

Additionally, you could click on the port to make it permanently show that particular port in the preview. However, I wouldn’t want to have the ability to have some nodes showing before, and some showing after previews, so I think you should only be able to click on an output port to permanently change the preview, I think inputs should be temporarily shown on hover only.

I’m not sure but if your idea is exact you create just a texture, and with a new nodes for exemple called “textures call” we can call this texture and connect them to a mapping or any material nodes ?

Basically it is put in the shader editor the basic is create a compiled image and can be output on any node material if my logical is good…

1 Like

You have something mixed up, the point of the Local/Applied idea is to show

  • Local: The modifier itself how it would look without it being applied to anything
  • Applied: The modifier shown when it is applied to the input

There is no such a thing as showing the input itself without any modifications, as you have those shown in the previous node.
Example, you have a “Square” node that creates a rectangular square, then you connect that node to the blur node, then in the blur node you see how the “Square” looks with the blur applied or if you want to know how the blur looks without it being applied.

Blur must be applied to something to generate a preview image. So what is it applied to if not the input, some default image?

Blur might not have been the best example :sweat_smile:

I would assume that a “modifier” node with a predefined preview look is quite rare, because as you pointed out they need a input first to display the result, unless i assume that it’s visualised in a different more mathematical way maybe (ie. a Default image, either generated by the settings or a predefined one).

This suggestion came from @DrM so he might know more on this, or has more to say on it.

I’m not sure but if your idea is exact you create just a texture, and with a new nodes for exemple called “textures call” we can call this texture and connect them to a mapping or any material nodes ?

Yes, that’s my idea, and by the way remove the Shader editor, and simply differentiate the type of node by the main output.
In addition to the Geometry output, also add the Texture/Multitextures and Material outputs. Although a texture attribute can be obtained from Geometry Nodes, the grace would be that it could be cataloged, and could be called from another object.

I’m seeing some problems in my idea without going into bakes yet.
1- To apply a textured material without doing it on the same node, you should first create the texture node, call it from a material node, and call the material from a GeometryNode.
2- Do I have to call a FaceSet to assign a material?
3- To visualize the materials I need an assigned material, and if I don’t have it, PaintMode should lend me one to preview it.
4- When making an Asset of the LayerStack, it would also be convenient for me to visualize it on an object.
5- Can I call a GeometryNodes or Material from a Texture node? Who is allowed to call who?
6- What would it mean to apply the nodes of a texture or a material?
7- How does the order of the nodes affect the materials…?

I make proposals that are not even clear to me, but I suppose it is time to reflect.

I’ll stop you now… Geometry node and shader node can’t be assembled… he create a biggggg confusion in any big software … Shader and geometry was separated forgot this idea, is impossible sorry and is good like that ! both need to be dinstingued.

For my opinion i think this system is fully useless… just create a few new node for the shader editor resolve all the problem… i don’t know why everybody hurts this head so hard…Thems nodes was wanted such a long time. Just create them with an output and add id in the 3 node editor and bam the problem was solved… but the datablock need a serious investigation, and eventually a little rewrite…

  1. This is for that material output ins necessary you can’t applicate a material without output.

  2. Basicly yes

  3. Yes and no… a viewer is better… like any game engine a preview mode or a direct output on the mesh.

  4. Need a cache system ?

  5. No, because Blender have seperated both system, and the only geometry you can edit is with displacement maps ! I’ll request a new nodes like morphing node present in shader editor.

  6. Texture nodes can design ONLY a texture ! you can’t applied them directly on a mesh, you have an obligation to passe them by the shader editor, and apply them material with the texture created in texture nodes editor (But basicly nobody use the texture editor everybody create this own texture in shader editor is better with more option and directly result ! this is for that texture editor need to be gone)
    7.In logical order left to the right.

Nobody uses the Texture Node Editor because it currently doesn’t work, not because the idea of one is bad. In Blender’s current Texture Node Editor the textures it generates can’t be used inside the shader editor to create a material, and as far as I can tell, they can’t be exported as an image file. It can only be used in a few places (as a brush alpha, a modifier, or in the compositor). It is also currently broken, for example it has a place on each node for a texture preview, but it doesn’t actually show the texture, and I have trouble getting the texture to update in places it should be able to be used.

If texture nodes are redesigned and reach a more finished state, they will absolutely be useful. Substance Designer is a much more complete texture nodes system, that is used by tons of professional texture artists, and was used in almost every AAA video game in the last few years.

While texture nodes and shader nodes seem like basically the same thing at first, there is a big difference in how they work: each Texture Node is given a full image to apply its effect to, while each Shader Node is only given a single pixel to work with. This means that handful of really useful nodes everybody wants (blur, distort, move, rotate, scale, scatter) are impossible to do with shader nodes. You can’t apply a blur effect in shader nodes because the nodes don’t know about any other pixels to blur with, you can’t do a move effect because you don’t know about the other pixels. You can work around these issues by modifying the coordinates before you generate the effect, and you can even sort of make a blur effect by adding a white noise node to the coordinates in the shader and rendering with lots of samples to reduce the noise, but you can’t use a node to move or blur something in the shader editor after it’s already been created.

Because of this, many effects are a lot easier to do with a texture-based system. An example of where this is a material for a pile of leaves. First you create a leaf shape, then you want to place hundreds of copies of that shape around randomly, with some of them overlapping others. Texture Node tools like Substance Desginer or Material Maker come with a single node that can do that for you. To do that in a Shader you have to first generate a grid of coordinates where each section is randomly offset and rotated, then plug the coordinates into your leaf generating node group, and then duplicate this process a few times to get different patterns you can blend together to make them overlap nicely. What could be done with 1 node in a texture-based system, would probably take over 100 in a shader based system.

Whether the texture node system and shader node system should be put in the same place or not is debatable but having both systems would absolutely be useful

17 Likes

No it’s a good example. Getting the results you want with a node and/or layers system requires quick access to the results of each operation. It’s debugging information, and also live feedback as you change parameters, you want to know the effect of it in isolation without the cascading effects further on.

Yes it usually requires an input, if that doesn’t exist yet you can just show this

(link to Wikipedia so please don’t tell me that’s not OK)

I’m joking of course, but just show some equivalent of ‘no input signal’. Doesn’t really matter - fully black, cross hatch … or hey, why not just show a picture of Suzanne? :grin: That might be cool actually, to have a bunch of monkeys looking at you. Seriously, just show the icon of blender or Suzanne - so it’s not overwhelming

I didn’t even realize that Blender has a (defunct) texture node editor already. So much in there just never noticed, but it already has a default (no input) texture, which is just fine. Only odd thing is the checkerboard is viewport, not node based. So it changes as you move the node around.