Layered Textures Design Feedback

I think it should be important to try to keep separated the “bake” concept (when you bake maps to use in the texture nodes or to export) from the “bake” when it will be used inside blender in the node tree. Maybe with a concept that you can “freeze” or “consolidate” the nodes.

It’s just Blender’s texturing system, doesn’t really have a name.

The current design only covers the case where you bake textures on the same object. What would the shader and texture setup look like on both objects? Maybe some system where you edit textures on the high poly object, and then on the low poly object you have a node that transfers the same textures channels from a high poly object? It would be interesting to see a proposal for how that works exactly.

I think layers can have an Alpha channel, which would act like a mask by itself already. The idea for the Mask inputs was for masks that are not part of the asset, which I think is common? Say for example you want to paint a bit of rust into some specific part of the mode

Of course all this is possible with node setups also, but the expectation was that masks specific to a material are common enough that it’s worth having in the layer stack this way.

The current design is not for texture (or compositor) nodes to output an image, because we want to be able to use the same texture on different meshes. That means we can’t actually produce an image from just the texture nodes, we also need it to adapt to the mesh each time.

“Shading” is a broad term. I think what’s possible should be clear based on the list of overlapping shader nodes that I gave?

The design has baking in the Texture Channels node, which in principle should be able to bake anything except for light. It’s not in the texture nodes themselves, but in the shader nodes, because we want to keep texture reusable in different meshes. And as mentioned in discussion above, baking results and some settings should probably be per mesh.

So assuming that, what is the argument for putting it inside the texture nodes, what would be the advantages?

Talking about the stack, what I have understood because in the proposed design I see the concept of texture stack still very fuzzy and I see many flaws in it. Not only do we need procedural textures, we need also

  • procedural modifiers
  • procedural masks

And that both can be used in the stack. Otherwise they will make the stack practically useless. Because in the end you would have to go to the nodes to configure anything. And any user of procedural textures knows that 90% of the procedural work is in the masks. Not so much in the texture.

And there is another thing that I think I understood and that seems to me to be a problem. And that is the fact that

“Blending, masks and modifiers simultaneously affect all channels in a layer.”

Does this mean that if we have a multiply for diffuse is a multiply for all channels? because it is a very big problem.

1 Like

In the mockups you can see masks and modifiers are just nodes, they could be any kind of built-in node or custom node group.

See my reply here:

2 Likes

I understood that part. To make it clear I simply mean, the way to work currently (that only one program allows it) is stacking masks (and modifiers of the masks) inside textures mask. To achieve the desired effect. All this inside the stack. Not of the nodes (that as I have said I understand that there everything is possible).

Is this raised in the current design?

To give an example, this is my actual file with procedural textures. All that mask and modifiers are only for the mask itself. it’s not for texture(or PBR channels)

If we can not do this with the proposed design it will obligate user to use nodes. Or the design have a solution for this?

No screenshots like that please:

The plan is to support adding masks and modifiers from the layer stack UI. Such a system would not be limited to a handful of nodes, it should also be possible to add user created node groups from there. Likely there would be some kind of asset tagging to indicate which node groups are suitable as masks or modifiers, to avoid seeing a long list of irrelevant node groups.

There’s no specific design for stacking masks or modifiers on masks, it’s something we thought about but it would be interesting to see ideas for what the corresponding texture nodes would look like.

2 Likes

There should be an easy way to bake all textures in a scene, for users as well as exporters and renderers that need them.

As an occasional Blender user who uses blender to create assets to be used in other software (mostly “Enscape” assets for Revit) I think this is very important. Ideally the exporters (I mainly use .gltf workflow) could be updated to include a checkbox “bake textures”, so that we can use procedural textures while setting up assets and still have a very simple workflow to get these exported together with the asset.

Not sure how feasible that would be? Could it be made foolproof enough that users could download a procedural textures online, and be able to include them in their exports by just checking a box in the exporter, without ever having to actually understand the note setup of the texture?

1 Like

I think this thread and the proposal would benefit from a little more elaborated mockup, Just Integrating the current mockup in the native blender UI would be much better.
Maybe even writing a simple manual for basic action like: painting PBR material, baking, adding layers stack to the Shader Node, adding layer stack to the Geometry Node, etc.

On another note, maybe some type of baking or caching can be included in the node that requires baked texture. So internally the blur and the filter nodes would be combined of a Bake Node and a Blur Node/ Filter Node.
When only an explanatory warning is visible to the user.

1 Like

I think that we need to differenciate

  • Bake maps: You bake the curvature, AO,… to use in other software/texture nodes
  • export maps: You export your PBR texture (created with the new node system) to use in other software
  • freeze maps: You simply indicate that blender uses the cache from one node in form of texture

and maybe

  • bake: When you bake other things, like lightmaps, atlas,… like in the first example.
1 Like

in my opinion textures and shares need to be unified and me
I would replace layer stack with the concept of modifiers. let me explain: if every geometry nodes is a modifier, every modifier can be seen as a layer of the geometry. in the same way every shader nodes is a material, every material is a layer of the geometry. only the concept of layers flows from bottom to top, the opposite of modifiers. masks, blends, effects and bakes are modifiers applied on the underlying layer, as for videosequenceeditor.

Overall I’m very happy with this plan. Two questions thou:

Do you see texture nodes and shader nodes existing in one editor? I’m asking primarily UI-wise.

  • conflicts like blur node operating only on baked data could be solved with user-facing messages and warnings like: Procedural texture need to be baked before using Blur node. This could be underlined with red dot/outline on layer/stack/blur node for example.
  • texture reusability could be solved with sharing layer stacks as separate nodegroups
  • problem of sharing materials containing textures exists whether they are in separate editors or not
  • layer stack in Properties would display all layer stacks for an object (but not its materials)
  • painting layer in stack view in properties would be linked to specific layer nodegroup in Unified Node Editor

My argument for this unification is that user have less interface to manage with one Shader/Texture editor and it’s representation in Properties.
And if baking can be made in a form of nodes in the same Editor, this design could also simplify baking workflow and solve dilemma where baking/caching UI should live.


My second question is about Attribute workflow. In the blogpost mockup there is an Attribute node present. If I understand this correctly it implies that attributes made with Geometry Nodes will be possible to use in Texture Editor too?
My question is how you see attribute flow direction in the future. Could it be possible to ‘send’ attribute from Texture/Shader Editor to Geometry Nodes too for example?

4 Likes

If you tab into the Texture Channels shader node, you would be able to edit the texture nodes right there without switching editors.

I think a single shader node graph with the BSDFs nodes, layer nodes and bake nodes is going to be quite hard to learn and set up correctly. You end up with rules about what you can put in a shader node group for it to be reusable as a texture, when and where you add bake nodes, which channels each layer should consist of, etc. It’s difficult to build a user friendly and high level UI when there is little enforced structure to the graph.

With this design were are trying to make it so that just naturally by the way things are set up, you could one click bake all the channels. Maybe that type of thing is possible with bake nodes too, but it’s not clear to me what the UI would be.

There could be a Texture Channels geometry node similar to the one in shader nodes. That would reference a texture datablock, and then you could use the channels as fields.

1 Like

Sounds good. This wasn’t clear from reading blogpost.

I’d argue about setting a node graph. To me it’s way easier than navigate through dozens of sub-panels in texturing apps. But I get the point about setting rules. Tabbing into Texture Channel node from shader editor fills the need I had in mind, so no further questions.

The simplest way I can think of is putting baking nodes automatically after or before certain texture/layer/layer stack nodes if the user hits bake in stack properties. The content of the node could be identical to what is shown in properties.
This could allow for separate workflows if someone wants to use only nodes or only properties panel for work.

We don’t know how baking UI will look like, so what I wrote is shooting in the dark.
Do you plan to put baking in separate Properties tab or keep it within Layer Stack one?

That’s great!

Thanks for clarifications.

Maybe RCS’s number one request will finally be fulfilled for this project: Right-Click Select — Blender Community

3 Likes

Baking panel could be moved from Rendering Properties to Output Properties tab.
And more Baking panels relative to user baking requests could be added to this tab or View Layers tab (who could be renamed Render Requests).

1 Like

Will texture layers be able to promote things such as float curves and their UI? Currently RGB/float curves, colour ramps cannot be edited or promoted outside of the current shader nodegroups, you have to edit them inside. I have wanted the ability to be able to promote curves/colour ramps as group inputs in regular shader nodes for a while… this may be Offtopic

I assume the texture layers panel UI will only have values that you can change and no curves promoted into the stack and you would have to jump into the nodegroup to edit them.

I really like the overall layered textures design ideas so far It seems well thought out.

1 Like

This is very exiting, I like the general idea!

Bake button on texture node
One thing that would seem interesting to me is that if you have the option to bake textures, it would be like a switch on the (master) procedural/ baked texture node. Visually, I think it should be like Apple’s on/ off button in the node’s header bar, perhaps it should also affect the node’s outline.

When toggled off, the procedural mode is used, when toggled on, you will be prompted to save the baked textures to disk. But, if you toggle the switch off afterwards, you would return to the procedural workflow state. This does mean that the (master) texture node should remember:

  1. The procedural node tree
  2. The paths to the texture files that were baked to disk (if not baked in the blend file)
    2a. If baked again, the filepaths should update
    2b. Save backup file prior to baking(!)

Additionally, a single click to bake, would imply that all texture nodes are switched over to their baked state. The button for this should then be on the output node or somewhere in the UI, as opposed to on the nodes themselves. Although I can imagine you would be able to hook up a bunch of texture nodes to a bake node, which then bakes the result for the nodes that were connected to it and switches them to baked mode.

The idea behind this is that you can easily switch between the baked and procedural mode at all times. This could be useful for speeding up the workflow by having a shorter computation time when baked. Also useful for Blur like operations to be shown in the material preview. Furthermore, you can easily get back to the procedural mode to tweak the material later down the road and, easily re-bake them. Any operation that would destroy the possibility to switch between bake/ procedural should then (adjustable by user preferences) display a prompt message to confirm to proceed.
Also, by linking in the baked textures, it is possible to let a texture artist work on the texturing, bake out the result and have another member of the team load in just the textures by pointing to a file path that contains all of the textures, preferably with a “_Socketname” like indication.

Texture stack
Also +1 for the stack editing the texture painting through a stack, any Krita or PS user will appreciate that its there and it should preferably work the same way (albeit with procedural layer options aka ‘Smart Materials’).

Linked layers in the texture stack
An important aspect to consider is linked layers in the texture stack. In a complex case, you may want to create one mask and use it to drive multiple subsequent layers in the stack. For example, you may want to mask out the road from the pavement and later on use the inverse mask to create procedural manholes on the road. Being able to link these nodes on the fly means:

  1. The target linked layer will update when its linked parent is changed.
  2. Linking two layers will override the target layer with the parent layer.
  3. There may be other properties that could be linked between layers, e.g. occlusion masking or transform coordinates (you move/scale/ rotate a Texture A by distance X and Y and want this to be the case for Texture B too).

Conceptually, this is the equivalent of using Krita’s clone layer system, but then also for properties of a layer, rather than only cloning a layer.

Stack/ nodes
For showing the nodes as opposed to the stack in the panel, there could be a toggle switch view modes, much like how you change from solid to wireframe in 3d views.

Also, I think its good to consolidate texture nodes and shader nodes. The way I imagine interoperability to work, if this makes sense, is to have a top layer header “Principled BSDF” that cannot be altered by the user. Much like plugging in the texture node directly into the shader node, but then visualised inside the layers stack. Any layer below the shader header would then naturally reside inside a top level folder, user folders would then be subfolders of the shader folder.

If you want to create a more complex material, you would have several of these top level shader folders. The contents of the folder are then equivalent to input parameters, so the things you would want to affect as a user. The way they are wired up, would not be apparent, for that you would switch to the node view (e.g. create a mix shader node and hook up the two top-level ‘folder’ shaders to it and have the mix shader node connected to the output socket).

Actually, the idea of tabbing into the shader folder might be interesting. If you do so, you would only see the shader hierarchy and associated input/ output sockets.

Can’t tell you what a difference this will make for our studio

General thoughts

  • AO needs to be brought out in the baking pipeline, presently it’s hidden away in several places
  • The pipeline should be as streamlined as possible. Making AAA game assets is about speed due to quantity and budgets, so being able to get a texture project set up (say) < 1 minute with minimal clicks and drags is ideal
  • Once a style is settled on (texel density etc) pretty much the same pattern for texturing is done over and over many hundreds of time. This plays into the point above and will be driven by the Asset browser, but also having the controls in one nearby location (baking/exporting/etc) is important. Asset naming and locations help here. For example, if Preferences had a way to specify where bakes, textures and models are exported (e.g. in subdirs Bakes/ Textures/ Models/ or an absolute) is very helpful

Is making a single texture datablock for many uses a good idea?

Yes, if conceptually that’s what it is don’t try to paper over it.

Are there better ways to integrate textures into materials than through a node?

Nodes are slow and a bit clumsy, again it comes down to speed. Subtopic
“Is it worth having a type of material that uses a texture datablock directly and has no nodes at all, and potentially even embeds the texture nodes so there is no separate datablock?”

Yes, in addition to a full node this would be helpful for when you just want a *&!$ texture

With this system, the users can now do the same thing both in texture and shader nodes.

Yes, that aligns with game engines, where you can alter baked textures in the shader system. No problem and again conceptual simplicity

What is a good baking workflow in a scene with potentially many objects with many texture layers? … how can we make a good workflow in Blender where users must know to bake textures before moving on to the next object?

Show the image in a thumbnail on the node. Important for usability and debugging. Since the bakes are just textures themselves, when not baked show them as a named but empty thumbnail.

Otherwise a single workspace with all of these elements is what is needed. Perhaps generalize the “Image Editor” to be “Texture Editor”. For the modes “View”, “Paint” and “Mask” add “Nodes” (or something). Add buttons in obvious places to perform the baking, exporting and other options.

Some textures can remain procedural while others must be baked to work at all.

Not generally a problem in other such applications. Regardless, if you have a thumbnail for the layer, you could simply omit that for pure procedural textures, or you can indicate it by how the thumbnail is shown (e.g. bold border for baked textures)

How do we define what a modifier is in the layer stack?

Not sure I follow, but if I understand it, the properties for the layer should make it clear

Can we improve on the name of the Texture datablock or the new planned nodes?

“Texture datablock” isn’t clear to me yet what that is, but that may be just studying the notes and thread. Regardless using the word “Texture” is probably best as that is fairly standard for this context.

1 Like

“conflicts like blur node operating only on baked data”, is it even possible to add a “blur” gradient thing to procedurals at all ?

No. I was theorizing about combining texture and shader editor in one and how to solve potential conflicts.