Layered Textures Design Feedback

Maybe I could not explain it well with standard general terms (still a new Blender user :slight_smile:), but before I explain what I meant by “PBR creation features”, I’d like to know up to what extent does Shader Nodes and the new Texture Nodes actually overlap. Making us know the similarities between these two editors would be easier for us to suggest improvements to the new proposed system.

Texture editing means that currently the Texture Nodes only work on textures applied to something such as brushes as alpha textures or to textured displacement of an object using Displace modifier… In simpler terms, it refers to the fact that Texture Nodes should probably focus on making custom textures. However I also like having a dedicated Layered Texture Stack like the modifiers.

The overlap is more or less all existing shader nodes except:

  • Nodes in the Shader and Output category
  • Camera Data, Light Path and similar ones that make no sense outside of rendering

https://docs.blender.org/manual/en/dev/render/shader_nodes/index.html

Will the overlapping nodes actually share also the internal implementation, or will the implementation have to be duplicated like it is the case for example with procedural textures in GN vs Shader Editor?

Maybe for the Blender-to-Blender case but for the more general Blender-to-Anything case it could be a huge drawback if there’s not clear distinction between the tools/systems for internal baked data of a mesh and a more general bake system that can bake what you want for final textures.

Usually when you bake for internal usage in a filter/fx it is considered as a “data/mask” you need to some process, it’s a intermediate step and mostly of those are not generally useful as final images in the PBR workflow (AO is the only exception probably). When you bake for “final images”, independent if those are for Eevee, Unity or Unreal, you have “project” requirements of those images that are generally totally different in many ways.

One practical example with the more simple case. When using AO for some effect it should have a linear gradient with the full ramp so people can leverage a “levels” filter and use it as a mask for some effect. Also baking it at 4~8K is probably required. But the final AO of the model is something totally different that is defined by the art style of the project, device it will run will define the resolution… maybe it’s inside other channel pack map… They are not the same… and make those two different usages one single tool in blender seems impossible without huge compromises in usability in one or other usage.

2 Likes

Exactly! It always bugged me that the procedural textures in the shader editor (and now the GN editor) and the ones in the texture property editor are separate and not interchangeable, sometimes for example I wanted to have the same procedural texture drive both the material and the effect of a displace modifier, and always felt limited by the impossibility of it. Plus, there are some really nice textures in the properties editor (Clouds, Cell Noise), and their parameters, that I really miss not having in the shader nodes. This is by far the biggest reason I’m waiting for texture nodes being redesigned.

  • Texture specific nodes like Blur, Filter

Yes! More control on “effecting” textures would be a huge benefit. I can say other important effects like Posterize, Threshold, Distort…

I agree that more nodes/tools for NPR workflows would be very nice to have.

Can’t really give more feedback due to my very limited knowledge of topics like Baking, and also because this is a lot of information that partly needs to be imagined, and I can’t really give more feedback without having a try of the system itself once it’s ready and see how it interacts with the rest of Blender, especially at UI level. This is to say that I’m very happy this topic is being tackled and can’t wait to try this in Blender, congrats to the devs!

1 Like

The implementation would be shared where possible. For the CPU implementation we can share a lot with geometry nodes. For the GPU implementation we would share the implementation with Eevee. Layer related nodes would internally generate a bunch of Mix RGB and similar nodes, so that they automatically work in any evaluation context.

The main work would be nodes like Geometry, Texture Coordinate, Attribute, Blur, Filter that depend on the domain where the texture is evaluated.

2 Likes

Would this texture layering apply to painting masks in the compositor? Could we layer paint and smudge on a render result with these tools?

Ok, I was misunderstanding this. In other apps there are texture nodes that implicitly use AO and Curvature maps, and those are expected to be cached for best performance.

I’m not sure what the best mechanism is to cache them, and if it’s best to have them explicit or implicit in the texture nodes. But yes the workflow for caching those input texture channels is not the same as baking of the output texture channels.

2 Likes

Unlikely in the initial implementation focused on texturing meshes, but it would be a logical extension.

1 Like

I hope, shader nodes will have texture pointer too, as GN have now:

It will will allow to sample same texture few times easier, and change it without changing few nodes deep inside shader tree.

Good design.

One of the strengths of SP is that apart from the materials and the procedural masks, the brushstrokes are also procedural. This allows you to paint a text at a low resolution and later raise the quality.

I imagine that SP saves all the paths of the brush strokes and attributes such as the color or radius of the pointer among others. So at any time you can reproject the colors or stamped textures.

In my opinion it would be something similar to Grease Pencil being able to paint the objects around it. SP does not allow modifications to the mesh and plots cannot be displayed. Blender allows you to modify the mesh for this reason I think that adapted to nodes it should be done through a utility that calls a stroke path and allows to bake the color projected on the mesh surface for converting it to a texture.

In any case, seen in this way, it could be added in the future so that the process was completely parametric.

Another very useful tool is the projection boxes, useful for stamping stickers by bringing the boxes closer to the surface of the 3D model.

2 Likes

I don’t mean “function” nodes instead of texture nodes, I mean texture nodes on top of function nodes.

:point_down:

Before I dig into the proposal, I must say I’m quite exasperated at the confusing terminology here.
Please consider the following changes so that we’re more in line with the rest of the world:
Texture → Texture Set.
Channel → Texture
(a channel is commonly a single grayscale component, not a set of RGB values)
Layer → Node (This is a nodal system after all, and the ‘layers’ in the proposal don’t combine to create a single image but a set of them)
Layer Stack → Node Graph

Also, I’m not sure about this one, but it seems to me that the verb Bake in the proposal is overloaded? Seems to mean two different things: 1 - creating an input image by sampling geometry, and 2 - rendering an intermediate buffer for further processing in the node graph. Is that correct?

Apologies if this seems like bikeshedding, that’s not my intention.

One more humble plea is to finally decouple the texture node from the texture sampler while this work is going on, so that any setups that manipulate texture coordinates can be made independent of the actual images they work on. While this is only tangentially related, you will still be working on the texture node, so now would seem to be a good time.

1 Like

I’m not sure what you mean. On the implementation level, texture nodes would share code with function nodes.

On a user design level, I don’t see practical use cases for introducing a function nodes group concept, if that’s what you’re proposing. Or how it would simplify making divisions between node systems, as the common nodes are not what makes that difficult.

All of this is looking very promising. So good to see movement.

One thing that I always think about when it comes to textures (at high resolutions and many of them) is:
A texture caching system that would allow us to be much more free while building scenes with characters and environments without having to always think about texture memory limits and where to save some pixels to still be able to render the whole frame in memory.

Are there plans for this?

1 Like

Terminology is certain a challenge and something I’m looking to improve. But if we compare to a certain popular texturing app, my understanding of their terminology is different than yours.

  • Texture: what we call an image texture in Blender.
  • Channel: matches the term in the design proposal. Often there is a one-to-one mapping between a channel and a texture, but you can pack multiple channels into one texture. The conflict with the term for R/G/B/A channels is unfortunate.
  • Texture Set: a set of image textures (to be) baked for a particular mesh. Potentially if we store a set of image textures on a mesh, that’s what we could call it in Blender.
  • Layer: this is not the same as a node in Blender, it’s a set of channels that is created and processed by nodes.
  • Layer Stack: in Blender this corresponds to a specific node in the node graph, not the entire thing.

As for baking, I used that term for baking channels into a set of image textures or mesh attributes, that may then be used for rendering or export. For intermediate buffers I would use the term caching, but the proposal does not go into that.

8 Likes

We want to add this in Cycles, would be great if we could manage it this year, but it’s not part of this project.

7 Likes

I don’t understand this.
Does that mean that a Layer is always obligated to process all channels? If the user creates a Layer that only supports a certain set of channels, will that layer simply not work in a Texture with a different set of defined channels? Like for example one user prefers to call one channel “AO” and another prefers “occlusion”. Same thing, but won’t fit. Seems to me like users ought to be able to reroute channels as needed between Layers.

Or maybe I don’t understand what a Layer is. Are they not user-created? The pictures show Layers like Rust and scratches, so they seem to be user-created?

This is really good for blender and once working it’s gonna make a huge impact on the blender usability for texturing, but I think a layer based system could be interesting to add but I’m thinking that shouldn’t be the way to go.

My idea is to have a material mixer node graph where we can add materials from scene, and mix together with some dynamic output node that can be changed in order to match different workflows and export settings ore even add specific and personalized output directly from one channel (for later multi channel bake).

The problem is always mixing materials, so this way we could provide information from material nodes, there we could add masks to them in order to stack different layers of detail. just like mixing shaders and channels in the material nodes.

Maybe add some input nodes that hold textures that can be edited and modified to use as “mask layers” with blend modes and filters (but editing them not just clicking on them, like now, instead a check mark or a button that tells you you’re editing that image ore even multiple images as well) this way we could store information from previous bakes and select which information goes to, and even tell blender that image affects multiple inputs and masks, just like anchor points in other software, but this is far more powerful and easier to see visually.

I know this is far more big than just the textures, and I think this project should be working side by side with the baking overhaul too. So I hope we can help make this project as much as we can, you’re doing a great job!

Layers are created with a Layer node. The inputs of that node are the channels defined by the texture, for example Base Color, Roughness and Metallic.

You can add a Layer node and connect image or procedural textures to its inputs. If you want the layer to have only a subset of the channels and not connect anything to some inputs, that’s possible. When it’s mixed/stacked with other layers that channel will not be affected.

The Car Paint layer in the example is a layer that’s locally created in the texture datablock being edited, with a hand paint texture.

If you make a nice procedural layer, you can put that in its own texture asset, and reuse it. That’s what we imagine the Rust and Scratches layers to be, pre-existing procedural texture assets that are used as a layer in the stack.

If you want to separate a layer into channels to be processed individually, there should be a node for that too. Probably that kind of thing would only possible in the node editor, not something you would control from the layer stack UI.

7 Likes