Layered Textures Design Feedback

This is indeed something that’s not so clear in the design, where exactly the baked images should live. It would be good to support baking to mesh attributes, so those would be naturally stored on a mesh. Attaching the baked images to a mesh instead of a material would be helpful and consistent in that way.

A challenge would be how to represent this well in the UI, you’re editing texture nodes, material nodes and also data on a mesh altogether.

This would require some non-trivial changes in rendering to specialize materials to meshes, though potentially it could be implemented in a general way that works for all renderers.

I’m not sure this distinction is needed. I think that it’s useful to see in Blender exactly what the final images look like on a model, and to avoid having to bake once to see things in Blender and another time to export.

I think it would make sense to have native settings to control such channel packing or combining multiple objects. This could then be used both by baking, as well as for rendering.

With baking nodes there is potentially more flexibility, but with more fixed settings you can do the conversion in two directions.

4 Likes

yeah, in regards to procedural textures, I think some of the tools/features that can be added is:

  • A ability to make non-procedural textures into procedural ones (think BlenderGuru’s video/nodegroup on texture tiling) plus Photoshop-like fixing of texture seams. Currently its very hard to convert textures into seamless ones only inside Blender.
  • A ability to break procedural and non-procedural textures into blocks that are easy to replicate in Unity/Unreal/Godot.
  • A working Stamp tool, that can pick source from 2D image in image-editor! Or 3D source
  • A working mirror ‘visualizer’ while texture painting
  • Gauche Noise Filter for painterly look!

For a landscape, I agree, instead of using a 4K/8K texture, its more efficient to use a 1K texture and some maths. My point is some people want the humanly control.

The new frontier of texture system is the use of blocks like Substance Designer. I make a custom emblem in photoshop/krita, then import it in Blender to use as a block in my node tree, but editing it on the fly is a pain (because I can’t do that inside Blender currently, and I have to fix it in PS/Krita, and the reimport again).

2 Likes

Textures nodes need more functionality than just the common subset between all node systems. So I don’t think such a system could be used instead of texture nodes?

Adding another mechanism besides texture nodes to share node groups may be useful, though I can’t really think of practical use cases that would justify it. In any case, it seems outside of the scope of the texturing project unless it specifically solves the texturing problem.

  • With this system, the users can now do the same thing both in texture and shader nodes. How does a user decide to pick one or the other, is there anything that can be done to guide this?

Won’t it be good to integrate baking and PBR creation features in shader nodes and keep features that work directly on textures (like texture editing and texture layers) in Texture Nodes? So as to avoid any sort of confusion?

Baking should indeed be at the material or mesh level, not inside the texture nodes.

However it’s not clear to me what the difference would be between “PBR creation features” and “texture editing and texture layers”.

Improving texture painting performance is a separate project that will be tackled before texture layers. Some details on that are here:

1 Like

Maybe I could not explain it well with standard general terms (still a new Blender user :slight_smile:), but before I explain what I meant by “PBR creation features”, I’d like to know up to what extent does Shader Nodes and the new Texture Nodes actually overlap. Making us know the similarities between these two editors would be easier for us to suggest improvements to the new proposed system.

Texture editing means that currently the Texture Nodes only work on textures applied to something such as brushes as alpha textures or to textured displacement of an object using Displace modifier… In simpler terms, it refers to the fact that Texture Nodes should probably focus on making custom textures. However I also like having a dedicated Layered Texture Stack like the modifiers.

The overlap is more or less all existing shader nodes except:

  • Nodes in the Shader and Output category
  • Camera Data, Light Path and similar ones that make no sense outside of rendering

https://docs.blender.org/manual/en/dev/render/shader_nodes/index.html

Will the overlapping nodes actually share also the internal implementation, or will the implementation have to be duplicated like it is the case for example with procedural textures in GN vs Shader Editor?

Maybe for the Blender-to-Blender case but for the more general Blender-to-Anything case it could be a huge drawback if there’s not clear distinction between the tools/systems for internal baked data of a mesh and a more general bake system that can bake what you want for final textures.

Usually when you bake for internal usage in a filter/fx it is considered as a “data/mask” you need to some process, it’s a intermediate step and mostly of those are not generally useful as final images in the PBR workflow (AO is the only exception probably). When you bake for “final images”, independent if those are for Eevee, Unity or Unreal, you have “project” requirements of those images that are generally totally different in many ways.

One practical example with the more simple case. When using AO for some effect it should have a linear gradient with the full ramp so people can leverage a “levels” filter and use it as a mask for some effect. Also baking it at 4~8K is probably required. But the final AO of the model is something totally different that is defined by the art style of the project, device it will run will define the resolution… maybe it’s inside other channel pack map… They are not the same… and make those two different usages one single tool in blender seems impossible without huge compromises in usability in one or other usage.

2 Likes

Exactly! It always bugged me that the procedural textures in the shader editor (and now the GN editor) and the ones in the texture property editor are separate and not interchangeable, sometimes for example I wanted to have the same procedural texture drive both the material and the effect of a displace modifier, and always felt limited by the impossibility of it. Plus, there are some really nice textures in the properties editor (Clouds, Cell Noise), and their parameters, that I really miss not having in the shader nodes. This is by far the biggest reason I’m waiting for texture nodes being redesigned.

  • Texture specific nodes like Blur, Filter

Yes! More control on “effecting” textures would be a huge benefit. I can say other important effects like Posterize, Threshold, Distort…

I agree that more nodes/tools for NPR workflows would be very nice to have.

Can’t really give more feedback due to my very limited knowledge of topics like Baking, and also because this is a lot of information that partly needs to be imagined, and I can’t really give more feedback without having a try of the system itself once it’s ready and see how it interacts with the rest of Blender, especially at UI level. This is to say that I’m very happy this topic is being tackled and can’t wait to try this in Blender, congrats to the devs!

1 Like

The implementation would be shared where possible. For the CPU implementation we can share a lot with geometry nodes. For the GPU implementation we would share the implementation with Eevee. Layer related nodes would internally generate a bunch of Mix RGB and similar nodes, so that they automatically work in any evaluation context.

The main work would be nodes like Geometry, Texture Coordinate, Attribute, Blur, Filter that depend on the domain where the texture is evaluated.

2 Likes

Would this texture layering apply to painting masks in the compositor? Could we layer paint and smudge on a render result with these tools?

Ok, I was misunderstanding this. In other apps there are texture nodes that implicitly use AO and Curvature maps, and those are expected to be cached for best performance.

I’m not sure what the best mechanism is to cache them, and if it’s best to have them explicit or implicit in the texture nodes. But yes the workflow for caching those input texture channels is not the same as baking of the output texture channels.

2 Likes

Unlikely in the initial implementation focused on texturing meshes, but it would be a logical extension.

1 Like

I hope, shader nodes will have texture pointer too, as GN have now:

It will will allow to sample same texture few times easier, and change it without changing few nodes deep inside shader tree.

Good design.

One of the strengths of SP is that apart from the materials and the procedural masks, the brushstrokes are also procedural. This allows you to paint a text at a low resolution and later raise the quality.

I imagine that SP saves all the paths of the brush strokes and attributes such as the color or radius of the pointer among others. So at any time you can reproject the colors or stamped textures.

In my opinion it would be something similar to Grease Pencil being able to paint the objects around it. SP does not allow modifications to the mesh and plots cannot be displayed. Blender allows you to modify the mesh for this reason I think that adapted to nodes it should be done through a utility that calls a stroke path and allows to bake the color projected on the mesh surface for converting it to a texture.

In any case, seen in this way, it could be added in the future so that the process was completely parametric.

Another very useful tool is the projection boxes, useful for stamping stickers by bringing the boxes closer to the surface of the 3D model.

2 Likes

I don’t mean “function” nodes instead of texture nodes, I mean texture nodes on top of function nodes.

:point_down:

Before I dig into the proposal, I must say I’m quite exasperated at the confusing terminology here.
Please consider the following changes so that we’re more in line with the rest of the world:
Texture → Texture Set.
Channel → Texture
(a channel is commonly a single grayscale component, not a set of RGB values)
Layer → Node (This is a nodal system after all, and the ‘layers’ in the proposal don’t combine to create a single image but a set of them)
Layer Stack → Node Graph

Also, I’m not sure about this one, but it seems to me that the verb Bake in the proposal is overloaded? Seems to mean two different things: 1 - creating an input image by sampling geometry, and 2 - rendering an intermediate buffer for further processing in the node graph. Is that correct?

Apologies if this seems like bikeshedding, that’s not my intention.

One more humble plea is to finally decouple the texture node from the texture sampler while this work is going on, so that any setups that manipulate texture coordinates can be made independent of the actual images they work on. While this is only tangentially related, you will still be working on the texture node, so now would seem to be a good time.

1 Like

I’m not sure what you mean. On the implementation level, texture nodes would share code with function nodes.

On a user design level, I don’t see practical use cases for introducing a function nodes group concept, if that’s what you’re proposing. Or how it would simplify making divisions between node systems, as the common nodes are not what makes that difficult.