Layered Textures Design Feedback

Hi all,

This is a topic for feedback on the design proposed here:

We plan to start implementing a system like this later this year, so there’s plenty of time for feedback and iteration on the design before that.

67 Likes

I really dig this.

Regarding

What is a good baking workflow in a scene with potentially many objects with many texture layers? In a dedicated texturing application there is a clear export step, but how can we make a good workflow in Blender where users must know to bake textures before moving on to the next object?

If Blender is meant to have some sort of interop node system to replace the static export concept at some point (notably for USD), I would say baking fits into this idea pretty well. Define jobs, bake and export all at the click of a button or at intervals…

In any case, I like the idea of encapsulating node trees into layers, bit like geonode trees in the modifier list.

10 Likes

Really like this design! :clap: :clap:

  • Is making a single texture datablock for many uses a good idea? Or does it make the workflow too fuzzy? Is there a better alternative?

From a user POV the answer is YES, please don’t keep different texture datablocks for different parts of the program, right now the only way to use the same texture with modifiers/GN/shaders/particles is using external images, currently there’s no way to use the same procedural texture for all those cases and can become very cumbersome to setup.

If not, how would filtering relevant texture assets work exactly?

Tags and catalogs in the asset browser, let the users decide what type of texture is an image just by adding a tag to it or by separating them in catalogs; most people will have different folders for different image “types” (textures, sculpt alphas, masks, etc), so setting them up as different libraries shouldn’t be much of a problem anyway.

13 Likes

This is great news! And it’s something I’ve been looking forward to for a long while :slight_smile:
Apologies if this is a bit rambly, but the design of this has pretty far reaching consequences for the whole procedural pipeline, so…

Stack vs Tree

The main thing that I personally don’t like the sound of so much is the idea of having a nodetree that plugs into a “stack” node as output - this doesn’t really make sense, given that a “modifier stack” is conceptually just a nodetree with only a single branch. I feel like this mistake was made once before with the old named-attribute geometry nodes, where the “real” tree of operations was hidden behind the attribute system, and most node layouts were just a very long unbroken string of nodes.

Is there a technical reason why only exposing the output as a linear stack with blending operators in between each layer, rather than as a nodetree where the user can blend it themselves was chosen? In the latter case, the “Stack” can still be displayed the same, it just limits you to only having a single chain of blending nodes, while still allowing the option to branch out if needed.

Answers to the proposed questions

  • Is making a single texture datablock for many uses a good idea? Or does it make the workflow too fuzzy? Is there a better alternative? If not, how would filtering relevant texture assets work exactly?

I’m in the camp of splitting datablocks only if it improves the development process - the concept of a data-block is already complicated enough for users, having to choose between a bunch of different options for which type of datablock to use would be more confusing imo, but if different texture types have entirely separate implementations then maybe it makes sense to split them.

  • Are there better ways to integrate textures into materials than through a node? Is it worth having a type of material that uses a texture datablock directly and has no nodes at all, and potentially even embeds the texture nodes so there is no separate datablock?

This is something for the material UI to handle, all it really needs is a way to streamline the process of adding a texture without having to open the node tree. But again, I think it makes more sense for everything to be “backed” by an actual nodetree rather than arbitrarily having different material types.
Embedding nodetrees would be very cool, especially if we could have an “expanded frame” node-group view that shows the contents of a nodegroup or texture tree along with the shader nodes while still allowing sharing.

Actually, being able to “embed” datablocks rather than having all DB’s in a global namespace would probably be a cleaner overall design for blender as a whole, but that’d be a pretty huge task. The Godot engine has a fairly good design in this regard IMO, with “Local Resources” (scripts, shaders, etc.) which can then be exposed to other (data) users if sharing is needed, but without cluttering the asset tree with single-use resources.

That is quite a bit too ambitious for here, but in an ideal world blender would already have something like that in place, making the answer to that question pretty simple.

  • With this system, the users can now do the same thing both in texture and shader nodes. How does a user decide to pick one or the other, is there anything that can be done to guide this?

I’d say this depends on whether embedding is possible or not - if it is, especially if the embedded nodes are visible at the same level (the expanded nodegroup thing i mentioned above), then the most intuitive thing for users would be to simply drag the nodes they want to use as a texture from the outer shader into the texture nodetree. Again kind of ambitious, but that’s the best world i can imagine :slight_smile:

  • What is a good baking workflow in a scene with potentially many objects with many texture layers? In a dedicated texturing application there is a clear export step, but how can we make a good workflow in Blender where users must know to bake textures before moving on to the next object?

Ideally? It’d be automatic and seamless as much as possible. I’m assuming that all types of texture block are intended to be static - e.g. not expecting drivers in texture nodetrees, in which case there’s no reason not to bake the textures. Designing a disk cache for expensive procedural texture setups to keep blendfile sizes low while still allowing for persistent caching would probably be a necessity as well.

  • How do we define what a modifier is in the layer stack? Conceptually this can just be any texture node that has a layer input and output, but how do we find a set of such nodes and node groups that work well in a higher level UI? Possibly some asset metadata on node groups?

See the first section of my response, although this question seems to suggest a slightly different workflow to the mockup tree screenshots, so I’m not totally sure.

  • Can we improve on the name of the Texture datablock or the new planned nodes?

“Texture Nodes” works perfectly - I assume this is going to fully replace the existing Texture Nodes. Every so often introducing other procedural artists to blender, they get excited and then disappointed that such a thing with that name exists, hopefully this would allow them to keep that excitement :wink:

5 Likes

I commented this on the blog, but I guess this is a better place for discussion.

If there’s going to be a common subset of nodes that works across all tree types, are textures the best interface?
Will texture nodes end up being used as a workaround for non-texturing use cases?
Could make more sense to have “function” nodes?
If some nodes (like blur) split the workflow and the evaluation into 2 different paths, do they belong to the same interface?
Will there be overlapping functionality between texture and compositing nodes?
Will there be non-overlapping functionality between texture and compositing nodes?

1 Like

Resolution is a big one, IMO.

1 Like

I didn’t add it in the design doc, but I think it would be useful to support nesting/folders. In terms of nodes that could possibly be just a Layer Stack node linked into another Layer stack node.

There’s also nothing preventing us from supporting editing full node graphs as a tree similar to what is now in the material properties. Though I think the usefulness of that is limited right now and the UI is not that great. Possibly a better UI could be design for both cases.

It’s not so much about the development process, but about the ability to share texture nodes between different areas of Blender.

I agree this would be nice.

Automatic would be great. In practice for interactive editing you’d likely work at a lower resolution, and then bake at high res only at the end. Depends on what kind of resolution you want I guess, but interactive baking of 8K textures is probably not feasible.

13 Likes

Selecting an image texture or color attributes would enable painting on it in the 3D view port.

I would like to add a sentence.

Saving the Blend file also automatically saves the texture, just like every other datablock in Blender.

8 Likes

There’s already a common subset of nodes, but no way to put them in a group and share them. I think texture nodes can fit that role.

I guess you mean field nodes?

There are trade-offs here, you can make it closer to geometry nodes, and then you’ll end up with a bunch of nodes that can’t work as shader nodes or are difficult to add GPU acceleration for. I don’t think there’s an ideal design here that is the best of both, as you go one one way the other use case becomes uneasy.

I think the primary use case is shading, and so that’s what it’s closest to.

The idea is that texture nodes are separate from shader nodes because of this. If you’re proposing to pull out just a few nodes like Blur somehow, it’s not obvious how that would work.

Yes and yes. There is overlap between all node systems, where exactly to make those division is a tricky question, but you need some division to avoid things to become a mess.

The Texture painting system having nodes system is essential indeed, but it shouldn’t be the first thing people dive into. The go-to system right now is the Photoshop-Krita-Substance-Mixer system, where the layers are on the right top-side, with filters/effects affecting each layers being separate. That is what a typical user should first see.

The node system may become too intimidating for new users who want to accomplish simple tasks first, I fear. The initial UI may retain what I stated above (a substance like UI), but at the same time, a entire node-tree (think Modifiers stack Vs Geometry Nodes graph) for the same would be the place for finer tuning, and more control.

Secondly, I feel like begging for better performance as the main focus in the new systems. People who are simplistic can put in hours and hours to paint the perfect texture using simple tools, but if its lagging that badly as it does currently (compartively to Photoshop/Substance/Mixer), it just becomes a pain. Make it buttery smooth please! On similar note, the sculpting performance is about 20-30% behind of where I personally feel its ideal enough to beat Zbrush flatly.

Thirdly, please do add tools for NPR this time. There are tons of people willing to give good feedback, and have good hopes out of the new system to be not just realistic friendly, as the rendering improvements ‘focus’ over last 2 years have been.

9 Likes

The ability to switch resolutions on each texture node (for testing purpose) would be a great feature. Imagine just having a dropdown for each texture, where blender has generated a upscaled and downscaled version of your loaded texture.

For example, a texture node has a 2K resolution put into it from somewhere on my PC. Inside blender, a upscaled 4K version (using maybe a open-source algo), would be generated, alongside a downscaled 1k, 512k, 256k (changeable in settings). And this is presented as a drop-down menu.

So while baking or previewing in viewport, we could switch to lower resolution or higher resolution with 2 clicks of that drop-down selection.

3 Likes

It’s true but also not specific to this design, at least in the current proposal images in texture nodes are no different than images anywhere else.

What I think is interesting in this regard is thinking about how you can avoid having a ton of images embedded in the .blend, for example new images could automatically get a file path.

4 Likes

Hi. I’m the developer of Baketool addon and regular user of other comercial 3D texture paint softwares. To me it’s really nice to see some progress on this area of blender. Congrats!

Overall i like the proposal a lot and i don’t have any problem with some overlapping areas between texture and material nodes. It’s like to define the albedo color in the texture or in the material. It’s just ok.

My only concern with this design is how it will handle baked data of objects/meshes. Let’s say i want to make a semi-procedural texture of rocks that can be used in many objects, it should account for channels like AO and Curvature that are attached to the Mesh datablock not the texture datablock itself. It’s exactly what other comercial apps do. So i don’t think baked data of meshes should be considered regular/exposed image datablocks in this design (Beside it should be possible to convert to-from those) but more “internal” data that the filters/nodes/fx in the texture node could access and can be generate by demand (It can even be a button in the mesh datablock “Generate Baked Data”).

I also would like to see a better separation between this and the bake system to produce final images that will be used inside other game engines. The bake to exported/final textures have other specific questions to be addressed (Like compose multiple bake results in single images or bake multiple objects in a single atlas) and mix those two in the same problem will bring no benefits.

My two cents.

5 Likes

And wouldn’t that be a better target?
That’s what I meant by function nodes.

An editor for node groups that can be used as functions across different tree types.

Each group declares what kind of tree types supports, and the editor only allows to use the common subset of nodes they share.

That would alleviate the need to cram different use-cases into the same tree type, and would make designing new node systems more forgiving.
It could also allow using specific node group signatures for special use cases, like loops and “foreach” functions.

Sure, but my point was that one advantage of procedural textures is that they’re not limited by resolution. You can’t just bake the texture of a whole landscape, for example.

1 Like

This is indeed something that’s not so clear in the design, where exactly the baked images should live. It would be good to support baking to mesh attributes, so those would be naturally stored on a mesh. Attaching the baked images to a mesh instead of a material would be helpful and consistent in that way.

A challenge would be how to represent this well in the UI, you’re editing texture nodes, material nodes and also data on a mesh altogether.

This would require some non-trivial changes in rendering to specialize materials to meshes, though potentially it could be implemented in a general way that works for all renderers.

I’m not sure this distinction is needed. I think that it’s useful to see in Blender exactly what the final images look like on a model, and to avoid having to bake once to see things in Blender and another time to export.

I think it would make sense to have native settings to control such channel packing or combining multiple objects. This could then be used both by baking, as well as for rendering.

With baking nodes there is potentially more flexibility, but with more fixed settings you can do the conversion in two directions.

4 Likes

yeah, in regards to procedural textures, I think some of the tools/features that can be added is:

  • A ability to make non-procedural textures into procedural ones (think BlenderGuru’s video/nodegroup on texture tiling) plus Photoshop-like fixing of texture seams. Currently its very hard to convert textures into seamless ones only inside Blender.
  • A ability to break procedural and non-procedural textures into blocks that are easy to replicate in Unity/Unreal/Godot.
  • A working Stamp tool, that can pick source from 2D image in image-editor! Or 3D source
  • A working mirror ‘visualizer’ while texture painting
  • Gauche Noise Filter for painterly look!

For a landscape, I agree, instead of using a 4K/8K texture, its more efficient to use a 1K texture and some maths. My point is some people want the humanly control.

The new frontier of texture system is the use of blocks like Substance Designer. I make a custom emblem in photoshop/krita, then import it in Blender to use as a block in my node tree, but editing it on the fly is a pain (because I can’t do that inside Blender currently, and I have to fix it in PS/Krita, and the reimport again).

2 Likes

Textures nodes need more functionality than just the common subset between all node systems. So I don’t think such a system could be used instead of texture nodes?

Adding another mechanism besides texture nodes to share node groups may be useful, though I can’t really think of practical use cases that would justify it. In any case, it seems outside of the scope of the texturing project unless it specifically solves the texturing problem.

  • With this system, the users can now do the same thing both in texture and shader nodes. How does a user decide to pick one or the other, is there anything that can be done to guide this?

Won’t it be good to integrate baking and PBR creation features in shader nodes and keep features that work directly on textures (like texture editing and texture layers) in Texture Nodes? So as to avoid any sort of confusion?

Baking should indeed be at the material or mesh level, not inside the texture nodes.

However it’s not clear to me what the difference would be between “PBR creation features” and “texture editing and texture layers”.

Improving texture painting performance is a separate project that will be tackled before texture layers. Some details on that are here:

1 Like