Layered Textures Design Feedback

Thanks for the mockup. My two main question with this type of design are:

  • Is it possible to create a higher level UI for this? Or does it basically mean this task becomes more technical with manual node setups, and that’s a trade-off you’re willing to make for more flexibility?
  • Would there still be another texture node system for geometry nodes, brushes, compositing, …? And to what extent would it overlap and be combinable with shader nodes?

It seems problematic to put that information in the node graph, because you’re mixing how to do a bake (the graph), and the control mechanics (filepath, resolution, etc). That will limit usability in the asset browser (output paths are usually never the same), and also has sharing difficulties within the same Blender project as you note.

It’s a Model-View-Controller problem. The node graph is the view, the model is the data elements in the graph, and you need elsewhere a controller to orchestrate it all.

The design does have a separation. Baking settings and file paths would be configured in the material or mesh datablock, while the (potentially reusable) texture nodes would be in the texture datablock.

I’m not sure MVC is the right analogy though.

Hm, still not seeing it. Maybe you have a radical new approach here that is forward thinking, but existing DCC’s operate with a separation between model data, materials and the settings to conduct the bake. If MVC doesn’t appeal to you, the point is that I think there needs to be a separation of concerns, in that baking settings are different than the data it operates on.

If the bake settings and baked image texture are not directly associated with a mesh or material, then where are they? And if they are somewhere else, how will Blender render or export the object with baked image textures, if not using that association?

for the design of the node (Layering) i was proposal a design, why not take this ?

3 Likes

Is-a vs Has-a. Are bake settings a node graph (is-a), or do they have a node graph?

I see it as similar to a render - the render settings are in a render tab in properties. They’re not in the shader node graph. A bake is just an offline render, why should it be different and include the baking settings in the graph? So as where to put them, either a new Properties tab, or a clear location in the Render properties, if you want to highlight the correspondence with rendering. I’d be in favor of a new tab - the render properties tab is too big already. Edit: yeah I can see a new tab right under render - no color, but an icon of a loaf of bread - love it!

To answer your question Blender will render/export the object with baked image textures via those properties. But it does push the bake settings up to the project level globally, so will make difficult if you want to bake a scene all in one go perhaps. If you want to support that workflow then the Properties UI needs to accommodate it.

1 Like

The Layer Stack node in the proposal is similar to this. However each layer is not just an RGB color, it must be a set of channels (Base Color, Roughness, Metallic, …), which is what we propose a layer socket for.

The exact UI design for that node is to be determined, presumably it would have e.g. a button to add a new layer like your mockup.

1 Like

Would there still be another texture node system for geometry nodes, brushes, compositing, …? And to what extent would it overlap and be combinable with shader nodes?

I didn’t count on the geometry ones.
With the brushes yes, since the input object when performing the bake could be a plane. Basic gemometrically shaped primitives would make it easy to create brush pointers, although with a bit of skill they can also be created with a gradient and math.
I don’t know if the composition could enter a loop, and on the other hand the Matte nodes are quite specific, but for the rest I don’t see a problem.

Is it possible to create a higher level UI for this? Or does it basically mean this task becomes more technical with manual node setups, and that’s a trade-off you’re willing to make for more flexibility?

I do not understand the question very well but I think that a higher level is not necessary. With the right button on any image node, the Bake could be called, by doing so, a couple of entries with the sources (the object and the camera) would be added to the node, and the paths and formats could be displayed in the sidebar. A button for Baking should also be added.
At the end of the render, the original node would be frozen unless the Bake properties are removed, or muted.
I understand the bake as a rendering of the outputs of the node itself, that is, the bake of a Layer or of the LayerStack would generate several textures, one for each Channel.
I hope I have answered your questions.

My question is not exactly where are they in the UI, but in which datablock? I’m guessing you suggest to have them in the scene datablock then, if they are at the level of scene properties in the UI?

If so, if you link in say a character into another file, you want to bring along the baked image textures. But you don’t want to bring along the entire scene of course. In that sense you don’t want each scene to have a different “view” of the character (and when you do, that’s what overrides are for).

Can you list me all imput socket and output (when he’s exactly determine) socket i can make some mockup :slight_smile:

I was also thinking of similar idea as @wevon
Maybe Baking could be expressed in Properties as hierarchy? The way it could work is to show baked data as a set of nested levels similar to Collections in the Outliner.
Every baking level could be expandable with an arrow and inside would be all needed settings for given hierarchy level.
User could then bake entire hierarchy at once or only specific parts / levels.

1 Like

I’d propose there’s a new datablock equivalent to the scene which is the bake settings. It’s like, but also unlike all of the datablocks discussed here, it seems like a new one is in order.

If so, if you link in say a character into another file, you want to bring along the baked image textures. But you don’t want to bring along the entire scene of course. In that sense you don’t want each scene to have a different “view” of the character (and when you do, that’s what overrides are for).

Yes so you could bring along the character, and it’s bake datablock, but not the scene in that case.

By the way I’m also a developer of many decades, and am working in the animation system fixing a bug at the moment (a new Blender developer). If you have need of volunteers for this project, texturing is more my wheelhouse than animation.

2 Likes

Yes, mainly the ones you mentioned:

  • Pack separate channels in various ways (ORM map for Unreal)

  • Convert a channel and then pack it (Invert roughness to create glossiness and pack it as albedo alpha for Unity Standard shader)

  • Merge height and normals together to export only one of the maps (for a shader that supports only one or the other)

  • A dream come true - being able to bake texture info to a vertex color channel (for example a shader is using red vertex color as a metallic input, to save on memory. Even approximated values would work fine there, as metallic is generally low frequency in details)

I have an big idea i make a mockup and i post it here, but basicly the idea is, create a sublayer in layer and give it a specific name like “SPECULAR” for specular… or an item list and choos it this own function and with this technic we can add sooooo many SUBLAYERS in the layers, and compile the layers, and same for the next layers basicly he looke like this

  • PARENTS LAYER COMPILED
    – SPECULAR
    – AO
    – ROUGHNESS

  • LAYERS

Idk if is it clear but i make a mockup i post it soon :slight_smile: i have a big plans for you :wink:

@Limarest, that makes sense.

@Stimes, I think what you call a “sublayer” is what we call a “channel” in the design proposal.

If there is a Bake datablock (or Image Texture Set datablock as I called it above), I think it would be for the purpose of sharing baked image textures between multiple objects.

Not for having multiple “views” of a specific mesh + material, where you can swap out one Bake datablock for another somehow. If you want that then I think material overrides make more sense.

And it’s not clear to me how that kind of Bake datablock would be equivalent to a scene datablock.

Great. When we start developing this project, it will all be done in public and we can make a list of task for people to pick up.

4 Likes

Agree with the above - I don’t mean to imply that the “bake/image texture set” data block is equivalent to the scene, but that it’s of the same ‘level’ (the queen to the king as it were). And agree that there’s no need for the ‘view’ swapping you mention, I wasn’t thinking of that.

An alternative to a Bake or Image Texture Set datablock be to make it possible to define multiple channels on an Image datablock. That could correspond to a multilayer EXR file, or multiple image files with control over file names, channel packing, etc.

I actually really like that idea, but need to think it over more.

In the image editor you’d have just a single image datablock, and you can easily swap between the channels. It would automatically channel unpack to view each channel. You’d also be able to easily import baked image texture from other software this way for rendering.

I feel like this solves the problem of sharing an image texture set between multiple objects really elegantly, just through a single Image datablock.

6 Likes

Well, maybe I’m beginning to see the merit of your idea. Present baking DCC’s rely on other applications (e.g. Blender, Max) to generate mesh data. So they have to have a separation between the data, and the settings to bake. But Blender is something different, maybe the first application where you can do a professional model and bake in one application. So does there need to be a separation?

This idea you have makes it more clear. A render node (BSDF) or a Image Texture Set datablock just has outputs, offline instead of online rendering. Adding channels to output offline to disk (e.g. a bake), well why should that be elsewhere? Just put it in your node graph, and it will travel along where it needs to go.

With Blender we’ll have a higher degree of control and customization than is normally available. Even Substance has the node graph off in a different application. Here it’s all one play pen.

Yeah, OK it is radical, but that’s Blender for you. To me this idea - multiple channels - made it click in my brain, as output bake textures are just another output.