Layered Textures Design Feedback

I’m assuming that’s a question that will be followed by a point you want to make, so just make the point immediately?

5 Likes

Just answer the question. I’m not a mind reader, and don’t want to make assumptions.

Just make the point, even if brecht may or may not have used it, many of us definitely haven’t , it be nice to follow along.

6 Likes

What point would I have to make? I don’t know who Brecht is, or what he can or cannot do?

FYI, asking questions is how you get to “know” someone, or something. This is also how you find ways to relate to others, build relationships. Let alone solve problems, and figure out & build solutions.

If I wanted to throw darts at the wall like the rest of you, then I guess that would be the point of this thread, and Brecht would continue to find ways to dispute an issue with every idea casting itself into another unsolvable problem that cannot be resolved in accordance to his perspective on the topic at hand. In the end, I asked a question. So I can get to know what Brecht knows already.

1 Like

I went off the assumption the conversation would go like “if you used product X you’ll know how great/awful feature Y is” you could easily describe how great/awful feature Y was with words for us that haven’t used it rather than relying on brecht’s personal experience with product X, given you seemingly did not want to go into that direction, sorry for interfering, carry on!

5 Likes

It is ok to ask questions, but generally speaking if you don’t have a point for what you are asking then it’s a bit empty. As an example if we follow up with this conversation as “Do you use Substance Painter and Designer” with “Yes I do”, followed with “Cool”, it simply doesn’t add anything to the discussion. If you had a goal that you had for asking about if he uses the substance suite then sure, such as “Do you use the substance suite because blank and that is something I would l like to be able to to in Blender as well”, then there is a discussion that can be had.

Brecht and Dodo are saying that if you had a follow up like that in the first place, then you should simply post it with the question instead of waiting for a response. It keeps posts clean and easy to follow. It’s not that its not a friendly place here, it’s just helpful to discussion, and considered good forumn etiquette.

14 Likes

If you think I’m rejecting ideas that should be considered, please go into the arguments I provided for that. For example when I said above that trying to be compatible with a 2D image editing layer stack is too limiting, someone could explain how the missing features I listed could fit into that design, or why these features are not essential, or why those should be part of another system, etc.

I’ve thoroughly looked at what other apps are doing for texturing, but I don’t want to make an argument from authority, the design should stand by itself and work in the context of Blender.

12 Likes

OK, How about presenting a new visual mock up of where you are at with what is being suggested?

I’m saying that because there are 250+ replies to your initial post, and with everyone taking a great deal of their time to provide you visual + detailed solutions. It seems the community at the moment doesn’t have a clear impression as to what you will, and will not integrate and this is all because you have a vision we haven’t yet been updated with, visually speaking. So, when you turn down an idea in reply, there is no way for anyone to cross examine what your current approach will be vs. our ideal approach.

Do you have a working prototype yet?

Either you have a clear visual representation, today, with the responses given on how to communicate what it will be in absolutes, or, we are wasting our own time having been presented a topic that states “Here’s a mock up with no absolutes, present me your best ideas and wishes for the texturing suite that the future Blender can offer”.

I honestly [think] your proposals will benefit greatly from users if you start sharing your progress of implementation you currently have, in action. Otherwise, everyone is shooting blanks in hopes something sticks. While like me, others will see a pattern noticing you have a tendency to often say “That will not work”. How about don’t reply at all unless it is something that will work, or you find brilliant and want to know more on how it would? Seems fair. Certainly that would give anyone reading an impression on the direction this mode is heading for implementation.

With all of this said, am I right? I don’t know, but I think this will cut down on unnecessary replies & suggestions and will rapidly bring the feedback you seek. Instead of theorizing the whole thing another 200+ replies later with no updates from your end of the table.

The initial post asking for feedback said we “plan to start implementing a system like this later this year, so there’s plenty of time for feedback and iteration on the design before that.” So it is a bit early to take time away from his work to make mockups and give you “clear visual representation today”.

19 Likes

I don’t understand the fuss, there’s been a pretty healthy back and forth so far and whenever there were concerns with contributions/designs it seems to me they were outlined clearly

19 Likes

Hi, regarding the design, I just have finished going through all Brecht’s posts on this thread, then I made a mockup.

Although it is similar to others on this topic, I think My design adds some additional value to the general discussion.

Nodes and Layers.

There are three needs that the current Shader node lack and are crucial for texture editing:

  1. A multi-channel data (layers) or workflow (layers stack, layer combine, etc.).
  2. Some nodes are harder to execute during render time (blur node, filter, painting on texture, etc.).
  3. High-level UI for layer-painting.

IMHO we can add those three in the Shader editor in a way that wouldn’t clutter the UI too much and would be easier UX-wise.

  1. Adding multi-channel data is pretty straightforward, add a new color for the socket, add a combined channel node, and a layer stack node.
    Even the conversion between data types is relatively simple.
    I imagine the Asset Layer would be a node group.

  2. For the second need, I suggest a new kind of data - Cached data.
    It’s similar to the way Field differs from regular data.
    Every type of data (float, vector, color) except shaders could be cached.
    Most of the nodes would work on both of the data kinds, But some nodes would only work on cached data kinds.
    This caching serves for painting, special filters (as blur), and optimization.


  3. For the high-level UI, I suggest a texture-output node similar to the material-output node.
    the output would be presented in the Properties Editor under the texture tab.
    The list of layers would be presented with Outliner Ui. Internally, I imagine the Stack would Behave as materials are presented currently in the Properties editor, where you can uncollapse any connected input to edit a lower level layer/node.

Baking:

I’ll be using the proposal uses for different kinds of baking:

  1. Exporting PBR textures to a game engine.
  2. For textures with many layers, baking is important for efficient rendering in Cycles and Eevee.
  3. Some nodes like Blur, Filter, and others require baking to work at all, as they can not be implemented efficiently or at all in renderers.
  4. Baking procedural texture layers into an image texture or color attribute to continue hand painting.
  5. Baking multiple materials onto one mesh.
  6. Baking to attributes.
  7. Baking from different objects (Normal/Displacement detail from higher res models).
  8. Baking render-data as Shadow, direct\indirect, etc.

I think some of those bakings are fundamentally different from others.

IMHO there should be 3 \ 4 kinds of baking in Blender each of them in a different place in the UI.

  • The per node caching system in my proposal solves the 1, 2, 3, 4, use cases.
  • A bake button in the Material \ Data properties editor should solves 5, 6,
  • More complex baking in the Scene Properties editor should solves the 7, 8 uses cases, maybe this kind of bake should be held in a dedicated baking graph.

Additional notes;

  • Painting would only work on a cached layer.
  • The state of the cached\regualr node would be presented in many places.
  • I am not sure if it’s possible to use the same texture stack for brushes/geometry as the textures are tied closely to the material they originate from, although the same material can reside on different meshes easily.
    maybe new Brush Output which only accepts Cached data.
  • A specific Cache node may be usefull.

I hope I’ve been clear enough :sweat_smile:

6 Likes

To make things a bit more concrete, here’s what the baking UI could look like. This would be accessible from the Texture Channels node in the proposal, with most of these settings stored on the Image datablock (mycharacter.png in this example) to be shared across multiple materials and meshes.

[ Update Geo Cache ]  [      Bake        ]

[▼] mycharacter.png    [ ][x]

File Path    //textures/mycharacter_<CHANNEL>_<UDIM>.png
Bake Res     2048px
Preview Res  512px

Channel     Type    Bake  Token         Pack
-----------------------------------------------
Base Color  Color   [x]   base_color    RGBA  ▼
Roughness   Float   [x]   roughmetalao  R     ▼
Metallic    Float   [x]   roughmetalao   G    ▼
Normal      Vector  [x]   normal_map    XYZ   ▼
-----------------------------------------------
AO          Float   [x]   roughmetalao    B   ▼
Curvature   Float   [ ]
Cache       Color   [ ]
Cache.001   Float   [ ]

This would list all image textures associated with the texture datablock, including input, intermediate and output textures. I think having this type of centralized settings and UI would be easier to understand and manage compared to various bake nodes, as well as being able to do automatic channel unpacking. It works best when everything is inside a Texture datablock, since that gives a clear context to pull together these various textures into a single UI.

15 Likes

Is this a response to my post or the posts before me?
Because your baking mockup can somewhat coexist within my proposal…

A loosely related question since nodes are involved quite a bit in the workflow - the node editor can become quite laggy the bigger scenes get, to a point where even moving or connecting nodes is actually a tedium (even if they’re not supposed to be evaluated).

Is that the type of known issue which realistically won’t be resolved anytime soon?

All things considered I really like the proposal and looking forward to using it in action.

5 Likes

Is that still a problem in 3.1? That sort of thing has improved a lot recently in many cases.

Yes, it’s still a problem. In a scene with about 50M faces (subdivided cubes for the sake of simplicity, 1.5M faces each) it looks like this -

nodeLag

This is on an i9-9900k with 64gb of ram and a 2080.

Since scenes of this size or bigger are very common in production I can see this becoming a general issue for a lot of people with Blender going more node based.

7 Likes

It’s due to the undo system (item 4 in ⚓ T60695 Optimized per-datablock global undo). But that’s really off topic here.

4 Likes

It’s not a reply to any specific post, just clarifying things.

2 Likes

Well, kind of off topic. When coming up with a design heavily reliant on nodes it’s usability is not irrelevant.

If ironing out these issues would take years for example, I’d vote for alternative solutions in the UI that don’t require visual grouping or moving of nodes - something which currently would take a lot of time that otherwise wouldn’t.

Just mentioning for consideration, In my workflow of having multiple objects in 1 texture map, I felt using ‘Material Groups’ as the term to use.