2022-02-16 Sculpt/Texture/Paint Module Meeting

This is mostly a consequence of varying levels of knowledge of whoever was making that comparison. They themselves stress that point in the video. They wanted to see what the experience would be for the average user out of the box. It’s quite clear that all of the renderers use different types of color management/tonemapping, which alone causes a lot of difference.

Blender, in its current state is not indicative of needs of all. What I proposed would be improvement to the needs of all, regardless if layers are ever introduced or not. There would be almost no one who would not benefit from the 3 nodes I mentioned, be it people who use them directly, or people who use them indirectly in a form of more abstract smart material nodegroup. But the main point I was trying to make was that the layer workflow alone would not add any new possibilities to PBR texturing in Blender. It would just make some of those possibilities more convenient for some users.

On the other hand, there are some things that would actually increase the amount of possibilities. And the lack of these possibilities currently is one of the main reason Blender is not widely used for PBR texturing.

That being said, rather than being vague, could you just post a practical example of some of your current needs? I don’t mean bullet point list of features, but an actual example of workflow on some practical result. The features you mentioned have almost nothing to do with layer based workflow, but rather with lack of properly simple baking tools and lack of more texture painting tools.

1 Like

We don’t want to do a first iteration, have people build shader node setups based on it, and then soon after break compatibility. Doing a suboptimal implementation with only the Principled BSDF first may well be more work overall.

It’s not about creating node groups being difficult or not. But rather if that type of node group is standardized and understood as a concept by Blender, it means assets are standardized and we can implement mix nodes, painting tools, baking tools, exporters, scripts that work with them.

If all those work based on implicit assumptions about how the shader nodes are set up, it’s fragile.

Right, which is why the design we are working on is fundamentally node based.

Again what’s conceptually simple may not be so in terms of implementation. But a workflow for baking PBR textures is part of the design.

6 Likes

I don’t understand this to be honest. AFAIK there’s only one type of the shading node group in Blender. I am also confused about Mix Node"s". Why would there be more than one? Assuming that we have RGB Mix and Shader Mix, what else is missing?

There may just be one node to mix layers, but it’s not the same as Mix RGB and Mix Shader nodes. In general:

  • It’s convenient to be able to mix all channels with a single node automatically in a useful way.
  • Mix Shader does not do this, it mixes the BSDF output. For example when mixing two different BSDFs with different roughness you would get two specular highlights with different sizes. However if you mix the roughness parameters you get a single specular highlight with intermediate size.
  • BSDFs only support mix and add, in a physically based renderer any of the other blend modes don’t make sense. But colors you can combine with many blend modes.
  • Some channels like normals or vector displacement require different interpolation rules than colors to give meaningful results.
  • The Alpha channel can be taken into account and affect all other channels for masking.
3 Likes

Yes, I (originally) assumed that’s exactly what Mix Shader node does.

Ah, yes. I did not take this into consideration. In that case, I agree that there should be a separate type of a Mix node which just combines individual channels of the BSDF structure.

Yes. I’d expect that once we have BSDF Attribute Get and BSDF Attribute Set, you can for example create “Multiply Color Mix” nodegroup, which takes in a BSDF input and a Color (RGB) input, then uses BSDF Attribute Get node to retrieve Color attribute, which then gets plugged into RGB Mix node as a Input A, the Color node group input gets plugged as Input B, and result of this RGB Mix node set to multiply gets plugged into BSDF Attribute Set node to set the result as new Color channel.

A more practical example of this would be a “Wet puddles” nodegroup. When you want to take your existing shader, regardless of how many mixed BSDFs it has down the node tree, and make it appear wet, you need to do multiple things:

  • Decrease the diffuse albedo on the wet areas
  • Remove or reduce the normal map detail on the wet areas
  • Replace or decrease roughness on the wet areas
  • Do not affect any other channels
    Right now, this is not possible to do in Blender conveniently.

Same answer as to the point above. I was just trying to make a point that these specialized Mixes should be just node groups users can make instead of some hardcoded arbitrary blending modes. Having Combine BSDF, Separate BSDF, BSDF Attribute Set and BSDF Attribute Get nodes is a good fundamental set for almost unlimited PBR texturing possibilities, in combination with what Blender already offers.

I’d expect the Alpha/Opacity to be evaluated at the very end, at the Material Output, and everything leading up to it would just be combining the grayscale values from the various opacity slots, such as Alpha slot of the Principled BSDF or Color value of the Transparent BSDF.

But I do agree I did not consider that right now, Cycles is actually mixing the evaluated materials, instead of mixing just the individual material inputs, and then evaluating that as a single material, as I’d expect in an average game engine. And I do understand that supporting both ways of evaluating the materials is important, but also challenging. But ultimately, I believe it’s just challenging, not impossible to solve.

The design is now published here:

For feedback based on that document, this topic can be used:

9 Likes

The first attempt of the design was actually fully in shader nodes using BSDFs as layers, but we found it became conceptually messy to have all these nodes together. Additionally, effects like Blur and Filter require some form of baking to work at all, and can’t be efficiently implemented in renderers.

1 Like

I quickly went over the design. Do I understand it correctly, that the idea is to have another texture editor alongside the shader editor, where it’s still possible to treat material properties as a convenient, self contained structures in forms of layers, which are very similar to principled BSDF, but are actually mixed and evaluated as sets of channels, not actual materials?

Initially that would be a bit worrisome, as it sounds similar to the attribute processor idea in the Geometry Nodes, which was luckily averted and we now have great Fields solution thanks to that. But thinking about it more, it could actually work well.

I am just curious, messy in what exact way? That the BSDFs were mixed as evaluated materials in some cases, while in other cases they had just their individual channels mixed, and evaluated as material afterwards, and all of that happened in the same node network? Or messy also in other ways, like the node tree complexity itself? And if the latter is the case, do you have some example/screenshot of that?

1 Like

I want to clarify, is there any plans to incorporate the Specular & Glossiness shading for texturing here? It is used extensively at my studio on our product line. I know Metal/Rough is a baseline standard for entry, but I would be more enthusiastic if Spec/Gloss shading for the BSDF has some considerations in terms of being mentioned.

Also, what considerations are being made into the design for “Bake by Mesh Name” as Substance Painter currently has? (This is a major deal breaker and a waste of time, if all is being done is to redo the whole format with the old school method of baking complex geometry. As no one wants to put in the extra hour to separate/explode their geo to bake. If there is something where geometry can be baked by matching name suffix head_l & head_h (or low & high) whichever preferred from each collection set. This is a winner for people outside of Blender!)

Still reading the blog post…

Yes, that’s correct.

There would likely be a texture editor for uses cases outside of shaders, but you’d also be able to edit texture nodes as a node group inside the shader editor.

If you wanted to blur a layer, you’d add a Blur node after the BSDF. I think it’s not obvious what this does already. Does it blur the parameters or the BSDF output? In addition, you now to need to bake to a texture for blur to work, so probably you need to some kind of Bake BSDF node, the meaning of which is also unclear. Then imagine you also have a light path node mixing shaders for optimizing renders.

How to put all those nodes in the right order and what they actually do seems hard to understand.

And then of course, the system we are proposing would be usable outside of shader nodes too. If it’s based around BSDFs that would not be possible.

2 Likes

There is no specific plan for this at the moment, though of course we are just now gathering feedback. In any case, the system is not tied to the Principled BSDF and you could create and layer textures for any BSDF.

There are some advantages to having most of the Blender community agree on one convention, so that assets can be shared more easily. For that reason I wonder at what level it would be most useful to potentially have Specular & Glossiness channels, in the texture datablock or more as a conversion for baking and export.

The specific baking settings have not been worked out, but there should be some convenient way to associate low and high poly geometry.

1 Like

The first part of this statement screams at me why Blender has many of its current issues, and is in the phase of revisiting them for the better. I think thinking about what would be best for attracting the entire 3D community is a better avenue to approach reflecting on design. Rather than just what the Blender community wants. Ultimately all design decisions the Blender team will make will orbit around attracting all 3D users, otherwise, why make any new additions and revamp all the changes from pre-2.8 to here, right?

Spec/Gloss is still heavily used in the industry, and the Call of Duty product line uses this. I would personally be very disappointed to see Blenders Texture development go in a direction that says “Yeah…well it’s not what the community wants, and its too much extra work”. Eventually that work will come knocking on someones doorstep for a task to work out. So why not do it now?

I’ll leave any further feedback in the newly dedicated thread.

Thanks for the answer. I will re-read the design once more and if I have more to say I’ll post it in that dedicated thread.

@Doowah what’s interesting in this design discussion is the best way you think both workflows should be accommodated. Distinct assets for both workflows, or a mechanism for automatic conversion between different types of assets, and how that would work then.

If you use a different convention that what the majority of Blender community is likely to use, it’s interesting to think about how you can still take advantage of those community assets.

2 Likes

There are already several addons that seem to have solved this problem. Please don’t forget to consult those devs for their experience.

What’s the issue with a baking or conversion step, either in a node or in the exporter? Both are entirely compatible, the only difference is in the Albedo channel (where metallic surface are black) and an inversion of the Roughness to Smoothness, right? I’ve worked with both, but recently work almost entirely in the Metallic workflow as the downside (in the Metalness workflow there’s a seam between metallic and non-metallic pixels) is of no consequence to me. But a conversion step would seem to fix that issues too, no?
To me your response reads a little dramatic and pessimistic when Brecht seems open to suggestions

1 Like

maps for specular workflow could be easily generated from Metalic textures just with some simple channel shuffling, one invert and couple multiplys. So it could be just one node that will convert channels from metallic to specular. I even think it could be made as nodegroup by yourself or somebody in community.

1 Like

The issue I would state, is a couple of things, consider a studio has a library set of made materials only utilizing Spec/Gloss method. As I was referencing in my initial post, keeping considerations open to allowing Spec/Gloss implementation without asking the user to do the extra conversion work would be appreciated. Rather than leaning a user to unproductively ask or require more time to do that work. My feedback is simple; I see you’ve considered one approach to the production of texturing. Can we have the other considered as well? It would be appreciated!

Asking for the road less traveled approach as early as concepts are being discussed for this. Is in fact the best time to do so. I can’t speak for everybody in this community. But I am confident Artists would appreciate having to do less mental gymnastics just to get to the finishline on their work. Therefore avoiding the need to manually having to convert and translate their existing library of work.

I do not know you personally. Please, take a break from the computer if you are reading too much into the wording. Because if this was a video/audio conference chat session. These words used wouldn’t have been considered that way. I’m deliberately putting an emphasis to things I’d like Brecht to consider. After all, he is asking for our input on the subject, right?

I hope that the displacement baking between arbitrary objects will be part of this design module.

Oh absolutely, but given that Node Wrangler already automatically sets up connections based on file names, I would assume these steps could be automated quite easily. I do see value in selecting the most prevalent process internally and automatically converting based on layer names, don’t you?

Glad to hear it wasn’t the intent, but text is always prone to this, and yours sounds quite forceful and demanding. Brecht says ‘there are some advantages to x’ and ‘I wonder’, both leaving a lot of wiggleroom for discussion. You later suggest the statement could be simplified to ‘yeah… well it’s not what the community wants’, which is giving little credit to how open ended it really was.

Your reply opens with ‘why Blender has many of its current issues’ and entirely ignoring the question: would a conversion (automatic or otherwise, it hasn’t been stated) be a functional alternative to spending development time on what’s currently the less-popular option?

I don’t care to go into this a lot more, just hope to clarify why you come across to me the way you do, which makes the discussion harder.
I’m in favour of a spec/gloss workflow being supported, I just also wonder if there would be downsides to an automatic conversion step

1 Like