2022-02-16 Sculpt/Texture/Paint Module Meeting

I have to disagree with this, because the Principled Shader’s purpose is being directable by the artist and not physical realism.

Cases in point, the parameter for diffuse roughness is a hack rather than physically correct oren-nayer shading, some glossy materials are not energy conserving, not possible to do things like bloom metal, layered glossy paints, advanced fabrics, materials that require a high degree of scattering control, ect…

In general, your posts so far are coming across as the infamous “I do not need it for my workflow, therefore no one should have it” argument. If the atomic shading components are removed, then not only will many need to completely redo their materials, but the so-called ‘Cycles’ look will be mandatory and as a result will no longer be competitive with commercial engines.

1 Like
  • There’s no “Cycles” look. Cycles is physically based path tracer like many others out there. Given the identical lighting setup, shading setup and color management setup, Cycles provides results nearly identical to other physically based path tracers out there.
  • Pretty much all of the most popular PBR Texturing software packages out there don’t have any sort of unique atomic BSDF shaders, yet they are still much more popular option for PBR texturing than Blender is.
  • No one is saying they should be removed. All that’s being said is that they don’t need to be considered in the first iteration of the 3 new nodes mentioned above, which would tremendously increase shading flexibility, mainly because this shading flexibility is almost always utilized to author output for Principled-like shaders.
2 Likes

Some points notably absent from a lot of ‘everything nodes, layers sucks’ talks, presented here not so much to argue but as points I hope are being considered:
-One thing that’s usually absent in these cases is a fast, destructive merge without setting up baking (cumbersome). If this’d get done behind the scenes, no problem. Why destructive? Because some tools will work differently on one layer vs multiple. Think of Blur or Smudge.
Merging two layers, merging all visible…
-Copying from one layer to another
-Tools that can work across layers: A smudge tool as in Photoshop that works only on layers below, and then copies those resulting smudged pixels into the current layer. Something I haven’t yet found in nodebased systems
-Tools that traditionally aren’t available or well-thought-out in 3d (except for 3dcoat). Selecting, Transforms, cropping etc.
-Texel-dependent tools. Like 3dcoat’s pixel brush that snaps to the texel in 3d space. Popssible, but harder when the way the image is mapped onto the model isn’t reliant 100% on uv’s, but could have mapping nodes that alter it.

For what it’s worth, I think @brecht 's idea of containers with layers is a sound one, and could work really well, as long as these have similar design between Sculpting, Texturing and Vertex/Attribute Painting!

@LudvikKoutny I’ve seen you make these points before, and I’ll tell you: your assets aren’t indicative of needs of all.

1 Like

I find this interesting from an observational standpoint

Comparison of Cycles to other renderers with “Nearly identical to other physically based path tracers out there”

By BlenderGuru
from https://youtu.be/myg-VbapLno?t=569

I’m linking it here for anyone who has not seen this but is interested how they compare look’s wise, i find it quite interesting.

1 Like

This is mostly a consequence of varying levels of knowledge of whoever was making that comparison. They themselves stress that point in the video. They wanted to see what the experience would be for the average user out of the box. It’s quite clear that all of the renderers use different types of color management/tonemapping, which alone causes a lot of difference.

Blender, in its current state is not indicative of needs of all. What I proposed would be improvement to the needs of all, regardless if layers are ever introduced or not. There would be almost no one who would not benefit from the 3 nodes I mentioned, be it people who use them directly, or people who use them indirectly in a form of more abstract smart material nodegroup. But the main point I was trying to make was that the layer workflow alone would not add any new possibilities to PBR texturing in Blender. It would just make some of those possibilities more convenient for some users.

On the other hand, there are some things that would actually increase the amount of possibilities. And the lack of these possibilities currently is one of the main reason Blender is not widely used for PBR texturing.

That being said, rather than being vague, could you just post a practical example of some of your current needs? I don’t mean bullet point list of features, but an actual example of workflow on some practical result. The features you mentioned have almost nothing to do with layer based workflow, but rather with lack of properly simple baking tools and lack of more texture painting tools.

1 Like

We don’t want to do a first iteration, have people build shader node setups based on it, and then soon after break compatibility. Doing a suboptimal implementation with only the Principled BSDF first may well be more work overall.

It’s not about creating node groups being difficult or not. But rather if that type of node group is standardized and understood as a concept by Blender, it means assets are standardized and we can implement mix nodes, painting tools, baking tools, exporters, scripts that work with them.

If all those work based on implicit assumptions about how the shader nodes are set up, it’s fragile.

Right, which is why the design we are working on is fundamentally node based.

Again what’s conceptually simple may not be so in terms of implementation. But a workflow for baking PBR textures is part of the design.

6 Likes

I don’t understand this to be honest. AFAIK there’s only one type of the shading node group in Blender. I am also confused about Mix Node"s". Why would there be more than one? Assuming that we have RGB Mix and Shader Mix, what else is missing?

There may just be one node to mix layers, but it’s not the same as Mix RGB and Mix Shader nodes. In general:

  • It’s convenient to be able to mix all channels with a single node automatically in a useful way.
  • Mix Shader does not do this, it mixes the BSDF output. For example when mixing two different BSDFs with different roughness you would get two specular highlights with different sizes. However if you mix the roughness parameters you get a single specular highlight with intermediate size.
  • BSDFs only support mix and add, in a physically based renderer any of the other blend modes don’t make sense. But colors you can combine with many blend modes.
  • Some channels like normals or vector displacement require different interpolation rules than colors to give meaningful results.
  • The Alpha channel can be taken into account and affect all other channels for masking.
3 Likes

Yes, I (originally) assumed that’s exactly what Mix Shader node does.

Ah, yes. I did not take this into consideration. In that case, I agree that there should be a separate type of a Mix node which just combines individual channels of the BSDF structure.

Yes. I’d expect that once we have BSDF Attribute Get and BSDF Attribute Set, you can for example create “Multiply Color Mix” nodegroup, which takes in a BSDF input and a Color (RGB) input, then uses BSDF Attribute Get node to retrieve Color attribute, which then gets plugged into RGB Mix node as a Input A, the Color node group input gets plugged as Input B, and result of this RGB Mix node set to multiply gets plugged into BSDF Attribute Set node to set the result as new Color channel.

A more practical example of this would be a “Wet puddles” nodegroup. When you want to take your existing shader, regardless of how many mixed BSDFs it has down the node tree, and make it appear wet, you need to do multiple things:

  • Decrease the diffuse albedo on the wet areas
  • Remove or reduce the normal map detail on the wet areas
  • Replace or decrease roughness on the wet areas
  • Do not affect any other channels
    Right now, this is not possible to do in Blender conveniently.

Same answer as to the point above. I was just trying to make a point that these specialized Mixes should be just node groups users can make instead of some hardcoded arbitrary blending modes. Having Combine BSDF, Separate BSDF, BSDF Attribute Set and BSDF Attribute Get nodes is a good fundamental set for almost unlimited PBR texturing possibilities, in combination with what Blender already offers.

I’d expect the Alpha/Opacity to be evaluated at the very end, at the Material Output, and everything leading up to it would just be combining the grayscale values from the various opacity slots, such as Alpha slot of the Principled BSDF or Color value of the Transparent BSDF.

But I do agree I did not consider that right now, Cycles is actually mixing the evaluated materials, instead of mixing just the individual material inputs, and then evaluating that as a single material, as I’d expect in an average game engine. And I do understand that supporting both ways of evaluating the materials is important, but also challenging. But ultimately, I believe it’s just challenging, not impossible to solve.

The design is now published here:

For feedback based on that document, this topic can be used:

9 Likes

The first attempt of the design was actually fully in shader nodes using BSDFs as layers, but we found it became conceptually messy to have all these nodes together. Additionally, effects like Blur and Filter require some form of baking to work at all, and can’t be efficiently implemented in renderers.

1 Like

I quickly went over the design. Do I understand it correctly, that the idea is to have another texture editor alongside the shader editor, where it’s still possible to treat material properties as a convenient, self contained structures in forms of layers, which are very similar to principled BSDF, but are actually mixed and evaluated as sets of channels, not actual materials?

Initially that would be a bit worrisome, as it sounds similar to the attribute processor idea in the Geometry Nodes, which was luckily averted and we now have great Fields solution thanks to that. But thinking about it more, it could actually work well.

I am just curious, messy in what exact way? That the BSDFs were mixed as evaluated materials in some cases, while in other cases they had just their individual channels mixed, and evaluated as material afterwards, and all of that happened in the same node network? Or messy also in other ways, like the node tree complexity itself? And if the latter is the case, do you have some example/screenshot of that?

1 Like

I want to clarify, is there any plans to incorporate the Specular & Glossiness shading for texturing here? It is used extensively at my studio on our product line. I know Metal/Rough is a baseline standard for entry, but I would be more enthusiastic if Spec/Gloss shading for the BSDF has some considerations in terms of being mentioned.

Also, what considerations are being made into the design for “Bake by Mesh Name” as Substance Painter currently has? (This is a major deal breaker and a waste of time, if all is being done is to redo the whole format with the old school method of baking complex geometry. As no one wants to put in the extra hour to separate/explode their geo to bake. If there is something where geometry can be baked by matching name suffix head_l & head_h (or low & high) whichever preferred from each collection set. This is a winner for people outside of Blender!)

Still reading the blog post…

Yes, that’s correct.

There would likely be a texture editor for uses cases outside of shaders, but you’d also be able to edit texture nodes as a node group inside the shader editor.

If you wanted to blur a layer, you’d add a Blur node after the BSDF. I think it’s not obvious what this does already. Does it blur the parameters or the BSDF output? In addition, you now to need to bake to a texture for blur to work, so probably you need to some kind of Bake BSDF node, the meaning of which is also unclear. Then imagine you also have a light path node mixing shaders for optimizing renders.

How to put all those nodes in the right order and what they actually do seems hard to understand.

And then of course, the system we are proposing would be usable outside of shader nodes too. If it’s based around BSDFs that would not be possible.

2 Likes

There is no specific plan for this at the moment, though of course we are just now gathering feedback. In any case, the system is not tied to the Principled BSDF and you could create and layer textures for any BSDF.

There are some advantages to having most of the Blender community agree on one convention, so that assets can be shared more easily. For that reason I wonder at what level it would be most useful to potentially have Specular & Glossiness channels, in the texture datablock or more as a conversion for baking and export.

The specific baking settings have not been worked out, but there should be some convenient way to associate low and high poly geometry.

1 Like

The first part of this statement screams at me why Blender has many of its current issues, and is in the phase of revisiting them for the better. I think thinking about what would be best for attracting the entire 3D community is a better avenue to approach reflecting on design. Rather than just what the Blender community wants. Ultimately all design decisions the Blender team will make will orbit around attracting all 3D users, otherwise, why make any new additions and revamp all the changes from pre-2.8 to here, right?

Spec/Gloss is still heavily used in the industry, and the Call of Duty product line uses this. I would personally be very disappointed to see Blenders Texture development go in a direction that says “Yeah…well it’s not what the community wants, and its too much extra work”. Eventually that work will come knocking on someones doorstep for a task to work out. So why not do it now?

I’ll leave any further feedback in the newly dedicated thread.

Thanks for the answer. I will re-read the design once more and if I have more to say I’ll post it in that dedicated thread.

@Doowah what’s interesting in this design discussion is the best way you think both workflows should be accommodated. Distinct assets for both workflows, or a mechanism for automatic conversion between different types of assets, and how that would work then.

If you use a different convention that what the majority of Blender community is likely to use, it’s interesting to think about how you can still take advantage of those community assets.

2 Likes

There are already several addons that seem to have solved this problem. Please don’t forget to consult those devs for their experience.

What’s the issue with a baking or conversion step, either in a node or in the exporter? Both are entirely compatible, the only difference is in the Albedo channel (where metallic surface are black) and an inversion of the Roughness to Smoothness, right? I’ve worked with both, but recently work almost entirely in the Metallic workflow as the downside (in the Metalness workflow there’s a seam between metallic and non-metallic pixels) is of no consequence to me. But a conversion step would seem to fix that issues too, no?
To me your response reads a little dramatic and pessimistic when Brecht seems open to suggestions

1 Like