2022-02-16 Sculpt/Texture/Paint Module Meeting

Participants: Brecht van Lommel, Daniel Bystedt, Gareth Jensen, Jeroen Bakker, Joe Eagar, Julien Kaspar, Pablo Vazquez, Pablo Dobarro.

For this year there were several announcements that are related to Sculpt/Texture/Paint module. One of them was a strategic focus on texturing in 2022. The other was the upcoming Heist open movie.

The scope of the texturing target is still WIP and will be fleshed out the upcoming weeks. One main goal would be to improve the performance of the 3d texture painting (current projection painting). The idea is to replace the current projection painting with a new system based on the PBVH and Sculpt brush (3d texturing brush).

As this is tightly related to the sculpt/texture and paint module and module members would also participate in the project we had a kick-off of the 3d texturing brush.


  1. Primary tools (must haves for Heist project)
    • We should aim for 16k textures and 1M polygons models and find out how close we can get and where the limitations will be.
    • Draw Brush
    • Projection tool (with backface occlusion)
    • Fill tool (fill full texture with a single color)
    • Blur tool
    • Delays might be acceptable when switching brushes/textures, but not when changing brush settings (fe brush size). It isn’t acceptable at the end of the stroke. (This is important for developer to know what could be precalculated)
  2. Secondary tools
    • Clone tool
    • Smear tool
  3. Masking tools
    • Auto masking
    • Manual masking
    • Create a mask by color
  4. Sculpt mode brush settings
    • Radius unit
    • Hardness
    • Flow
  5. Filters
    • Filters are tools that don’t require a brush, they perform directly on the texture.

There are several topics that still need discussion and prioritization

  • Surface occlusion
    • First “front faces only”
    • Based on actual feedback we can decide if more precise occlusion methods are actually needed. (eg. raycasting from vertices or from each fragment).
  • Sample tool - S shortcut acceptable for now, but in long term needs tool with settings.
  • Erase brush/mode
  • Face sets
  • Roll stroke
  • Channel selector for painting for game engine type of packed textures. Currently unclear if there could be better workflows when integrated with layered textures.
  • Lattice for projection tool
  • Redesign sculpt and paint model.
    • Modes organize features that have a common purpose, regardless of the target data type.
  • Store masks for reuse (face sets workflow? ID maps? Saved masks?)

Is a layered textured system being considered (I hope it is), like what is done in Substance Painter and Marmoset?


Sounds extremely awesome, I really like that you guys aim at high-resolution texturing and also think about channel selection for game dev stuff.
What seems missing is the layers system - to be fair, I didn’t see a single addon where it would be useable and not look out of place, but maybe if we think about it as an outliner mode or something, then it could be done? Either way, the texture workflow should be non-destructive where possible.


That or it’s his evil twin brother from the alt universe :rofl:

In Masking, will we see a Cavity Mask interaction liek a custom overlay for the viewport so that we can ‘dial in’ the masking effect? With the higher ceiling for textures, will we see the failings of the Stencil projection addressed to insure high resolution image interaction and result on the target canvas? I’d really like to see some of the suggestions at rightclickselect.com integrated into the improvements where they overlap these targets, especially the ability to toggle between scene space and target space for brush size similar to sculpt mode.


I agree about the layer system, I do not know why the Blender devs. keep wanting to dance around the idea of actually implementing one (which as far as I know is a thing exclusive to Blender). This has long been the case for sculpt layers and animation layers as well.

The developer behind Material Maker managed to put together a layer-based painting system in a few months in his spare time, surely a well-funded team like what Blender has can put something together.

Also, I would keep in mind things like UDIM as well. To make 16K textures possible would be great as a way to push performance, but being able to add as many UV tiles as possible without bugs (for textures more of the 4K size) would be very useful for AAA quality imagery as well.


I don’t understand the need for layer system in context of Blender. What Shader Nodes offers is significantly better than what any layer system ever could, in terms of Substance Painter-like PBR texture painting.

The main problem is poor brush settings management, poor performance and extreme instability when using texture painting together with complex shading networks.

Aside from that, the only thing that prevents SP-like PBR texturing in Blender is lack of following 3 nodes:

  • Separate BSDF
  • Get BSDF Attribute
  • Set BSDF Attribute

So that you can for example have node network, and overlay normal channel only or mix roughness only at the very end of the node network.

1 Like

We are also considering a layered texturing system. The scope of this system is more related to the rendering module and we are waiting for some resources to be freed before we will start this project. More information about this will be shared soon.

On a weekly basis I check RCS proposals and we get inspiration from them as well. People don’t often see that as they how RCS proposals influence designs that are implemented.

Although you can do a lot of things using the material editor, it takes experience and time to setup. A more artistic workflows should be researched. Next to that we should make it easier to mix materials and make sure it would work for many use-cases.


It would be very unfortunate if there were two separate ways of PBR texture painting in Blender, both of which missing some of the crucial features of the other.

1 Like

Let me elaborate:

The node based workflows offer same means of encapsulation and abstraction that mainstream PBR painting packages use to create concepts like smart materials, smart masks and so on. Unlike Blender though, they are much more limited in terms of how flexible the interaction between these concepts is.

The mainstream PBR painting applications also take experience and time to set up. As much experience, and as much time, because of how similar the both concepts are.

Here’s a very simple example of proper encapsulation of two simple PBR materials mixed with a painted mask and a Smart masked layer of a grunge on top. There is no reason it should not be as simple as this, for new users.

Especially now, that we have Asset Browser, so experienced users can very easily make abstract, purpose specific smart materials and smart mask nodegroups for basic users, exactly in the same manner it works in mainstream PBR painting packages, which come with pre-made library of smart materials, masks and filters, which beginner users tend to utilize.

Since Blender has adopted “everything nodes” mantra, it sounds absolutely crazy to take a 180°, and move from node based shading and texturing to a parallel, layer based system when at the same time, we are in the process of migrating modeling system from layer based to node based, because of how limited the layer based workflow was.

All it needs is literally 3 new nodes (which are also extremely useful outside of the scope of just PBR texture painting) and improvements to performance so that changes to shader node trees with Eevee enabled do not freeze the entire UI.

What’s really just so frustrating here is that the current node based system is 95% there. It’s 95% sufficient to provide the same amount of workflow speed, efficiency and flexibility as the mainstream PBR texture painting tools. But people are still using it for that only in a very limited capacity, because those last 5% are missing. And instead of there being a talk about finishing those last 5%, we are talking about keeping it insufficient and creating whole new parallel system from scratch, which will most likely be insufficient in other ways :frowning:

1 Like

We already have easy layering system for materials with Principle shader nodes.

What they are talking about is the ability to store multiple masks into multiple channels of same image.
One mask in R, another one in G, another one in B, another one in A.
As many as wanted in PSD, ORA, EXR layers.


How is that not possible with the current system? You can simply paint RGB channels and use Separate RGB node.

It is not impossible but it is not easy to set-up.
You have to be carefully to your brush settings and put in place a complicated node set-up and tweak display.
It is not as easy as pressing + button in Texture Slots panel and entering name/dimensions of image and everything is set-up.


Layer based systems are not really a 180, if we take into account that the whole artist community has more or less interacted and used the layer system in Photoshop and every other photo editing software etc., we can come to the conclusion that that system is widely used and adopted.

Understanding the concept and workflow is easier for new people to pick up as it is quite simple as opposed to nodes.
If we take a look at lets say Substance Painter, we can see what people are capable of creating with the layer based system, some incredible work and its accessible for new users,
now, if we take a look at the Node system approach i can tell you immediately that the new user will be confused out of their mind as to how it might work, especially if they never used nodes even if they used layers.

Nodes are overall a good idea BUT every system has a drawback and trying to squeeze the brush texturing in to the node setup just because blender goes “Everything nodes” with a lot of things does not mean it’s the best approach for THAT particular job.
Nodes take more time to set up, as you add more things to it it gets more complicated even to a point where the whole node tree is just confusing.
While layers MAY be limited in certain fields it shows that its layer based STACK approach is capable of achieving stunning results and all it takes a new user for it to understand is to start stacking layers, which are quite simple as a concept ie. easy to learn.

There IS a reason why Substance Painter is a layered based system while Substance Designer is a node based one, different tools for different jobs.

Now for my personal experience, having multiple textures/materials and multiple masks on a object
in Blender gets super confusing super fast and within 15 minutes of node spaghetti i just get frustrated; meanwhile in Substance Painter, i just keep stacking textures, paint masks and after 15 minutes i am able to makes something with way less frustrations. :upside_down_face:


This is exactly the reason why we are researching a way how to incorporate both, where a layered based UI would influence a nodal based system in a predictable and maintainable way.


I see what you mean now :blush:,

A layered based system thats a node system underneath. Friendly for artists and easier to maintain for devs.

To me, it looks like its something similar to the modifier stack, where a modifier is a “Texture/layer” where if you open the modifier you would end up in the node view, where it shows you the “Guts” of said layer.

Well properly implemented i’m sure this would yield stunning results :smiley:

This is just full of misconceptions.

Node trees only get confusing when no encapsulations are used. It’s same as not using folders or smart materials in SP and having one stack of dozens of layers with several filters and mask filters on each.

I actually ditched SP partly for the reason that layer based system often got too messy too fast. It’s easy to make something complicated, but then it’s also often the case you spend a lot of time finding what exactly you need to tweak to get rid of that one particular smudge you don’t like. Unless, of course you invest time into making those layers clean, properly encapsulated and properly named. But you can, and should do the same in terms of nodes.

I can speak from a first hand experience since I am making a PBR texture sets for a game engine assets, and I replaced SP with Blender to do that. Here’s an example of the workflow:

I am making a set of units for an RTS game (RTS Prototype - Work in Progress - Unreal Engine Forums):

The shading is a typical complex material comprised of several different material types, masked with various edge wear, grunge and concave area dirt accumulation, combined with hand painted areas.

Thanks to Shader Nodes ability to encapsulate, I was able to achieve workflow as clean, if not cleaner than what I could with SP. The top level material, assigned to the tank above looks like this:

It’s just one material, an abstraction of the “look” of all the vehicles for the given game faction. The top texture is just a Convex/Concave edges map, masking out convex and concave areas for procedural effects, encoded in R and G channels:

It’s automatically baked with a single click script I’ve made.

The next texture is a RGB packed texture containing masks for the Paint, Exposed metal parts and the accent/faction colors.

The last texture is just emission RGB texture for glowing components:

The main material does not actually reside in the tank file. It resides in a linked file, which is shared across all the units of given faction. So if I decide that all units of the given faction will be using blue metal, I can just change it in the linked file, and once I rebake all the textures, I have changed the materials for all the units at once.

The central material file looks like this:

It has a shaderball for material testing, the main material group for vehicles, and on the left, the more fundamental abstracted materials that material group is built out of. Inside the vehicle material, there is no confusing noodle network. Just another, clean, lower level of abstraction:

In this level, the more fundamental PBR materials are combined into a final material assembly for the vehicles. The input channels of the RGB mask textures are interpreted as desired material properties.

And then, at even lower level of abstraction, the basic materials themselves are found, once again in a very clean way, no confusing spaghetti:
This is metal for example:

Thanks to this workflow, creating a new vehicles using the exact same material style is trivial:

The same applies for buildings. Once I get the model done, All I have to do is to bring in the main building material node group, create one image texture for RGB masks to paint on, and paint areas where I want green paint, black paint, metal, or accent color. At the peak, I was able to make about one building a day, from scratch to finish, including the modeling:

For baking, I use BakeWrangler addon which allows me to pack the any texture/shader output I want into any channel of any output texture:

And with just a single click, I can always get a packed texture, in the same way I could in SP:

Obviously, there are several issues:

  • Baking texture map such as curvature should be easy, right now it’s hard. This is not a problem of lack of layered texture painting. This is a problem of very poor texture baking system in Blender. That needs to improve much more.
  • Packing and baking the output textures requires third party addon to be usable. Once again, getting a working texture baking system is IMO much more important than layered texturing system.
  • Node group encapsulation should be advertised much more than it is now. Hopefully Geometry Nodes will aid that.
  • Texture Paint mode should work better with Shader Editor. Image texture nodes should have some button which if clicked, will start painting directly on that texture in the viewport. This panel in particular:
    Is currently just one giant, poorly designed source of user errors.
  • To have full SP-like PBR texturing capability, Blender needs the nodes I mentioned above, which would allows is to modify only specific channels of the PBR struct anywhere along the shading networks. Right now, we can only mix the entire PBR structs. We can not choose to for example multiply only the base color or mix only the normal channel.

But all in all, I could never even dream of achieving this degree of flexibility and workflow cleanliness with SP layer based workflow. Just the mere fact that I can share live materials between different files, and therefore update materials in arbitrary amount of files at once is great.

The biggest problem here is that since, unfortunately, SP is currently the best we have, in terms of PBR texturing, most of us are thinking inside the constrained box of workflows we know from SP, while with just a small amount of usability improvements, Blender could supercede SP in almost all ways.


I’d say that even better example of PBR Texture painting and node based shading going hand in hand is https://armorpaint.org/

It unfortunately falls short of completing the feature set so that it’s actually usable in production, but it’s a great example of the concept being a valid one.


We’ll publish a design doc soon, but the way I would frame it is that we need layers as a first class concept in nodes. By a layer I mean a set of textures bundled together, that can be mixed, stacked, separated, stored as reusable asset, baked to image textures, etc. And that also enables a layer stack UI for those who want it.

Nodes like Separate BSDF, Get BSDF Attribute, Set BSDF Attribute would be a step towards that, making the BSDF the layer. But then you still have to figure out how baking, mixing and reusable assets work with that. Especially as BSDFs are not guaranteed to be a single Principled BSDF, they could be any shader node setup.

In order to provide a good workflow, I think we have to go further to give users a something where they don’t need to know to set up shader nodes, shader node groups, baking nodes, just the right way to satisfy implicit assumptions. Conceptually it’s still reusable node groups and nodes all the way down. But for building tools, assets, add-ons, and pipeline integration that all work together it helps to have more shared structure.


Initially restricting this system to just Principled BSDFs would be absolutely fine. The atomic BSDF nodes are considered more of a legacy these days, especially in the context of authoring PBR assets, which generally revolve around the modern principled shaders.

But even then, I don’t see this being much of an issue, since a few oddballs aside, most of the atomic BSDF nodes are just subset of Principled BSDF anyway. As a practical example, using Separate BSDF node which has just Diffuse BSDF plugged into it would output the Diffuse Color in the diffuse channel, and output default 0/black values in the rest of the channels.

Most people won’t miss the quirky features such as velvet BSDF, Diffuse Roughness and so on, because they can’t really be properly translated to other Principled shader based pipelines anyway.

There doesn’t have to be that “If we can’t support absolutely everything, then it’s not worth it” extremism.

Alternatively, these additional channels unique to the atomic BSDF nodes could be just at the very bottom, below the the list of Principled BSDF channels. And in case the node stream would have only Principled BSDFs, these channels would again return only zerosome default values.

When you take a look at the mainstream tools like Substance Painter, you will realize that the concept of creating what they call “Smart Materials” becomes pretty much the very same thing as creating node groups. Almost a mirror example of it. So if average people manage to author smart material, average people will manage to author node groups.

They also had to introduce quite clumsy concept or “anchors” just to supplement lack of ability to branch off output of any node along the tree and merge it back to anywhere along the node tree stream. Something that’s actually more easy and intuitive with nodes.

Solving the currently really bad baking can actually be easy as well. I think people should simply be able to to use something similar to AOV output nodes, a sort of “Bake output” nodes, to output and channel pack anything they want anywhere along the tree.