NPR Project & Multi-stage Compositing

The NPR Project was just announced on the blog.

At the core of it, it is the concept of Multi-stage Compositing. Simply putting it means that materials and/or objects can also have access to a compositing NodeTree:

This then gets all combined in the existing scene-level compositing:

image


I would like to invite everyone to read the blog post, and let me know if there is something that is not clear. Once the development on Multi-stage Compositing starts I will share a build and open a thread on this forum for testing/feedback.

The EEVEE internal features (Ray Queries, portal BSDF, …) may be tackled independently directly on main. To follow the project development you can subscribe to #139332.

PS: I’ll soon close the existing NPR prototype thread, so the attention of the project can shift to this new approach.

24 Likes

My first question is an overall general one, which I hope I can phrase for easy reading:

Vintage celshading (VC): This is shader/RGB workflow, and relatively fast to use but very limited. All work done simply within the Object material. No use of compositing required to achieve final material look.

NPR Branch (NPR): Makes use of object base material, and that material is the foundation for shading which is further created in the NPR shader editor. Far more flexible than VC. No use of compositing required to achieve final material look.

Multistage Compositing (MC): All materials created in standard material editor (BSDF workflow). Compositing required on a per-object/per-material basis to achieve the look formerly accomplished with VC or NPR. Which, I assume means that realtime preview now requires full-time use of the realtime compositor.

Is that an accurate summary?

3 Likes


The NPR nodes in the prototype allow for the effects to show up in reflections, will multi stage compositing still allow for it?

Also the new design tasks don’t mention repeat zones anywhere, are those still planned?

4 Likes

I think per material NPR nodes or per material compositing nodes are more similar than this. Compositing required / not required is not the distinction I would make. They’re both additional nodes that manipulate the output of shader nodes, and include compositing operations. On a superficial level the workflow would be quite similar.

It’s more about which exact nodes are available, and where exactly it integrates in the render pipeline under the hood and consequences of that. It’s somewhat difficult to define. Reflections and refractions get more tricky, but there are also ways to improve that. On the other hand you get extra functionality from the compositor and better compatibility with future EEVEE and Cycles.

3 Likes

Well, that’s what I’m trying to determine.

With shader/RGB, I have very limited access to the individual diffuse & shadow components of a material; everything is based on the RGB conversion, and the most common way to deal with the split involves using multiple ramps.

With Goo Engine, I have complete access to diffuse, shadow, and self-shadow at the material input. Ramps still play their part, but the base shading components are already present.

NPR branch is sort of a hybrid between the two - to what extent this would have evolved, I suppose we don’t know. But, the “vintage rgb” + ramps workflow worked well, and the NPR system added greatly to what could be done with it.

With this new system, the screenshots in the blog post show a monkey with a checkerboard shader - and then uses compositing to then do anything else downstream to the shader. There’s no example of an object with multiple materials, multiple objects with multiple materials, or with organic and hard edges. So the workflow here, and how it impacts how we will have to approach shading and lighting at the basic objects and materials level - I would like to be more defined.

If i were to want, for example, this standard material look - how it it achieved though this proposed pipeline?

Or, something far more involved - how much of my materials would have to be done using compositing?

If i have a character with 7 different materials applied to the mesh, each material with different colors, ramp slopes, highlights, some with inter-related AOVs - how much of that becomes split in material vs compositor?

1 Like

In this simple example the part after Shader to RGB could be done in per material compositing nodes.

Probably “Custom Shading” (like a light loop zone) would also cover it, but that is limited by current and future choices of rendering algorithms. For direct lighting with sharp shadows it should be ok, but for indirect light and soft shadows some things would require compositing nodes.

The shader nodes in this example don’t seem to rely on any NPR specific features? Though the screenshot is a bit low res.

It’s not material vs. compositor, since a material would contain both shader nodes and compositing nodes. The user wouldn’t have to do any manual work to combine the results from multiple materials, that would be handled automatically.

4 Likes

The example image, is related to these (edited for briefness) statements in the blog:

“The current way of doing it inside EEVEE is to use the ShaderToRGB node, which comes with a lot of limitations. … Instead, the parts of the compositing pipeline that are specific to a certain asset should be defined at the asset level. The reasoning is that these compositing nodes define the appearance of this asset and should be shared between scene.”

Based on that, my inference from the blog, is that some nodes (or abilities) of the current (and former NPR system) material chain are to be moved from materials, into compositing nodes. IE, Shader to RGB is specifically mentioned by name.

It isn’t clear to me what we will no longer be able to do at the material level, that we currently can, without getting compositing involved.

I’ll try wording it a different way:

The proposal sounds like it’s heavily leaning towards a pipeline of “turn 3D BSDF scene into NPR, via filters in compositing.” Which is far different than having a scene that is natively NPR shaded and lit.

3 Likes

The proposal sounds like it’s heavily leaning towards a pipeline of “turn 3D BSDF scene into NPR, via filters in compositing.” Which is far different than having a scene that is natively NPR shaded and lit.

The “native NPR shading” is more suited to be implemented at the material level. This is what the “Custom Shading” task refers to. But each workflow (material vs compositor) will have different limitation. For instance Material nodes won’t be able to blur the results of a BSDF (from the same tree) and compositor nodes won’t have direct access to geometry data.

2 Likes

Multi-Stage Compositing result won’t appear in reflections as they are not part of the light transport. However, other shading features like Custom Shading will be visible through reflections.

Also the new design tasks don’t mention repeat zones anywhere, are those still planned?

I simply forgot to add it. It is not really a NPR feature but we want to implement it anyway.

8 Likes

Just from reading the blogpost and looking at the example images is not clear to me how this Multi-Stage Compositing is going to work. Does it mean we will have compositing nodes available to use within the shaders, or does it mean we will have to create different compositing node trees for each effect we want to create? And is that independent from the node tree you’d usually use for final compositing after render…?
Maybe when we can test a build it’ll be clearer, but right now it sounds very confusing from a UI/UX perspective.

4 Likes

Seems like the major update of npr project is adding object-level compositor, which is similar to current prototype npr node but compatible with both eevee and cycles.

4 Likes

Thank you, this is perfectly summarized.

The main differences between the Multi-Stage Compositor and the NPR prototype:

  • There is no light transport interaction: result of compositor is not visible in reflection or refraction.
  • Renderer agnostic: will work with Cycles, EEVEE and other renderers.
  • Use of the compositor node: the NPR prototype used Material nodes as a basis.
4 Likes

This seems like a pretty major regression. How is this solved? What is the expected workflow to fix this?

4 Likes

The Compositing and Material nodes will still be separated. Just like in the NPR prototype.

Yes it is.

Just like we have World / Object option in the Material node-tree area, we will have a Material/Object/Scene option in the Compositor area.

You can think of it as if the NPR nodetree from the NPR prototype was a compositor nodetree.

5 Likes

The custom shading (for each light zones) will still appear in the reflections. This will already cover a lot of styles.

For the remaining styles that need reflection supports, the reflection needs to be manually rendered and composited. We are looking at ways on how to automate this in the scene compositor (for instance, with a screen space raycast node). The key point is that these reflections should not be handled by the render engine but by the compositor.

14 Likes

Compositor linked to object/material sounds pretty cool! Setting things up in compositor manually merging non-renderable features in post definitely tiresome. And with render AOVs anything geometry-dependent can be baked for later use in Compositor.

Please, don’t forget about OSL support! OSL still lacks AOVs - and without AOVs, without ability to bake intermediate information for next stage - multi-stage approach will be less useful. But still powerful :+1:

2 Likes

Are most materials going to be basically a BSDF input, as it is currently in Eevee? Or will components of diffuse, shadow, self shadow, ambient/world be available?

The NPR engine had different shading algorythms; (lambert, phong, etc) - will these still exist?

The curvature, edge detection, and similar nodes in NPR - will these still exist for materials?

Will AOV, math abilities and transparency exist at the material level, as they do in the NPR?

5 Likes

This is incredibly exciting and, in my opinion, a huge step up from the initial design of the NPR branch. Having the same nodes for the full frame compositor as for per material or object is brilliant.

I’m excited to see what comes out of it!

3 Likes

This new design is very exciting and even though it was designed for NPR stylized rendering, I would think of other use cases, such as 2.5D compositing, where for each plane it is possible to create a compositing tree. In that case I would ask, would transparency be supported at the compositing level? That is, if for a plane I use a simple black emissive material as a material and then I connect a group of compositing nodes in which I have for example an rgb mountain with the alpha, could I use the alpha at the multi-stage compositing level to make the object transparent in some points?

As a result, in production pipeline, this is often done through very cumbersome and time consuming scene-wide compositing.

Are there more complete practical examples where this case is apparent. I know it can be easier for sure, but can we not improve scene-wide compositing to close the gap? I feel like in more practical scenes, styling will not be drastically different across objects/materials, contrary to the examples shown in the post, so scene-wide compositing might not be that bad of an option as the example show.

Remove unsupported nodes from the add menu.

What kind of nodes do you intend to remove and why? Will those just be specific nodes like Object info?

This part receives the rendered color as well as its AOVs and render passes as input.

Will this include all passes enabled on the scene level? Or are they manually added and engines will look at the object compositing trees to see what kind of passes are needed?

Will you use Group Input and Group Output nodes for inputs and outputs or will you use Render Layers and Composite nodes?

When compositing anti-aliased input, results often include hard to resolve fringes.

Can you share a practical file where such issues manifest? Maybe this can be streamlined more generally for the compositor. There were similar complaints about edges when masking and alpha-overing in the compositor and we were investigating how this might be solved.

1 Like