EEVEE NPR Prototype - Feedback

Hello everyone!

As you might know, we have an ongoing project in collaboration with Goo Studios to improve NPR support in Blender.
We had a Workshop in July and decided to start by making a prototype to gather feedback before committing to a final design.

The prototype’s goal is to provide something as powerful and flexible as we can, see what the user community can achieve with it, and from there evaluate what features should be implemented in the main branch.
(So don’t expect backward compatibility)

The main things we want you to share with us are:

  • What results you can achieve with these features that were not possible before.
  • What results you still can’t achieve.

Note that we have limited resources, so expect some lack of polish (again, keep in mind this is just a prototype).

Overall NPR Targets

  • Light & Shadow linking
    This one has been already merged into 4.3!
  • Custom Shading
    Provide direct access to light and shading data, and allow users to process and combine them in arbitrary ways.
  • Filters and mesh-based compositing
    Provide a per-mesh/material post-processing workflow that interacts automatically with other render features like reflections and refractions.

Implemented Features

NPR Tree

For the prototype, we are using a separate NPR node tree where these features are being added.
NPR trees use the same nodes as Materials, except:

  • New NPR-only nodes are available.
  • BSDF nodes can’t be used here.

The NPR tree is selected in the Material Output node.

You can swap between the Material and the NPR node tree with Ctrl+Tab.

NPR Input

You can think of this as an extension of the ShaderToRGB workflow.

NPR Refraction

The NPR Refraction is similar to the NPR Input, but it reads the layer behind a refraction material.
It needs the Raytraced Transmission option enabled in the material.

Image Sample

These nodes output image sockets, so you can sample them with an offset and make filters.

AOV Input

Similar to the NPR Input and the NPR Refraction, the AOV Input lets your sample AOV images.
Here we can read the AOV from the behind layer because the Refraction Material doesn’t write to the AOV (otherwise it would overwrite the Suzzane AOV).

Multi-layer Refraction

Refraction layers allow chaining multiple refractions for a “mesh-based” compositing workflow.

For Each Light

For Each Light zones allow creating custom shading that works consistently with arbitrary light types and colors.

To Dos

Main Features

Known Issues

  • While you can read and write AOVs from NPR nodes, you shouldn’t read and write to the same AOV (even from different NPR materials) since the results may vary randomly.

Download

Bug Reports

The NPR-Prototype branch has its own separate bug tracker:

To report a bug:

  • Verify that the bug is not already reported.
  • Verify that the bug can’t be reproduced in the main branch daily builds.
  • Verify that you’re using the latest NPR-Prototype build.
  • Open the NPR-Prototype build, click Help > Report a Bug, and fill the required info.
50 Likes

Great work so far!

I don’t do NPR (outside teaching it) so take whatever I say with grain of salt or disregard it altogether, but I mostly wanted to comment about UI of the project.

I played with the build couple of days ago and switching between node trees is quite annoying indeed. I wouldn’t mind this design as it is, but I really liked NPR Zone idea and mock-up, which would allow doing everything in the same node tree (which I think will also make API much easier too for add-on developers).

One pain point of this UI design might be making node groups out of NPR trees. I’ve tested and node groups are shareable between material and NPR trees, which is nice (for making utility assets), but I don’t know how making entire tree as asset would work? Let’s say I make NPR node tree called “Simple Toon”, I want to make it asset and reuse it everywhere in my project. But how?

  • I noticed that NPR tree is node group internally (at least that’s how it appears in UI) like geonodes, which is nice. You can name it, and add sockets? Which I’m assuming is mistake, but no way to turn it into asset except doing it from Outliner > Blend File
  • But once it’s asset, how will you add it into your scene? Right now way to add NPR tree is by choosing it from data-block picker on Material Output node, but it will only list node groups that are already in the blend file, not assets. That means only way to make your assets appear there is to first add them as node groups in material tree, delete them, and then they will appear there. It’s quite uncomfortable workflow.
  • Material Output also lets you pick any node group as NPR output, you can add NPR trees in material nodes, you can nest NPR trees inside each other all of them having separate NPR Output nodes, and things like that are not ruining anything, but just adding to confusion.
  • I tried to push limits of what you can in NPR tree, and its pretty much anything, it’s just material node group. You can make entire shader without using any of the NPR nodes, you can mix noises and procedural textures and SDFs and all that, everything works. Which really blurs the line of where does material tree end and where does NPR tree begin. What am I, as user supposed to do in one and what in another?

I think all of the above could be avoided with NPR zone in material tree. You can just create node groups that wrap zones and add them normally, always available in add menu. You can use NPR specific nodes wherever you want in the node tree, mix it with regular procedural textures and masks at any point, whatever makes sense to you, and there will be no confusion.

I think separating trees will invite confusion, workflow issues and bugs that we might not even realize at this stage. While zone (or one node tree simply) could be more difficult to do now, itwill make workflow easier to understand and maintain in the long run.

9 Likes

First, I’d like to thank everyone on the team for their work on this. I know you all have a passion for NPR, so I feel like the project has great potential for success!

(I mentioned this is the other related thread, but as this is the official feedback thread I’ll re-mention here.)

I’m seeing fuzzy terminators and shadow glitching on surfaces. This is not specific to the NPR prototype, as they’re also present in Eevee Next. However, I’m quite hopeful that this can be eliminated in NPR, as the engine (I believe) isn’t necessarily constrained by “what has to be done in Eevee for photoreal shadows and shaders”.

Rendered with a sun lamp, arrow points to an odd shading artifact on the cone. (This is also present in the working view.)

Using a point light, the arrows point to the same surface artifact, as well as the fuzzy aliased terminator on the cone.

The point light artifacting is even more dramatic in the render.

(I can share the file, in case it will be helpful - but looks like cannot share files here.)

4 Likes

NPR refraction - this is tres cool. :wink:

8 Likes

Very nice. Is there a way the NPR tree can feed back into the shader tree for PBR texture data?

A loop around.

Typical NPR workflows I’ve used is Shader to RGB to do image (paint) processing with refraction and transparency pre-processing - then use this data to feed into PBR materials, like roughness, normals, and albedo for relighting in a PBR environment (cast paint to a real canvas).

Looks like this is a one way trip from PBR to NPR, but not back again?

6 Likes

@nickberckley Thanks for the detailed feedback.

I haven’t thought about NPR Trees as assets, to be honest.

You could link them, or put all the content of the NPR Tree inside a node group, but yes, you made a good point.

The reason for having NPR nodes as a separate tree is that (due to implementation requirements) we need to ensure 2 things:

  • That the nodes order is always Material Output -> NPR nodes (we can’t allow using NPR features to drive the inputs of BSDFs or Displacement).

  • That each material only has one “NPR Input” (we can’t do like ShaderToRGBA and make multiple BSDF -> NPR Input conversions per material).

The simplest way to ensure those constraints is to put the NPR nodes on a separate tree.

There may be more user friendly ways to achieve the same, but those should be explored after the prototype.

To be clear, the main goal of the prototype is to figure out the feature requirements, see what can and can’t be achieved. It’s “ok” if the workflow is a bit clunky for now, it’s better to take UX decisions once we have a better picture of the final features.

4 Likes

@thorn-neverwake Yes, the shadows are the same as in main, this is still EEVEE.

We haven’t touched the custom-shading part yet, so we might be able to tweak some things, but I wouldn’t expect a completely new shadow map implementation, especially not in te prototype.

That said, I don’t know where that point light artifact is coming from.

Since I assume this is reproducible in the main builds with ShaderToRGB I would open a bug report (you can tag me directly).

3 Likes

Hmmm… that’s a problem. I very frequently use multiple BSDFs and pipe them into Shader to RGB. Is there a workaround?

Nice work so far, by the way :slight_smile:

3 Likes

Yes, this limitation is by design (see my answer to @nickberckley).
There are more details about why this is needed in my design proposal:

1 Like

I understand that the idea of a separate tree is to avoid to recreate nodetree in each material.
But in that case, I don’t get why Nodetree creation is done in a Material output.

if user wants to create a World or a Line Style ; he does it like for material, with template in Header of Shader Edtior. I would do the same thing, here.
I think that the link to material should be done by a list in NPR output node of materials, or objects material slots indices, or objects, or collections.
User would not have to select and tweak each material ; if efficient ways to define a selection of materials, to apply the effect to, is present in NPR editor.

There could be an impossibility to affect a material to an NPR tree, if it was already added to an existing one, with a warning in status bar in case of attempt to do it.

1 Like

I believe the theory is - you don’t need the shaderRGB conversion workflow.

Sort of like how it works with the goo shader info mode. Diffuse/RGB isn’t part of the equation.

2 Likes

That’s valid. I’ll just throw suggestion for future when you’ll start working on this, but one way to ensure both points are respected in regular material tree might be to have Material Output node output “Material” socket (used in geometry nodes), and have NPR Zone only accept Material input.

That will guarantee that users will always do Output -> NPR and there is no other way to it.


Btw, is the “Image” socket in NPR same as image socket in geometry nodes? or do they just share the name.

2 Likes

We are going to have NPR-specific shading nodes (and custom light loops) on the NPR side, so you should be covered (just with a different workflow).

4 Likes

At first we were planning to allow custom inputs in the NPR Tree, that would have been exposed in the Material Output (like with Node Groups). That’s the main reason.

Should I assume that nothing involving multiple compositing passes is a focus/concern at this state?

1 Like

If there’s something broken it won’t hurt to make us aware and add it to the “know issues”

2 Likes
  1. Node tree should not be a part of the output of the shader node.
    1.1. You can not use material node tree as read-only asset and with custom NPR tree.
    1.2. You can not combine NPR and Shader tree assignment in geometry nodes.
    1.3. You can not share the same material with different NPR in the same project.
    New nodes have to be or part of the graph or be in the tree that is not part of the graph. Zones is the way to make pseudo graph so nodes keep in only one tree but can be in other execution context.
    You can put NPR ID selector into additional list of IDs just like materials are.
  2. I do not see the point of the NPR, sorry.
    a. So this is just composition with by-mask implementation (node tree with effect is assigned to some certain part of the mesh)?
    And its worth to do not just add new way to put mesh selection into compositor?
    b. Or this is actually change the way how render is happening (something like custom ray reflection or global illumination emission with toon effect)?
    But why this is eevee-only (as i get that) and not actually cannot be used multiple time in combination with shader?
    c. Its just way to do not save some temporal layers with info?
    But why its is worth to add new whole nodal system, instead of just unified interface of outputting additional layers of render info? And do this for cycles, eevee and custom engines?
  3. Image sampling should only be an explicit with implicit screen space UV input. Not sure if current version only a temporary, but just to make sure. It is possible to make ID data type compatible with multifunctions and operate over them in the Functional nodes which is will be compatible with all built-in backends. So in the result version new node tree should not introduce this implicit conversion between Image and Color data types.

It seems that you think building a scene in NPR is effectively the same thing as rendering a standard image, and adding an artsy filter to it in post composting. This is not the case.

As to (paraphrasing) “but what about it working with standard Eevee or cycles” - the whole point is to bypass things in those renderers that NPR artists are limited by.

NPR artists ask for shader feature X to work a particular way, and we’re told “No, that is not how light and shadow behaves.” This has already been a critical flaw with Eevee Next. So - fair enough. This allows us to be more free from the restrictions imposed by code designed to support realism over stylistic.

7 Likes

Ah makes sense. Will the Shader to RGB node still exist after this node tree system for loop back workflows back into PBR?

1 Like

Basic shader fun with camera vectors and refraction…

10 Likes