EEVEE NPR Prototype - Feedback

Hey there!
Ive been working around limitations on NPR on blender for ages and this is awesome!
Ive just began messing with this and I am already in love. The light loop and repeat zones, ablity to read and write AOVs, man, this is so powerful!
This is a basic test shader that I made.


It has only diffuse and specular (GGX) implemented right now, a sample based (naive) outline node I did, and some other stuff that it was really really hard to do without light loops and its now trivial, I think I did everything in an hour.

Also, the possibility to write aovs and use them on the compositor, oh boy.
This adds so much, like this ben day dots node I made for the compositor:

That said, there are a few things I wish was added.
Most importantly that I can think off are 2 things.
light influence
image
Maybe an input off those on the light loop? like we have color, attenuation etc
and shadow opacity and color per light.

Anyway, amazing work!
If anyone wants to tinker with this file, here it is.

4 Likes

How can we link normal maps to this? Need to still have that painterly feel of my normals to work via imported textures via painting software.

Exposing light influence should be trivial, since the property is already there internally.
Adding extra/custom properties to lights would be nice (this has already been requested in the thread), but requires more thought.

You should be able to read the normals from the NPR Input.
The Normal Map node should also work inside the NPR tree.

2 Likes

hi again soooo i noticed while working that when you keyframe values or anything in the npr tree, it gets highlighted green but is never actually keyframed in the timeline. i assume this isnt intentional? i still haven’t gotten the new build because of internet issues

The (non) behavior is the same in the current 2/18 build. And I assume it’s just not implemented yet.

Any plans on adding a bake feature? I want to bake a toon shader setup I have set up but the node setup doesn’t work in cycles and the alternative ways are very tedious.

3 Likes

Hello all you wonderful people!
Got a chance to spend some time with the Prototype and wanted to share some thoughts, node groups and requests.

I used the Surface Curvature and Co-Planar Edge Detection groups, in combination with a GeoNodes Attribute for edge marking, to make a Linework group:


Below is only the edge mark data:

If possible, it would be nice to be able to access edge mark data from within NPR nodes - I’m using the ‘Is Edge Smooth?’ node to detect edges marked sharp, then applying those marks to the shader.

The downside is that it marks your edges sharp - and checking if an edge is marked as a seam could cause similar issues, so ideally it would be Freestyle mark data that would be accessible.

There are a couple of other GeoNode features that I think could work as NPR nodes:

  • Screen Space Compensation - a lot of the nodes that are used in the solution posted above are not available in Shader nodes, I hacked together my own, but it’s not perfect
  • Scene Time - I think it was mentioned above that NPR nodes are not yet linked to the Timeline, but it would be nice to have quick access to Seconds, Frames and Stepped Frames to drive NPR values

UI questions:
NPR Trees have Inputs and Outputs, are they intended to have IO like a node group, or is that something that would be hidden in the final release? I can access the NPR Tree as a node group in the Shader Editor, and any IO will be exposed on the node, but I couldn’t find a way to access it inside of the NPR tree - they are not added to the NPR Input or Output nodes (I’m guessing this is by design)

It would be nice to be able to quickly feed data from the Shader tree to the NPR tree (e.g. an animated texture, or linked value) - but I can appreciate that there is some design difficulty there.

That also brought up the question of how to access NPR settings outside of the Node Editor - node groups are great because they will populate the Properties area with all the settings you want to have exposed.

Node Search autocomplete gave inconsistent results between the Shader and NPR editors - Mapping and Color Ramp nodes were available in NPR, but I could not access them through search.

There was also no indication that NPR shaders would not compile - the first time I tried to feed Color data to an Image input I thought that the shader was calculating, only to eventually realize that nothing was happening and I just couldn’t connect those nodes that way. More bright red link highlights for unsupported operations would help with this.

Is there a possibility that NPR nodes might make it into the 4.5 LTS?

Thanks again to everyone contributing to this project, including everyone posting their results in this thread, I would have been a little lost without some of these examples.

I’ll post some of my experiments below, mostly basic stuff, node groups are in the .blend:

16 Likes

In my testing of the NPR branch I tried to come up with a novel way to render cel-shaded character mouths. I’ve decided to abandon this experiment for the time-being, but I wanted to log it somewhere for the sake of posterity.

My main focus was to see if I could grant animators direct control of the lines drawing the shape of the mouth itself, as opposed to allowing contour rendering to draw the lines based on mesh poses. To do this, I simplified down the mouth itself to the most basic structure and set up some AOV masking to keep it clipped to the head mesh.

Viewport (with backface culling on)

(apologies if I spam a bit to show all my screenshots)

2 Likes

Rendered

Looks like nothing special, but the power here lies in the flexibility of having the mouth be a separate mesh. With that, scaling and dynamic posing becomes much easier without having to worry about crazy intersections or geometry folding in on itself.

2 Likes

Annotated

The main reason I’m abandoning this is because, while it’s promising, I don’t like the idea of relying on rendered view for animation preview/playback. As it stands, this looks pretty horrible in Workbench, to the point that I think it would be detrimental to an artist’s process posing a character. Switching to rendered view to preview and pose face shapes shouldn’t be a requirement, especially considering the majority of our shots have lots of moving parts and big environments that bog down fps.

Additionally, this would destroy the usability of a zdepth pass around the head area, which would mean the mouth would need to be rendered on a separate view layer. Maybe in an ideal world, we can have multi-viewlayer rendering in the realtime compositor? That honestly would definitely be the cleanest way to implement this I think.

If this piques anybody’s interest I can elaborate further on my setup. I do feel as though there’s something promising to this, but as it stands I think it’s not ready for production. Nonetheless, I’m absolutely sold on NPR. I plan to do many more tests with it!

3 Likes

I’ll post some more test renders and blends here, and keep questions and comments to the next post.
Apologies for the state of the blends, I didn’t spend a lot of time making them clean and tidy!

Massive thanks to @Vitalijs_Komasilovs for the Cavity and Kuwahara groups, they are doing heavy lifting in most of these examples. It’s a little shocking how little you have to do to give something a “hand-painted” look!

Kuwahara:
Anime BG Boards - Bevelled Cube with a Principled BSDF and Wave Texture:


Anime BG Pipes - Cylinders in back, Extruded Curve in front, Metallic BSDF:

Golden Bell - Cavity, Rim over Metallic BSDF with Kuwahara reflections:

GeoNodes instanced Mesh Circles, realized, with edge highlight:

Refraction:
Plane with Voronoi, Noise and Gradient textures for cracked lines, stress lines, cracked refraction and masking:


Edge, Cavity highlights, refraction and noise:

Sniper Scope - Zoom with refraction, overlaid scope image with procedural textures:

Speed Effect - AOV for mask, displaces and overlays sampled image:

Speed Reaction - linear gradient for mask, displaces sampled image:

Blurred AOV and Noise Textures for simple toon flames:

12 Likes

@thorn-neverwake - you mentioned some light artifacts in one of your first posts in this thread, not sure if you’ve already got a solution, but it looked a bit like what happens to me when I “blow out” stylized lighting - it seems to wrap around from white back to black again.
My solution was to clamp the Result or Factor (depending on how you have your nodes set up) of the Mix Node that was adding the highlights, and that usually fixed it.

@Wyatt_Hall - this is a really interesting setup, and it reminds me a bit of the Lightning Boy Studios Boolean Mouth workflow - not sure if anything there is helpful, but it seemed like it could maybe offer a solution to your Workbench issue (or maybe not… I just figured it was worth a mention.

I was wrong about this: it’s not that nodes don’t show up in NPR trees, the actual issue is that autofill is limited when dragging out a link from an Image Socket. This seems like it is probably a Blender issue, and not something specific to NPR, so apologies if you spent time trying to figure this out.

@pragma37 I have also run into a workflow issue, and I’m not sure how to get around it:
I use the Light Path node in my World Shaders to separate the Background from my Object lighting.

When I use a Refraction plane I can’t get it to show the Background color if I’m using that setup:

Do I just need to add the Light Path node to my Object Shader?
My NPR Tree?
Is there a whole other way of achieving this effect now?
(The blend file is in the link in the post above for anyone who wants it)

Also, I know that this is not a forum for UI/UX feedback and I can remove this portion and post it elsewhere, but I do want to make one suggestion (that has probably already been considered by the team):

Put NPR Nodes in a traditional node group and add a specialized input to the Material Output node.


This would have a few advantages:

  • Node Groups can set a custom default width, this would allow users to easily read the name of the NPR Tree they are using, making scenes with multiple NPR trees easier to manage at a glance
  • an NPR only Image-input could be added to a panel on the Material Output node, keeping it clean for anyone not using NPR
  • the NPR input could be placed at the bottom of the node to reinforce that Shaders are calculated first, and then NPR afterwards
  • NPR Node group could start with no inputs, but allow users to add them within the NPR Tree for quick data transfer from Shader to NPR without using AOVs (this might be technically impossible, I’m not sure)

These are all just thoughts that I had when I started working with more than one NPR Tree per scene.

And another huge thank you to everyone involved with this project, there is so much great stuff already here, and so much more on the way to look forward to!

@spectralvectors To be honest, booleans would be my first choice and probably solve all my problems! But in my own testing I’ve noticed that they’re too unstable to rely on, plus they’re significantly more intensive to process on high res meshes. Additionally, since my normal editing relies on data transfer via topology, changing the vertex count of my objects is a no-go. May consider giving it a second look, though…

General question:
Something I really loved about goo engine was node-level light linking. Each Shader Info node in your material could be connected to it’s own light link group. This allowed for several lights to operate independently on a single material. Ex: A sun lamp is making toon shading, while an area light is added over making regular diffuse shading.

Is it possible to replicate this in some way with light loops or something else in NPR?

Not sure about Light groups/linking in new EEVEE, or this prototype specifically, but I did think of a hacky way to achieve what you described using Influence.

I set up a Sun Light to Influence Glossy only, then a Point Light to Influence Diffuse only, and NPR Nodes as shown below:

Same idea, but using a Light Loop:

2 Likes

Making it a separate category from “Object shaders” is a really weird and un-intuitive idea, that goes against the blender UI design.
These are still object shaders…

Personally I prefer having more control of the shading and lighting, than having less but working within existing UI design.

UI design isnt a holy text or something carved in stone. It is meant to make things easy to work with for the user, not restrain vast improvements for the sake of itself.