heh some kind of blur can be approximated like this, by putting noise in the offset of the image sample - similar to distorting uvs in shader editor. But this here looks very nice with moving camera. Could be an useful solution for simple stylized fur… I like this NPR version more and more.
One more example, ‘blurred’ shading with sharp texture…
That sounds great! In the meantime, i have another question/bit of feedback, regarding offset workflows:
large offsets understandably create artifacts on the edges of objects, as some of the background shows up there - i get where the black bg comes from. The white part, when there is an object behind, is more mysterious
this can be mitigated to some degree if bg objects also have NPR nodes
would some kind of dilate/expand option be possible for image sample? Specifically to avoid those black/white areas, and to be even more specific, for rendering on view layers when there is no visible bg objects…
Do you mean reading the UVs of a refracted object?
No, that’s not possible.
You could technically output the UVs to an AOV, but that’s not going to work exactly the same.
This is because Diffuse Direct/Indirect doesn’t have the albedo (Diffuse Color) applied.
If you multiply them by the albedo you would get the same result as the red plane (without speculars).
Not from the object itself, if you want to expand the object contour you would need to use a refraction material on a separate object.
(See the Image Sample example in the first post)
Hello! Very excited for what is being developed here!
I’ve been working in the past months on some NPR techniques for my short-film.
One nice thing that using ShaderToRGB permits, is that you can mix it with other shaders, giving the possibility to create toon materials with reflections from a glossy shader:
Since the NPR node-tree doesn’t allow to have shader inputs, this prevents us from re-creating the same setup in the node-tree. I tried transfering the glossy material as an AOV, but obviously it doesn’t give the same result:
A sort of half-solution I found (which actually shows something it wasn’t possible before with ShaderToRGB), is to keep the 2 mixed shaders, in this way it all gets converted by the color ramp, while keeping the reflections. Still, the final result is not as “sweet” as the first one:
Now, I’m aware that I could simply keep using ShaderToRGB, but since this whole new system can be considered an improved, extended version of it, with new integrated features coming down the road, I wonder if this particular case can be considered worth for developing in the future a solution/workaround for “re-integrating” shaders in the NPR tree. Thanks!
This way you not only have way more control over your shading style, this also allows you dropping your material into any scene with arbitrary lighting and it should “just work”.
Edit: Similar setup using “max” instead of additive lighting, just for fun:
Unfortunately with the suggested setup, the colors (both by using just the principled and the principled + glossy) are just not like that, with the first being less “punchy” and the second being quite saturated:
Sorry what is the look that you are trying to hit? The tried getting the PBR specularity on top of the NPR like the photo above and it was exactly the same.
EEVEE render:
Thanks! I figured that what was making the look different was the principled IOR. Either changing it to 1 or using a Diffuse BSDF solved it.
BTW, I didn’t immediately realise it, but introducing light loops does mean having color ramp toon shading taking the color of lights?? Because if that’s the case, this is huge!
Not exactly, NPR Input > Combined Color includes ray-tracing and SSS, while ShaderToRGB doesn’t support those.
Yes! That’s why I put all those colored lights into the scene.
But not only that, custom light loops open the door for custom “BSDFs”, not just applying a ramp over the built-in ones.
Thanks!
Do you mean the NPR shading being present on reflection probes?
Not yet, it shouldn’t be hard to implement, but it would need to be rewritten once multi-layer refraction is implemented. So it makes more sense to not implement probes support until multi-layer refraction is ready.