I really like this branch!
I saw it on YouTube and thought it was a great example: https://youtu.be/UrzjyrJpFCo
Also, after seeing it, I had an desire for a feature that would be nice to have.
If there are multiple options and values per light source and make them findable in the NPR shader(in ‘For Each Light’), it would allow us to do things like have rim lights with different values for each light.
Its utilization will vary depending on how user configure the NPR shader!
oh hey! If i understand right you’re trying to overlay an effect on top of a plane in front of the camera and still see stuff behind it right? you can do that by setting the plane in front of the camera to have a refraction material with an IOR of 1
and in the material settings make sure you have raytraced transmission turned on
then in your NPR tree plug combined color into the NPR output, anything you do in between that will be overlayed on top of the background render
I was wondering if this sort of setup was possible with the Image Sample node? Maybe I’m doing something wrong or will this be possible in future iterations? I’d like to reuse the same texture object for multiple UVs/samplers, while keeping this object easily replaceable with no workarounds like wrapping the Image Texture node inside of a nodegroup to make it a “global variable” of some sorts.
If it wasn’t planned to be added in the first place, I’d like to suggest this functionality. I think it will be a huge benefit for everyone. Thank you!
Great work so far!
I have a question regarding feedback loops. Is it possible to have some kind of feedback loops across frames with the NPR branch? Like it is possible to use simulation nodes in geometry nodes (and attributes to bring them to shader land), is it also possible to do the same directly with shader nodes or NPR nodes? And I mean like the repeat zone but across frames?
I also tried to start a conversation about this in the matrix chat here
Edit: the link doesnt work apparently, it is the everything nodes chat
Sorry for the recent lack of replies, I’ve been too busy with other things (although we’ve had several NPR design meetings recently).
Note that you can still output AOVs from NPR nodes.
Am I missing something?
Yes, this is more like a temp hack for the prototype. We are discussing internally much better ways to handle this.
It’s not intended to work at the moment (the image node still outputs a color, not an Image, notice the socket colors), but yes, that’s something we have thought about, it’s just not a target for the prototype.
My only worry is that people will abuse it to do image filtering inside the material, instead of having the filtering already baked into the image, which would be much more performant.
But other than that, yes, I agree this would be really useful for things like triplanar mapping or texture bombing.
I think this was discussed at some point. It’s not possible and not currently planned, but it would be cool to have.
The main problem is the “cache” would have to be in screen space (which comes with many issues and limitations) or use some kind of uv space baking (which would be a huge development project on its own).
Okay I was expecting that. But do you know of any other way (using the GPU module for example) how one could get some kind of frame buffer feedback or ping pong buffering for textures?
Maybe my question is then not really fitting here, thats why i pointed you to the matrix chat as well. It is the everything nodes chat.
Both the plane and suzanne are enabled raytraced transmission, but it seems like the npr refraction node can only show the object that are not enabled raytraced? How can i get the scene color(in this case, the suzanne) back?
will there be a way to get just the texture of the object itself into the npr tree instead of through the diffuse color in the NPR input? it does the job but theres this very subtle darkened fresnel-like effect that’s kind of jarring and may interfere with some aesthethic choices.
if there isnt a way to remove this effect, it would be nice to at least have a texture input for the NPR input node
i’d love to post screenshots of what i meant, but my blender account is pretty new. it can be recreated however by comparing a texture plugged directly into render output without an npr tree, to having it plugged to a shader and using the diffuse color input within an npr tree
I think you might need to share some screenshots, when you can. I’ve just tried to create it as you described at the end, and see no “subtle darkened fresnel-like effect” that I don’t expect to see, depending on which channels/outputs are used.
And also something like a bake node in the image sample that will frozon the npr refraction view to its current view. It can do some very interesting vfx like the glass breaking, baking the background color and then break the glass ,just like AE effect.
I used your Goo Engine for some time in the past. The one feature that was very useful for me was the “in front” option working for all objects in EEVEE. This allowed effects in VR mode that are otherwise impossible to do in Blender.
Do you plan to implement this feature as well? I haven’t seen it in the proposal.
From within Blender, using the VR addon. It works well enough that I can use it productively without having to output to some external game engine.
There was however one effect that I couldn’t achieve, I’ll try to explan it: to fake the reflection of a (glowing) display on the surface of a (slightly curved) object. Think of a Head-up-display that is reflected in the windshield of a vehicle.
We don’t have real-time raytracing.
Screen-space-reflection doesn’t show it, probably since the display that is being reflected isn’t directly visible to the camera.
A planar probe doesn’t work, as the surface is slightly curved.
I have a properly (in CAD) constructed mirror image of the display which “hovers” on the other side of the object. I.e.: a fake reflection. But it is of course obstructed by the object itself and not visible to the camera.
→ Solution would be to use the “in front” option on the mirror image, but that doesn’t work in EEVEE (except for wireframe objects like empties, cameras, etc.)
Another solution would be two view layers one with the scene and one only with the fake reflection, composited on top. This doesn’t work in real time / VR, as the real time compositor doesn’t render view layers.
I have also not found a material node setup that could do this.
So my only solution was Goo Engine, as it properly renders objects “in front” in EEVEE.
Rendering inside Reflection Probes lacks several features compared to Camera Renders.
This is more a general EEVEE limitation than something directly related to the NPR branch.
So I wouldn’t say it’s intended in the sense that I would prefer not to have those limitations.
But it’s expected.
No, this is still EEVEE, so the same limitations still apply (no “real” ray-tracing for now).
It’s mentioned in the “EEVEE-side features” (Depth offset).
Depth offset covers more cases than the “In Front” option, but it should still be able to replicate that workflow fine.
It’s not implemented in the prototype, though, since it requires changing some core assumptions about how EEVEE works, and it would be too much work to maintain it on a separate branch.