I just learned that Crytek (game engine developer) has released its realtime reflects & refracts software to developers. Microsoft has picked it up and Rhino is also on board. It works on NVIDIA and AMD cards and also on Mac, Windows, and linux platforms. This would not enhance all of Cycles capability but would seem to incremental improve it. Not being a developer, it appears to me comparatively easy to insert into Cycles.
My question is can this be incorporated into Cycles for realtime photorealistic animation? Not necessarily for a game engine, just presentation and story telling.
I haven’t seen much specific information about this, what kind of API there would be specifically, what kind of license it would have, or seen announcements that Microsoft or Rhino are involved.
Either way, using something like this is a big project and has many implications. It’s not an easy thing to adopt. We are more likely to adopt ray tracing libraries that were designed with production rendering in mind, the requirements for features like hair and motion blur are often quite different than what would be done for games.
Thankyou for responding so quickly. I saw the Rhino-Windows on Google under the key search of its name “Neon Noir”
Yeah, I realize that it would require research by a developer to know whether this would work and I don’t know how your system works behind the scenes, but I thought if you wanted to know what is out there maybe this could be considered and then discarded.
Because this only (as I understand it) is for reflections and refractions I didn’t think of the relationships with non glossy situations or motion blur.
My vision for EEVEE’s ideal state for reflections and GI would be moving in the direction of deferred path tracing similar to what Enscape did as shown here: https://gpuopen.com/deferred-path-tracing-enscape/
I don’t expect EEVEE to actually use this as its primary engine, though, because with the limited temporal resources Blender has, this may just end up being a slight variation on Cycles and not worth it at all.
However, this technique might be an interesting way to handle better (more bounces, more accurate) screen-space reflections and refractions in the future.
Essentially, it uses rasterization instead of primary rays to get a good medium for screen space effects, then it uses either screen-space raymarching or BVH traversal depending on a weighted noise pattern (pure black and white, either screen-space or BVH).
It uses temporal caches for both denoising and speed, and only calculates one bounce per frame (each consecutive GI bounce has a 1 frame delay).
This way, it leverages the speed of rasterization and screen-space techniques to some extent, but manages to avoid baking and most of the artifacts of screen-space reflections and refractions. However, it isn’t strictly real-time (and doesn’t need to be), although it gets close on high-end machines, even without RTX (with 1-2 diffuse bounces).
Then I can safely tell you we didn’t jump on board with that. What Rhino 3D does have though, is Cycles integrated as a viewport mode in Rhino 3D v6. That viewport mode is know as Raytraced. Additionally, in the current WIP that will become eventually Rhino 3D v7 Cycles is actually also the engine powering Rhino Render.
I’ve worked on Rhino 3D since 2014, exactly on that: integrating Cycles in Rhino 3D.
Perhaps the confusion stems from the fact that there used to be a NEON renderer for Rhino v5.