Blender 2.8: Cycles Optix on non-RTX card

OptiX backend doesn’t support the bevel and AO shaders atm.

ah thanks, must be the bevel then.

I wonder if it would be possible to have the error message list the culprit materials/nodes?

Is there a way to only do viewport denoising with OptiX without switching to OptiX in the system preferences? I have a 1070 which works now but only when changing away from CUDA. I’d like to still use CUDA for final renders but OptiX in viewport.

I’m using the method mentioned by Mazay (on 2.83) but even when I select CUDA on preferences “AI Denoising” is still available and works fine.

1 Like

any possibility to make it work on AMD Open CL?

The OptiX denoiser was designed by Nvidia for Nvidia GPUs. I saw there was a project that someone was working on to convert things like CUDA code into OpenCL, and theoretically the same could happen for OptiX to OpenCL, however, it’ll probably take a while to get done.

Brecht has created a task for the Intel denoiser being available for the viewport. The plan is to have this implemented into Blender by 2.90 release and it should work on most x86 CPUs from the last 10 years. https://developer.blender.org/T76259

@JuanGea has the Bone master build which if I remember correctly has a prototype implementation of the Intel denoiser for the viewport.
Windows: https://blender.community/c/graphicall/Mlbbbc/
Linux: https://blender.community/c/graphicall/djbbbc/

2 Likes

Yep, we have the first implementation @StefanW did of the OIDN for viewport :slight_smile:

But keep in mind that it only works when you use CPU as render device, if you use GPU (no matter if it’s Nvidia or AMD) it does not work I’m afraid.

OIDN for viewport will be awesome. I definitely prefer OIDN over Optix in terms of IQ. I suspect on a modern CPU it won’t be that much slower than Optix, but I’ll be interested to benchmark it once it’s committed.

Also, it looks like with the change @brecht is doing, it will part of the render process as well, which is really nice.

Well, I can say that while it’s fast, in a 2990WX is not as fast as Optix in a GTX1080 (which is not an RTX card), but I don’t know if that can be optimized and improved, I have no idea :slight_smile:

I’ll have to do some benchmarking among my systems to get a sense of how OIDN scales with multicore. Maybe it works better with high frequency and fewer cores? I would be nice if the performance could be optimized, but the best solution is an OpenCL denoiser or OIDN adapted to work with OpenCL so that it would be fast on any OpenCL GPU.

Oddly enough, I found a post you made back in January asking a similar thing about RadeonProRender :slight_smile:

Yep, but people that tested it told me that the results were far from optimal compared to Optix Denoiser and OIDN, I’m not sure about it’s current status or results.

AMD RDNA 2 looks like it will be very fast and will include hardware ray tracing. What’s not known yet is the software side. I’m hoping that they realize the importance of ray tracing beyond video games and provide a solid OpenCL-based denoiser (and render acceleration). However, it might be proprietary like Optix. I guess we’ll find out in September.

The denoiser doesn’t work with the raytracking part of the hardware, the RTX cores, but with the AI part of it, the Tensor Cores, I imagine RDNA2 will also include some AI/ML oriented hardware.

Yep, we’ll find out when RDNA2 is released :slight_smile:

where can i find this file?

non RTX cards are already supported in 2.90 and 2.91, no need for you to modify any file :slight_smile:

I use Gtx1080 + Gtx1070 together as CUDA devices on same pc. I tried my gpus as Optix devices and rendered ok with no difference in rendertime, say one given image 3 minutes…
And now that we can use Gtx gpu as Optix devices, what if I buy one Rtx 2060Super and try to render as Optix devices Gtx + Gtx + Rtx? The 2060S will benefit from its tension cores in that mix?

The 2060S will use the RT cores, yes. Those are not the same thing as the tensor cores, btw. Tensor cores are not used in Cycles that I know of, with the possible exception of the OptiX denoiser.

well I’m trying for the past hour already to get my 1080ti to work with 2070 super, all I get is Blender unresponsive or crash to desktop or some error code.

You should use the latest drivers, and right now better if you use Blender 2.92, are you using those two?

Yeah. Did clean drivers install, restart, and tried with 2.92 and the experimental builds. Are you saying that you have managed to make rtx and gtx card to work at the same time?