When you use CUDA with an RTX card you cannot use nodes like AO or Bevel in the shaders, those nodes are incompatible with OPTIX so far, and that’s understandable, but for some reason you cannot enable Optix denoising if you use those nodes, I imagine this is due to a check that is halting the Optix engine, but that check should probably not be present for the denoiser when you use CUDA as the engine.
You can try the good old D-Noise from Grant Wilk, it was possible to use it with GTX cards.
I think it’s amatter of “support” check in code, not a technical thing I think.
Is Patrick Mours the developer in charge of this, right? (I’m not sure if he’s a forum user)
I wonder if Patrick is aware of this. Perhaps this is easy to implement, but perhaps it happens that since the development was more oriented to OptiX than CUDA, this has been overlooked.
It would be great to be able to use OptiX denoiser with all the features supported by CUDA.