This is my workaround for getting OPTIX to work on my GTX 1060 on Windows 10 64-bit. This might not be the recommended way but I’m not on Linux and I had to figure out myself how to get it to work, you are welcome to try it out.
I keep experimental releases inside "C:\dev\Blender\Blender Portable", extract your experimental build and rename the folder to something simple like “Blender Portable” in my case.
Create a shortcut of ‘cmd.exe’ from "C:\Windows\system32"
Copy ‘cmd.exe’ and paste as shortcut in "C:\dev\Blender"
Go into shortcut properties, in Target directory you’ll see this "C:\Windows\system32\cmd.exe"
Rename “cmd.exe - Shortcut” to something like “OPTIX TEST” and simply run it
Blender Preferences -> System -> Optix (Your TitanV should now be listed when you select Optix instead of CUDA)
Remember that Optix will only work when you run blender from this shortcut. If you followed my directory setup then you can download latest experimental builds, extract and rename them to “Blender Portable” and your “OPTIX TEST” will work without messing with Target properties every single time.
I’m still working to see just how much it will speed things up. If I put the tile size = to the frame size (1500X and 2000Y) then the frame renders in 4 seconds at 64 passes. However, as the denoising algo needs adjacent tiles, it won’t denoise and it freezes. If I divide the frame into 4 equal tiles it renders + denoise in 11 seconds, the bulk of that rendering the tiles. Will be interesting if I can figure out if it is possible to adjust the AI algorithm to run on a single tile and see what the results are, it would more than double my rendering speed for an animation.
Getting great results with 2 x 1080 Ti using this–modest rendering speedup vs. CUDA but imo the denoising is superior, and viewport denoising is a treat. Would love to see it in the standard options!
Hello i’d like to use my 2x980Ti with OPTIX, but I’m not advanced in changing code. I’ve tried akshayxw69 “exe” solution, but with no results. Blender doesn’t open:( Could someone please help?
With this commit, 2.90 should now have OptiX enabled for all Nvidia GPUs from the GTX 900 series and higher without any modification to the source code.
I found even simpler way. Just go to Blender shortcut properties and paste this in Target field: CMD /c "SET CYCLES_OPTIX_TEST=1 && START "" "blender.exe""
I don’t seem able to get the optix denoiser working on my 1070. Just says cancel in the top left corner. Does it require a specific driver? I’m using the latest studio driver:
Is there a way to only do viewport denoising with OptiX without switching to OptiX in the system preferences? I have a 1070 which works now but only when changing away from CUDA. I’d like to still use CUDA for final renders but OptiX in viewport.
The OptiX denoiser was designed by Nvidia for Nvidia GPUs. I saw there was a project that someone was working on to convert things like CUDA code into OpenCL, and theoretically the same could happen for OptiX to OpenCL, however, it’ll probably take a while to get done.
Brecht has created a task for the Intel denoiser being available for the viewport. The plan is to have this implemented into Blender by 2.90 release and it should work on most x86 CPUs from the last 10 years. https://developer.blender.org/T76259