Blender 2.8: Cycles Optix on non-RTX card

This is my workaround for getting OPTIX to work on my GTX 1060 on Windows 10 64-bit. This might not be the recommended way but I’m not on Linux and I had to figure out myself how to get it to work, you are welcome to try it out.

Get latest builds from here - https://builder.blender.org/download/

I keep experimental releases inside "C:\dev\Blender\Blender Portable", extract your experimental build and rename the folder to something simple like “Blender Portable” in my case.

Create a shortcut of ‘cmd.exe’ from "C:\Windows\system32"

Copy ‘cmd.exe’ and paste as shortcut in "C:\dev\Blender"

Go into shortcut properties, in Target directory you’ll see this "C:\Windows\system32\cmd.exe"

Replace with "C:\Windows\System32\cmd.exe /c "SET CYCLES_OPTIX_TEST=1 && START /D ^"C:\dev\Blender\Blender Portable^" blender.exe""

Rename “cmd.exe - Shortcut” to something like “OPTIX TEST” and simply run it

Blender Preferences -> System -> Optix (Your TitanV should now be listed when you select Optix instead of CUDA)

Remember that Optix will only work when you run blender from this shortcut. If you followed my directory setup then you can download latest experimental builds, extract and rename them to “Blender Portable” and your “OPTIX TEST” will work without messing with Target properties every single time.

1 Like

This WORKED! Thanks so much for taking your time to answer my question. At first I had a “side by side” configuration error which I fixed by reinstalling the latest visual studio c++ package.
https://support.microsoft.com/en-us/help/2977003/the-latest-supported-visual-c-downloads

I’m still working to see just how much it will speed things up. If I put the tile size = to the frame size (1500X and 2000Y) then the frame renders in 4 seconds at 64 passes. However, as the denoising algo needs adjacent tiles, it won’t denoise and it freezes. If I divide the frame into 4 equal tiles it renders + denoise in 11 seconds, the bulk of that rendering the tiles. Will be interesting if I can figure out if it is possible to adjust the AI algorithm to run on a single tile and see what the results are, it would more than double my rendering speed for an animation.

Getting great results with 2 x 1080 Ti using this–modest rendering speedup vs. CUDA but imo the denoising is superior, and viewport denoising is a treat. Would love to see it in the standard options!

Thanks, This worked.

1 Like

Hello i’d like to use my 2x980Ti with OPTIX, but I’m not advanced in changing code. I’ve tried akshayxw69 “exe” solution, but with no results. Blender doesn’t open:( Could someone please help?

You can always use our build:

2.83:
https://blender.community/c/graphicall/kdbbbc/ (still not the release one, I have to update it, but close enough)

2.90:
https://blender.community/c/graphicall/Mlbbbc/ (from yesterday, but be aware that there is a big bug being hunted right now)

3 Likes

With this commit, 2.90 should now have OptiX enabled for all Nvidia GPUs from the GTX 900 series and higher without any modification to the source code.

4 Likes

Awesome news :slight_smile:

Great! Rendering almost faster than in Eevee:)

Thank you

I found even simpler way. Just go to Blender shortcut properties and paste this in Target field:
CMD /c "SET CYCLES_OPTIX_TEST=1 && START "" "blender.exe""

1 Like

Not needed anymore starting tonights dailies.

1 Like

Hi there, it’s possible to open from terminal on osx?
thanks

I don’t seem able to get the optix denoiser working on my 1070. Just says cancel in the top left corner. Does it require a specific driver? I’m using the latest studio driver:

1 Like

oh, perhaps it’s related to one of my shaders…light path node?

error:

OptiX implementation does not support shader raytracing yet

OptiX backend doesn’t support the bevel and AO shaders atm.

ah thanks, must be the bevel then.

I wonder if it would be possible to have the error message list the culprit materials/nodes?

Is there a way to only do viewport denoising with OptiX without switching to OptiX in the system preferences? I have a 1070 which works now but only when changing away from CUDA. I’d like to still use CUDA for final renders but OptiX in viewport.

I’m using the method mentioned by Mazay (on 2.83) but even when I select CUDA on preferences “AI Denoising” is still available and works fine.

1 Like

any possibility to make it work on AMD Open CL?

The OptiX denoiser was designed by Nvidia for Nvidia GPUs. I saw there was a project that someone was working on to convert things like CUDA code into OpenCL, and theoretically the same could happen for OptiX to OpenCL, however, it’ll probably take a while to get done.

Brecht has created a task for the Intel denoiser being available for the viewport. The plan is to have this implemented into Blender by 2.90 release and it should work on most x86 CPUs from the last 10 years. https://developer.blender.org/T76259

@JuanGea has the Bone master build which if I remember correctly has a prototype implementation of the Intel denoiser for the viewport.
Windows: https://blender.community/c/graphicall/Mlbbbc/
Linux: https://blender.community/c/graphicall/djbbbc/

2 Likes