I just want to know if this doesn’t work on my card because it is a GTX card, or if this may be a bug, or if just that viewport denoiser has not been implemented yet for other viewport resolutions other than 1x.
If you edit the source you will get the GTX card working with Optix and the viewport denoise, I tested it with a GTX970 (use latest drivers)
This sounds like a bug.
Where is this file ? You need to make a custom build ?
Thank you. I have reported the problem just in case.
@Angelrebirth. You must build with the modifications indicated in the first message.
Or you look for a build that contains those modifications:
There’s no need for a custom build for Optix on GTX cards anymore.
Cycles/Optix: Add CYCLES_OPTIX_TEST override
Tested with Nvidia 750Ti, viewport denoiser and render with Optix works, between 10 to 20 seconds more faster depending on the scene i try.
To clarify this for people – this isn’t a CMAKE flag, it’s an environment variable (the picture above looks like a cmake-gui window to me? But maybe it’s a gui environment variable editor).
So for example, on my Ubuntu machine, I can set the variable with:
CYCLES_OPTIX_TEST="all" and then, when I run Blender from this console, it will have access to this environment variable, and I’ll be able to attempt to use Optix with my GTX card (works beautifully on GTX1080). Now, if I want to make this persistent so that I don’t have to set the variable everytime, I can
export CYCLES_OPTIX_TEST and it will be saved as an environmental variable.
This works on the build-bot builds, ever since the commit was added.
I hope this helps people who were confused, like me. Thanks @LazyDodo for helping me over at Blender.chat, and for making the commit.
edit: am a idiot.
./blender runs blender
This is just the KDE properties editor for a launcher.
If you want to manually edit Linux launcher, “Exec =” line should be:
If there are spaces in the folder name, put an asterix * corresponding to the space.
does latest blender 2.83 build disabled the RTX only cards?
Can you give more detail on how to do this? I am new to programming but wanted to get into it with blender. I wanted to start by enabling OptiX on my TitanV, but I have tried a custom mod with suggestions on this thread as well as the links here:
I’m using windows 64 bit. I tried compiling in Visual Studio 2019.
This is my workaround for getting OPTIX to work on my GTX 1060 on Windows 10 64-bit. This might not be the recommended way but I’m not on Linux and I had to figure out myself how to get it to work, you are welcome to try it out.
Get latest builds from here - https://builder.blender.org/download/
I keep experimental releases inside
"C:\dev\Blender\Blender Portable", extract your experimental build and rename the folder to something simple like “Blender Portable” in my case.
Create a shortcut of ‘cmd.exe’ from
Copy ‘cmd.exe’ and paste as shortcut in
Go into shortcut properties, in Target directory you’ll see this
"C:\Windows\System32\cmd.exe /c "SET CYCLES_OPTIX_TEST=1 && START /D ^"C:\dev\Blender\Blender Portable^" blender.exe""
Rename “cmd.exe - Shortcut” to something like “OPTIX TEST” and simply run it
Blender Preferences -> System -> Optix (Your TitanV should now be listed when you select Optix instead of CUDA)
Remember that Optix will only work when you run blender from this shortcut. If you followed my directory setup then you can download latest experimental builds, extract and rename them to “Blender Portable” and your “OPTIX TEST” will work without messing with Target properties every single time.
This WORKED! Thanks so much for taking your time to answer my question. At first I had a “side by side” configuration error which I fixed by reinstalling the latest visual studio c++ package.
I’m still working to see just how much it will speed things up. If I put the tile size = to the frame size (1500X and 2000Y) then the frame renders in 4 seconds at 64 passes. However, as the denoising algo needs adjacent tiles, it won’t denoise and it freezes. If I divide the frame into 4 equal tiles it renders + denoise in 11 seconds, the bulk of that rendering the tiles. Will be interesting if I can figure out if it is possible to adjust the AI algorithm to run on a single tile and see what the results are, it would more than double my rendering speed for an animation.
Getting great results with 2 x 1080 Ti using this–modest rendering speedup vs. CUDA but imo the denoising is superior, and viewport denoising is a treat. Would love to see it in the standard options!
Thanks, This worked.
Hello i’d like to use my 2x980Ti with OPTIX, but I’m not advanced in changing code. I’ve tried akshayxw69 “exe” solution, but with no results. Blender doesn’t open:( Could someone please help?
You can always use our build:
https://blender.community/c/graphicall/kdbbbc/ (still not the release one, I have to update it, but close enough)
https://blender.community/c/graphicall/Mlbbbc/ (from yesterday, but be aware that there is a big bug being hunted right now)
With this commit, 2.90 should now have OptiX enabled for all Nvidia GPUs from the GTX 900 series and higher without any modification to the source code.
Great! Rendering almost faster than in Eevee:)
I found even simpler way. Just go to Blender shortcut properties and paste this in Target field:
CMD /c "SET CYCLES_OPTIX_TEST=1 && START "" "blender.exe""