Blender 2.9 CUDA on Mac

Wooooaahhh it’s working now! It was weird, I didn’t change anything in the settings for the Cycles Add-on menu. I guess it just wanted me to look at it, then it worked? Computers… so fickle.

The viewport denoising works really well. I loaded up a big scene I’ve been working on and it’s surprisingly responsive. In fact, loading cycles in the viewport felt instant, much different from 2.83. Interesting! Although both the viewport render and the image viewer render cause crashes if I throw too much geometry or textures at it, even if 2.83 can handle the scene fine. Bound to happen with this sort of setup I suppose. But nonetheless, this is amazing! Thanks again for sharing. It’ll be interesting to see how long this patch works on future versions of Blender.

1 Like

Happy to know you got it working! Pass on the word, when I was searching for a solution to the gpu acceleration on Mac I saw many posts of people upset by the drop of Cycles support in recent Blender…

As for the longevity of the patch, it doesn’t look like Cycles use of CUDA is about to stretch the tools available on High Sierra soon. The issue is more on the general decline of support of everything else, as you could see with homebrew. And with Apple dropping OpenGL, using Metal while others go Vulkan & etc, hard to see what’s coming in the near future. I don’t see myself able to work out of optimised hackintosh hardware for very long (it’s not that Apple machines aren’t good, but a custom-built machine is just better).

1 Like

If I meet anyone else in this situation, I’ll definitely tell them about this!

I plan on using my HS system for as long as I can. Working in macOS is so nice, I couldn’t imagine switching to anything else. I guess we’ll see what the next few years bring in terms of Apple’s new hardware & Cycles for Metal/Vulkan/etc… who knows what’s gonna happen there!

I’m a regular user, I’m not a developer. But I really want to use the new version with my Gpus.
Does this procedure work only on hackintosh or on native macs?

Would you help me? Do I need to install Xcode?

I was unable to do the dmg installation process following Bruno Wego’s method

I only managed to download the dmg
cuda_10.2.89_mac.dmg

from now on I was afraid and came to ask here before doing something wrong and ruin everything!

I’m pretty sure you do need Xcode installed. I have version 10.1, you can download it from Apple’s developer website.

Although I don’t think you’d need Xcode to install the CUDA toolkit… do you have wget installed?

What’s your hardware/OS configuration?
If it’s close enough to my machine I can ship you a binary that may work. Otherwise you’ll need to setup your machine for building Blender… it’s just a few steps, but I’ve got 30 years of compiling experience on the platform (NeXTSTEP 0.8…) so I’ve lost track of the required learning curve at each step.

Hi Hugo, first of all thank you

this is a Macpro 5.1 (Mid 2012) - 2x 3.46 Ghz
High Sierra 10.13.6
Nvidia Driver 387.10.10.10.40.138

I installed Xcode and cuda_10.2.89, but there are no tools in application

Thank you very much for your help !!

Hi Hugo,
I have the latest Big Sur 11.2 installed, don’t know though whether your compiled binary will work on my machine or not but if you can share the binary I can give it a try?
Thanks

The main components are looking good. If you want to go further you’ll need to get XCode to get the basic dev tools, then build the missing dev tools from sourcee that you’d normally install with HomeBrew. You’d do that by fetching each missing dev tool (actually is there more than cmake? git?), then make + make install or equivalent.
And then you should be good to build Blender.

Alternatively if you can specify the exact model of gpu you have, I can put a binary of Blender compiled for your gpu somewhere. I’d expect it to work out fine given how similar our basic setup is.

Everything I’ve read about nvidia support on mac says it stopped at 10.13.6. So given how low level & fragile the gpu support stuff is, I’m guessing there’s 99.99999% chances it won’t work on 11.2 (you’ll also need the nvidia binairies to work beside the Blender binary).
Any way you can move back to 10.13 ?

Probably will need to be up to date with Apple although the latest updates were breaking several software :frowning:
Currently I am not doing a lot of rendering as most of my work is modeling but as soon as I start doing some rendering the current setup won’t work for me

Don’t really know what should I do and started looking at eGPU options but even these options are very expensive plus if the problem is in the operating system then this would be a dead end, my other option would be repartition my Mac and install Windows on another partition, what do you think; should I go with this option as it seems to be the cheapest one currently?

Hardware Overview:

Model Name: Mac Pro
Model Identifier: MacPro5,1
Processor Name: 6-Core Intel Xeon
Processor Speed: 3,46 GHz
Number of Processors: 2
Total Number of Cores: 12
L2 Cache (per Core): 256 KB
L3 Cache (per Processor): 12 MB
Memory: 64 GB
Boot ROM Version: 144.0.0.0.0
SMC Version (system): 1.39f11
SMC Version (processor tray): 1.39f11
Serial Number (system): C07K509CF4MH
Serial Number (processor tray): J530400F5BH8C
Hardware UUID: 55920A08-00B8-5BC9-A355-62973637A5FD

System Software Overview:

System Version: macOS 10.13.6 (17G14019)
Kernel Version: Darwin 17.7.0
Boot Volume: MACPRO
Boot Mode: Normal
Computer Name: MACPRO
User Name: Hdalio (hdalio)
Secure Virtual Memory: Enabled
System Integrity Protection: Disabled
Time since boot: 4 days 9:47

If you just want to use GPU for renders, then you should consider using a cloud system, eg AWS EC2 P2 instances. It’s pretty easy to setup, and also pretty inexpensive. I budget 8-12 usd per minute of animation (24 fps) for something that takes the 1050 Ti in my mac 10-20 min per frame to render; and the actual render time on the cloud depends on how many instances are running in parallel, which is perfect for meeting tight deadlines. Alternatively if you already bought an expensive GPU, you can put it in a “cheap” linux box and get a cheaper GPU to support the editing / live preview using Eevee (that’s what I did initially, but now for any serious rendering I use AWS, just use the linux box for testing new code or setup that will end up on the cloud machines).

What is your GPU model?

GPUS:

Nvidia GTX Titan XP (Pascal)

Nvidia Geforce GTX 1080 TI

Ok I’ve made a binary with 2.93.0 alpha with CUDA 6.1 support (ie matching the Titan XP (wow!) and the 1080).

Note for the compilation: 2.93 uses python3.9, which in turns uses “__isPlatformVersionAtLeast”, a post-10.13 symbol, so I had to implement that from scratch. I tried a few scenes with moderate complexity and nothing crashed so I’m thinking it’s all good, but that + the software is alpha state, so save often just in case.

File blender.2.93.0_alpha.dmg (248 mb) is at: https://drive.google.com/file/d/1JqyP9L9YKXQmbhFxLWdPs8xAQozIzwYo/view?usp=sharing

2 Likes

I am very grateful Hugo, Thank you very much!!
I copy the Blender File to Aplications?
and the other files?

Screen Shot 2021-02-16 at 08.24.27

Pleasure! I think you only need the Blender folder/app, but I put the rest of the stuff just in case…
Let me know if it is running!

Hugo, copy of blender replaces the file, opened everything ok, but it does not appear in the cuda in the System part, do I need to reset some preferences or files from the old version?

1 Like

It’s in the Add-ons section, way down at the Cycles options: there’s a CUDA tab, in which you should see (and activate) the graphic cards.

2 Likes