What does Apple Mac switching to custom ARM mean for Blender?

It’s wonderful news! At this point it is only a matter of time and we will have Godot and Blender with a high performance Vulkan!

I completely doubt that Blender will work at all with Emulation (or make sense that is), because I have never seen emulation that works faster than the original and I doubt that ARM chips will just be magically faster, you have believed too much marketing blabla / geekbench synthetic benchmarks that mean nothing if you really believe that. Blender is a high performance tool, it is never fast enough, emulating it would be insanity.

How the hell do you quote in this strange forum???

quote:

I was wondering: Could this transition to ARM favor better porting of Blender on Raspberry PI 4? The latest version of the PI has 8 GB on board and it would be great to bring a Pocket Blender always ready for use (I think of school students, above all)

That does not make any sense: A RPI 4 ist waaaaaaaaaaaay too weak to be of any use to use Blender on. Even normal Systems are too slow, but a Raspberry Pi is totally the wrong poor performance system for that.

2 Likes

With opengl the Raspberry 4, as you wrote, is not able to run Blender sufficiently. But adopting vulkan could not change the scenarios? Do you have any recommended single-board computer (SBC) solution for blender?

The news I have heard is that they will keep opencl and opengl, even in ARM. I figure after having tried to push every dev to use metal and port all their software (very expensive) they have noticed that on Mac OS, with the tiny tiny market share of around 10% only, nobody really cares and many software packages would be dropped and cancelled if that were to happen. They already destroyed a ton of software when they wiped out all 32 bit compatiblity (pretty much all games that existed on Mac OS and steam) This is, after all, not opensource, where you can just take the source code and recompile it. You have to convince each and every company to do the effort, also for older software.

So now they realized that they actually are not in that strong of a position (like on iOS), but a very weak one: The majority does not care about Mac OS, 90% of all users worldwide chose a PCs with Windows and Linux and they know why and that has been as long as Apple has existed.

Now I have seen the presentation and was surprised to see the name Blender there in a slide, but they did not say a word about it. I’m sure if the Blender Institute saw the slide, they heard Apple say “and Blender will be ported to ARM” and said “we are?”, having heard about it the very first time. :wink:

To me the presentation looked very weak (which is of course natural, talking about the early stage of development): They just threw a lot of names around, without facts behind it. Showing Linux ARM in a VM was also pretty weak, because Apple has shown a clear track record to not care about Linux. They showed Debian in the VM because they had nothing else to show. I’m a big Linux user and my specialty is Debian, but I know how Apple works and if it is not making them billions, they really don’t care about it, they are right now the greediest company on the face of the earth.

Conclusion:
They know what happens when they kill OpenCL and OpenGL, so they won’t, because it will nuke all that is left of all the software still available for Mac OS. When taking set theory, doing a Venn diagram, the intersection of all the apps that are available in 64bit (32 bit apps have been nuked) and all the Apps that will be recompiled for ARM64, plus the intersection of apps that got ported over from OpenCL to Metal Compute, PLUS OpenGL that has been ported to Metal, that intersection won’t be many apps anymore at all.

With opengl the Raspberry 4, as you wrote, is not able to run Blender sufficiently. But adopting vulkan could not change the scenarios? Do you have any recommended single-board computer (SBC) solution for blender?

Vulkan is not going to make a difference. It is more efficient, yes. But new tires on a vespa do not a 40 ton truck make. :wink:
No, none. SBCs are too weak, even laptops can be too weak for Blender. It of course depends on what you want to do with it, if you just model stuff that is fine, but rendering will be a problem, because the CPU will be way too weak and the GPU non existent (and not support the normal number crunching frameworks like CUDA (Nvidia) and OpenCL (AMD))
You also have to consider the RAM or the lack thereof (a GPU needs to have as much RAM as the scene needs). The ARM in the RPI 4 is way too weak for rendering. If we had Blender Benchmark, I could easily demonstrate by showing that a scene that you can render on a i9 Intel in 30 Minutes would take 20 hours on a RPI 4. We will get that chance, because I’m sure Blender and Blender Benchmark will get ported to ARM, but it might not happen in the next years.

I wanted to link to blender benchmark, but you are only allowed to use 2 links max :wink: LOL
I didn’t know links are expensive, you can easily find it by looking for Blender Benchmark in google or duckduckgo

(((Talking about RAM: I tried to load one of the open movie scenes (Cosmos Laundromat: First Cycle)

into my Laptop that has 16 GB of RAM. It overfilled the RAM and crashed the laptop :wink: All that amazing glitz takes a lot of power.)))

The Blender Institute has much more important things to do than fix the invented and arbitrary problems de jour that Apple invents in their quest to make their users life’s as miserable as possible, for example wiping out tons of software that still worked in the last release of Mac OS.

They actually focus on providing bugfixes and new useful features.
Blender 3D is a special tool, a very powerful and super complex 3D Renderer that demands high specs from a machine to work properly. The RPI 4 is not in that category, not even close.

All SBC I know are very low powered, they have other purposes than high performance number crunching that powers 3D rendering. IoT needs to be power saving and possibly running on a battery.

1 Like

https://www.amd.com/en/products/embedded-minipc-solutions

Great and very clear explanation! :wink: As an Italian I really liked the comparison with the Vespa! :smile: However far from me the idea of rendering on an SBC. Only in the school environment would it be right to do basic modeling and texturing, with simple and not complex objects to be completed in a few hours of work. The AMD solution mentioned by @megalomaniak would be too expensive for the kids.

When procured individually, yes. For a larger school procurement in bulk(say for multiple schools in a state/city) it would likely come with some discounts worked out.

At present, all I know is that the TEGRA series development boards of nvidia have desktop OpenGL and can run blender normally.

Interesting. But Tegra works mainly with Android OS. I know that nVidia has created Linux drivers but, to my knowledge, there is no distro specifically dedicated to Tegra. Between pros and cons, to avoid headaches, a classic mini-pc with intel or AMD chipset and Linux OS would be more appropriate for schools.

1 Like

Have you never heard of a project called “LINUX FOR TEGRA”? (L4T)This is official from NVIDIA.

I’ve heard of it but isn’t it a Linux platform for Tegra developers? Is it comparable to a distro like, for example, Linux Mint? I would be able to install it but I don’t think school technical assistants are able to do it easily, unless they are “nerds”. Just to clarify: The technical situation in Italian schools is not the best compared to others in Europe.

Addenda: I found this video about nVidia Jetson Nano and Blender

Just an image file, burned to SD card, DIST is ubuntu

1 Like

Yes, very similar to the Raspberry. I’ll be honest: Tegra intrigues me. I will buy a Jetson Nano to test it. Thanks so much for mentioning it.

Apple has already deprecated OpenGL and OpenCL and IMO are not likely to support it further beyond the first version or two of macOS 11 (11.0 and 11.1). From what I gather Metal has matured quite a bit in the last few years and has the capability to replace both. Yes it means developers will have to put in some extra work but many are starting to /have already done that. Adobe, Serif, BlackMagic, RED, now Otoy, and others. Albeit in the 3D space it’s not pretty, I grant.

But it’s clear Apple has done some serious work on Metal the last couple years and documenting same, IMO for now no one should question their commitment. AFAICT this is happening whether we like it or not. OpenCL meantime in it’s latest revision, is touting reversion as innovation. OpenCL 2.0 was so poorly adopted that the main feature of OpenCL 3.0 is making it 1.2 again. : )

https://www.anandtech.com/show/15746/opencl-30-announced-hitting-reset-on-compute-frameworks

So I’m not averse to criticizing Apple — they have a history of changing APIs too frequently and sometimes arbitrarily, but this isn’t one of those cases. OpenCL from what I understand, even in earlier versions was somewhat lacking as an API and in terms of robust documentation. And what it was originally intended to do (displace the proprietary CUDA) was never going to do because the pace of development didn’t match NVIDIA’s and on top of that big green created a poor implementation of OpenCL-on-NVIDIA, which was supposed to be a key to “everyone using it”. Naturally then game developers and others kept using CUDA because their stuff worked better on NVIDIA hardware when they did. At least that’s my limited understanding of the whole years-long snafu. Won’t get into the Apple vs. NVIDIA aspects. Not worth thinking about any longer except to say you had two titans (no pun intended) that were not going to meet the other half way. Pure ego-war.

NVIDIA wants to be dominant IMO. Anything that relies on them to be “good open source players” is doomed to failure IMO because they want their APIs, not open APIs, to be the ones everyone uses. Not that Metal is any less proprietary but Apple was coming at it from a position of their GPU and graphics offerings lagging badly. If they didn’t don’t do something drastic, they’re going to completely loose out in this space because more creative apps, more games, more of everything is now leveraging the GPU and the trend is increasing. So how to make everything competitive with AMD as your hardware platform (short term)? Roll your own APIs and frameworks (Metal, CoreML, etc) and your own drivers and remove your dependencies on everyone who is not Apple (ultimately). Then they just went a step further with Apple Silicon and Apple GPUs, so that will be interesting. Now I’m not even sure what AMD’s incentive is to work with Apple beyond the next 2 years. lol Messed up but never dull.

Who knows maybe in 2 years Apple will have the biggest, baddest GPU on the market but I’ll believe it when I see it. : )

2 Likes

Blender developers need to start working on a touch interface.

I think the question that everyone is asking here is if we gonna also have an official version of Blender for iPad.

Unless something changed recently, this is something Apple prevents by not allowing apps to include their own scripting languages.

1 Like

I did’n make this up,l they seem to be holding on to OpenGL and OpenCL, which is wise, because Metal has not been a success on their target System (Mac OS):

https://developer.apple.com/documentation/xcode/porting_your_macos_apps_to_apple_silicon
You will notice that they mention OpenGL on Apple Silicon.

Yes, they might have done some work on metal, but that is not the point. Even if it is the most efficient graphics system on the planet, Macs have a TINY TINY marketshare, only around 10%, 90% of the world runs on PCs and even considering the brilliant marketing of Apple and screaming louder than the rest 90%, they still don’t define worth a damn in tech policy in that space. They do it on the iOS market, because in the USA (only that country) Apple has 52% market share. They can force everybody to use metal (which they did) and of course in the mobile space everybody will bow down and fall to their knees and obey.
But mobile apps are often jokes and much reduced functionality programs compared to their full fledged powerful PC / Mac Versions. No serious workstation would dream about working only with 4 GB, like some mobile Devices do.

The point in metal adoption is that even though it is more efficient, it is also a vendor lock-in highly proprietary technology that is supported nowhere than Apple. If you take 10% Mac OS marketshare and calculate how much 3D Users are of that, maybe 5%, so 5% of 10% would be 0.5%. Having to rewrite a software with a completely new graphics layer (and not just OpenGL but also OpenCL, to metal computer, of which nobody that does anything in that industry has ever heard about) will likely cost more than such a tiny market will ever yield back. Some companies have stopped supporting Mac OS when Apple threw out the baby with the bathwater when they nuked all 32 Bit support. Blender 3D for example will not work for Mac anymore after OpenGL gets nuked. The announcement of metal was many years ago, it had always been deprecated, I was amazed that they STILL had it and now hearing the annoucement from Apple that Apple Silicon would still have it makes it clear for me: Metal has by far not been the success Apple wanted. Apple, being arrogant on the iOS and iPhone success pretends that it is also as successful in the Big Boy world of Laptops and Desktops. But seem like companies aren’t migrating to metal. In the case of Blender 3D, Blender is opensource and gets it’s money by donations and fundraiser, they won’t waste money on something that Apple arbitrarily just pulled out of their posterior. Fact is that Blender works very efficient on OpenGL on all platforms, metal is not needed and just adds a horrible amount of work. Which might be justified, IF metal were a new international standard, like Vulkan. BUT metal only exists on 10% of all machines and would be used by much less than that. Even now, Blender for Mac OS is less used than all other OSes, even Linux has 10 times more users, and that is thanks to Apple being a complete jerk:

They tried to kill OpenGL years before just by providing an old and super crappy version of it. I sound pretty anti Apple and as a technologist, they start to suck more and more as a company by purposefully ignoring their users and showing very annoying high amount of arrogance. A good company would offer metal, but keep OpenGl and CL and 32 bit too, (as well as 64 Bit: Every Linux machine now has a 64 Bit Kernel and software on it, but still supports 32 Bit, of course, why not? Give the user the choice to use whatever) Yes, the migration to 64 bit has been suuuuuuuuuper slow, like the one to IPV6, but forcing it and kicking people out is not the right way
Apple is doing more more really bad technology decisions for the whole market, they are becoming more and more super proprietary and as a opensource advocate and one of freedom, they are increasing the bars on that walled “garden” and strengthening the already strong vendor lock-in.
They are also dumbing down the whole IT field in my view, often pandering to the less (technologically) educated that pride themselves that they don’t know what RAM or a CPU is.

Now Blender might port to ARM (which is easy done, not much work really, and did I hear rumors that Apple will pay them for it?) and the same for metal, because the Blender Institute is not against Metal or Apple, but they just don’t want to spend a lot of money for something that only benefits a tiny tiny fraction of their Userbase.

I’m starting to read the OpenCL article, it is very long and detailed, information that years ago nobody knew but I could see it: OpenCL got to a super slow start, lagging behind CUDA for years by not being usable at all.
last time I checked OpenCL was 3 years ago and I noticed that it gave me only 50% of the performance that CUDA provided. Now CUDA is highly optimized and proprietary: part of a vendor lock-in ploy of Nvidia. BUT if I want to buy a powerful GPU (which means Nvidia), and want to use CUDA (which ALSO means Nvidia), there is not much choice really. OpenCL was never AMDs invention and since it being an open standard, any development efford expended will automatically benefit everybody else in the field, which will be saving money and still be able to use it fully, so that has not given a really great incentive to any company to make it more efficient. BUT it is better than metal compute because it is universal (works on (AMD GPUs need it) Linux, Windows and so far Mac OS, and not just Mac OS, which is the weakest of the market for GPGPU work.
Nvidia rules in high performance GPUs right now (and has for years) Apple not allowing Macs to use the GPUs is EXACTLY what destroys any growth in market share in Apple, even though they “ARE SUPER POPULAR ™ (R)” So anybody that wants to do rendering or anything will just come to the conclusion that Apple sucks for not giving them the choice to use the most powerful GPUs in the world. And they switch to PC, every year people give up mac and return to PC because it is just too limiting and restricted. I mean who came up with the super super dumb idea to only allow driver updates from Apple itself? I mean if Microsoft did that, they would lag behind or forget to upgrade a driver and the devices would suffer from it. Since these are often devices from other manufacturers, let these provide the newest drivers directly. The above open letter to Tim Cook shows that problem with OpenGL on Macs, sticking passionately to some prehistoric version of OpenGL.
And yes, OpenGL is hardly perfect and efficient, BUT it works without modifying software to some proprietary rendering standard of just one manufacturer that work nowhere else. Vulkan will be the logical next step for Windows and Linux, as well as the mobile devices running Android.

Yeah, two titans that don’t even share markets together. Nobody buys GPUs from Apple and nobody expects any usable GPU performance out of a mac, since you can only get AMD (bad), with OpenCL (worse!) In my view, AMD has just absolutely not focused on developing great GPUs, their focus is on CPUs. ATI, which AMD bought had amazing GPUs, I owned a few of them when Nvidia was the underdog and had much worse price/performance ratios.
GPUs after ATI became part of AMD have gotten progressively worse in the next years, I would assume that most people of ATI slowly left AMD, because it was not a good company like ATI anymore, but nothing more than a department in a large company. I have seen that a lot.
The fact is that Nvidia doesn’t care about any market share that Apple could give them: Nvidia cards have never been available on Macs and people that need a good GPU, they were running Hackintosh OR just bought a PC next to their Macs, a very powerful PC that made the Mac look stupid and slow.
So they already had these customers, it was driving them away from Apple. Now hackintoshes will die a fiery death when Apple ARM Silicon comes around, because the OS will be proprietarily written for only their CPU, which only they sell, which means Mac OS will not run on any CPU that you can put in a PC, only Apple Hardware can run it. That is by the way how Apple makes record profits: People want the software and OS and maybe the shiny looking computer cases, so they need Apple Hardware. Apple hardware is very overpriced, locked down and not extendable (no replacing RAM or HDDs SSDs with larger ones at market rate)
For everything you absolutely must buy a Apple device, because they let nobody else play.
Apple on the other hand has nothing that Nvida would fear, all mobile GPUs are a ricidulous joke compared to the real size GPU versions. A RTX 2080 can use up to 250 watts when fullly powered, a mobile GPU has to work with passive cooling (not even with cooling fins) while being able to use only a few watts of power to not wipe out the tiny battery of a mobile device.

Nvidia does not just want to be dominant. They ARE. I could not buy another GPU from another manufacturers in years, even having a huge need for these and buying them by the hundrets.
Even on Linux (as well as Windows) CUDA is incredible in cutting down on Render Time (I use Blender 3D a lot, that is why I know) Look at how tiny the market share of OpenCL is in Blender Benchmark, that shows the attractivity of Nvidia. (I can’t include more links, this is a non scientific discussion, where you are limited of how many sources you can reference, but goggle for blender open data, and you will find around 100’000 benchmarks run, on a variety of machines. Sadly the benchmarks don’t allow to show more than 10 on the site, I talked to them to get that changed, because it is silly and useless to only show 10 of 100’000) Nvidia: Yes, they are also super proprietary, but they offer drivers for Linux, for an opensource OS. They don’t do that anymore with Macs, thanks to Apple.
I have to suffer Nvidia, because they are the best game in town, on any OS. Even them being super proprietary and the drivers on Linux being not anything close to opensource.
Very stimulating and interesting conversatin we are having here :wink:

Yeah, well, Apple is doing the old “Empire of ONE company”, where a vendor wants to become everything to everybody. It fails, some things Apple does very well (software better than hardware, watch the rossmann movies on youtube about that, Apple made A LOT OF GARBAGE in the last years. Just the keyboard is an example of incredible stupidity and incompetence. Nobody in the PC space has made a “bad” keyboard in 20 years. Nobody would stand of it, the competition would destroy that manufacturer over night, people returning their defective keyboards and buying the competition’s.

I don’t understand in any case why Apple was always locked on only 1 producer, I mean AMD and Intel are compatible, which has always felt to me like an unlikely fairy tale that would not be true, as no two vendors do share the same proprietary thing.

1 Like

Another collateral damage of no Open GL in Mac Os might me no web GL, idk what are they thinking.