Downloads slow, why mirrors and not CDN?

I’ve seen a number of reports lately that downloading Blender is slow (which matches my personal experience, typically 0.2-2MB/s). Presumably this is dependent on the individual mirrors being selected for each user, and the traffic on them at the time of download.

Why is Blender still using download mirror partners instead of a reliable CDN?

For polyhaven we use Cloudflare and see around 1M users and 80TB of traffic handled per month, which is around half of what blender handles according to the most recent Blender by the Numbers, the majority of which (~90%) is cached and handled by CF instead of the origin server. This costs us $20, and could be covered by their FOSS sponsor program if we bothered to apply for that.

We also use Backblaze B2 as our storage host, meaning we don’t even pay for (or have the hassle of managing) a webserver to host our files. Backblaze and CF have a partnership where there are zero egress fees. I.e. to host unlimited traffic, all you pay for is the storage cost (pennies) and the $20 CF subscription.

If Blender moved from download mirrors to CF, I assume it’d save time and effort when publishing new releases, improve download speeds significantly for all users around the world, and take a load off of servers.

I have a fair amount of experience juggling page rules and such on CF (specifying which domains/paths to cache, etc), I’d be happy to help set things up if needed.


The reason is probably more dogmatic than pragmatic : mirrors smell federated, linux distributions and free software, while CDNs have a taste of centralization, big corps and the end of the net neutrality.

The 3.0 showreel on youtube has 100k views, the same video on peertube has less than 100 views. Yet one model is probably more desirable than the other.

So a bit of Gramps Is in the Resistance, and I like it.

That may have been a reason initially, but Blender is now on Steam, MS store, etc. I’m sure it’s more a matter of “this is the way we’ve done it for the last 10 years, and no one has done anything to change that”.

Clearly the current method has some serious drawbacks (speed, latency, maintainability, many points of failure), but (so far) it hasn’t been enough of an issue for anyone to do anything about it.

I saw Ton mention that, for the Clakson University mirror, the Blender downloads used 75% of all their traffic, over 11TB in one day.

I don’t know that paying for CDN’s is something that is practical, especially given that we were making arrangements to get an increased uplink at the datacenter, and that unlike the Cloud CDN that pays for itself via subscriptions, the foundation would have to foot this bill themselves.

Now, that isn’t to say that having a CDN isn’t a great ideal for more practical uses, such as web site page assets, to help things be snappier. We could just spin up global Google storage bucket, or S3, to serve such files, with the added benefit of already having strong authentication built into the system.

Either way, I think I would wait and see what happens with this uplinks that are in the works, and observe how much it helps. But that’s just my opinion, I may be wrong :wink:

1 Like