My name is Antonio and I am CTO at R2U, and we empower e-commerce stores with augmented reality and computer graphic imagery.
We have recently developed a scalable render infrastructure on the cloud to handle our render pipeline using Blender Cycles.
To leverage the full power of GPU-optimized virtual machines, we are trying to establish a communication channel between our servers and 3D Artists’ workstations, so that they can speed up their creative process. If we manage to take their Blender scenes to the cloud we can make still images and animations way faster. If on top of that they receive real-time updates, we would’ve been replacing their workstations with much more powerful cloud-enabled render machines.
The problem with that approach, as we’ve found out, is that the bandwidth of most users is extremely limited, so that uploading a relatively small scene of hundreds of megabytes is already impeditive to a more iterative process. For “final versions” of a render, that does make sense, but if we want to replace the F12 button with a cloud-based real-time feed, another solution must be developed.
This is where I come to the question of this topic. Does Blender have a global/internal state that I can keep track of, and then make small “commits” of changes between two different points in time?
Being a programmer, I can relate this challenge to the idea of an SCM tool such as git, where each “diff” is stored on the history database, and that I can easily “push” changes to a remote server without having to send the full “repository” of information.
In the same way, if Blender had that for everything related to the 3D scene, I could manage to send only what has changed between two different versions to my virtual machines in an extremely efficient way and accomplish things such as real-time remote rendering, real-time collaboration, etc.
Sorry for the long post, but this is a bit complex to explain without giving the full context.