Offloading heavy geometry

The creator fo the proxy tool add-on got in touch with me and he’s working on a new version. Like the one before, it generates a point cloud but this time the hires geo is saved on a separate file and the hires is loaded only at render time. It’s amazing. I tested it with my 27M poly geo and instead of taking 9 gigs or ram, my layout file only uses 77 megs! It loads so fast too! The scene is crazy light and I can move around without any slowing down. He’s working on making it work with collections. There’s also an issue where it doesn’t remember the transformations of the objects that are being converted into a point cloud. This is what her wrote me: Yes, that’s because the objects in the library file are not linked to the scene by default and for now I have no idea how to access a library file content. Maybe I will look into this if it’s really necessary.

So we’re almost there. Not ready for primetime yet but it’s coming.

7 Likes

Something relevant for this discussion?

https://developer.blender.org/rBb64f0fab068169a7c379be728aae8994eb893b18

1 Like

Speaking of Katana etc, do you have an opinion or use for Gaffer to assemble / render your scenes? Supports the standalone version of Cycles afaik

https://www.gafferhq.org/

Creating tons of extra external files just for some display optimization of one single .blend file?
what a mess

Gaffer doesn’t support Cycles (actually, nothing outside Blender supports it because there’s no fully implemented stand alone version) so I have no use for it but thanks for the suggestion. :slight_smile:

Haven’t checked on the progress of this recently so maybe not ready yet – but I’ll keep an eye on it

Well my friend, let me give you a real life example. Optimus Prime, on the first Transformers, had 10 levels of details (LOD). It was ridiculously heavy. It was impossible to show it at maximum resolution because no machine could handle it (at the time). You could only see it at rendering. Now imagine 5, 10, 15 transformers fighting, on their own planet, filled with insane details. The film industry is a totally different game. It’s simply impossible to load all that stuff in memory, for display. On Welcome to Marwen, Zemekis’ latest feature film, we needed 44 000 machines to render it and some of them needed 380 gigs of RAM. And it’s not like it’s Star Wars or a Marvel movie. So yeah, display optimisation is a must if we want Blender to shine in the VFX world.

11 Likes

You’re talking about big Studios workflow here, most blender users are working alone and will not benefit of such features if it’s exclusively designed for such huge budget workflows.

think about all freelancer out there using blender.
I just hope that there will be an option to keep everything centralized too. I don’t see why it need to be one way or the other. it can be both via an option in the interface

Medium size too. We’re already reaching the limits of Blender where I work and we need better solutions. Maybe you don’t need it, doesn’t mean it’s not needed. I don’t care about grease pencil because I have no use for it but it’s good that it’s there. Blender is developing all the tools needed for major projects. Alembic, USD, VDBs etc. What’s the point of developing them all if not to try to gain more space in the film industry?

9 Likes

Why not both.
it’s up to the user to decide.
All I’m saying is that forcing a display solution to be de-centralized for every users is a bad idea.
I’m not saying that a de-centralized solution is bad. I’m saying forcing this choice upon every users is a bad idea, thus the need to a more flexible solution, that let user the choice if he want to keep everything centralized or not :slight_smile:

Often they are on external network drives to boot. :stuck_out_tongue:

Wasn’t that clear from the opening post? edit: never mind, it was post number 8. I misrecalled.

There’s no reason at all to force you to do anything. That’s not the point here. If you want to keep everything in your scene, then just do it but the day you’ll get into something bigger than what your computer can handle, then you’ll be happy to have the option. :slight_smile:

6 Likes

I think there’s a misunderstanding between the offloading of the geometry by itself, and it’s display method here

in such example above, a single archviz artist working on some heavy scene,
the artist OpenGl viewport starts to become extremely laggy, he want to display some objects into point clouds/proxy.

as far as I understand What you/some are suggesting :
- if the user want to use such display method he will be forced to offload every objects into external files.

what I’m suggesting :
-let user decide if he want to offload or keep everything internally stored (loaded in VRAM only on final render )

1 Like

.blend files, IIRC, are basically glorified memory dumps. Or at least they used to be.

I think they are more like database for anything blender wants to store. Even blender settings are stored in blend files somewhere in user appdata. There is a difference between memory dump and database. Memory dump is loaded as is and it can’t be read partially. Database can be read piece by piece.

One thing I haven’t seen discussed here:
I am reasonably sure certain formats like vr*yProxy save a pre-built BVH-Tree inside the file. This allows to load gigabytes directly into the renderer with only seconds of pre-processing.
To my knowledge cycles-viewport has dynamic bvh (so the tech is here), its only the main renderer that throws everything in the same bag, which plagues us for years now with the scene build-time being sometimes on par with rendertime… (with only little gains in speed for having a unified bvh)

Anyway - if blender ever gets a own solution for offloading, I really hope they implement this.
One side effect is, that you can use different tree depths of the bvh to set viewport display level of detail when showing only bounding boxes.

obviously this only helps for static content, but I assume that is the majority of heavy stuff

edit: i guess also that we don’t yet have reusable bvh data between frames is that it is not so easy to figure out what has changed and what not (depsgraph). For a static asset file format that would be at least simple to figure out…

1 Like

Yeah but I mean I don’t think blender loads data from a blend file partially when you open a blend file. What you say applies to linking or appending a datablock and it’s effective hierarchical children.

I don’t see at all how this contradicts what Funnybob wants?
The gist is: If your computer can handle it, have it all in there at once. If it can’t, give options, which at no point you are forced to actually use, to offload stuff for more efficient viewport display.
Like, the only way you’d be forced here is in terms of your computer not being able to handle it otherwise. No dev’s gonna force that upon you.

No dev’s gonna force that upon you.

Yes, forcing the user to make extra external files for using a handy display solution.
That’s forcing a data-storage workflow upon users that want to use the newly proposed display solutions (point cloud/ proxy)

as far as I understand What you/some are suggesting :

  • if the user want to use such display method he will be forced to offload every objects into external files.

what I’m suggesting :

  • let user decide if he want to offload OR keep everything internally stored (loaded in VRAM only on final render )

packing everything in the .blend is clean and extremely handy for a lot of users.
again, i’m not suggesting that offloading is bad, i’m saying that user should have the choice if he want to offload or not when using a viewport display method.

Imagine forcing users to create external files for each object he’d like to display as a bounding box in your scene? It doesn’t make any sense to force a data-storage method with an OpenGL display solution right? that’s exactly what I’m trying to explain from the last 5 posts or more : there should be a distinction between the display method of the object and the offloading by itself. Forcing a display method to work with a data-storage solution isn’t a flexible idea. That doesn’t mean that we souldn’t have such option, in fact we do, it’s an excellent feature! No that simply mean that the offloading need to be an optional choice.

Is it not clear enough? :sweat_smile:

An specific option to have a pre-built BVH could be a super good idea, specially because the biggest slow down of pre-processing now comes from instances, it will be sped up for sure with the new Point Cloud implementation that @brecht is working on, but probably it will never be as fast as not having to build the BVH itself.

@BD3D to be honest about using external files, we are a small shop ( you know it :wink: ) and we miss that workflow a lot, it is widely used in Archviz in other platforms, not just in VFX and not just by companies, also by freelances, because it helps to maintain projects organised and reusable, and maintain main files low in size, and render pre-process fast, so I think that this improvement is something that may benefit everyone, from the smallest solo-person to the biggest studio out there doing hardcore VFX

I agree there is no need to enforce the creation of an external file, so far a user can use collection instance or linked collection instance, but when you are in the phase to optimise a super heavy scene, having external files with all the possible accelerations included is very helpful, and you know, for a video every second equals spent money, it’s not so important for static renders, but for animation, this is life! :slight_smile:

3 Likes