Python Importer Creating Large Amounts of Data

I’ve got an importer I’ve written for Blender and it’s working great. It basically creates some materials and a lot of mesh objects and objects with linked mesh data(instances) depending on the size of the file. When tens of thousands of objects have been imported, after the import op is finished, Blender can take a long time(tens of minutes) to become responsive again. I assume it’s doing some internal processing and arranging of the new data that has been created. To the user, it just looks like Blender is frozen if they try to click the window.

Does anyone know if there’s something I can call from Python to trigger these updates over the course of the import, instead relying on it to happen all at once at the end?

Try using a profiler like https://github.com/benfred/py-spy to see if python script is causing the slowdown. If not, and if on linux, https://github.com/KDAB/hotspot and if on Mac, Instruments.app can tell where the time goes.

Without looking at the code, this is the best I can tell.

1 Like

Though I have not run into this use case before. The most extreme case was to import a 3D scanned model with about 1.500.000 vertices. Which was a real pain to work with. I had to decimate the model about at 800.000 vertices to have reasonable responsiveness in my system.

However if there is a case that your model is something like of a CAD type (architecture / industrial design), you can’t decimate anything in this case. You will have to import things piece by piece separately and proxify them in one large scene.

What is the typical polygon budget in this case? As for example for those of you working in such projects? If something can be done in this case it would be using background processing, best case scenario is to prevent UI locks.

Otherwise the most simple and effective solution, is to write a command line script that imports the file with --python-expr and once you start it, leave it run by itself for 10 mins.