When attempting to add a large mesh (176,230,815 vertices, 528,692,445 triangles) using mesh.from_pydata() in Blender 2.79b, I get a crash with the following error traceback:
Error: Array length mismatch (expected -115552647, got 528692445)
Traceback (most recent call last):
File "/home/avery/work/maxibone/src/visualization/blender-vessels.py", line 120, in <module>
add_numpy_quad_object("vessels",vertices,faces,(0.,0.,0.));
File "/home/avery/work/maxibone/src/visualization/blender-vessels.py", line 45, in add_numpy_quad_object
mesh.from_pydata(bpy_vertices,[],bpy_faces);
File "/opt/blender/2.79/scripts/modules/bpy_types.py", line 429, in from_pydata
self.vertices.foreach_set("co", tuple(chain.from_iterable(vertices)))
RuntimeError: internal error setting the array
This looks like an integer overflow error, except that 528692445 is much less than 2^31 and should not be a problem to represent in a 32 bit int. I traced the error message to blender/makesrna/intern/rna_access.c, and all the relevant variables seem to be ints which, while perhaps size_t would be more sensible, should not have a problem to store 500 million.
I would be happy to try to fix this issue, but am very new to the Blender code. Where could the overflow occur, and how to fix it?
Another issue is the extreme slowness of from_pydata() (the data takes about 5 seconds to load from disk, but about an hour to load through from_pydata(). I would be happy to help implement a from_numpydata that should be possible to get to perform orders of magnitude faster, but will raise a separate issue for that.
Thank you so much for any help you may have!