Hello,
I’m thinking about how to go about bringing instantaneous particle related point data from outside into Blender via Python for visualization of particle simulations.
Scenario: Bring in point coordinates (create vertices in Blender mesh, that’s easy), various vector data (velocity, orientation) and various scalar data (scale, density etc.) (maybe even tensor data later on) for each point, and then use those in Blender: Create a glyph object in Blender (sphere, arrow or any other mesh object) which would be instanced at point locations using Blender’s object instancing mechanism (one of them, don’t know which is best here). Scaling, rotation and shading (foremost coloring) of each instance would be done (somehow) in Blender using values from imported vector and scalar data. Aim is to not need to create meshes at each point, to get instancing benefits.
-
Where to store imported point data? It seems that storing vertex colors requires a face to exist at the vertex. So it is not possible to use vertex colors to store color data for vertices only (without any edges and faces in mesh), is this correct?
-
Vertex group weight range is [0, 1] according to documentation. So I guess it is not possible to use those to store full float value range? Value ranges could maybe be scaled to [0, 1] but that would be somewhat painful.
-
Maybe use custom vertex attributes for storing point data. Is it possible to access custom attributes e.g. in material shader node trees? Would it need a custom node to do that, and are such nodes technically possible to add? Or could I use the Attribute node for this?
-
Do not use bpy to store imported data at all, but inject values into a custom scene. Is it possible to adjust e.g. scale and rotation of a single instanced object somehow?
-
Create motion blur for particles from imported velocity data. I’m thinking about creating an operator which would create/adjust animation f-curves using imported velocity. Comments?