Thank you for your time!
For the solver kettle:
The current node group design should work in my opinion if access to more granular building blocks is provided. Say providing a propper range of solver functionality from low level steps such as advection, contact generation, constraint solving, ect. (steps that are currently hard-coded into blender’s simulations) and also provide them put together properly in high level template nodes. Let’s say if the user just wants a a stack of cubes and spheres on top of a plane falling with an animated gravity property they can using nodes:
- assign the plane, spheres collection, and cubes collection default mass/friction/shape/etc (helper node)
- create a rigid body scene solver node group (this is the high level template)
- plug their collection of spheres + cubes into the dynamic body slot
- plug the ground plane into the static body slot
- add either a reference to the scene gravity attribute or a key-framed value node into the gravity slot (maybe there can be a defaults system)
- take the template output (which is the input objects that have the updated transformation matrices applied after simulating the current frame (or defined amount of time/steps) and pipe it to the output of the system (since we are basically dealing with something that modifies the scene state and not just a single mesh’s geometry the scope of such a node system is still something to be addressed in the design.)
But let’s say that the user decides that they want sparks and debris to fly whenever an object collides with another object… Well currently Blender doesn’t support this, and let’s say that the node template wasn’t designed for this either. Currently you would need to hack around this with dynamic paint or something or modify blender’s code, but with this node group you could simply dive in a level and inspect the (well framed/commented/laid out) steps of the rigid body solver… Ah there it is! The contact points output of the low level collision detection node, inspect it in the spread sheet and you see a list of positions, object pairs, hit normals, etc. You take the the wire and send it into an output socket. Now that you have this list of points you can work with it and cull out data points that have too little impact intensity or don’t have an object with a metal material in the pair. Now that you have a list of contact points you can do all sorts of things with it, you can just plug it into the “initial points” socket of that spark particle effect you downloaded from the asset manager, or you can go wild and put it in a cache node to merge it with last frame and use it for instancing ice crystals, rocks, flowers, etc. Maybe even hop on the forums and suggest this part of the simulator be exposed, and with no C/C++/Python code the user’s can expand the possibilities of the solver without having to beg programmers for access to certain callbacks.
Going this route everything nodes should be a LONG term goal but will give a lot more node functionality (and possibly hundreds more nodes) introduced into blender, but I think it will all be worth the growing pains for the flexibility it unlocks.
For examples: I think the kind of thing that everything nodes is most close to in what it tries to achieve might be an Entity Component System (ECS). While not always exposed as a node system (although it absolutely can be if you visualize it as such) it certainly achieves the ability to create complex customizable simulations. Where it might differ is in that some ECS implementations might go beyond simulations into general purpose software architecture.
While I cannot point to any closed/commercial software that tries to do what we are attempting, I think looking at an ECS might give good insight. An excellent Open Source ECS to look at for inspiration is FLECS. It has already solved many of the challenges in creating a general purpose simulation system. Although many of it’s solutions might be overkill (such as notifications on component/entity life-cycle changes/prefabs/snapshots/etc) or not easily applicable for what blender needs, The core philosophy and functionality of it should show how an extendable data oriented system could tackle things from particles to other general purpose applications.
Thanks again!