2020-07-28 - Particle Nodes Status

Present: Dalai Felinto, Jaques Lucke

Meeting to assess the current state of the particles system, and agree on next steps.

The meeting covered the current implementation with three demos, and went over the designs to be explored before the coding continue for too long. This will be the scope of the work presented and developed in a workshop in September.

Current status / pending design

  • Multiple inputs for the same socket (needs design solution)
  • How to do specific simulations (scattering, spray, …)
  • What is the particles unit (VSE is the shot, compositor is the frame)

Nodetree describes the states you want to simulate and the behaviour

Type of nodes:

  • Data Processing (blue)
  • Behavior (red)
  • Particle Simulation: contain state

“Not sure about the separation between Emitters / Events / Forces” - Jacques Lucke


Which questions we still need to answer?

  • Are we building the right product? Is a product still worth building?
  • Is this design compatible with the other node projects? (hair, modifier, …)
  • Can we design the effects required for scattering of objects as well as particle effects? How? How would the node tree look like?
  • How would this system make previous open projects easier (or better) to work? (e.g., pebbles scattered in Spring, hair spray on Agent, tornado scene on Cosmos, grass field interacting with moving character in BBB, …)

Use cases for the particles simulation

  • Scattering of objects
  • Particle effects

Design > Mockups > Storyboard > Prototype > Implementation

Before the project advance much more is important to get buy-in and feedback for all of the current iteration design stages.

The project had a working prototype early on, but not for its latest versions. Specially not since the UI workshop in February. Also, at the workshop some UI ideas were discussed, yet no final solution has been approved.

  • Design: Mental model - node types, the relation between particle simulation and point cloud object, what is a node group, state, behaviours
  • Mockups: wireframes or final screenshots of node trees for existing problems (torch, rain, tornado, PS5, shoes, …)
  • Storyboard: what steps one goes to create different simulations
  • Prototypes: Functional (limited) working system (Python? C? Mixed?) where users can actually use the system.
  • Implementation

Current System


Playstation 5 Reveal

Nike Air Max 2017

BBC Idents - Gallery


Here’s a thought.

I never understood why people discuss Point Cloud, Particles, Vertices etc like they are separate things. They are fundamentally the same thing, its just the domain on how you process and represent them that are different.

People seem to think that because Blender will have nodes = the flexibility of Houdini. But in reality Houdini’s success is from its complete abstraction of these higher concepts and a very very strong core philosophy on data processing.

Take the concepts of “Events” for instance. You have no such thing in Houdini. But you can build and treat the data like it should be an event. Essentially its just tagging an attribute on a regular point to do something else.

Houdini has its low level language called VEX, which operates on data. Its completely agnostic to the domain. Be it points, primitives, particles, even animation data, volumes, images etc because fundamentally they are all the same thing. The only difference is the domain they are represented in.
Thats why you can easily convert and image into points, and the points into animation curves and back to an image. Because all you’re doing is processing the data the same way.

Scattering for instance is also not tied to particles because why should it. Its just points with attributes that you can feed other nodes. Why narrow that function down to just “particles”.

If you look at some of the other popular node based frameworks for particle work like TyFlow and ThinkingParticles, they both suffer from the idea of “rules” and “events” instead of being transparent about what you’re actually dealing with. So once things start to get really complex its harder and harder to wrangle the actual data.

On top of that framework, you build easy to use user tools.

Speaking as an ex LW, MAX (pflow/TP) user and a current 16+ year Houdini user. Now in love with Blender :slight_smile:

PS. Not saying copy houdini outright. But you got to get the fundamentals right before you start asking “are we making the right product”

PPS. I might be ignorant to the development of the “everything nodes” idea as Im not reading every discussion so please ignore if you feel I’ve missed some crucial points.

Thoughts ?


current method of combining node types looks like it might lead to difficult to read and overly complex node trees. A potentially better solution would be to have a top level node tree where each node is a container for another node tree which is for a specific task.

The top level nodes/containers would have one input socket and one output socket which transport all point attributes (position, uv, rgb, custom (vector3,vector 4,float,text,integer)) between the top levels,

Inside the top level nodes you could then create ‘read attribute nodes’ which lets the user pick from all available attributes that have been passed in from the top level (bit like a uv node in the shader editor), and ‘write attribute nodes’ which the user can use as an end point for a node tree to update existing or create new attributes which would then automatically be passed back out to the top level tree via the top level nodes output socket (no need for messy internal wiring to an output socket like you have in shader groups).

This is basically a much more elegant node group, and less cluttered way of passing data around without millions of wires.

I’m not sure if you already have a solution, but I think it’s important to also allow the user to expose ANY parameter to the top level nodes, so they can build the node tree and then make the important parameters available for tweaking at the top level. Again, this shouldn’t require cumbersome set up and wiring like we currently have in shader groups, just right click the parameter and choose expose to parent node and then select which parent to expose to (in the case of multiple levels).

Of course all of this should be nestable, so you can have at the very top level a single node to which you can expose any of the parameters from any of the contained nodes. Think HDA’s in Houdini. This will allow the user’s to create and share complex systems where the end user need worry about only plugging something into the nodes input, tweaking the parameters exposed to the very top level node, and then wiring its result out to the next node in their tree.

Another use case of all this is to mix particles with other physic systems, like rigid bodies, mantaflow, cloth, etc…
And another one is the use of this system to break things, similarly to Fracture Modifier but with nodes.

So IMHO the use cases presented here are too limited, since they are just 2 situations identified, and I think there are more, but maybe I understood it in the wrong way, and these situations are included inside those two use cases :slight_smile:

And regarding the production examples, you have to think not just in the colour tornado, but the colour tornado with different levels of detail and debris, interacting with contrained rigid bodies breaking the environment and affecting the leaves of the trees, all this can be done with a particle system node system, and it implements not just particles but actual rigid bodies driven by the particle nodes system, scattered/instanced elements with the particle system that can be affected by the system and the physics of the system.

So i think it’s much more complex than what is presented here as example :slight_smile:

Finally, there is a question that I don’t understand here:

Blockquote Are we building the right product? Is a product still worth building?

I’m not sure what this question means, but if my answer serves for something, YES, wihtout a doubt, particle nodes + function nodes (that I imagine could be exposed at some point in 2.91), modifier nodes, constraint nodes, and others are IMHO very important because they allow to do complex scenes or situations and give way more freedom to the user, again, for sure you would have ll this in mind, but I wanted to expose it here too, because I’m not sure if I understood this question and why are you questioning this right now :slight_smile:


You hit the nail on the head. Everything in 3d is basically some sort of container (points, vertices, faces and edges), all of which can hold data (attributes). All a node sytstem needs to do is give the user a way of passing that information around and modifiying it. So really one well designed node system is all you need to achieve anything in 3d. The only difference between particle nodes and mesh editing nodes should be the developer prebuilt nodes for achieving specific tasks that might be too complicated to achieve for the average user, these prebuild nodes should still be nothing more than a node which exposes parameters of the node trees it contains (rather than them being hard coded into the blender source in C/C++)…but these aren’t overly important if the system is designed in a way that enables users to build and share complex node trees, expose their important parameters to a front end node, and then share that node with the community. With blender’s massive community there’d be an abundance of nodes allowing users to perform not only particle related tasks with nodes, but any task (modelling,vertex painting, rbd sim, cloth sims etc)…they’re all just the manipulation of attributes travelling around inside their containers in 3d space.

So I’d say stop concentrating on all of the prebuilt nodes, focus only on the system, and let the community start building the tools for everyone to use to achieve anything (not just particle simulation). The only hardcoded nodes should be those that represent low level programming operations/principles such as logic/conditions, loops, variable creation, attribute creation, caching to file, caching to ram, writing files, reading files, error handling (output to console etc), data creation/instancing nodes, nodes for accessing and manipulating any data from within the blend file or from other blend files, a node for entering script where the user can create attributes and edit attributes programatically, as well as utilising modules such as bpy, numpy, tensorflow etc.

Also needs a good way of sharing, maybe an online repository searchable from the shift + a menu. Search local, search community, search devs, with the shift a menu also seperated into sub menus : local, online (devs,community (dev approved), community (unapproved))

ps…hoping for gpu acceleration/parallel processing where possible, such as houdini’s loop nodes which can perform iterations in parallel providing between the loop start node and the loop end node there are no nodes which are incompatable.


Could do with getting rid of the multiple inputs such as event and emitter. Instead have raise event nodes and receive event nodes. The raise event node would be triggered if the data flows to it (determined by a conditional node most likely), and then all receive event nodes which are set to listen for that specific raise event would activate the node tree they’re connected to. The receive event nodes should have access to all data which was flowing into the raise event node without the need for wiring. The receive event nodes should also be able to live anywhere, not necessarily within the same node group/particle system. This way you could trigger any operation anywhere within blender, such as triggering a sound when a raindrop hits an object for example. Good idea to start thinking of the bigger picture beyond just particle nodes, otherwise the design isn’t going to easily accomodate future progressions of the node systems integration with other aspects of blender.

This software doesn’t even have groups and nobody cares, so abstracting data like houdini…Dream on. Yes Blender should work only with primitive types I agree, and same for shading… But it’s too much aimed at beginners.

PS: Take statixvfx feedback very seriously because he is so freaking right. Retake before it’s too late.

And apply the same for shading. There is no alpha, only a float type data that can be premult with a color type by example…


Blender must be more blending!

You got the point! Being flexible is key.


We are working towards unifying this. Attributes are stored in a generic attribute system that is shared between meshes, point clouds and hair.

Where particles come in is the nodes. There is a particle solver and particle nodes, similar to POP nodes in Houdini. The input and output of that solver are generic attributes on point clouds.

Scattering is not tied to simulation nodes in the current design, unless you want to use dynamics for scattering. It is part of the particles project because it is a use case that we must support if we remove the old particle system.

Point clouds can be generated through modifiers, and later on through modifier nodes. Instancing scattered objects will be possible on point clouds, meshes, etc.


For the pointclouds: when using them for instancing i wonder if the points can only be Vector3’s or if it could also be possible to do a Matrixcloud instead. So one can generate the matrizies inside the nodetreee somehow and pass the matrizes into the pointcloud. Or is scaling and rotation already handled for the points in the new pointcloud design?

There is no implementation for this yet, but instancing should be able to use geometry attributes for rotation and scale. A matrix per point can fit in the design, but I don’t think that’s the way to go. Separate position, rotation and scale attributes fit better with the rest of the system and are more standard I think.

1 Like

Can it be for smooth transition from old workflow (preset of nodes group)?

(Quick Hairs)

(Quick Particles)


I think the question to be asked is not if the system is needed, because it is. From single illustration to VFX teams, particles are needed to scatter simple objects for advertisign or to create complex effects for movies.

That said, I feel that the new system will only show its true potential once particles can behave as and interact with rigid bodies. This should be, in my opinion, the first milestone after the foundation is completed and completely functional. The particles should collide with each other and respect its volume in order to create more complex visuals. Without that, the system will only work for abstract work. or to feed Mantaflow simulations (which is expected I think).

EDIT: I don’t know how much time it would take to implement collisions and rigid body within particles, but please don’t take this challenge as a negative point to make the system half complete or “unnecessary”, instead use this as a motivator to create a product that can be used in all levels of production.


After playing around a little with the system, without much knowledge regarding other systems, I like what you’re doing. I like how you can create and stack events and (probably) forces as well. It reminds me of XParticles, which is the easiest particle system out there (as far as I know).

It’s pretty easy to visualize what you potentially want to do with the Simulation Editor. For instance:

These Particles from this emitter will do something…

  • When certain time is reached.
  • When it collides with another object.
  • When reaches certain velocity.
  • When reaches certain location.
  • When etc.

I like the conditions system. It’s very “noob” friendly.

However, probably we will need more high level nodes (or groups?).

Also, can forces be triggered with certain conditions? Because right now they look like a general attribute for the entire simulation, which isn’t that good.

It’s probably to soon to ask, but as I pointed out on my post above, rigid bodies is a must.

Also, working with splines/curves to guide particles and use the point cloud to generate splines paths is something very needed maybe on a basic level of the new system.

Cheers and keep up the good work!


On references/use cases - a classical one, that is hard to do currently is dense condensation droplets.
I’d like to be a know-it-all that suggests what core functionality could solve that (neighbor aware particle distribution? joining/absorbing other particle’s volume on distance threshold?), but AFAIK it’s not solved with grace anywhere.
Getting particle’s stuck-to-surface output flag to change particle object would be nice… or even distance to mesh. …+Per particle ShapeKey state? Would allow for animations/deformations for taking off from the surface etc.