Geometry Nodes

Is it currently not possible to replicate the modifier’s layout with Python? If I try

row.prop(obj.modifiers[0], "Input_5")

for example, I just get an error. I would like to update my Modifier List addon for 2.92 (and 2.93).

If it’s not possible currently, I’ll just show some text instead. If thats the case (as I suspect), will it become possible in the future?

1 Like

It’s an early work in progress but it is up and running… https://www.youtube.com/watch?v=EdOhC0yGfh4

6 Likes

Will this be able to distribute points in the full volume of the input geo rather than just in clumps around the verts? Can’t wait for this to be a feature, I use the hair particle emit from volume setting all the time for foliage

Yes, this is scattering in the VDB volume, I just used points to volume as it seems to be the only reliable way at the moment to use volumes in geometry nodes. Once you can convert meshes to volumes this will handle that as well.

1 Like

Saw this on twitter as a way to indicate where you’re using reserved name attributes. It seems like a good idea just to distinguish where you’re going to be writing especially position / rotation / scale as these can destroy your geo pretty easily and people, especially beginners, are going to get stung losing track of attributes.

5 Likes

I haven’t tried it myself yet, but you could try to use row.prop(obj.modifiers[0], '["Input_5"]') because these are custom properties on the modifier.

The identifiers (such as Input_5 here) are the socket identifiers in the node group. We might be able to customize those one day, but it does not have high priority currently.

2 Likes

That works, thanks! Now I just need to loop through the properties to display them all. Hopefully that’ll work. :slight_smile:

E: Except the name in the UI is also “Input 5”. But I can probably get the correct name, in this case “Level”, from the group input node. Need to investigate…

I’m currently trying to create a simple layered object from a plane using a solidify modifier for the thickness and geometry nodes for the individual layers. As you can see in the screenshot, the layers don’t follow the vertices of the plane correctly.

Am I doing something wrong or ist this not possible yet?

1 Like

I believe the issue is your Z scale. When you’re compressing your angled solidified section, it’s having the effect of sheering the geometry. You’d have the same result without the geo nodes if you just scaled the object in Z with only the solidify

I have an Attribute question for the more techy folk here:

Are there any illegal characters in attributes? Could I call something x'' or p.z>p.y or c* etc.
From my testing it appears to work, but is it safe to do this?

Currently we don’t have any restrictions for what characters you include in attribute names. I’d definitely not recommend any of those examples, but we also don’t want to impose arbitrary limitations.

1 Like

As long as addons can’t run arbitrary code from attribute fields…

Cloth simulation test using voronoi texture to create vertex group pinning with empty falloff

44 Likes

I’ve been having a lot of fun with Geometry Nodes, and especially with being able to vary attributes on instances! I’m excited to test out setting attributes on instances using the Python API and accessing them through Cycles for rendering.

A feature that I would love would be functionality to set velocity vectors on mesh vertices and be able to use these for rendering motion blur. Examples where this could be useful are for:

  • Addons that generate procedural meshes with changing topology. This would make it possible to render these meshes with motion blur.
  • Addons that generate custom “particle systems” where the particles are instanced on mesh vertices. This opens up the possibility of rendering the instanced particles with motion blur.

These are two features that we would use in our development of the FLIP Fluids liquid simulation addon, and would also make so many of our users very happy. Even just a workaround of exporting the motion blur attributes of a mesh sequence to an Alembic cache would be useful.

6 Likes

Any suggestions about how to access modifiers or shapekeys of the instanced objects?

I feel like I need to put this out there again now that I’ve had time to get into Geo Nodes properly and use it in a few projects. I do really enjoy using it and it’s been super helpful for some landscape design visualisation I’ve been doing this week for creating borders and distributing trees and the such. For that, it’s been invaluable.

However, I really don’t think that forcing the attribute workflow through geometry sockets is the way forward. We need to be able to break out an attribute, work on it regularly and then merge it back in. This hyper-linear workflow doesn’t lend itself to nodes and especially not for complex transformations. It makes debugging a nightmare when you have 20+ attributes that you can’t quite remember if you’ve overwritten or if you want p.y3 or p.y4. Granted the spreadsheet will make this better but it’s not the best / only solution.

I don’t see why we can’t break out a list from an attribute, process it with math/vector math/colour nodes etc and then feed it back to the geo when we’re ready to use it for position or rotation or something where it will effect the geometry. We already have the attribute nodes able to process lists of data against other lists / floats / vectors. I imagine there will be decisions needed around how to process different length lists, whether you repeat last, repeat full to reach the long list length, or if you cull extras to maintain the shorter list length. Something like a List Match node could be made to manage that manually for specific uses and otherwise just have repeat last as the default behaviour. This would open up a lot of options with being able to work in a more interconnected way integrating multiple geometries’ data. Even being able to share object data wholesale for useful effects like a position lerp to imitate shape keys or translating instances between predefined locations etc.

I foresee attributes becoming increasingly prohibitive as we move forward into parametric modelling tools and so maintaining a more visual, low-level system would be much easier and manageable for the user. You could still write to attributes in order to share between different silos within Blender (shaders / particles / collection nodes etc) but all the processing of data could be done visually, without pulling the bulk of the geometry socket with you and having to manually declare what attribute you’re working on at every single step. String node was a welcome addition but again, doesn’t illustrate the flow of data and doesn’t really help when you need to drop in an extra node halfway through and then manually check through everywhere you’ve use a string to potentially update it.

Case in point, here is a point culling method using a pyramid SDF:

Even this fairly simple conic SDF culling method is not very approachable:

9 Likes

Just curious, have you seen this? T85655: Attribute Processor for UX improvement

6 Likes

I have yeah, definitely it would be a useful tool and pick up a lot of the heavy lifting from these math heavy trees. I still think that being able to split away from the Geometry socket and later merge back to it would be the optimal solution in terms of flexibility.

5 Likes

I’m also interested in that feature.
For now I found a workaround. I use the particle system to distribute one vertex object in the volume of the object I want to fill ( grid distribution) than I convert the particle system and I join the points generated to a single object. and finally I apply geometry nodes to tis object and I instance theses points witch fills the volume.

4 Likes

+1 !!

Flip Fluid is such an amazing, well made and stable addon. It’s on the top 5 of Blender best addon with Scatter, Retopoflow…

Having a way to get motion blur would be awesome.

2 Likes