Geometry Nodes

I think I might rather it be 1 object from the beginning. I know right now geometry nodes is focusing on direct use cases, right now that is scattering, but that won’t be the only use case. I think for procedural modelling and other workflows you want to use “full copies” right away. I don’t mind too much if I have to drop a “make instances real” node before doing operations that instances couldn’t uphold (voxel remeshing, proportional editing etc.), but it does make me wonder what would be the most used scenarios for instancing in geometry nodes.

To me the fields that would benefit the most from it being instances right away are scattering and dynamics, no? Basically fields that just require it for performance. Set dressing can get nasty with grass, pebbles and leaves etc, and physics simulation might also like things to be packed and instanced. Other than that, it mostly seems like it would be desirable to have the granular control that full copies provides for other scenarios, especially procedural modeling.

This is just my opinion of course, but I think I would prefer it to be the opposite of what is planned, having the copies all considered as 1 object and act as full copies, unless told to do otherwise with nodes like “make instances” and “pack” etc. Another way would be to just have a toggle checkmark on point instance node that reads “instance” (thought I guess it would need a different name at that point), so you can check or uncheck it depending if you want the copies to be real or instances.

1 Like

Considering that maybe random picking from collection will not be on 2.92 release, what would be the workflow for scattering different objects? To have several nodes for each object and trying to controlling everything by vertex groups?

Thanks in advance!

That would require some very robust safeguard to prevent users from converting too many instances into a single object.

Why not? It’s already in daily builds. Just uncheck 'Whole Collection ’ in point instance node.

Oooh, nice. My bad. Thank you!

I’m not sure why it would be beneficial to guarantee that a full copy is done for every instancing operation. A copy can always be made implicitly if the data needs to be changed.

But yeah, outputting a single object of “instances” will probably need to be supported eventually, maybe just because having lots of instances can make the viewport pretty slow.

4 Likes

The point distribute node now creates a normal attribute for each point, which is great. However I can’t find a way of creating normal attributes on regular mesh points. Not sure if I am missing something here…Is there any reason why all points don’t have a normal attribute created by default, in the same way that they have a default position attribute?

3 Likes

he point distribute node now creates a normal attribute for each point, which is great. However I can’t find a way of creating normal attributes on regular mesh points. Not sure if I am missing something here…Is there any reason why all points don’t have a normal attribute created by default, in the same way that they have a default position attribute?

Every geo should have a normal attr by default, and the point distribute node would just transfer all geo attr to points attr This was already said above see image below, the generated point cloud should automatically inherite their emitter attr It is extremely logical to get the normal from the geo, as the normal are parts of the geometry (duh).


it’s a bit weird that the points don’t inherit their emitter attr, not sure if i’m the only one thinking that? any other?

Currently, The way attr are generated and passed or not passed between nodes is a bit messy

edit: it seem that the devs are indeed planning such feature.

If it was ready before, the distribution node wouldn’t need any normal attr generation, as the normal attr could just be inherited from the emitter

1 Like

I can see a generalization from here : the user may want particles or instances to inherit other kind of attributes from the source geometry. “Which and how” calls for explicit attribute transfer, I think (even though it may sound obvious for this particular use case, I don’t think it should be done silently).

1 Like

Is it feasible to have nodes that can create vertex weight to be used as attributes?

Maybe some sort of 0-1 weight influence node.

Hi devs, currently Point Distribute and Point Instance nodes hide the original(emitter) object, in the picture below Grid object gets hidden.

If I want to paint weights I either have to mute both nodes or turn off Geometry nodes modifier to see emitter object in Weight Paint mode, but then of course, I lose the preview of scattered instances and that’s counterproductive.

I see that if I TAB into the Edit mode emitter object(its mesh) reappears and disappears when TABed back to the Object mode and the same logic should be applied for the Weight Paint mode I guess.

Also, can Point Distribute and Point Instance nodes have a check box to unhide emitter object even in the Object mode? Let’s say the Grid object in the picture above is a terrain, but these two nodes hide it so I actually need to have another copy of Grid object so scattered instances and “Terrain” are both visible in the viewport.

you need to merge the geometry input and the point cloud node

ah ok… I thought they removed point cloud object from Blender completely, but it’s an experimental feature right now.

I think @BD3D just meant use a join node to combine the node group input (the grid) and the points from the point distribute node

1 Like

Ahhh now I understand the reason for joining the geoms. Thanks @HooglyBoogly and @BD3D :kissing_heart:

How? Don’t need to explain it, if you have a node tree I would be happy to just see a screen. :smiley: Thanks.

Just tested the align to normal node, i got some really strange result! why are my instances constatly rotating? what is happening? the normal attr of the points seem to be re-evaluated each frame? why aren’t the points simply using the normal of the mesh?

is this a bug? blend file available below

I tried your blend file, but there’s no “align to normal” node you mentioned above. Anyway, I updated your file, is this what you expect to get?

1 Like

yes indeed this is what i believe is expected, but by scattering from a new object dedicated to scattering the result is constantly re-evaluated and that’s not making a lot of sense, as the mesh normal didn’t change during the rotation

(sorry for the vocabulary inconsistency, by point cloud node i meant point distribute (internally it is creating a point cloud object i think?) and by normal align node i meant the new align to normal feature)

by scattering from a new object dedicated to scattering the result is constantly re-evaluated and that’s not making a lot of sense, as the mesh normal didn’t change during the rotation

It actually makes sense as you’re rotating Emitter Sphere, but you should do the same with Scatter Object as well. Just make the Scatter Object child of Emitter Sphere(see picture below) and it’ll work as you’d expect.

I see your point, but I would agree with @BD3D that it doesn’t make sense. Or at least it’s inconsistent. Usually object transform are applied on top of everything (source of the famous ‘Why does my bevel look weird?’ confusion most of us probably faced at some point).

Therefore I’d expect the instances to transform together with the instancer. The old vertex instancing worked like that, as well.

The feature is not there, yet.

Jacques added an Align Rotation to Vector node that will be used for that when it will be possible to get Surface Normals of mesh as an attribute. But that last step has not been done, yet.

I think it would be better for the instanced objects to be evaluated in their own local space only, right now transforming them has an effect on how they’re scattered. I don’t think it should.