Geometry Nodes

-About Instance Node Id attr

I must say i quite don’t understand the logic of this “ID” Attribute

New

it can be used to change instances randomly, indeed
but how to precisely choose which one, is not possible somehow? (afaik)

I was expecting that the ID would respect the order of the instances in the collection but it seem completely random so not sure how to use it at all

perhaps there’s something i’m missing

That’s because it’s random. The guarantee right now is only that elements with the same id will get the same object. Note that the id attribute exists in master already, it’s just not exposed.

Is there a way to make it not random? Like a certain index would only refer to a certain object in the collection, so that we can control the instance order?

No that is not possible right now. If you want to instance specific objects on specific points, you should splitup the points with a Point Separate node first currently.

This has been discussed in more detail here: ⚓ T84608 Specify per-object distribution ratio for moss use case

The guarantee right now is only that elements with the same id will get the same object

If there’s no way of users to know for sure that N id == X objects, then we are possibly missing the following opportunities

control instances spawn % rate by id
savetweetvid_EzDZhY6W8AE1FqZ

control instances spawn with color values
savetweetvid_EzDZ2inXAAMdtIjT

control instances spawn from the point scale attributes (important for realism)
ezgif-3-c022d012a6a6

control instances spawn with texture color values (very important for green walls and flower patterns ect)
savetweetvid_EzDZ-n0WYAA6fOv

All of the above is possible because i naively implemented “Instancing by index” with the nodegroup below (from 2.93) (with the help of the point separate)

Capture d’écran 2021-08-06 132032

But this was only possible because user know for sure that N id == X asset

If you want to instance specific objects on specific points, you should splitup the points with a Point Separate node first currently.

Note that running multiple points separate in a row as suggested had severe impact on performance (this set up is 50% slower than random collection instancing) & also it add a lot of unnecessary complexity in the nodetree

So making this node not pick instance randomly, (by for example, if there’s 6 element in collection, 0,1,2,3,4,5 int value would correspond to the object order, perhaps everything above should repeat or be random) would add up a lot of opportunities, while making it pick random do not really add any advantages in this point instance node afaik? :face_with_raised_eyebrow:

I hope my feedback was useful

6 Likes

Bounding box display isn’t a solution either. When viewport shading is set to render preview all geometry is rendered fully even for opengl portion of the view as well. Here I have a couple objects with Viewport Display set to Bounds, but in render preview viewport you can see monkey outlines outside of render border.

It’s very hard to see because shading is set to a flat color that exactly matches that of your background, so it seems that it has been made difficult to see by design for some reason. So if you have a complex scene with thousands of hi-res objects scattered, then going into rendered mode will basically freeze up Blender. Not because Cycles can’t render it, but because your gpu is killing itself trying to show you billions of polygons, that you can’t really see anyway. The only workaround is to disable overlays before going into rendered mode, that seems to kill any opengl rendering. But it has its own drawbacks because: a) constantly switching overlays is annoying, b) if you forget to do that, things can end very badly and c) some overlays are important, like scene info in the top left corner.
Initial design goal for Geometry Nodes was object scattering/set dressing, but with broken viewport display it remains just a goal, sadly. I wonder how artists working on Sprite Fright deal with this issue, since they use GN extensively.

2 Likes


I have created a super simple short script for generating l-systems in blender and am working on making an addon.

2 Likes

5 Likes

All that stopping me is being able to take the .matrix_world from a list of empty’s and @'ing it to a group of object’s .matrix_world over and over as deep as I want

1 Like

Hello Everyone

where i can follow the vertices to point node task?
there’s a node that convert a mesh to point cloud that is in wip i believe

It would be very handy for every scattering workflow that scatter per vertices for example, right now all attributes are not transferred correctly (vertex groups/vcolors/Uv attr for example)

4 Likes

I’ll share here what I’ve managed to achieve with Geometry Nodes, maybe someone would find the info useful.
The last 4 videos are all done only with Geometry Nodes and I try to explain everything as clear as possible.

Shield Effect with Geometry Nodes

I’m also curious if there is a thread or something about Geometry Nodes progression?
I recently found this GN build that probably has all these WIP nodes, and I’m curious when we will see more nodes made available in the main builds.

2 Likes

In near future we got instance from nodes not objects/collections and its great,
but in this case I got questions how we can do loops?
something like this for every instance i need change seed in nodes.

Eh? It’s the seed of the randomization of that attribute. It makes little sense to randomize the randomizer which is responsible for randomizing an attribute of the array the output of which you want to use to randomize the seed.

image

why? if we have bricks wall for example. I create one brick randomized his parameters, then I instance him for all wall now we have all bricks with same parameters but if we change it with instance id we got more natural wall without created many inputs objects for instance.

and its not only for seed. we got many ways to randomize objects scale/rotate. and change them using instance id create much more interesting things

That Attribute randomize means that attribute on every element of the geometry input will receive random value. So if you have 100 points, then randomize their sizes and then use those instances to distribute brick meshes on them. That’s already possible.

its work only for size of instance. but if we got same more completed object not only cubes or spheres with gn damages edges or same more like damages etc?

Afaik this issue has not been tackled yet
it’s also a hard wall Ive been stuck on for a long time

Would Per-Vertex option in Point Distribute be relevant? Converting a mesh with UVs is complicated since a vertex can be associated with multiple faces (or face corners)?

1 Like

It could work indeed

but imho the best would be this kind of node as some users already requested their need for a point per face center or point per edge center feature already (above in this thread *edit and right below)

Capture d’écran 2021-08-13 155150

Point per vertices might be the most common, seconded by point per faces
i must say i don’t see any use case for point per edges :face_with_raised_eyebrow:

*Note that this node would require automatic attribute transfer similar to what the point dist is doing ofc, because that’s the main problem with the current per vertex instancing workflow

About the attr problem, It’s true that face corner data can be tricky to convert, but isn’t it done already by the attr convert node?

9 Likes

Explored the AttributeConvert node, and it can indeed be used to convert UV coords but it uses a ‘mixer’ construct to perform some kind of weighted interpolation which is great for colors but not so much for material_index or UV coords. Seems to be possible to introduce an option to change interpolation method to some discreet selection approach though (e.g. ‘first entry’).

AttributeTransfer results in discreet selection but the intended use case does not seem to be geometry with a 1:1 match, so one probably still want to solve it using a conversion method.

Using the AttributeConvert node one can convert the vertex position attribute to the face domain and generate interpolated ‘face’ coordinates, so it seems like one can almost achieve the behavior except converting the data to a point cloud.

Might have missed something though and it feels like the fields proposal is definitely haunting :ghost: this issue as well.

2 Likes