Geometry Nodes

The spreadsheet only shows the top-level instances. Ideally we would implement something like an outliner tree view to see the full tree of nested instances. Understandably that gets a bit complicated though.

4 Likes

My current thing about nested instance is that I need the realize node to have level controls, otherwise onec you nest it, you cannot come back. I saw somewhere that there was a plan about this but I don’t know the details, hope we have that soon.

Are there already dates when we could expect to have mesh edit nodes like extrude, bevel or recalc normals? Knowing is better than patiently hoping. Of course i could do the mirrored mesh parts differently than just scaling along one axis by -1, but this way the nodetree is much nicer, so… please!

1 Like

Wow, great news!

I tried 3.1 build, but I still kind of stuck here, because I can see that data is here, but I can’t get how to reed and write it in “Fields” concept.

I suppose this should work, but it doesn’t:

Note that I have some data in viewer, but it losts after realizing. Maybe the patch is not in the build yet? I tried December 16, 02:21:44 - b265b447b639

I’ve got the same problem, is this setup wrong ?


i need to color the instances

Oh! Finally, got it. It says Realize instances will be legacy for backward compatibility. So I deleted it and added again. It works, cool!

2 Likes

we can’t transfer attribute to instances ? what if you want to keep them as instances ?

2 Likes

Sorry, I don’t quite get your question. In order to put some data into vertex color like in my example, I have to convert them into real objects.

image

my question will be, can we put data on instances ? realize instances wont be an affective workflow for a lots of them

1 Like

It would be better if you say what exactly you’re trying to achieve. Because using data on instances is very limited. For performance reasons, you may turn off realize instances in viewport and get back in render:

This setup is 16 times faster without realizing in my case.

That would be the same, it would kill the system memory speaking in a complex scene.

The thing is that we may need to store information on instances, like a color attribute per instance for example, not modifying the geometry per-se, but storing some data on every instance.

1 Like

so that’s not possible yet ? In this tutorial Bradley use some custom nodes to achieve that in 3.0 but i thought the dev added the option to do it in 3.1

@MichailSoluyanov trying to have the same result without realizing the instances, we should be able to store attr on them and not on the mesh

1 Like

That was my understanding, but for now I only read this for realized instances, not actual instances.

Hi, I Used this part of the node tree to generate UVs on the procedural wires/ropes. If I used the closed circle profile I had a wrapping issue on the UVs, so I opened the circle and snapped the last vertex to the first by transfering the position using indices:

I wonder if Is there a way to write the normals just like I did with the position, and get rid of the hard edge. (There is no merge by distance yet, right?).

I mean, copying the normal of the last edge point to the first.

4 Likes

For a curve with a closed circle profile, the UV is already gap. It makes no sense for you to separate it.

  • You can get the normal of the curve and output it to the shader.

With the closed curve profile,looks like the u value of the last vertex, gets interpolated with the first, and It becomes visible on the corresponding “seam” on the extruded curve(I’ll post a screenshot asap). That’s why I wanted to break it and snap the vertices. The problem now is that I have the hard edge that I would like to fix by copying the normal, not sure how to output the normal to the shader and use it, my goal is to copy the last vertex normal to the first one to hide the cut. But besides this particular use case, is a way to manipulate/write normals planned? Just like we do with position attribute. Or is it even possible right now and I’m unable to figure out how?

I’ve noted that you have to make a lot of subdivisions to get good results in sampling textures. Is it expected behavior, or have I missed something?

1 Like

In theory, yes, because you find the texture value for each vertex. Thus, you depend on the mesh resolution.
But if you first create points, and then, depending on the texture, remove the points, then the quality will depend on the density of the points.

1 Like

Yes, I understand what you mean
After all, without islands of polygons, we cannot make gap.
I forgot this.
I think it will be easier to do the 3rd component of the UV vector as an inflection factor to create a sharper transition.

Little Mantaflow experiment done with help of Geometry nodes

21 Likes