Geometry Nodes

BTW, if you also vary the Density Factor using attribute painting, then Distance Min won’t help you at all, like in this example with the road and vegetation instances:

What setup you would use so instances are spaced more or less evenly no matter the Density Factor?

1 Like

Why is it an inconvenient workaround? If you want an even distribution its much less inconvenient to start with an even distribution and then vary it a bit than it is to try to force a node meant for scattering into something it wasn’t made for.

If you start with a grid or a mesh line you have perfectly evenly spaced points.

Yes, it looks like I will probably end up by using the grid (in my case the ground mesh vertices) to create the points.
I just need to figure out how give it the functionality of Distribute Points on Faces, specifically how to pair it with a density attribute (so I can paint the distribution) and how to achieve stable IDs, so my collections children aren’t regenerated after each adjustment.
Thanks!

@jendabek Exactly what I was about to say. Start from an even distribution and introduce some chaos instead. You can paint it by simply deleting points that are close to a masked vertex on the ground mesh

Thanks, I managed to get attribute density & stable IDs working using this setup.

If I can have an additional question, regarding GNodes in general:
As modifiers make also their targets sluggish to edit, I am used to do “hide in viewport” for all the objects which have modifiers targetting the object I am going to edit, to make the editing performance (like col. attribute painting) completely smooth again.
So far this worked with every modifers I used (like ShrinkWrap or Data Transfer), but not with GNodes - even after hiding the GNodes object, it doesn’t free objects I am using in that node tree, so editing them remains slow.

Is this a bug I should report?

Here you go:
https://drive.google.com/file/d/17L94YC0-OBGkHMIlf0L8GM8IEO59uYOr/view?usp=sharing

I unfortunately did not have time to clean it up, improve UX and test everything, but it should give you a good base:


The terrain is intentionally decimated to show that you don’t need regular topology to use it.

EDIT: Here’s a slight modification to make density mask work gradually, instead of sharp cutoff:


4 Likes

How would you go about generating something like this in geo nodes?

Is this even possible yet?

So what I want to do is generate splines connecting points based on a distance threshold. for example, say point A is within the threshold of four other points then I want to generate 4 splines connecting points A with the 4 other points. And then I want to do the same for every point in the point cloud.
The more I think this through the more I realise I need some sort of loop to do this. unless there’s a dedicated node for it.

Any insight on approach for something like this?
Cheers

1 Like

I was using this method just recently. As you mentioned it is not ideal. I’m not talking about UI inconveniences, but about performance. Decimate with very large meshes is horrendously slow to the point where you just merge bottom GN and Decimate just to do your work. I keep copy of the object with unapplied modifiers as a backup.

Merge by Distance node covers most of the usage of Decimate, but it is very rudimentary at this point. Unsubdivide and Planar are very useful in some scenarios.

There is also case where Merge By Distance will not maintain the volume after operation, while Decimate will do. I think both options are valuable, and there should be a checkbox (preserve volume) for this on MbD node.

2 Likes

AFAIK This kind of thing is usually done with a KD-Tree which you can currently do with Python but not in GN.
It would probably also be possible if GN had loops but I assume it would be less performant than a KD-Tree.

I think there is no good way to compare values within a field with other values in the same field at the moment.
If I am wrong I’d be very interested on how to do this as well.

1 Like

is sculpt mode will be available in any blender future releases?
Current blender geometry node is capable of modifying objects vertices position but creating complex shape using sculpt mode is kind of difficult to do like creating human.

When checkpoint nodes are comming, IN FACTS YES

I dont really understand exactly how a kdtree works so youre proabably right.

Although i was able to replicate the effect in a way.
First i distributed points randomly. Then instanced a curved line on the points. The first control point of the curve line is on the same position as the point cloud. For the second control point of the curved line, i transfered the original position of each of the points in the point cloud to the next point in the cloud using transfer attribute set to index (the index is shifted by adding 1 or any integer to it) . Then i set the position of the second control points of the curve to this new shifted point cloud position which gives the effect i was going for in a way.
Then i can repeast this and shift the index during the transfer process to get multiple variations.

Not that elegant and not exactly alike but it works.

1 Like

Cool method, creating a new geometry from the bounding box mesh, and projecting the points back to the emitter surface, i did something similar for a pixel based distribution algo few months ago, and another one based on curve-areas ect…

The big problem is that we lose all attributes of the emitter mesh with this kind of data flow :frowning: when we rely on external objects or primitives to create an effect on a mesh, or in this case a new distribution algorithm, all attributes of the users will be lost.

You might say “just use an attribute transfer node” except it cannot work on an un-defined amount of attributes, especially coming from an user perspective who won’t dig into your node tree (either by lack of knowledges or being restricted in the studio pipeline).

It’s one of the biggest limitation i found with geometry node since the very beginning, :sweat_smile: it was brought up in the blender chat last week, I hope this limitation wall will be resolved someday

https://blender.chat/channel/geometry-nodes/thread/mvc3XeG43rvfXxNNa?jump=MbWJNTH8hNu7ssFDh
https://blender.chat/channel/everything-nodes?msg=J4Dh3aM9q7cSNXL9z

1 Like

Another post by me for much needed incorporation of fast booleans in geometry nodes.

Here is an example on a 50x50 vertex grid with triangles instanced on each point, scaled with an attractor, and boolean removed.

The left example is all within geometry nodes with fast booleans. The right example is two separate objects and a boolean modifier, using the Fast option.

The video is in real-time and the difference of speed is substantial. PC is Ryzen 7 5800x, RTX 2080, 64gb Ram.

These kinds of operations are essential at an adequate speed, and usually performed with many more instances.

Here are kinds of examples with many more instances booleaned out:

image

9 Likes

Is there an alternative / workaround for this 2.93 node?
https://docs.blender.org/manual/en/2.93/modeling/geometry_nodes/attribute/attribute_sample_texture.html

Huh?

I need to transfer texture colors from one object (ground) to vcols of another object (placed on that ground).
I can’t figure out how to achieve that using this node :-/

That’s an image texture, not a texture data :slight_smile:

Is there an alternative / workaround for this 2.93 node?

No, it seems that the texture data property panel will die at some point, it just needs to be buried discretely? it is a bit sad because a centralized way to manage all procedural textures could be very useful :church:

So no more texture data! you will need to reproduce your procedural texture with all individual texture nodes

I can get the colors from the image texture using that node, I just can’t figure out how to map it properly on the other object (in this case from “ground” to “treewall” object).


I am sure this is probably very simple (utilizing Transfer Attribute etc.), but I am just probably too tired … so if somebody could still give me a hint, I would be very thankful … :slight_smile:

Well, then you do not need the texture data node at all :slight_smile:

You forgot an important step: transfer the UV of your plane to your treewall object, btw the StackExchange is a better platform for questions