Geometry Nodes

It should automatically convert instances to real geometry after adding any node after the point instance, or adding a modifier afterwards, the bevel might be clamped due to smaller objects, try with the clamp overlap option disabled.


oh my bad ! i had some thin wall … clamp overlap was the problem

Collection info as input of 1st geometry in boolean with object info as “cutter” in DIFFERENCE mode loose collection objects uvs ?

One of the first things I imagined using for Geometry Nodes for was making a node network that generates large bundles of dense wires/cables using a single Bezier curve as input. Does anyone have any tips for how I might go about this? Is this kind of thing possible currently with Geometry Nodes? I tried a few different things but I’m afraid to say I’m hopelessly lost.

Here’s a quick mockup that looks a bit like what I’m talking about:

Ouch, that user interface though. It looks like a DMV form to be filled out :confused: I really hope it can be handled better, otherwise using it will be a real pain.

Yeah, it’s a bit unwieldy. We’ve removed one input already (the “hit index”, it’s just the triangle index on the mesh). There are some ideas to make input lists so you only put in attributes that you actually need. Here’s the documentation revision, should give you a better idea of the current state: ⚙ D11620 Documentation for Raycast geometry node

Ultimately i hope we can implement something like the socket proposal i made a while ago (shameless plug): Proposal for attribute socket types

Pretty cool. I’m already loving this raycast node ! Is it possible to retrieve other (user-made) attributes from the target geometry as well ? or is that a job for the attribute transfer node ? by the way, aren’t those two (attribute transfer and raycast) a bit overlapping ? attribute transfer works only by proximity afaik (⚙ D11037 Geometry Nodes: Initial Attribute Transfer node.) but adding a raycasting method would basically make it into a clone of your raycast node.

Yes, user attributes can be interpolated as well. “Target Attribute” is the name of the attribute you want to read from the mesh, “Hit Attribute” is the name of the output attribute. Eventually it should be possible to add more than one attribute to interpolate. The code supports that already, but we don’t have a solid UI mechanism for it yet.

You are correct that there is some overlap between Attribute Transfer and Raycast nodes, in that they both interpolate attributes. But raycasting requires a mesh target (a surface to intersect with) while attribute transfer works with just points (anything where “proximity” can be defined). Personally i prefer to have separate nodes rather than trying to make these large behemoths that do everything and have a dozen mode switches.


Indeed, raycasting needs a proper surface, I hadn’t thought of that. Yes, fair enough. I’m all for atomic nodes, I was just thinking out loud.

It does not seem to be possible to Point Distribute on a mesh that only has vertices and no faces? I cannot use the Point Instance node since it doesn’t have density attribute… Anyone got any solutions to this?

You could generate a mesh from your point cloud with the convex hull node for instance, or use the existing points for instancing ?

1 Like

That will not work. I am basically trying to place a single object at the end of a line (made of 3 vertices initially and then subdivided) that has a vertex group on the last vertex named “tip” and align it in the direction of that vertex normal.

I can place the object at the tip by duplicating the “stem” and removing all the vertices except the last one, but then I lose the proper normal direction and it just makes the whole idea more complicated than it needs to be.

I might be misunderstanding your goal but can’t you use a point separate like this?


!!! That works, thank you!! :grinning: :+1:

1 Like


There’s a Huge wall/limitation in geometry node,

it’s the ability to communicate beween multiple geonode modifiers

Here’s an example

In this blend i have two objects with two distinct geometry node modifier, one geonode modifier that create falling suzanne, another that do proximity with suzanne instance

The problem is that it seem that it is completely impossible to read the instances location/rotation/scale values when working from an external object.
all the objdect info node is giving us is the applied mesh of all instances which is extremely slow to use!

Note that this issue has already been bough in the past

It seems that the answer to this problem is “just put everything in one node” which is a bad solution

Notes about geometry node bad assumptions: (imo)

in general, most of the limitation of geometry nodes are appearing because there are a lot of assumptions on how the users should use the nodes, ending in a lack of flexibility.

In this limitation above, the design assumed that users will always create a big nodetree containing everything. Which is far from ideal when working with multiple assets that interact with each other, or with a team. ideally you want to separate into multiple objects if it is possible (example in the workflow .blend above).

There’s more bad assumptions present, another one I keep hitting : the design also always assume that users will always do scattering using the point distribution node! but this is also a false assumption, we can emit points from meshes vertices which is a very common usage! Therefore these assumptions will create limitations and problems down the line, such as attribute not being created (the id attr is only generated by the point distribute node) or not attr not being transferred correctly (after point separate node for example) or simply the fact that vertices cannot be converted as a pointcloud domain.

(You will note that this last example if way less impactful that the first one above.)


Hey, so, why does this not result in geometry being output?

The profile in question is a bezier circle, and it works when turning a bezier curve I’ve set up manually into a mesh. I feel like I’m missing something here, especially since without the profile, it DOES output geometry. (It won’t let me post add more than one image, sorry)

I think there’s currently a bug/limitation that makes the Curve to Mesh node fail, when the curve is pointing perfectly vertical. (see: ⚙ D11265 Geometry Nodes: Add Mesh to Curve Node)
That might be happening here. You could try a line pointing to the side and then rotate it to point up afterwards.

I can confirm this as well. I reported a similar bug that has been resolved (mostly) here. This however is something different.

Oh wow, now it works, thanks!

Is there a way to take a point distribution and clone some other geometry to each point? I.E. I want a line to be generated for every point. Like a point instance, except instead of taking an instance or collection, it would take geometry as the input.

1 Like