Geometry Nodes

It’s not a distribution workflow, I created a tool for curves based distribution, the only thing we can’t do with Scatter XD, I have a course about how to create and modify it, basically it’s a loop were we process a given amount of sampled points in a curve, and then we treat offset position, rotation based on tangent extracted from each sampled point in the curve, scale, randomisation of those values, and object or collection to instance, or collections of collections or objects, we can feed several curves or just one, it has something more I don’t remember right now.

It’s not too complex, the thing was that there was nothing that was able to efficiently distribute objects along a curve in Blender, that’s why I created that tool, but for it to work with GN I need it to support curve sampling and tangent extraction, and we don’t have curve support yet, I tried using curve geometry, however it’s not good enough and delivers some weird results (it’s not supported yet I think)

One of the interesting things of that tool is that in the end you have a bunch of standard collection instances that you can manually control, it’s not a big object or anything like that, so you can manually fix or control some positions or rotations to art direct some objects

1 Like

What would be truly awesome would be the ability to load a custom node as some kind of cdylib shared library kind of thing.

Curve support for Geo Nodes
https://developer.blender.org/D11091

Performance boosts
https://developer.blender.org/D11139
https://developer.blender.org/D11125

Should be in builds soon, maybe you can take it for a spin.

Yep, I know, I was a bit of a little pain to @HooglyBoogly regarding this hahaha

I think I’ll be able to convert the tool sooner than later :slight_smile:

Thanks for the heads up anyways! :slight_smile:

1 Like

Super … good luck.

https://developer.blender.org/D10921
also needs some attention from him !! :sweat_smile:

No, not at all, that’s something totally different. Also useful, but different. What I want is something like this. This is how simple it should be to displace mesh along normals using a texture with Geometry Nodes:


(Ignore the mistake where two group input nodes have different set of inputs. It’s mockup composed of multiple screenshots)

Do you see that giant snake I posted a few posts above, with all the confusing names for custom attributes I had to come up with just so that my intermediate computations do not mix? Well, this is what could replace it. Something this trivial. In fact it’d actually be even simpler looking as I just copy pasted shader editor’s attribute node. The GN attribute node could be simpler and smarter, with just a text field for attribute name and single pale blue node output slot.

I mean look at the Attribute sample texture for example:
image
It inputs geometry and it outputs geometry? Why?.. Just because how limited the attribute workflow is. All you need to sample the texture are the mapping coordinates. Look at how simplified it is in my example. Rather than even these most trivial nodes having Geometry struct input and outputting this cryptic “everything” geometry struct, why don’t they output just RGB data as vector type node output and input just mapping coords as a vector type node input?

We can have just simple Attribute get node, like we have in the shader editor:
image
Which could then easily be used to put any custom attribute to be used as mapping for the Attribute sample texture node. Image sampler itself doesn’t need to input geometry just to access the attribute. This whole workflow is so overcomplicated and clunky.

6 Likes

I agree this is much clearer, and obviously needs much less setting up. It seems superior in all aspects. However haven’t they said multiple times that something like this (splitting off attributes from geometry) is impossible ?

What would the same tree look like with the “attribute processor” ? Would it solve the linearity problem well ?

Why would it be impossible? I mean it’s possible that in some cases, you’d want to have multiple different geometries with their unique set of attributes. But anyone who would take just a few minutes to think about the problem would probably quickly come to some reasonable solution. Here’s mine:
image
A very simple “Attribute Get” node that lists and allows you to access all the attributes from the input geometry structure.

So that if you have multiple geometries, and want to use for example the position of one specific geometry object out of the multiple you have available to generate mapping for your procedural texture, here’s how you’d specify it:

To me, the attribute processor looks like overcomplicated approach to do it. But I am not the one who requested it.

I personally do not understand why it should exist. If we simply could directly work with nodes that output fundamental data types (floats, vectors) as attribute inputs (the pale blue input node sockets next to attribute text fields), then wrapping a combination of let’s say some vector math nodes into a node group, which could input any arbitrary data types, such as geometry, vectors, float, and output the same variety of types, then the attribute processor would feel like very limited, constrained version of node group next to it.

Attribute processor just retains all the same limitations. It only inputs geometry and it only outputs geometry. And whatever it actually calculates can not be output as nodeslot, but only as a text field attribute, which you have to continue using in the same clunky workflow.

Once again, I do not understand why geometry nodes are a node editor when its design clearly goes directly against how nodes should be used, and instead treats the flow as linear one dimensional stack.

5 Likes

Continuing my streak of crappy mockups, here’s how simple it ideally should be to create a basic landscape generator with Geometry Nodes:


Obviously, it would be quite limited, mainly because Attribute Texture Sample node currently only accepts the texture datablock, which is not exposed as a data type so that the texture parameter could be modified with vectors, floats and integers, and still have to be modified outside of the GN editor.

But still, the basic gist is there. All you really want to do is to add Z position of the sampled texture to the existing positions of the vertices, and you also want to control the landscape subdivision, height of the landscape displacement and size of the landscape features (size of the texture mapping). It’s already possible with the current GN implementation, but the amount of mental labor and time it takes to accomplish is just excessive.

7 Likes

Here the node setup with geometry nodes:


It is nearly the same setup.

3 Likes

Yes, but it’s more annyoing to set up, takes more mental work, having to deal with the parameters typed in the text field. It’s harder to edit the tree and make changes, but most importantly, the graph does not represent the data flow at all. So when you work on it, you constantly need to keep the architecture of what you are doing in your head, instead of just seeing it in the tree.

Once again, I repeat, what’s the point of having geometry nodes, when the only feasible workflow is an one dimensional linear stack?

The geometry nodes do not have issue of whether something is possible or not possible to do. They have workflow issue.

6 Likes

That “attribute get” idea sounds good to me ! I think it’s worth brainstorming about a new solution. Your mockups look very familiar, and much simpler to read which is very important. If this could be made to work, it’d be an immense improvement indeed.

(It’s just that I personally have none of the technical background required to judge whether all that is possible or not)

Another (more flexible) way of managing attributes would be through a specialized language, then you’d just need a short snippet to do what takes ten nodes right now.

2 Likes

But your setup has the exact same linearity…so i did not get your point here. I agree that your setup has a bit less typing and looks bit more user friendly for example in when you combine the height in an list which is not tied to the geometry

… but then you have to deal with list of different length (due to the differences in the different geometries ) and someone has to deal with it.
So when you integrate your list (of height vectors) back into the geometry you have to somehow map the values back into. And in your setup the list could also come from another geometry where it would unclear how to do that.

So if you mean to decouple values from geo think that your proposal would add flexibility but it will not be more simple to handle.

5 Likes

Its not only linear as well but IMO more confusing and will get even more chaotic with larger setups.

Also you need to now keep track of where you “got” the Attribute in the tree.

4 Likes

By index, I assume ?

It’s not just that. It’s that the larger the node trees grow, the more easy to understand and easy to edit will my approach be, and worse off will yours be :slight_smile:

I will respond to your other question as soon as I finish my work :slight_smile:

1 Like

Yes if your values came from the same geometry no problem … but as @Lloyd_Almeida stated then you have to keep track of where your values came from and if they fit.

3 Likes

Wouldn’t the proposed “get attribute” node solve that ?

…but then again everything is tied to the same geometry and no reason to decouple it. And the linearity and more typing issues can then be solved by something like the proposed Attribute processor.

2 Likes

Ok, so I think I realize what @daydreamer_mosh means. For example let’s say you have two different geometries, and use put Attribute Get node after both, reading their positions. If you for whatever reason then wanted to add both position attributes together, what would the resulting attribute be?

The problem with the current GN design is that it doesn’t tell you what’s an array and what is not. So for example from my mockup it’s not obvious that at some point, the vector node sockets are actually arrays. Unreal Engine for example has a nice graphics to distinguish between that.

Anyway, for mismatching attribute array lengths, where for example Geometry A would have just 50 points where Geometry B 150, I’d just expect some user friendly fallback, where the result would allocate an array of the same size as the largest input array, then add components for each index up to the index of the smaller array, so in this case 50 attribute vectors from the Geometry A to the first 50 attribute vectors from the Geometry B, and then add Zero vector to the remaining 100.

Yes, it would probably not be what the user wanted to do, but I mean combining attributes from different topologies is not really very expected workflow in the first place, so this kind of generous fallback would not be an issue imho.