Geometry Nodes

Hi, welcome to the forum.

  1. Rotation is a topic for the next sprint (nicknamed: rotation feast). A glimpse at the plans:

  1. Geometry normals is in the plan, as you can see in this task. As with any other task in the backlog, we welcome other developers to help with them as well.

  2. There are other ways of doing this I believe. Like sampling a texture and setting attributes from it.

Congratulations on the richly illustrated post.

7 Likes

This end of the year the development will halt a bit since most of the team is on holidays. That said, a lot of the work is already in master.

I would love to see the result of what people are actively doing with the system. In the end of the day this is the outcome we are looking for. So go delete your default cube and feel encouraged to share your creations here. (and eventual feedback based on those solid uses of the existing tools).

9 Likes

Rotation is a topic for the next sprint (nicknamed: rotation feast). A glimpse at the plans:

Geometry normals is in the plan, as you can see in this task. As with any other task in the backlog, we welcome other developers to help with them as well.

Thanks for your response Dalai
Great to hear that rotation will have it’s own sprint

There are other ways of doing this I believe. Like sampling a texture and setting attributes from it.

A sampling method look indeed nice for creating procedural patterin from textures
I believe i took this screenshot in the chat, from @jacqueslucke

But not so much for hand painted areas. is sampling vertex-groups/colors also in the plan?
Hope it is! it’s much more easy and non destructive to paint attr with weight than it is with an uv unwrapped bitmap from a texture data set up :slight_smile:

aside from this cool new feature,I must insist, it’s a bit weird that the points don’t inherit their emitter attr, not sure if i’m the only one thinking that? any other?

I wish the team a merry christmas,
Enjoy!

Yes, points should probably inherit attributes from their source, that just requires some development of attribute interpolation though. So more of a missing feature.

6 Likes

Point instance collection now instances all collection objects at once. It would be nice if there was a checkbox for instancing a random object from collection, or/and a random nested collection.

1 Like

That was just implemented: https://developer.blender.org/D9884

2 Likes

Ah, I finally understood what that design meant. Even if inheriting works, it wouldn’t work as the image though, since “scale” is a reserved name. It would need an extra node to copy/rename the painted attribute to populate the “scale” one.

2 Likes

I think that like this, it would look more understandable and convenient.

These parts don’t look well thought out. If other parameters are connected by noodles and can be visually connected, then these parts are not visually connected with anything, and must be entered manually.

This will likely make more sense once the spreadsheet viewer becomes a thing
 Attributes are just buckets of data that can be attached to the geometry set being passed along by the geometry node. A getter node from an object won’t help as their existence is a little more complicated than just vertex groups predefined on an object.

Right now the attributes aren’t displayed anywhere so it will likely be a little confusing, but I think a good idea might be to allow you to either pick existing attributes from the incoming geometry set with a drop-down or type in the name of a new set. (although if this approach is taken there might need to be some thought into making it clear to the user that they can create new attributes based on typing or make an explicit attribute create node)

(Edit: Now that I think of it implicit creation of attributes might create a whole lot more problems than it solves. It will be especially difficult to debug if the user assumes the incoming data already exists but they are generating it (like in the case of a misspelled attribute, domain differences, or something happening upstream). I think that in this case the node should raise an error and I would heavily encourage the developers to move to an explicit attribute creation node.)

Not sure if it is common knowledge that you can put expressions into the geonodes parameter fields the same way that you can for shaders. This allows you to put a #frame expression into a value node and use it to drive animation.

This is a simple little mograph test that I have been playing around with that uses the frame number with a sine node to drive various parameters. This is just playing back on my mediocre laptop quite well under eevee, and gives a nice glimpse of how cool these tools will be in the future.

Really need to get access to individual point id numbers and attributes(PSR) to drive expressions for the cool mograph stuff, so not really sure what the plan with this is? For instance, will you be able to put a sin(#frame + #id) into an attribute fill field for scale, so that each id would have a different sine animation?

Anyway, I uploaded a little youtube vid with a download link to the file if anyone is interested. https://www.youtube.com/watch?v=QFCddvIlRCY

10 Likes

Well, what can I say looking at all this.
Presently. This is probably the most difficult Node system to understand in the entire industry.
If this is not improved in the future, then this will be like using the Shader Editor for #Nodevember. When only a dozen people on the entire planet have a complete understanding of the instrument. And I see that this is only getting more complicated.
Meanwhile in other programs the node system is very easy to understand, and at the same time, they are much more powerful tools.

I don’t think this is the most difficult node system to understand by a longshot. The industry standard procedural modeling and vfx package has a very steep learning curve (including learning a programming language) but is incredibly powerful. Even addons like Sverchok can be super difficult to wrap your head around without a solid understanding of mathematics.

Geometry nodes is currently missing some central functionality so if you are looking to achieve certain things you probably cannot right now. I think a lot of blender users right now are complaining about attributes because they aren’t used to them as a concept and there are a couple of missing debugging and quality of life features.

3 Likes

I get it about blender not being powerful compared to houdini etc
 , but it’s by far the friendliest pro software to learn in the whole industry

2 Likes

I dont agree about the mockup design, the old classic one was more readable and functional. Coolness of design is diffferent topic that being useful. That thin colored stip doesnt give an idea about the type of node from a distance, the old one was more readable, when the green bar was under the text of node name so we could guess the node from a distance when its zoomed out. This design is so similar to nodes of other softwares which I didnt find useful as Blender’s nodes. I think current node design is great and stay as it is. Thats one of the reasons I like blender nodes
But I agree, twiceastall sockets is good for solving double inputs and yes good idea to use sidebar to decide on sequence when sequence is important.

I disagree. I think its not good to have text size stay same because its out of the node. what if someone has complex nodes placed side by side and then zoom out. The text will be messed up . In current Blende Nodes, The reason why we have the thick strip of colors is to understand nodes without reading their name by using colors not the names and when we zoom out in a compelx closely placed nodes we can understand roughly the nodes type by color cuz its so visible and nothing stays out of node so it can look more tidy from far. Also, the reason that we have colored top bar on nodes is to have that color fully tell us about the node when its closed/unexposed in compact form, when we remove color from top bar or make it thin, it doesnt serve the purpose of having closed/unexposed nodes.
Also inputs should be on the left not bottom. Thats how input logic exist, so old Principled bsdf circular unexposed node design is still better ı think. New inputs horizontally will cause issues when there is complex node setups and make it harder to have a sequence. Node logic is for making things from left to right. Layers work from top the bottom
ı think current Blender node design was well thought and no need to look cool.

Some more experimenting with geonodes for mograph. A cool little discovery was that the attribute color ramp can remap a position to a scale attribute in a non linear manner. I actually map from position to scale then a further scale to scale to get a unique look.

Not totally sure how this is working under the hood. I am guessing that the position vector length to the object center (0, 0, 0) is what is being mapped to the scale. Would be nice to work out a way of animating this center point to move the falloff.

There is a file download link in the video description:

14 Likes

Presently. This is probably the most difficult Node system to understand in the entire industry.
If this is not improved in the future, then this will be like using the Shader Editor for #Nodevember.

What would your suggestions be to improve Geometry Nodes accessibility?

This system, of course, is not the most complicated, I said this rashly.
I, like the others here, we have a lot of ideas. But the problem is that we do not know what idea lies in this project, what concept and worldview the development team uses. Therefore, everything that I can offer may look completely inappropriate for the vision of the developers.
Apparently the developers were inspired by H
, the attribute spreadsheet is also similar to that.
I take SD as a basis. Which has very clear nodes. These nodes are mini presets, like generators, tile generators, nodes that allow you to select or create something. Mini factories that do specific work. This is very convenient for ordinary users.
Of course, the SD also has minimally simple nodes and an editor where you can also use mathematical equations to create unique nodes aimed at creating something that standard nodes do not provide. This is suitable for very advanced users.

The problem I see with Blender is this:

  1. It will be the same as in the Shader Editor, there will be only simple nodes that will need to be assembled into complex groups each time. There will be no mini preset nodes. (Of course, you can import the created groups into a new file, but they will not be there by default)
  2. I believe that everything, or almost everything, can be done inside Geometry Nodes, without resorting to separate editors or abstract elements like hand-written attributes.
    The point is this. Nodes and Noodles are visual feedback between the program and the user. This system uses visual techniques to show how some elements connect to others and interact with each other. This is what this system is designed for, for visual feedback. The use of non-visual elements in the visual system makes the visual system non-visual.
  3. Most likely, there will be nodes that will be a hand-written script that can be used as a node.
  4. This is more of a nitpicking. The Point Cloud object, perhaps it was not worth creating it inside the 3D space, but creating it inside the Geometry Nodes. Because if you try to explain the Point Cloud and schematically depict it, then it turns out that this is an object within which another object is selected on the surface of which points will be created, and a third object is selected that will replace these points. Three objects inside one. But this is not critical, just my nagging. Perhaps this is the simplest solution, or perhaps not, I don’t know.
  5. I don’t know if Geometry Nodes will have full analogs of all modifiers, and whether it will be possible to create non-linear stacks of modifiers inside the GN. Or we will have to use modifiers and GN in parallel.

Of course I want to thank the developers, thank them for their work.

I don’t understand that comparation, both programs that you mention have the same atomic nodes and hundreds of presets. And the user needs a way to read the values, if somebody know other way that a spreadsheet to read a thousand or million of different data could be good to know.

I see actual geometry nodes like a alpha/beta version with few features, we can not judge all the system by actual things.

1 Like

Amazing work!

What I’m really waiting for is the possibility to have objects or empties drive/become falloffs for scale, position etc., like in this mockup.

I tried using a Vertex Weight proximity modifier and use the resulting vertex group as an attribute to drive the scale but either it’s still not possible, it’s not the right method, or I’m just missing something
 regardless of this, amazing work done so far by the devs.

5 Likes