Geometry Nodes

Personally, at the moment, I don’t understand where this scaling and rotation parameter comes from

Hopefully this will help to understand attr
i hope the view will adapt from what object/node is active

https://developer.blender.org/D8637

One question about how nodes work.
For example, a scene with pebbles on a plane. https://developer.blender.org/F9383263
We took our plane,

  1. converted it to a point cloud,
  2. added a random size and rotation to these points,
  3. then Converted these points to custom geometry,
  4. then merged this custom geometry.

Why did we get custom geometry and Plain at the end? We’ve converted this plane to custom geometry. Plain should not be visible, it has been converted to Custom Geometry.

It makes sense if we combine the Modified Geometry with the Original Geometry at the end.

Incorrect. The plane isn’t affected directly by the node tree. These nodes operate on a point cloud object and plane is only referenced as a guide for point placement.

1 Like

Ah, I understood what was happening.
The developers have added a new primitive object called Point Cloud. And Geometry Nodes must be added to this object.

And one more thing.
If I deleted Node Input or Node Output, then there is no way to add them, because they do not exist in the Add menu, at the moment?

Why Geomtry and Mesh menus in the Add menu are separated from each other. They are very similar to each other. Perhaps it would be better to combine them, or not?

This creates an attribute or uses an existing attribute, how many are there? I realized that I can print Rotation and Scale there. How about Move or something else, and why should it be entered manually instead of choosing from a list?
Maybe it can be improved somehow?

I’m not sure if this is intended for users, rather for developers to test the output of Geometry Nodes

1 Like

Yes. The workaround for now is to copy-paste those nodes from another node group.

According to the manual mesh nodes operate on mesh only… geometry nodes operate on all kinds of geometry.

You can create any attribute for any kind of use, and name it what you want. It could be an attribute on certain vertices in an electrical cloud so that you can create lightning rods in-between them, or it could be stains of blood diffusing in powder snow.
Of course there are also the builtin (legacy) ones such as vertex groups or vertex colors but I’m sure they’ll be available to choose from a dropdown later on. And I assume some names are reserved such as “scale” or “velocity” for particles, or “normal” for mesh vertices.

I understand it. This is not critical, they are just very similar.

This makes sense if there is some attribute that does not belong to the Geometry Nodes, and you can put it there this way. But if there is any attribute that can be placed in the Geometry Nodes, it must be in the Geometry Nodes as a node.
Or, as it was suggested here to use it as a Teleport. Create an Attribute, call it with a special name, and without connecting noodles, enter this attribute in another node.
But, in my opinion, any attribute should have some kind of visual form, or visual connection with something, in order to make it easier to understand.

I believe this would be the same as setting and getting a variable.

Your “in-portal” is assigning a value to a variable.
Your “out-portal” is calling this value and putting it into another node.

You could have several “out-portals” for a single “in-portal” as well.

And if you could name them they would be easier to organize.

And perhaps there could even be portals to other objects.

It would be good to have a list or something in the n-panel to keep track of all the variables/portals.

3 Likes

For me one of the most important features would be a “Display” dropdown at the output.
I should be able to set it to “Geometry”, “Bounding box”, “Bouding box wire” and “Dots”.
Imagine when scattering thousands of complicated objects, such as a tree.

What do you think, is that at all possible?

It would be nice to have loops as soon as possible, to be able to create randomized arrays in nodes.

Thanks for your suggestions, they are very good. Until the weekend I will not be able to dedicate more time, but I am collecting all the impressions.

Although the proposal has not been very successful, it has been widely commented and I am reflecting on it. For now I plan to reconvert it into groups in which the inputs and outputs can be separated, and connect them using a cable manager. In this way, apart from organizing, playing with the visibility of the connections, and showing the input and output attributes, they could be modified within the group, like current groups.
I think that giving more options to the groups it is possible to fit everything.

The “portal” thing is the same creating a variable in a list in the node tree, and having a “set variable” and a “get variable” nodes, no need to go too sci-fi to find this concept being successfully applied in other packages :slight_smile:

2 Likes

Do some of you have similar issues with “Poisson Disk” point distribution?

Blender version I used

version: 2.92.0 Alpha, branch: geometry-nodes-distribute-points, commit date: 2020-12-04 17:25, hash: 0f0ea8b81790, type: Release
build date: 2020-12-04, 18:00:38

Sometimes, after I set Minimum to a tiny value (0.05) and increase Maximum to a high value, the point distribution does tile. And, sometimes Poisson Disk distribution does show a beautiful random distribution without tiling, same values for Minimum and Maximum.

@LeonardSiebeneicher it has to do with the minimum distance. Changing it just scale the pattern. Tiling is not the issue, what could solve your case is jittering. Though for now you can do the jittering yourself as a separate node

1 Like

I think it’s not only possible, but absolutely necessary for any serious scattering tool.
It would be great if Blender could have an automatic LOD system, where you set a performance target in the viewport settings and the complexity of particles is adjusted accordingly. So basically instances that are closest to camera are shown as fully detailed geometry and at some distance they are switched to some simplified representation, like bounding box or point cloud, to meet the performance target.

1 Like

Ah … ok.
the tiling distribution puzzled me.

I can wait with experiments which need jittering.

Please, consider to use OSL for this task. I know is may sound strange because it is for shaders, not geometry - but shading part can be simply ignored for this case, or extended. And some features may be useful for getting geometry too - raycasts, queries, etc

It is already has many rendering/surface related concepts in place, documented, with examples, etc, etc

It is really fully-featured, c-like, compilable and well optimized (comparing to python) and already integrated with Nodes… and it is cool :slight_smile: would like to see it in compositor nodes too, some days. For fast custom processing

Uh not really. not even compatible with GPU’s
I think top priority would be to fix this first before using osl elsewhere

I think he refers to generate scripts that affect geometry, not for shading :slight_smile:

So it has no relation with render time OSL

not even compatible with GPU’s

same as python! etc. geometry script will likely be CPU-only anyway. And this is not really a problem, since they will run BEFORE render time, to prepare meshes for real render. So CPU/GPU is not a problem here

Blender currently has only two scripting options - Python and OSL for Cycles. Would be crazy (imho) to add any third option just for geometry here.

Python is OK but it is generaly slow and much more memory consuming, especially if you need to generate something massive. OSL, in other hand, is compilable - so it is FAST - and already have tons of optimizations. Since it is a bottleneck in rendering, it is really robust for what it can do

Comparing to Python OSL is much more suited for the task, imho. It already have displacement stuff (used for rendering, but why not reuse concepts in similar context?), for example.