Geometry Nodes

I understand. The way they are separated in the table right now does not seem totally clear : “naming conventions” just sounds like “good practice for users” -it’s not clearly communicated that they are for internal use. Maybe your explanation (the one I’m replying to) could be added as a paragraph before any of the tables ? I’m more technical than most and yet I had a little trouble figuring out what to make of it, I would encourage a little more verbosity.

Additionally, isn’t that paragraph incorrect ? ->

  • Used in the Point Distribute to control the rotation of instanced objects or collections.*

That would be the Point instance node, no ?

In any case, I welcome the effort on documentation. It’s a big leap Blender is taking with geometry nodes and it’s important to have good docs.

Yes. I’ll think about how the page could be reworked. Some of this stuff will be changing soon anyway. Apparently I never proofread this… thanks.

We’re the proofreaders

4 Likes

You can already convert instances with CTRL + A, Make Instances Real.

How does it handle thousands of instances?

Hello Everyonne

I got a few useful nodegroup to share here:

  • Global to Local vector mapping:
  • XYZ Euler to Quaternion
  • Quaternion to XYZ Euler
  • Quaternion Multiplication
  • Rotate Location Vector on XYZ

the file contains two demos:

in first demo, the camera colored axis object transforms are calculated in the emitter local space via the nodegroup

in the second demo, the colored cube on the right has passed trough a Quaternion conversion funnel, as you can see the conversion did work properly as the rotation is the exact same as the reference cube

File:
https://pasteall.org/blend/52d6d54366494f1bac570dad2e73bc21

4 Likes

Is there a way to produce an uniform random vector with the attr randomize node?

Capture d’écran 2021-02-06 023855

As it seems that it add randomization in both XYZ separatively instead of giving a random constant for X, Y and ( for example [2.55, 2.55, 2.55] instead of [6.59,6.47,1.28] )

If we want to use a vector type for the scale, adding randomization is tricky
anyone have a solution ?
Cheers

If you want a randomised uniform scale, make the attribute randomise node output a float, not a vector. Geonodes is smart enough to be able to convert a float to a vector, and will make each component of the vector the same as the float, so a random float of say 22.5 will become vec3(22.5, 22.5, 22.5).

1 Like

scalar, scalar, scalar, scalar, scalar, scalar, scalar…

1 Like

Then i lose my XYZ scaling… :kissing:
I want for example to change the scale of my instances on Z AND have a random uniform Scale.

Impossible?

Can’t you just multiply a random float to your existing scale values after you have scaled on z axis?

Not sure how it handles thousands, but tens of thousands very very poorly. Anyway, what I’d really be for is access to scattered points via Python API, I need to use it exactly the same way like you(export instances into game engine) and doing it by making instances real objects is no go.

3 Likes

We can’t read points and their Attr via python?

Anything with dividers in it makes things easier to read !

We need more dividers.

1 Like

As far as I know it’s not possible yet.

IMO This really needs to be a feature before release. 2.93 is an LTS release to be maintained for years. Geometry nodes scattering should have all the necessary features including debugging in place as well. Lacking the ability to read instance data via scripting will be a massive crutch in countless production pipelines.

1 Like

As far as I know it’s not possible yet.

Well I did write some attr (it was the normal attr) and it did cause some issues/bugs
So I suppose the API is just not ready yet?

Yeah Completely agree, this is extremely important, tons of studios/gamedevs will need to directly write and read points and their attr/instance name to link with their pipeline.

Just having a play with setting up a basic foliage generator and then LOD system using proximity weight.

I have a question though. To begin with, I generated my stalk with leaves and flowers, and then I generated my bush using the stalk (still with a live node tree on it), I then generated the ground plane that was scattering those live bushes. Performance was poor - anywhere between 1fps and crashed.
However, as soon as I made the instances real and basically baked out a bush, even though it was the same poly count as the instanced version, the performance on the ground with a few thousand was back up to realtime.

I thought instancing was supposed to be much better for performance. Why then was the performance so much better when I turned the bush into a real object rather than having it point to the live node tree?

6 Likes

I’m really just guessing here, but based on what you described above it seems that the node tree which generates stalk(from leaves and flowers) and the node tree which generates bush(from stalks) has to be evaluated for each and every bush instance placed by Point Instance.
And the reason why scattering is fluent with “baked bush” is that Point Instance node is then using “existing” mesh instead of “needs to be generated first” mesh.

3 Likes

So, I have a complicated question to ask.
I have a collection composed of: a mid poly car, some low poly assets, two low poly rigged characters and they are emitting about 10000 particles each. If I manually duplicate those collection 10 times, the render runs fine. But, if I instance this collection using Geometry Nodes the exact 10 times, while on the viewport everything looks fine, both viewport and final render simply doesn’t work. Is this a Geometry Nodes limitation? Because the poly count is exactly the same on both situations (manually duplicating and GN instances).

On my limited understanding, GN instances should be more efficient than simple duplicates, or am I wrong?

What exactly doesn’t work?