Geometry Nodes

How to access GeoNodes custom “inputs” in a python script?
Just like how you can access say the “count” of an array modifier via bpy.data.objects["array_obj"].modifiers["Array"].count, how can one access the inputs of a geo nodes modifier without having to go into the actual nodes? I can’t seem to figure this out. any help or pointers to what I should be looking for?

Just hover mouse cursor over exposed input and pop up gives you a hint. In the example below exposed Ray Length is “Input_2” .

1 Like

Thanks man!
It does seem like the inputs are treated just as custom properties of an object? so I should be able to do obj.modifiers['gn_modifier'].keys() to get a list of all inputs? only one way to find out

EDIT: Yup! inputs behave like (or actually are) custom properties

Is there a way to make sharp corners when using splines for distribution of points? Although I’m using vector handles and the curve looks correct, point normals always get interpolated. This behaviour is unusable for making railings.

Proximity from volume would have been nice too

Capture d’écran 2021-08-01 152547

8 Likes

I had to remove a video from proprietary software, see:

3 Likes

I’d like to solve that with the sample volume node: ⚓ T89218 Volume Sample Node

5 Likes

I’m not sure I understand the goal of this node. Does it do a sort of “attribute transfer” from a volume to a point cloud ?

One extremely useful node to have is a “is_rendered_view” input

Capture d’écran 2021-08-02 133907

Similar to the usage of bounding box, it would allow us to see geometry only when the renderedview is active

New

‘‘But why’’ you may ask?
well because cycles rendered view is capable of displaying a ton of polycount with ease thanks to the nature of ray tracing algorithm, so having a simpler geometry for the rasterized viewport and the “real” render ready geometry for rendered view could heavily facilitate workflows

I Did a script that do this action automatically but IMHO this should be supported natively

20 Likes

Related to this…
… is there currently a way to do the opposite, i.e. transfer a value sampled via the attribute sample texture volume node and into volume density?

1 Like

I am curious, how does that script work with rendered viewport mode? I mean when you switch to rendered and swap the proxy instances to the real ones, Blender will still draw them also in the rasterized layer to draw stuff such as selection outlines and such. This means that all those high poly instances will still be drawn also on the GPU and the viewport navigation will likely be extremely slow…?

What I mean is that since the monke here:


Got the selection outline drawn, that already means its in the GPU buffer and GPU memory too. Or did you find a way to keep proxy representation of the objects in the rasterized overlay even in rendered mode…? I am really curious about this.

Selection outline is the biggest killer in ray traced rendered mode indeed and should be avoided

Or did you find a way to keep proxy representation of the objects in the rasterized overlay even in rendered mode…? I am really curious about this.

In my Scatter workflow i always do the scatter from an external object, getting emitter from the object info node, then disable the selection on this object. then poof no more outline, faster viewport + easy to manage scattering layer

Interesting… I’d assume that even when the object is not selected, the scene is still present also on the GPU. But it sounds like the meshes aren’t drawn at all.

For example… are you able to render let’s say 1x1km plane with instanced high poly grass? (Should be around 25-50 million average sized grass clumps). Something like that should crash Blender any time the rasterizer tries to display it in viewport. So in order for something like that to ever succeed without crashing, when switching to rendered mode, the proxy meshes would have to be swapped for real one in a way where it doesn’t ever get loaded onto GPU to be displayed in viewport. Does that work?

But it sounds like the meshes aren’t drawn at all.

For example… are you able to render let’s say 1x1km plane with instanced high poly grass? (Should be around 25-50 million average sized grass clumps).

in the python script as demonstrated above, the implementation is rough (it’s just a timer that switch geometry so it’s a very bad implementation, unfortunately, there’s not much we can do from python).

what i assumed in the node proposal is that it would use the same kind of implementation used in the bounding boxes display, guaranteeing that the full mesh only displays on render/rendered view.

What about booleans difference?
Points- Mesh ?

BTW sorry for the video, i though that this was an exception as this is a very simple example

Yes… that’s why I asked. I was curious if there is already some obscure way to do this that you have figured out, or if we have to wait for something like the node you’ve proposed.

I used other DCCs before Blender with plugins like ForestPack, and when I migrated to Blender, it was borderline impossible to create large environments with heavy instancing, not because Cycles could not render it, but because there was just no reasonable way to generate those instances without killing the viewport, because there’s currently no proper concept of rendertime-only geometry with lightweight viewport proxy display, which guarantees that the viewport will never know about the high poly rendertime mesh, and renderer will never know about the lightweight viewport proxy.

3 Likes

Good idea!
But, what should be supported natively is ‘display as points’, which can solve all those viewport performance problems. Even mid-size grass field displayed as bounding boxes can significantly degrade fps.

This can already be done

(not if we do per vertices instancing tho)

hey guys

Where can we control the instancing index?

i use random sample for collection but it seem that there’s absolutely no way to control which instances are spawn where, which is a bit annoying, i believe that an instance index attribute is needed

1 Like

There actually is ID attribute on Instances, but I’m not sure what it exactly stands for because for all instances it’s -1. Maybe @HooglyBoogly will shed some light :slight_smile:

1 Like

Ah a new attribute! interesting,
I also want such attribute, it would be very good for having more control over how the instances are spawned

but having it per instance domain mean that it cannot be changed without destroying the instances into a big unoptimized mesh :face_with_raised_eyebrow: