Geometry Nodes

I am unable to keep up with everything new regarding GN, but I am curious, in the 3.0 beta, can GN be finally used to generate/write to UV attributes?

All I want is to for example use world position coordinates to generate UVMap to then use with displace modifier. But this doesn’t work:

1 Like

You can indeed:

I was trying to create a simple, trivial box mapping which almost every software other than Blender has in a non destructive form. In Blender, only way to achieve some sort of box mapping procedurally is to create UV project modifier and create 6 individual helpers for each axis, which is crazy ugly workaround.

I tried to do it in GN, but I just have no clue how to make workflow even remotely acceptable. I mean I got it working:

But there must certainly be some better way to achieve this than this hideous sequence of attribute vector math nodes where one has to come up with crazy amount of temporary attribute names just to put something as trivial together:


It was a real headache to put together because it was impossible to visually debug which world normal mask was corresponding to which 2D Vector world UV projection. But mainly because how much data one has to keep in their head when constructing this.

Since you seem to be one of the few people who actually know their way around GN, is there a better way?

All I want is to just create World XY, YZ, and XZ based UV coordinates, and then combine them together using dot product masks.

Edit, this is the same thing done in shader editor:


It just seems way more intuitive.

2 Likes

Right, that’s the idea with this: D11547: Geometry Nodes: New Attribute Processor.

Yes, I know, but afaik it’s still not available in 3.0 beta, and also I am a bit worried that it will still be cumbersome as the processor subnetworks will be quite isolated from the top level networks. I will see how it turns out, but so far I remain skeptical.

This is the cleanest version I have so far:


But still, for example I wish I did not have to create attribute separate node followed by 3 attribute combine nodes.

I wish I could simply write something like position.xy to automatically cast it into a 2D vector attribute, or position.x to have it automatically cast to a float.

Or when creating those world aligned normal mask, I wish I could just directly type something along the lines of Abs(MaskZ) instead of having to cumbersomely create Attribute Vector Math node, set it to Abs, set it’s input to MaskZ, and set it’s output to MaskZ.

And the Attribute Color Ramps nodes are another products of cumbersomeness, where I had to resort to imprecise compression of the range by typing 0.499 and 0.5 in the ramp slider values, because I wanted to avoid even uglier node tree, if I wanted to do things properly, as in Ceil(MaskZ - 0.5f)

Whatever the Attribute Processor will be, I am somewhat worried it won’t make this sort of things that much easier. :neutral_face:

Not Geo node related but if you want box mapping in shaders then you just change the Image texture from Flat to Box

Yes, I am aware, but:

  1. This one has serious bug where it breaks when rotated
  2. It’s purely shader only. The reason I need triplanar/box mapping is mainly for procedural modeling.

I do lots of environments, and surprising amount of procedural environment stuff, such as cliffs, castle walls, etc… can be generated by chaining mesh displaces and remeshes… :slight_smile: But for that, it’s important to be able to procedurally displace mesh, then remesh that result, and then generate UVs on that remeshed result to displace again :slight_smile:

I wanted to show you an example, but unfortunately it’s bugged out :frowning:


Apparently, if you have 2 GN modifiers using same GN network writing into the same UVMap channel, only the first one will work, so the second one, which should generate the new Box UVs on top of the remesh modifier doesn’t work anymore. :confused:

to his point, box mapping isn’t UVs- if your goal is to export for a game engine for example, the only ‘non destructive’ method is the UV Project modifier with 6 different empties (7 if you use a root for global transformation, which most people do). it is decidedly not intuitive.

Yep, we have a free addon to streamline that process, there is no other way, but with the addon is a matter of a click, the “bad” part of this is that it does not have “mixture”, I mean, the seam is sharp, but since it’s UV based, it’s logical.

I made a texture box mapping node group, that can do displacement without need of UV maps and has sides blending, it might be useful

It can also be easily modified to add rotation/location controls


Shader node setup that it was based on
It also has sides blending/mixing it might be helpful @JuanGea


Blend file

4 Likes

Hi,

yes, I know. I’ve made pretty much the same thing myself already quite a while ago. Not just box map but actual triplanar map like you, and using the GN mapping instead of UV maps and displace modifier.

I know it can be done, but like you, I also arrived at an incredibly ugly and confusing node tree. What I did above was just exploration of if the workflow can be better with GN in 3.0.

My problem with GN wasn’t whether something can be done, but the speed and efficiency it can be done at. Since you went through the exact same process, I am sure you know how painful was the clunky messing with all the temporary attribute names compared to doing the same thing in the shader editor. And that’s my point :slight_smile:

1 Like

Devs are talking about particle nodes again. So, naturally, the question has to be asked:

Can we do it in geometry nodes?

The answer is: “Well, kind of”. The example here uses some basic particle building blocks to create a bunch of barrels that collide with the ground and then explode after a delay, spawning secondary particles.

blend file:
(Note: You have to run the register_frame_handler.py script to make it work, see explanation below)

Some implementation notes:

  • GN output state is not carried over from one frame to the next. To make iterative simulations work, we have to somehow store the modifier output. I’ve used a frame handler that copies and applies the modifier to the mesh:

    def cache_handler(scene):
      override = bpy.context.copy()
      override['object'] = sim_obj
      bpy.ops.object.modifier_copy(override, modifier=sim_obj.modifiers[0].name)
      bpy.ops.object.modifier_move_to_index(override, modifier=sim_obj.modifiers[1].name, index=0)
      bpy.ops.object.modifier_apply(override, modifier=sim_obj.modifiers[0].name)
    

    Alembic caches would be another possibility, but they are a bit finicky, and file read/write could have unwanted overhead.

  • The existing particles are combined with newly spawned particles each frame through a Join Geometry node

  • Physics is using a standard Leapfrog solver method to integrate positions and velocities, which is quite good at conserving energy

  • Collision detection is using the Raycast node, with rays between the old and new particle positions

  • Collision response is also fairly standard, with a restitution and friction component. It only does one iteration atm, so with concave colliders might not be able to fully resolve. Good enough, and the nodes get reaaaally messy already

  • When a barrel explodes it adds a geometry point to the “Events” output (you might see where i’m trying to go with this).


    The “Events” geometry is then used in a somewhat hack way to spawn secondary particles, using a Point Distribute node

  • Rendering has to happen in a separate object because all geometry in the sim object is considered to be particles. The ParticleViz object makes instances for points, distinguished by an emitter_index attribute

This was a fun experiment, even though it goes without saying that this way of creating particles is pretty insane and not at all practical. It does highlight some of the areas that need to be addressed though.

13 Likes

This is VERY interesting, and you are using in practice the idea of being able to create a solver inside a node with nodes, that’s magnificient :slight_smile:

If we could have that node as an special type of node that can be “compiled” to get C++ performance, that could be awesome, many types of solvers could be created doing this.

The question is, can you assemble all this in a user-friendly way? so any artist can assemble a scene with this “simulator”?

Of course this is a proof of concept and not a fully fledged production ready simulator, but the concept is very cool :slight_smile:

Which addon are you talking about?

Here it is, BS Easy BoxUV:

https://school.bone-studio.com/cu_addons_free

It’s under our free section.

1 Like

image


After using the join node, the Group data, althought it still exists, cannot seem to get back to the actual vertex group. I think I understand that this is not a bug, it is just the join node makes a new geometry so Group data is no longer a vertext group, but it really doesn’t feel right when using. I am thinking about this, what about having a node similar to the material assign node, but for vertex group? Like a vertex group assign node?

I am thinking about this because it seems assining vertext groups is actually doable:


with the data transfer modifier you can get the vertex group from the original mesh, but it fails when you have edited the group data in the nodes (like in this case both grids should have weights). So I am thinking about this vertex group assign node that you can use in the nodes, and it is also similar to material assign node, kind of for similar purpose and reason.

That’s what I was thinking while reading your post, lol. Cool experiment ! Are you aware that Jacques made a prototype for particle nodes before starting over with geometry nodes, a few months ago ? all the iterative part was solved (!) a while ago already, not to undermine your work of course, and I know you had your own working prototype years ago as well.
It does show that a solver can be made directly with nodes though, which is cool. As long as the user can access previous frame info, anything can be made from there right?
The attribute wrangling in there is pretty insane too, I guess it could be compacted down to a third of that using attribute processors.

1 Like

I wasn’t aware of Jacques particle nodes, i’ll have to check those out. And yes, this really isn’t meant to be a serious particle implementation. I get a feature, i have to test its limits :slight_smile:

If i should make a list of what is missing/nice-to-have for such simulations in GN, in rough order of impact:

  • Caching and persistent state, to carry particle state from one frame to the next, but also to avoid costly computations of parts of the node tree and to record simulations.
  • Better ways of creating geometry, like “make N points for each input point”. Obviously this is more complicated for surface meshes with faces, but for particles is fairly straightforward. Wouldn’t need for-loops even, just inherit attributes from the “parent” so they can be initialized.
  • Time/Frame inputs: these should be implicit, various parts of a simulation will need consistent time steps.
  • Context variables: Declaring object properties would allow configuring simulations without having to pass everything through an input socket. That’s mostly what custom props/ID props do, but here a node tree could declare a property that gets added to any object which is using the nodes.
  • Structs: Grouping properties into structs would make it much easier to pass parameters around through multiple node group levels.
2 Likes

Since GN node networks tend to get messy really quickly, it’s now important more than ever to have good node network management tools. This means that this patch: ⚙ D11209 [WIP] Reroute node improvements should not be ignored. It currently has no reviewers assigned for some reason. Given how important the reroute nodes are when it comes to cleaning up node networks, we should be able to move them with the mouse, like any other node in the node editor.

4 Likes

So, according to @dfelinto Geometry Nodes are supposed to be production ready already for scattering use case, but I am still confused about that. I am trying to cover a trivial use case of scattering some branches on game ready tree, and I want to do some modifications to the scattered instances afterwards, mainly do normal transfer from a remeshed hull to create fluffy tree crown shading:

It appears that literally any node I use after the Point Instance node completely destroys the UVs, and not only in given geometry stream but all geometry streams that are joined together at any point.

And a bonus questions:


How do I make Volume to Mesh result smoot shaded?

Thanks in advance.

EDIT: Even doing the workaround of making the data transfer happen outside of the GN results in the same thing :frowning:


So apparently any manipulation of the mesh data after point instance has been used just removes UVs :frowning:

3 Likes