Hi, guys. So, I’m trying to instance a cylinder on another cylinder, but for some reason the instances are incredible squashed. Why?
Because your emitter cylinder is scaled.
Also, I’m trying to use a remesh modifier to “merge” all the instances on the original cylinder but it doesn’t work. It only affects the original cylinder. What would be the ideal workflow?
Few posts back we’re discussing it, distributed points or instanced objects won’t be anyhow influenced by modifiers placed after geometry node modifier.
Hello, I tried the geonodes for few weeks now. I used it for scattering trees and grass. Imho, It is very useful for self understanding of what we are doing. Each node of the tree is useful, and no fields ( as the particle system ) are empty or not used. You have to set up only what you need. One thing I would love to see is a way to make a vertex group sensitive to the camera view. I tried some math functions, but didn’t works as expected.
For scattering trees, I used planes ( luckily for my pc ) and the orientation to the camera is pretty simple. Just fix the rotation with a fill attribute node. For the grass, random Z rotation applied, no problem. So much more fun than with particles, I promise !
Thanks to the developers working on it, It is just going better each days (even the weekend ) !!
I trying to show what’s possible with this new tool, but I’m absolutely not using all the possibilities around it.
btw : Took about 5 hours of free time on this sketchy render
It looks like a Point Distribute node is becoming inevitable to use attributes.
We can not rotate according to Normals of original mesh without using this node.
Unfortunately, this node only provides random distribution methods.
We can not use a predictable editable distribution corresponding to vertices or faces centers of mesh + mesh attributes, for the moment.
It would be welcomed to have other methods available like vertices, edge middle points and face centers.
That would allow user to have a fine control on some instances placement via mesh editing.
So, I’m trying to recreate a scene of mine using Geometry Nodes, this is how it’s going.
I have one question. Is it possible to select a geometry and use a specific material for that geometry? If I need my pillars on the image above to have different colors than the rest, is it possible?
I agree about other instancing methods, which would have ideally been all done with one node. I had suggested a while back to change the point instance node to be called object instance with a dropdown menu for point/face/edge types. Otherwise there will now have to be a separate node for each instancing type…face instance, edge instance, etc.
Point instance still sounds to me like a node that would instance(or add) a point, so I don’t think it was named very well. At some stage there will have to be some kind of add point node, so that points can be added via nodes alone, so naming can get problematic quickly.
Okay. I tested.
As you may know, instanced mirror, array, instanced radial array are higly desired. And vanilla Blender do not have it out of box.
I made advanced instansed mirror. But it still requires another object with single vertex.
Sadly there is no way to create certain quantity of points and fill there positions with function. No way to get length of attribute. I think, such low-level access and operations are must have and allow to create arrays and clouds of objects. Not just surface scattering (which is very nice too since Poisson disk scatering implemented)
I just saw the commit in master.
It looks like Hans is asking himself how to present vertex normals, face normals and custom normals.
But as you said, my question was more relative to distinction between vertex and points.
Initial goal was to use point cloud object that only have points.
So, under that context, a point instance node would not be weird.
But now, that we are using directly mesh and trying to exploit mesh attributes directly, too. That implies changes to original plan.
What I found weird is to have one big geometry socket.
There is a need for a big container socket to pass from one step to another.
But we are facing a problem about characterizing and identifying content.
I am exposing a problem where verts are considered as points without attributes and points from Point Distribute nodes have attributes. So, we have two different kinds of geometry content.
We can not do the same thing with those different contents.
So, we need nodes to deliver a generic way to reach attributes or to identify and convert content of geometry socket or maybe both. I am not sure.
Geometry category of nodes only has 2 nodes (join/transform), for the moment. I don’t know what Jacques plans were about that category.
But if there is a conversion need or a way to output attributes or a viewer node that probably should be solved by nodes from that category.
I don’t think you can specify the material itself, yet. But you can give the different parts of your structure a color attribute that you can reference in the shader editor with the Attribute node and use as a mask in your material.
If I understand your node tree correctly you should be able to easily create the color/mask attribute before joining the different parts together:
Oh, thanks for your help. But it did’t work at first try. I’ve used attribute fill color on geometries with an specified attribute, also I’ve created a vcolor as well. I will try again later.