It’s not that an object is a packed version of meshdata. An object contains of two parts: the object and the object-data. The second part is the actual mesh data, while the first part contains the transforms for the whole object (simplified explanation warning). When you instance an object, you only copy the object part, but you reference (link) the object-data part. This means it’s very efficient, because you don’t need to copy the (large) mesh data.
A mesh primitive in GN just generates raw mesh data. Turning that into an object (i.e. consisting of an object and an object-data part) doesn’t really fit into the blender architecture I’d think. That would lead to ‘hidden’ objects which exist but are not shown in the outliner.
I’m not saying it’s impossible to generate objects from a GN tree, but it would need ot be very thoroughly though over and designed to prevent an unmaintainable mess.
So I don’t really see that happening anytime soon. If you need the efficiency of instancing, you’ll have to do it explicitly and keep the instanced object outside of your GN tree.
However just duplicating existing mesh data imho is something which would fit GN very well.
I see your point, I’m aware of the Object and the object-data difference, with “packing” I exactly meant creating “hidden objects”, not visible in the outliner, with object container and data to be instanced, of course this process happens behind the scenes, transparently to the user. Anyway, I’m totally ignorant on blender internal software architecture, so I don’t know if it even makes sense to talk about how to implement something like that internally.
It would still be great if it could be possible to have a workflow that doesn’t necessary relies on external objects, like the one described by @dfelinto in this post (on-the fly creation, opposed to the asset workflow):
It would be even greater if instancing objects created directly in the tree, wouldn’t generate a performance loss due to raw mesh data copying and not instancing.
In houdini you can just create objects primitives, edit them with nodes and then copy instances to points, It would be handy to have the same workflow possible. To be honest, I don’t know if Houdini instances geometry or just copies it, losing performance as well.
We’ve mentioned this before in a few places. The proper solution is instancing geometry directly-- not requiring on object for instances created in geometry nodes. Work on that is actually planned for this week.
why would we still develop new nodes if there will be a rework of the whole data flow structure in X weeks? is there somewhere was can read about the planning of geonode for 3.0?
I think that it’s completely impossible to get the ALPHA channel information from a color attribute
let me know if i’m wrong
right now this information is visible but not accessible which is quite weird
As weight / vertex paint is very slow in case of higher poly meshes, is there any more responsive method for defining the distribution?
Or can we expect the performance of those 2 features will get improved significantly anytime soon?
Are there any plans in the term of the general performance so we can work smoothly even in projects where we have hundreds of thousands (or millions) instances and “emitters” of i.e. 5M tris?
weight painting is done in “Group” on a lower density grid. The Attribute Transfer node projects it to a vertex group inside the higher density grid.
That way editing is done on lower resolution meshes.
But I guess one would need to use many more small meshes and patch their weigts with Attribute Transfer to the higher density mesh.
I am not sure how this would impact performance.
– Edit –
I tried with high density mesh.
If lo-res group is projected into hi-res, painting on lo-res becomes slow. But if projection is disabled during weight paint, painting works fluently.
I’ve been struggling recently to get colors applied to instances in Geometry nodes. I’m trying to apply them using Attribute Fill which works for a mesh but not for instanced geometry. Does anyone have any ideas as to why this might be happening? Each of the geometry instances have a vertex color Col channel and the Col attribute appears in the Spreadsheet. I’m assuming this just hasn’t been implemented yet.
I’ve been having the same problem with Vertex Colors as attributes. This has become really problematic for me because I can’t export any changes to UVs or Vertex Colors with the current set up.
Attributes on instances doesn’t work (yet?). Entagma recently did a video showing how you can transfer point attributes to instanced geometry. But you need to collapse instances into a single mesh so it’s not usable in most cases.
Thanks for the tip, but this method wouldn’t allow any precise work as you would be limited by the detail of the proxy (lowpoly) mesh. Also when editing the distribution, you need to see immediately (responsively) what you are doing, but even in this case you would still experience the lag after each stroke - working like that for 8 hours or so will make one crazy.
Yes, Blender performance issues are not the problem of GNodes, but if it makes the features like this one barely usable for any larger projects, I think it is fair to mention it here - hoping devs will notice and fix it finally (i.e. sculpting is infinitely faster than V/W paint, so it must be doable?).
I have already spammed various Blender forums in the past regarding this, but maybe thanks to GNodes somebody will realize that such area of Blender needs some care, otherwise tools like this can’t really show it’s full potential.
Just even thinking about this formula with the current state of GN makes my head hurt. It should be a simple formula to generate box mapping by doing a few if conditions, testing which of the 3 axes is the face normal aligned to most, and setting the UVs based on world space vertex positions.
It was trivial to put together, but I am having a hard time finding a reasonable approach to construct it in GN and actually output UVs working with Displace modifier.
Hello all.
Is there any work being done on adding shape keys as attribute? if yes can anyone point me to it please? If not then is it possible to expose an attribute say position for example, do some operations on it (using another similar geometry’s attribute) and feed it back to the original geometry?
Writing shader code used to demand a lot more replacement of conditional statements with things like mix or lerp. I couldn’t figure out how to get branching logic using geometry nodes out of the box so I ended up approaching it as if I were writing a shader for a low spec device.
Haha, yes, you’ve arrived to pretty much the same thing I did after a while. But I imagine it also took you way too much time and patience to transfer such a simple concept to GN
The way I approached it eventually was that I actually had to first construct it in Blender’s shader editor, where it was actually intuitive, and then I had to “translate” that to GN.
It just shows how much we need GN to get more intuitive. The amount of mental work it is required to make even simplest of things is just too much
Excellent news. I’ll keep an eye on the daily commit notes and see if I can help with testing as soon as it’s added.
As with @RiccardoBancone, I understand the benefits of instancing objects rather than duplicating portions of a mesh. What I’m looking for is having both options at my disposal, and choosing the one that makes sense for each situation.
Something like the Array modifier, but accepting a Geometry input port, would be huge in GN. It would be a bad decision to use it to copy geometry many times, but it would save a lot of work for those situations where you just need a small number of repetitions of simple geometry.
out of curiosity- are there any plans to give GN modifiers the ability to self-reference the object they are assigned to? I’m trying and failing to visualize how GN can eventually replace the existing modifiers without being able to self-reference.