Fields and Anonymous Attributes [Proposal]

Also, to add to the above post, following the same “logic” (assuming there is one) of my idea, one thing that you should, in my opinion, prioritize is fall offs. Proximity, Less Than, More than, doesn’t appeal to most of users. Most people just want an object fall off, random fall off, linear fall off, the more… the better. Make as plug and play as possible to make GN gain traction with people, especially people doing daily motion, ad and VFX work where speed with tight deadlines is the key. Make it easy to use with simpler node groups and let the low level nodes for the hardcore people for more advanced stuff, more complex node groups, add-on creators, etc.

While I agree that the community will play a big hole here, basic stuff should come with Blender. For instance, when I told some students that in the near future they could download some presets for falloff effects, which is a pretty basic and common stuff for all kinds of projects, they didn’t look happy with basic stuff needing some additional (possible paid) download.

Thanks again! :slight_smile:

8 Likes

You have some great points! I expect the asset browser to play a huge role. The planned built-in asset bundle gives us a really nice opportunity to include those procedural effects like you mention. Falloffs, selections, noise, scattering with randomness, and more are all things we’ve talked about including higher level nodes for. The workflow could be browsing or searching in the asset browser editor or in a “mini asset browser” in the node editor, then a simple drag and drop. Ideally we could have some nice thumbnails too!

I think due to time constraints, supporting node groups well in the asset browser isn’t a goal for 3.0, but I’m guessing the release after would be a good time to work on this stuff.

16 Likes

I agree! Expecially when it comes to vector math and programming terminology like “float”, for non programmers it can really make no sense.

“Wrapper” groups and in general higher level groups, or even built-in nodes (if it’s worth it performance-wise) would be nice, but I would still keep low level nodes with programming terminology names, maybe well documented (so non coders can still learn), after all, nodes are a visual programming tool, and I see coders and more IT-oriented people complain about strange terminology.

Blender can be a great tool for both pure Artists and Pure Technical People. I think that Just creating a visual/logical separation between low level nodes with high level / wrapper nodes by color coding the nodes could be a nice start! Then as @HooglyBoogly said, Asset Browser/manager with nice thumbnails can be a great improvement for Artists friendly workflows in the future!

6 Likes

Yes, it is much easier for regular artists to operate with possibilities rather than math.
When artist start to operate with math abstraction level, he evolves as a technical artist.

5 Likes

More Shrinkwrap study, I’m trying to reverse engineer the modifier, so far I was able to match projected behavior, on surface, with limit, offset and positive culling options (off/front/back):

It got a little bit messy with switches and boolean math nodes to match the offset behavior on different culling options:

It’s the first time I used “Attribute Freeze” I believe, this is an essential node! I didn’t realize it’s role until i started thinking: “wait, now I need the normals before the points are displaced, how do I get that value if it’s evaluated back from the geometry the field is plugged in? I need a geometry to probe and freeze the attr… wait a minute… :bulb: :bulb: :bulb: The fact that the node is named “freezed” helped me to figure out that that’s what I needed

I think someone already suggested it before, but having an enum input type socket and a multiple input switch compatible, would allow me to replicate this:

image

maybe with a dropdown, instead of having this:

image

Or is there a way to do that that I’m not aware of?

10 Likes

Like this? afaik there’s no task for it yet this is just a mockup i did

2 Likes

TODO:
Edit: Please use your common sense. Of course i don’t want to pipe a viewer node into geometry, but the b/w data that runs into the viewer node. If you feel that the resulting material output node extension should look different than what i show here, please feel free to use all your imagination force of your inner cerebral holodeck to fill in the things i didn’t show. This is not a mockup, but a design on a meta level.


Thx in advance :wink:

1 Like

Hey @Grinsegold cool texture!

I’m not sure if I get the message, but I guess It’s a proposal for a GN output from Shader Nodes to transfer shader data to GN via a GN material sample node, right?

In your specific example though, I think you just want to use the color/factor generated by the procedural B&W circuit texture. So, Isn’t it just allabout sampling the color (and therefore the texture)? Why use a shader output plugging in an emission shader? If It’s the case, for this maybe a GN color attribute output would make more sense?

if using shader output as you show, What should I expect to happen if I plug, say, the Principled BSDF output instead of the emission viewer? How is the result supposed to be evaluated and when (At render time I guess, since it’s a shader output, and It’s generally view dipendent)?

just interested in knowing more about your Idea!

The viewernode is only for demonstration purposes plugged into the surface socket. Of course, in production, the Principled would give the color (copper threads and green board), and a direct noodle would feed the GN socket. But then the readers of this post would miss the point that i want to feed in a b/w texture in this case, which doesn’t mean users shouldn’t be able to use the whole shader as information to pass to the geometry. Whaterver the user wants to pipe out as mask for anything you could imagine doing with vertices/points/values.
Maybe i’m naive, but i imagine having an output in the material output node as a pass for GN, acting as a container for 3 (4?) channels with arbitrary values, attributed to the respective position.
I could of course bake the texture and use it then in GN to drive the density, but what a waste of time, and what a loss of control…

Sampling from a shader is more tricky than it looks
shaders are only generated at rendertime
You can’t mix shaders info with “real” information
you’ll have to bake it

Maybe an error message if you plugg in green noodles instead of grey, yellow or purple?

No you don’t get it, the whole shading graph simply don’t “exists” outside of rendering
I might be wrong tho but i believe that’s how it works

So cycles shaders (shaders as material not the shader type) to geonode is not possible, but the other way around, a texture generated in geometry node transferred to cycles, might be a more feasible feature request

1 Like

I just can hope you are wrong, since texture generation is so much more flexible on the shader side. I feel so limited in procedural modelling because i can’t drive voronoi texture coordinates by a musgrave texture, which is driven by a noise texture, which is driven by a brick texture… can we do that in GN?

That’s why the texture node editor should be revived from it’s decaying state

2 Likes

i thought GN would do that.

I don’t see shader evaluating to geometry attributes happening, but I may be wrong. On the other hand, textures are meant to be upgraded on the GN side so what you show here should be possible directly with geometry attributes down the line

2 Likes

Any info? official statement about this?
it seems quite strange because:
What about the ability to share a texture data cross multiple users?
What about the texture data type itself?

It’s been said a few times over, I think on this thread ? Brecht mentioned it and there’s a task on dbo, not sure of the specifics. The idea was to have texture nodes live into a special node group, so that it could be re-used in any context (shader, geonodes…)

2 Likes

Texture nodes were going be part of the attribute processor iirc, they can just be added to normal geometry nodes now since there’s no need for attribute processor with fields. Noise texture is already in the fields branch, other textures just need to be ported over.

8 Likes

@HooglyBoogly @RiccardoBancone @1D_Inc , thanks for your input, guys! All you made some great points. I think it’s important for the team to be aware of what users think, because sometimes it feels like you’re approaching Geonodes more from a technical artist/programmer POV.