TODO:
Edit: Please use your common sense. Of course i don’t want to pipe a viewer node into geometry, but the b/w data that runs into the viewer node. If you feel that the resulting material output node extension should look different than what i show here, please feel free to use all your imagination force of your inner cerebral holodeck to fill in the things i didn’t show. This is not a mockup, but a design on a meta level.
I’m not sure if I get the message, but I guess It’s a proposal for a GN output from Shader Nodes to transfer shader data to GN via a GN material sample node, right?
In your specific example though, I think you just want to use the color/factor generated by the procedural B&W circuit texture. So, Isn’t it just allabout sampling the color (and therefore the texture)? Why use a shader output plugging in an emission shader? If It’s the case, for this maybe a GN color attribute output would make more sense?
if using shader output as you show, What should I expect to happen if I plug, say, the Principled BSDF output instead of the emission viewer? How is the result supposed to be evaluated and when (At render time I guess, since it’s a shader output, and It’s generally view dipendent)?
The viewernode is only for demonstration purposes plugged into the surface socket. Of course, in production, the Principled would give the color (copper threads and green board), and a direct noodle would feed the GN socket. But then the readers of this post would miss the point that i want to feed in a b/w texture in this case, which doesn’t mean users shouldn’t be able to use the whole shader as information to pass to the geometry. Whaterver the user wants to pipe out as mask for anything you could imagine doing with vertices/points/values.
Maybe i’m naive, but i imagine having an output in the material output node as a pass for GN, acting as a container for 3 (4?) channels with arbitrary values, attributed to the respective position.
I could of course bake the texture and use it then in GN to drive the density, but what a waste of time, and what a loss of control…
Sampling from a shader is more tricky than it looks
shaders are only generated at rendertime
You can’t mix shaders info with “real” information
you’ll have to bake it
No you don’t get it, the whole shading graph simply don’t “exists” outside of rendering
I might be wrong tho but i believe that’s how it works
So cycles shaders (shaders as material not the shader type) to geonode is not possible, but the other way around, a texture generated in geometry node transferred to cycles, might be a more feasible feature request
I just can hope you are wrong, since texture generation is so much more flexible on the shader side. I feel so limited in procedural modelling because i can’t drive voronoi texture coordinates by a musgrave texture, which is driven by a noise texture, which is driven by a brick texture… can we do that in GN?
I don’t see shader evaluating to geometry attributes happening, but I may be wrong. On the other hand, textures are meant to be upgraded on the GN side so what you show here should be possible directly with geometry attributes down the line
Any info? official statement about this?
it seems quite strange because:
What about the ability to share a texture data cross multiple users?
What about the texture data type itself?
It’s been said a few times over, I think on this thread ? Brecht mentioned it and there’s a task on dbo, not sure of the specifics. The idea was to have texture nodes live into a special node group, so that it could be re-used in any context (shader, geonodes…)
Texture nodes were going be part of the attribute processor iirc, they can just be added to normal geometry nodes now since there’s no need for attribute processor with fields. Noise texture is already in the fields branch, other textures just need to be ported over.
@HooglyBoogly@RiccardoBancone@1D_Inc , thanks for your input, guys! All you made some great points. I think it’s important for the team to be aware of what users think, because sometimes it feels like you’re approaching Geonodes more from a technical artist/programmer POV.
Just to clear up possible confusion, the issue with “shader” data is lexical here. Because shaders are functions of the visible geometry, piping shader info into GN creates a dependency loop. I think you’re talking about porting the color-type and utility nodes from the shader editor into geometry nodes which is very much possible with this branch (see: built-in noise texture). It’s been mentioned that there’s talk on extending the texture node input workflow to more closely match the shader workflow.
Mixing textures is already possible in fields just workflow is kind off attribute similar, but I think with some tweaking you could do this in current fields build. When we will have more texture nodes instead just noise this will be even more simple.
A picture is worth a thousand words! Nice. I’m glad i asked. Otherwise i never had learned what your image can teach. Unfortunately this means GN is still/already so unintuitive that i didn’t come up with your solution by myself.
It is an experimental demo after all, not even an alpha release. Some things will be not working as they should some things will be bizarre, now they’re working on backwards compatibility, so it will probably take some time till we see the next version of this build.
With instancing, haven’t discovered this yet, but I see what I can do.