What module to look in to understand material nodes/shaders in the render pipeline?

I have a pretty complex material node setup and for reasons beyond the scope of this post I’m interested in being to evaluate the resultant RGB value for arbitrary points in space. My understanding so far is that this is currently impossible because said calculations happen on the GPU and are not ever communicated back into main memory or the CPU. I want to learn more about how this process works and the code behind it. Might help me think of an alternate way to do what I want, or (with extreme luck) I can write a snippet that gets what I need done and build it myself.

I have some small experience writing toy webgl/openglES programs so I kinda know the mess I’m getting into. Blender is a big program though. Can anyone point me in the right direction? I’m not sure where to begin looking.

Correct me if I’m wrong somewhere


Today, for nodes, there are several types of calculations at once:

  • Multifunction
  • Field
  • Shader stack (in our context)

Following the logic:

  • Multifunction, this is a processor calculation. In metarials (although maybe I misunderstood this), this happens for single values. That is, if there is a set of nodes that can be simplified on the processor, this happens (although it may not)
  • Field (what I think you need the most, but it’s only in the geometry nodes, so you have to rewrite the material to the geometry nodes, and evaluate if it can be done without ray tracing data) it’s like a single value, but calculated for data given from above, not from below.
  • The stack is the shader document that is constructed by the material as it updates the node tree. When constructing, if you use any resources (textures, …), they are attached to the document, and in return, a reference is made to the document to their number in the shader. In this moment, this stack can be compiled into a program for a video card. I’m not sure if you can call back the text version of the shader code here (although in theory it exists). Also, if you were interested, at this moment you can call another program on the video card, with your data, in order to use this program.
1 Like

Hey @Coalth , what do you mean exactly by “… evaluate the resultant RGB value for arbitrary points in space”? Do you mean anywhere in your scene or anywhere on a part or your scene visible to the camera?

Interesting, the latter I suppose, since shaders can depend on camera position (although the one I had in mind doesn’t).

This is very helpful! I’ll see if I can dig into the code a bit later today and see if I can come up with further, more specific questions.

Can you elaborate a little on this? What file(s) would I find these definitions in?

I have some small experience with ogl, from which I drew this conclusion. But in terms of code, I’m not sure which part of the API will help you. If interested, maybe someone from the GPU module can help you?

But if you want to explore it yourself, then here is the code that builds the shader document. By tracking it, you can find a place where it might be possible.
https://developer.blender.org/diffusion/B/browse/master/source/blender/nodes/shader/node_shader_util.cc$247