Have been working a texture node editor project for a while, and thought it serves as a good initial design for a new texture node editor in Blender
Here are the main points:
- Nodes are executed in topological order as usual
- Sockets hold reference to GPU texture objects (or GPU Offscreen)
- Node Preview: since each node generates one or more GPU textures, we can simply use one of them as a preview too
- Each node has an execute method
- An example of an execute method:
- Get a texture from an input socket
- Create an offscreen
- Pass the texture to a shader, render to the offscreen
- Set the output socket data to the color attachment GPU Texture of the offscreen
- Optionally set it as a preview for the node (store it in a node preview hash table, or node can hold reference to the preview texture, so a draw handler can loop through nodes and draw the previews for them)
- For input sockets data is fetched from the output sockets they are linked to, otherwise default value is fetched (effective socket data)
- Reroute nodes are supported by having their execute method as such:
- Set the data of each output socket of the reroute node to the data of the reroute input socket
- GPU textures can be downloaded to data-blocks by reading the pixels into CPU memory
There’s also a working prototype Blender addon that uses the gpu
module
Let me know what do you think, what could be improved to this?, and how it would be integrated to replace the old texturing system?
Best regards