Node-based texturing system design proposal (won't work, this an idea for a GPU based 2D texture authuring node editor)

Have been working a texture node editor project for a while, and thought it serves as a good initial design for a new texture node editor in Blender

Here are the main points:

  • Nodes are executed in topological order as usual
  • Sockets hold reference to GPU texture objects (or GPU Offscreen)
  • Node Preview: since each node generates one or more GPU textures, we can simply use one of them as a preview too
  • Each node has an execute method
  • An example of an execute method:
    • Get a texture from an input socket
    • Create an offscreen
    • Pass the texture to a shader, render to the offscreen
    • Set the output socket data to the color attachment GPU Texture of the offscreen
    • Optionally set it as a preview for the node (store it in a node preview hash table, or node can hold reference to the preview texture, so a draw handler can loop through nodes and draw the previews for them)
  • For input sockets data is fetched from the output sockets they are linked to, otherwise default value is fetched (effective socket data)
  • Reroute nodes are supported by having their execute method as such:
    • Set the data of each output socket of the reroute node to the data of the reroute input socket
  • GPU textures can be downloaded to data-blocks by reading the pixels into CPU memory

There’s also a working prototype Blender addon that uses the gpu module
Let me know what do you think, what could be improved to this?, and how it would be integrated to replace the old texturing system?

Best regards


I suggest to read through this task and topic:

With the assumption of a 2D image and GPU support this can not serve as a replacement for the old texture nodes. We should build on the existing function/geometry nodes CPU implementation, and then as a second step consider how to make that more powerful or faster for the subset of use cases where 2D image operations and GPU support are possible.


ok I see now, many thanks for feedback :pray:

Have you implemented this in C or as an addon?

An addon (20 characters)

Please implement his function natively in blender is a must have for a lot of people since a long time some peoples want this !


Agree, it is going to be eventually, but not this system, this is more suitable as an addon than natively, or maybe natively but not to replace the texturing system, it is a different thing now I understand more

It’s true this system is not totally optimized and not good for preview… ideally the preview is like inside the box and in the bottom after the Imput / Output and he can be load in real time like in UNITY ENGINE if the node is animated, is better natively but is true this system is not good for natively implement.

Some rewrite and ideas is needed for implement perfectly !

1 Like

Thanks for suggestions :pray:

This looks cool.

But there should definitely be crossover talks between updating the texturing system and the compositor improvements as they technically should operate in the same domain; rasterized images.

There’s a real opportunity here for improving BOTH.

1 Like

Is true but, but both system need to be separated … you must work only one for one system ! and not two for one, because the system is different… And the system is probably written in Vulkan Shader, for increase performance ! Because blender must be soon moved to vulkan than OPEN GL in the past.

1 Like