Suggestions / feedback on the extensions for the gpu module

This topic is intended for add-on developers who want to port their code to use the gpu module.

As discussed in BGL deprecation, the bgl module is at the end of its days and should be removed to give free way to a switchable drawing API in Blender (Vulkan + OpenGL).

To this end, patches are being proposed to extend the gpu API and make it possible for the bgl API to be replaced:

Feel free to comment on the patches, suggest changes and ask questions about how to replace the bgl module.

The documentation is updated automatically:

This topic will be updated as soon as new patches are proposed.


Please, add ability to draw clickable(selectable) text to bpy.types.Gizmo draw method

Could we have a way to render depth buffer into a texture with GPU python modules? Now we can render the 3D View into a texture with “offscreen.draw_view3d” and “offscreen.color_texture”. Could we have something like “offscreen.depth_texture”?

I find in Blender 2.92 glReadPixels will always produce 0.0 for bgl.GL_DEPTH_COMPONENT, while it works well with bgl.GL_RGBA. Making the depth buffer of 3DViewport accessible really helps a lot when adding some effects. Thanks a million!

In fact “offscreen.color_texture” needs to be updated (and “offscreen.depth_texture” will be a good addition).
Currently “offscreen.color_texture” returns a bind value for OpenGL. So it must be deprecated.


Just want to chime in and say +1 to being able to access the depth buffer. Also, is there any indication yet as to what will replace this after the current implementation is deprecated?

We have the GPUFramebufer.read_color, so we could have a GPUFramebufer.read_depth.
But only if it is really necessary to port the current addons as it can limit the vulkan’s implementation.

For my addon ( I am relying on (, and the example still uses the bgl module. Will there be an update to this example, too?

You can see the documentation under development at

1 Like

Deprecation of bgl module in switching to Vulcan is great. Current implementation of the gpu module is lacking though. In particular, use of Buffer or GPUTexture outside this module is pretty bad.

I’ll just explain the my use-case to put everything into some context. So previously I used pyopengl library to generate a set of images and put them inside blender. And it was fine because final read of buffer returns numpy array which is easily turned into flat array of pixels to pass to pixels.foreach_set() and push pixels to blender image. With new gpu module and implementation of framebuffer object I decided to switch to it and get rid of peopengl dependency. The problem is that returns a Buffer object which is shaped according to texture width, height and color depth. Passing this buffer to pixels of some blender image is non-trivial. The problem is that you have to flatten this array. And all the methods that I’ve tried a super slow and take at least 2 second for a simple 1024*1024 32-bit image.
The main bottle-neck seems to be Buffer.to_list() method which returns just a list of lists of pixels and takes around a second. But again this list of lists has to be flattened. numpy.ravel or numpy.flatten are very slow for some reason. Generating a list by stepping through all the indicies is also very slow.

So the whole drawing part takes a hundredth of a second while copying results into blender image pixels takes several seconds which is terrible.

So I wonder if there are any plans to integrate gpu module more tightly into blender. Or maybe I’m overlooking some simple solution which you can maybe share

I also wonder, whether this can be still a part of 2.93 release. It would be great if I could release an addon without forcing users to use 3.0

1 Like

In blender 2.83 we could use:

buf = bgl.Buffer(bgl.GL_FLOAT, width * height * 4)
bgl.glReadPixels(x, y, width, height, bgl.GL_RGBA, bgl.GL_FLOAT, buf)

To ‘read’ any part of 3d-viewport (using glReadPixels) to 2d buf - which later can be saved to img.
Anyone know how to redo this using only gpu module?
There is offscreen.draw_view3d() - but it requires 3d camera to be present at scene for projection_matrix - and as it is mentioned in example , there is no way to get projection_matrix without cam.

See the read_color used in this example:

I forgot to mention I wanted to render view_buffer to image in blender 2.93.
But even in latest Blender 3.0 I have trouble to get the code to work (I get black image when reading framebuffer with read_color).
Here is the simple operator script that I made to render current 3d view to image:
ViewportBufferToImg_b30.txt (1.5 KB)
Im not familiar with OpengGL stuff so I’m probably doing something wrong… Any help would be appreciated

GPUOffScreen is like an empty screen that can be drawn in the background.
If nothing is drawn it is black.
It has its own GPUFramebuffer.
GPUFramebuffer is a container for GPUTextures. (A texture can be used by more than one framebuffer).

Blender has a framebuffer for each viewport.
If you call gpu.state.active_framebuffer_get() inside a draw callback, you can access these framebuffers.


When trying to copy this example I get an error that attribute dimensions of Buffer object is not writable
I’m on 2.93 though

Still badly missing a way to read data right from vbo, bgl equivalent to

    bgl.glBindBuffer(bgl.GL_ARRAY_BUFFER, vbo[0])
    bgl.glGetBufferSubData(bgl.GL_ARRAY_BUFFER, index * 24, 24, co)

We could expose these functions:

const void *GPU_vertbuf_read(GPUVertBuf *verts);
void *GPU_vertbuf_unmap(const GPUVertBuf *verts, const void *mapped_data);

I need to make a list to track these backlogs.


Would definitely be a huge step forward !

I am happy to see that GPUFrameBuffer will get a function read_color (*x*, *y*, *xsize*, *ysize*, *channels*, *slot*, *format*, *data=data* ) which allows us to read a rectangle from a FrameBuffer texture.
Question: Is there also a function to write a rectangle into a texture or FrameBuffer object? Specifically I’d like to have the functionality of glBlitFrameBuffer

Hello, I’ve been using bgl module to generate custom 3D texture and draw:

self.texture = bgl.Buffer(bgl.GL_INT, 1)
bgl.glGenTextures(1, self.texture)
bgl.glBindTexture(bgl.GL_TEXTURE_3D, self.texture[0])
texture_data = bgl.Buffer(bgl.GL_FLOAT, [depth, height, width], images)
bgl.glTexImage3D(bgl.GL_TEXTURE_3D, 0, bgl.GL_R16F, width, height, depth, 0, bgl.GL_RED, bgl.GL_FLOAT, texture_data)
bgl.glTexParameteri(bgl.GL_TEXTURE_3D, bgl.GL_TEXTURE_MIN_FILTER, bgl.GL_LINEAR)
bgl.glTexParameteri(bgl.GL_TEXTURE_3D, bgl.GL_TEXTURE_MAG_FILTER, bgl.GL_LINEAR)
def draw(self, context):
    bgl.glBindTexture(bgl.GL_TEXTURE_3D, self.texture[0])

Is there a way to do the same thing with the gpu module in Blender 3.0+?

The gpu module API to create textures has support for 3D textures.

1 Like