Drawing to GPUOffScreen from within an operator seems to freeze Blender until the 3D view is redrawn?

Hi All,

I’ve been developing MeasureIt-ARCH, an extension / alteration of Antonio Vazquez’s MeasureIt addon and have come up against a bit of a bizarre issue that I could use some help with.

After using the ‘Render MeasureIt-ARCH Image operator’ which draws MeasureIt-ARCH elements to an off-screen buffer and then reads the buffer into a new image data block, blender appears to freeze. The window cant be closed, most things become unresponsive and dragging window bounds creates drawing artifacts. Here’s a gif of the behavior.

As soon as the 3D view area is redrawn however, Blenders functionality returns to normal. The code that seems to be responsible for the issue is as follows.

 def render_main(self, context, animation=False):

scene = context.scene
sceneProps= scene.MeasureItArchProps
sceneProps.is_render_draw = True

clipdepth = context.scene.camera.data.clip_end
objlist = context.view_layer.objects

# Get resolution
render_scale = scene.render.resolution_percentage / 100
width = int(scene.render.resolution_x * render_scale)
height = int(scene.render.resolution_y * render_scale)

# Draw all lines in Offsecreen
renderoffscreen = gpu.types.GPUOffScreen(width, height)

view_matrix_3d = scene.camera.matrix_world.inverted()
projection_matrix = scene.camera.calc_matrix_camera(context.view_layer.depsgraph, x=width, y=height)
print("rendering offscreen")
with renderoffscreen.bind(save=True):

    print("setting gl props")
    # Clear Depth Buffer, set Clear Depth to Cameras Clip Distance

    print("loading matrix")

    # Draw Scene for the depth buffer
    # Note: the issue persists even if the draw_scene() function is not called
    draw_scene(self, context, projection_matrix) 

    # Clear Color Buffer, we only need the depth info

    # -----------------------------
    # Loop to draw all objects
    # <I've Excluded the code that is here since the issue occurs regardless of whether the 
    # actual drawing code runs or not>
    # -----------------------------
    print("reading offscreen")
    buffer = bgl.Buffer(bgl.GL_BYTE, width * height * 4)
    bgl.glReadPixels(0, 0, width, height, bgl.GL_RGBA, bgl.GL_UNSIGNED_BYTE, buffer)


# -----------------------------
# Create image
# -----------------------------
image_name = "measureit_arch_output"
if image_name not in bpy.data.images:
    bpy.data.images.new(image_name, width, height)

print("writing buffer to image")
image = bpy.data.images[image_name]
image.scale(width, height)
image.pixels = [v / 255 for v in buffer]

As far as I can tell the render code is quite similar to Antonio’s, although I’m using a different view and projection matrix and making use of the off-screen depth buffer.

I’ve tried running with gpu debugging turned on, but there don’t seem to be any errors that consistently occur around use of the rendering operator.

Also, 2 of the 3 users reporting this issue happen to be running a NVIDIA GTX1050 if that might be related to the issue.

If anyone has any insight into what might be causing this, or where to look to start debugging more effectively I’d really appreciate it!

1 Like

If anyone stumbles across this later, It turns out the issue had nothing to do GPUOffscreen, but was because I wasn’t disabling GL_DEPTH_TEST