Blender 2.8: "Unable to open a display" by the rendering on the background (Eevee)

Linux: CentOS 6
Broken: 2.8 94722121e57
Compiler: GCC6.3

The rendering on the background crashes with this error message: “Unable to open a display”.

The used command:

blender --enable-autoexec -noaudio --background simple_cube.flamenco.blend --render-output ###### --render-frame 1

The used scene:


We don’t currently support headless rendering with Eevee, there must always be a display even for background rendering.

Hi Brecht, thanks for an explanation. I use Blender 2.8 over X11 and after adding new GUI - View3D with Eevee is black using Mesa v17/v18 libraries (lvmpipe, swr). Do you know what was changed in OpenGL?


@brecht Will it be possible to do rendering using EEVEE headlessly or through the Python API in the future? If so when do you expect that to happen?

The entire OpenGL implementation changed in Blender 2.8. We still have a lot of work to do to test and make things works on various GPUs.

We would like to have it working but there is no specific date planned for it, it’s not even clear to what extent OpenGL drivers of the various graphics cards even support this well. Note we are talking here about rendering on a computer without any display connected, background rendering works if there is a display. And Eevee is so fast that usually you don’t need to run it on a render farm, a single desktop computer can render out animations.

Hi @brecht,

Eevee (Intel Compiler) works with Mesa 17/18 (Intel Compiler) now . It was my bug.

Thanks for fixing :slight_smile:


Hi brecht, just to add a precision: Eevee is fast but nothing is ever fast enough in 3D animation and we will always need render farms, millions $$$ of them even with Eevee. Please don’t underestimate the need for Eevee to be able to render on a farm.

Just an example: I am currently rendering a very light and small short movie with Eevee and I hit the limit pretty fast and I would really like to use my 100k$ render farm that is doing nothing while I struggle to render everything on my multi titan xp gpu workstation (while only one of them is working with Eevee cause it also doesn’t suport multi-gpu). In short my movie could have been rendered in 1 hour but I already spent 48 hours trying to render it on 2 workstations and it’s not finished yet. I also have workstations without any monitor and because of that I can’t render with them in Eevee so I will need to find some cheap broken monitor to plug them just for the ws and gpu render farm nodes to be able to render ;).

Thanks for your work other than that Eevee is fantastic and has huge potential. I had so much fun playing with light, dof, bloom and fog!

If only there was a way to fake a monitor by pluging some kind of device in the gpu port, it would allow Eevee to render at least. How does Eevee choose the GPU? What exactly is telling Eevee: “ok this gpu has a monitor plugged in one of its ports, your can use it to render.” Or “there is no monitor so you are not allowed to render”.? If we knew it we could maybe fake it…

Maybe this could work:

I found those already made dummy plugs: I will try them !

Hi Strob,

I did not have a time to try it, but you can try xvfb+mesa+llvm/swr.


@brecht Any updates on headless rendering with EEVEE?

No news. We are in the 2.80 beta phase, not focusing on new features until 2.80 is released.

Rendering via xvfb works, but is super slow!
demo command:
/usr/bin/xvfb-run -n 0 /opt/blender-2.80/blender -b eevee_render.blend -o suzanne.png -f 1

Using blender-2.80-e57ee5934a30-linux-glibc224-x86_64 on a Google VM instance running Ubuntu 18.04 LTS


Did you try Mesa 18 with llvmpipe and set llvmthreads to maximum?


Hey Milan,
is this working for you?
I can run eevee via xvfb on an aws gpu instance, but yeah, slow: 50sec while it should be 2sec



I managed to get eevee running and rendering fast on a debian Google Cloud Instance with a GPU (P100), so for AWS I guess should be possible as well

First installed Nvidia drivers, then started Xorg and it worked.

just make sure glxinfo doesn’t give any errors (it checks that OpenGL can work properly)