Wayland and EGL support


I am looking into implementing support for the Wayland protocol (https://github.com/christianrauch/blender/tree/wayland) and have the basic EGL context creation and event handling ready.

@ideasman42, @brecht Is this something you would be interested in?

There is an alternative blender-wayland project, by Wander Lairson Costa, that implements Wayland support via the deprecated wl_shell interface for a much older blender version. I wasn’t able to rebase this on the current master because of many changes in the context creation API.
I eventually re-wrote the Wayland support from scratch, with inspirations from this project, using the modern xdg_shell protocol.


I had to make some changes to the GHOST_ContextEGL class to get it to compile. As far as I can tell, this class is not actively used in the blender codebase (and compiling with -DWITH_GL_EGL will indeed fail).

I still encounter issues using the EGL context, specifically I am unable to switch contexts between windows, e.g. when opening and closing the file menu, I will get EGL_BAD_DISPLAY: An EGLDisplay argument does not name a valid EGL display connection.. Since there is no working usage of that context class, I am not sure at this point if this is caused by the handling of the context on the GHOST-Wayland side, or if it is a missing functionality of the GHOST_ContextEGL class.

Is the EGL context in blender fully supported? How would I make sure that the context gets properly manages when constructing and destructing child windows?


Heh, I’ve also been working on the same thing :slight_smile: : GitHub - julianeisel/blender at wayland_experimental.

Part of the reason I did this was to familiarize myself with the ins and outs of Wayland. It should be really useful to have a maintainer with in-depth knowledge on this around. Currently I think there’s a lack of active maintainers on this part of GHOST, introducing Wayland may be a good point to change that.

Differences in our implementations

I see some minor differences in our implementation (you use PKG for setting up libs, I use our usual find_package(); you changed EGLEW to GLEW, I just fixed our current EGL implementation; you allow X11 as runtime fallback, I added a either X11 or Wayland compile time option - we’ll likely need smarter runtime loading anyway but that might be a bit more complex and should probably be done as a separate step).
You’re also a bit further: you already added input handling and added xdg_shell usage. I stopped while working on the latter.
Basically, you can launch and interact with Blender, I’m still at the crash-on-startup phase :slight_smile:

Here comes the big “however”:
Before spending a lot of time on this, we really should ask us: To SDL or not to SDL?

Not only does SDL already support Wayland, it also has far more matured features if compared to our GHOST code, e.g. much superior keyboard layout support. It may solve our problems with maintaining platform dependent GHOST code - SDL does the job for us. I personally don’t have experience with SDL, but @ZedDB has and recommended giving it another look.
While there are certain benefits to having our own solution, I think by now they don’t hold up as much anymore. We may get more benefits with less work than adding an entire Wayland implementation.


Platform Abstraction

Of course, if Blender decides to replace GHOST by some other platform abstraction layer, may it be SDL or GLFW, then there is no point in implementing platform-specific layers. However, if the Blender developers decide to got his way, it should be done for all platforms, and not just to gain Wayland support. In such a case, I think that GLFW would be a better choice as a platform abstraction layer since it only specialises on the window/context and input handling, without additional drawing functionality.

While the benefit of this would be the reduction in maintenance costs for these GHOST wrappers, you lose the ability to implement platform-specific user interaction, e.g. showing notifications, or showing the progress of a background process in a dock icon.

In any case, if Blender developers decide to replace GHOST with a third-part platform-abstraction layer in the short-term (within the next 2 years), I would not go on with these Wayland native efforts, otherwise, I would continue with these Wayland efforts.


Nice. Since you seem to have more experience with the EGL context creation, I would be very happy if you could help me with the context creation.

I used pkg-config since it will work for all libraries that I am depending on. There are some third-party CMake modules to find and use Wayland libraries, but I would need to find yet another cmake script for xkbcommon.

If this is the better choice, I would be very happy if you could contribute to my wayland branch.

A runtime selection between X11 and Wayland, as done for example in GTK, Qt or Firefox, allows using the same binary on different desktop sessions. However, I don’t know if there is a better approach to this other than testing if a connection is possible and fail (with an exception in this case since the connection is done during class construction).

The question remains if this contribution would be useful for Blender, or if there are plans to drop the GHOST layer in the near future.

Is this true? I would have expected that such functionality would still be just as available as if not using an abstraction, but would (obviously) need to be implemented per platform.

I think so. How else would you implement the Wayland protocols at https://gitlab.freedesktop.org/wayland/wayland-protocols? I think there are limited ways to access the OS-specific window handles in SDL or GLFW, but I am not aware of how you would extend the functionality available in those platform-abstraction layers. After all, their purpose is to abstract from platform-specific APIs.

Let’s say you want to make use of the text_input_unstable_v3 protocol which provides methods to input text by means of other sources than a physical keyboard, e.g. an on-screen keyboard. To use this in Blender, you would need to wait until it has been implemented in your platform-abstraction layer of choice, or you might implement it yourself and send patches upstream to the platform-abstraction project and hope that they will be accepted.

It is not guaranteed that upstream platform-abstraction projects will implement all the features you need. With “platform-abstraction” I specifically have projects in mind that only care about the abstraction of the window and input handling. An alternative to those would be the use of toolkits like Qt and GTK, which also support GL context creation and more likely will implement Wayland protocol or desktop application features in general.

That’s my personal opinion, but I think that depending on an platform-abstraction layer and still requiring to have different implementations per platform is not a good choice. I would either use platform-abstraction to not care about the underlying platform at all, or I would provide manual platform bindings to make most use out of platform-specific features.

Years back there was a wayland ghost port someone had started, I looked into it and IIRC managed to get basic tests working. Just to note it’s on our radar and there is a GHOST/Wayland port that could be updated if we wanted to. Edit - linked here.

On the other hand there is Wayland/X11 and Wayland/SDL, which are both functional, even if not ideal.

I think @julianeisel is referring to the existing WITH_GHOST_SDL support, which compiles Blender’s GHOST using SDL for windowing operations.

The Haiku version of Blender currently uses SDL.

The advantage with SDL is we already have support for it in master and we can use it without any overhead until we choose to take advantage of Wayland spesific features.

It may even be the best use of time to stick with SDL until Blender moves to Vulkan, then evaluate how Vulkan+Wayland integration should be implemented.

It’s true enough, in some cases you could mix in platform spesific calls which could be OK in limited situations but would likely get out of hand for more complex functionality.

This is a fork of the repository that I linked in my initial post. I wasn’t able to rebase it on current master, it uses deprecated interfaces and is missing functionality, like custom cursors.

Yes, you can use the X11 version of Blender via Xwayland on any Wayland session. However, the aim of this work is to provide native Wayland communication with all its benefits :slight_smile:

I am aware of the GHOST-SDL bindings. However, they are not active by default and even if you activate them manually, SDL is not using Wayland by default. To used Blender on Wayland, you have to compile it manually with GHOST_SystemSDL, and set the environment variable. And if you go along that path, i.e. if GHOST_SystemSDL is promoted as platform-abstraction, it should be made the default and replace GHOST_SystemX11, GHOST_SystemWin32, GHOST_SystemCocoa, etc.

Since Wayland is independent of OpenGL or Vulkan, not much apart from the context creation would change. The majority of window and input handling, and additional protocols will stay the same. So once you switch from OpenGL to Vulkan, it’s worth to either adapt the context creation in your manually maintained GHOST layer, or wait until SDL adopts Vulkan.

I actually did not intend to discuss too much about platform-abstracting alternatives to the custom GHOST layer. My reasons to go with a native Wayland implementation over SDL are mainly that

  1. GHOST_SystemSDL is not the default on Linux/Windows/macOS
  2. SDL is not using Wayland by default
  3. SDL, compared to a toolkit like Qt or GTK, will likely not implement all functionality that may be required by desktop applications, e.g. on-screen keyboards and touchpad gestures

But if it is decided by the Blender developers that SDL will replace all custom platform-specific GHOST layers, and thus making GHOST obsolete, and that any Wayland patches will not be accepted, then there is no point for me (or anyone else) to go on with this work.

Long term I think we will want native Wayland support, SDL/X11 can be a stop-gap.

While I’m interested in this, I’m not planning on spending development time on this, although I could help a bit with reviewing the branch or finalizing it once it’s working on a basic level.

As you’ve found, updating this branch is a significant project, if you are interested to work on it, thats great - but there are all sorts of issues to investigate.

As far as I know this was more a test.

The Wayland support for UI interaction and EGL context creation has been implemented in https://developer.blender.org/D6567. Everything related to EGL context creation (onscreen / offscreen) and keyboard + mouse interaction with the UI (but mouse wrap) is suppose to work. It is ready for general testing (cmake option -DWITH_GHOST_WAYLAND=ON).

Some features, like setting the cursor position or accessing absolute screen coordinates, are not available on Wayland by design. Some other features, such as drag&drop / copy&paste, window decorations and relative mouse movements, haven’t been implemented yet. I intend to do this after the basic Wayland support has been merged.

I would be happy to get some feedback on this (@ideasman42, @julianeisel, @brecht).


I’ve tested the blender build under the wayland branch. It’s working very well. I am running on Sway WM without XWayland.

A few things I noticed, the mouse cursor does not follow mouse movement when drawing with the 3D cursor. I had a few crashes which are hard to reproduce.

I created a package: blender-wayland-git on the Arch User Repository if anyone is intersted.

Great work @christian.rauch

1 Like