Adding Multi Touch support - GHOST

I’m proposing adding multi touch support to blender as I’ve just bought an xp-pen 16TP and I can see that supporting multi touch in the same way that Krita does would be very helpful. These multi touch + pen devices are getting increasingly more common, with Wacom, XP-Pen, Lenovo and Microsoft all making devices with both multi touch and pen input these days.

As far as I can see this work splits nicely into two main components:

  1. Adding support for Multi touch within GHOST so touch events will be available to application developers
  2. Implementing the UX of using touch

My idea for implementation plan is to initially do 1), and expose touch events via the DNA event system, so the UX can be prototyped and refined in Python as an addon (where I’m more comfortable for quick iteration). Then, once “how multi touch should work in the UX” has been explored, it can be implemented properly.

Currently, I have Multi touch events captured in Windows, and I am currently working on tagging touch-sourced cursor move events with something so they can be ignored in the viewport in future.

Looking at the way things have been implemented, I’ve decided that touch events should be a separate GHOST event type, but this is perhaps not ideal, I’d like input if possible. Decisions about whether to have the Touch event type have multiple touch points, or to fire off an event for each touch point, or whether to fire off multiple cursor move events are all choices that I feel people should input on.

Thanks in advance,
Giles (New contributor)


That sounds great! – This would really comlement the development that is currently happening in the sculpting/painting-field.

Speaking from user-side the interaction-methods that I think are most needed are:

touch + pen (this means not to interpret touch as left-click and to just interpret gestures)
(1-finger-move → rotation) — (2-finger-move → pan) — (2-finger-pinch → zoom)

touch only
(1-finger-move → left-click) — (2-finger-move → rotation) — (2-finger-pinch → zoom) — (3-finger-move → pan)

pen without buttons + keyboard (I allready posted this - I apologize for doubling)
(pen-movement in empty spaces of the viewport → rotation — (additionally pressing a modifier key → pan / zoom)

  • (touch + pen) works really well in nomad-sculpt
  • (pen without buttons + keyboard) works really well in zbrush


For implementing the touch events I would recommend at least starting with using SDL to get them.

SDL is currently an optional dependency for Blender, but I think it would be good to use that as then you don’t have to implement support for it on all supported platforms yourself.
We will have to discuss making SDL a more central part of Blender with the rest of the developers later as I think it is the way forward to make it easier for us to maintain good platform support.
I have already had some discussions about this, but we have not come to a concrete conclusion as this would be a very big project.

On the multi touch side of things it seems like you have a plan of attack already, which is good :slight_smile:
After you have gotten a solid idea of how you would like the multi touch support to work in blender, be sure that you create a detailed proposal on how and why things should work the way you are envisioning.

DO NOT start any big code rewrites or feature implementations before you have gotten a green light on this proposal. I want to make sure that this is clear because we have had multiple people being very disappointed because their contribution got denied because it was not something that the core developers thought was a good approach.

To me it feels like you are currently in the brainstorming stage, so I can’t really give any concrete feedback.
However if you want to live chat, then you can hit me up on for quicker turnaround.

Hi ZedDB,

I’d not come across the idea of using SDL - from what I understand it does some of the work that GHOST does?

Where would you suggest I make such a proposal? Bit lost between here, developer. and RCS! I’ve started implementing a proof of concept implementation in GHOST for Windows so there’s definitely some stuff to look at already. Definitely not production ready but not far off it if that’s the implementation approach we decide to take.

There’s definitely more to talk about about how the UX would be but as previously said, I’m mostly focused on getting the base support added first.

Have a look here, multitouch support for windows is supposed to land in Blender 3.2: ⚙ D7660 GHOST: Add support for precision touchpad gestures on Windows

That is just gestures on the touchpad, it does not support touch screens.