2024-03-18 Tablet Support: UX and other implications

Attendees:

  • Dalai Felinto
  • Francesco Siddi

This meeting was an initial exploration of what needs to be discussed to design first-class support for tablets in Blender. That encompass both graphic tablets (Wacom Bamboo, Wacom Cintiq, …) and hand-held tablets (such as iPad, …).

The main unique input devices considered were:

  • Touch
  • Pen

Initial considerations

  • Most interfaces could be considered tablet ready (better widgets can always help).
    • However, any editor which relies heavily on shortcuts needs attention.
  • Flexible layout system, allows various workflows/workspaces.
    • Blender could/should ship with workspaces optimized for tablet.
  • Blender relies on modes for giving access to tools.
  • Challenge: Blender has a “One hand on mouse, one hand on the keyboard” design. Artists already use Blender with a graphic tablet, but always need a keyboard to perform basic operations.
  • Challenge: Let’s think in terms of universal design!
  • Challenge: Blender was designed for a single active input device.

Tablet (and pen) input solution

  • Create custom toolbars (similar to quick favourites).
    • Tools stored there can be activated by tap or tap and hold.
  • Identify common patterns (“Add” for all entities, etc.) and expose them more prominently in default toolbars.
  • Identify common gestures used elsewhere and provide default input mapping for them (e.g. 2-3 finger tap → undo-redo, pen tap → switch tool, etc).
  • Context menus callable with gestures, multi touch, or simply via small widgets that appear after selection.
  • Allow “tweaking” of tools as they run (e.g. constraining transform on single axis).

Topics and questions to discuss

  • The use of text-based, nested menus (as oppose to icon-based).
    • In Blender, the functionality of each editor is consistently advertised through menus, which often contain nested options.
    • Most tablet app designs rely on visual cues/icons, and resort to popovers when dealing with many options.
    • How could Blender deal with this?
  • Design patterns that combine pen and gestures.
  • Assigning exclusive tools/behaviors to pen vs. fingers.
  • Thoughts on pie menus.

Other considerations as we explore this

  • Various screen sizes and orientations.
  • Other devices (VR/AR, mobile, etc).
27 Likes

I do everything with a tablet pen instead of a mouse.
Blender also needs an alternative to shortcuts using the mouse wheel.

6 Likes

@mhjang Yes ! I’ve been advocating for that for a very long time and although some voids have been filled, some remain. Off the top of my head, I can think of a few situations where mousewheel has no easy alternative : transform operator’s proportional size (in all relevant modes, but especially painful in mesh edit mode), navigating popovers and menus (also possible with arrow keys, but tedious).

That led me to get a macropad with rotary knobs to emulate a mousewheel. It works great, but it shouldn’t be required to use basic functionality when you’re a tablet user, especially for health reasons that make it difficult to switch to a mouse, even temporarily.

I don’t own a “screen tablet” so I won’t have great insights into what is necessary to get Blender workable on such devices. However, if this project leads to more customizability (especially the mention of custom toolbars), color me very excited for it.

11 Likes

Personally, I have never had any difficulties using Blender with Pen and Tablet.
With only one major exception: there are few places in Blender where a list can only be scrolled with the mouse wheel.

Blender has a very neat feature of scrolling panels with the middle button press and move. I’m missing only a few places where there’s no such functionality.

4 Likes

I would consider improved Tablet/Pen input to be a wildly different problem than Touch Interfaces.

We handle pen input fairly well now, although that could certainly be improved. But that is simply adding a new device to supplement the user’s existing devices. Touch is an interface that most often replaces other devices and has limitations that greatly affect our design.

The first issue is that the minimum hit size for touch is substantially larger than our current minimum features sizes to accommodate finger presses. Although much of our interface can just be “made bigger”, we do have a very grid-like design with fixed and consistent line heights, so it is difficult to have wildly differing text from icon sizes.

Where this becomes a difficult problem is in all of our dropdown menus. It isn’t just that the initial trigger area has to be finger-sized, but that each item in the resulting menu has to be as well. We’d just run out of screen space if we need each item to be that large. This is why touch designs will usually bring up larger, complex, and more persistent (not disappearing when you move your mouse) areas, rather than use menu item lists. An example of doing this in Blender would be for us to add optional overlapping areas to the outer design. So press “File” (or folder icon on a toolbar) and an area would slide out from the left edge (overlapping all areas) with items that are related to the file system. The top might have a row of icons for each template type, recent items might be in a collapsed section, some items might bring up popovers, etc. Press “Edit” and a different panel of options slide out.

The bigger limitation is that touch lacks a “tracking state”, in that it does not have a persistent location when you are not actively interacting. Our design features multiple areas with only one being “active”, which is defined as the one that currently contains the mouse pointer. Without any devices with a tracking state we then lack a way of indicating the active area. In order for keyboard shortcuts to continue to work when your hands are not on the screen we would require to make area activation “sticky”. And doing so would naturally also require greater visual indication of the currently active area. This would probably mean adding “activation via interaction” with the challenge of doing so without breaking the habits of people using current devices like mice and pens. Without adding such area activation changes it is difficult to add touch-friendly toolbars that cause action outside of their own area since the target area is ambiguous.

Our current design of multiple non-overlapping areas is also harder to use with touch. Just resizing areas would be troublesome. Allowing area maintenance (joining, splitting, tearing off) using docking gestures from each header might help. But I’d guess we’d ultimately want to also add the option of using area tabs, so touch users could choose to work with less areas simultaneously.

14 Likes

I’m in pain too. So, I’m using a keyboard with knobs as an alternative.
Developing touchscreen UX is indeed beneficial, but it seems necessary to evaluate the UX for tablet pen devices in a PC environment. Particularly, an alternative to the mouse wheel is essential.

1 Like

While we handle screen-less tablets as a replacement for mouse well, we handle graphic tablets (like Cintiq) rather poorly as far as workflow goes.

It is not practical to use the keyboard when using graphic tablets, and the pen-only workflow is a hidrance.

1 Like

What do you mean? I’m a full time Cintiq user, and even though while using blender I usually prefer to use the mouse and treat it as a regular monitor (if not doing textures, sculpting or GP), it has nothing to do with the keyboard in my case.

The most frustrating thing that immediately comes to mind when thinking about cintiq-blender issues is what others have pointed out here: the lack of mouse wheel substitute when working with a pen.

Things like setting proportional editing size without the mousewheel is… kind of impossible? You can adjust the slider in the popover and then try it, and adjust again because it wasn’t the right amount…but it is a PITA, and just not practical, so I just drop the pencil, grab the mouse and use the wheel instead.

I honestly don’t know how to solve it, but it’s the kind of interaction that really bugs me when using blender with the cintiq.

9 Likes

Pen/keyboard combo isn’t that much of a problem, actually. At least for me. So this is just my perspective. The thing that really annoys me are:

  • like @Hadriscus already mentioned, modifier inputs that are only reachable through mouse gestures
  • navigation with Emulate 3 Button mouse blocks other things like Alt-Enter Multi Selection modifier. Even though I find it to be much superior as a pen navigation style over the MMB-Click solution in native Blender layout.
  • and the biggest one for me: Scroll-Lists with only a tiny scrollbar on the side. With Tablet input I’d love to have these just be scrollable like a list on a smartphone with touch input.

Though, otherwise really cool to see any improvement in this area. :heart:

4 Likes