I just got a Macbook pro a few weeks ago and rapidly discovered that there is no touchbar implementation at all.
So it endup with wasted space that could be used to improve either workflow efficiency or UX.
I already posted on right-click select select with some proposition on possible implementations. And already started working on it.
There is half a billion ways I could implement this touchbar and I have no experience in UX so a bit of guidance would be awesome.
Things I see you would do:
- figure out what workspace is active
- figure out what space is active
- show buttons for some operators in the touchbar
- figure out what space is active
I don’t know the touchbar myself (I use really only the display brightness and audio volume part of it), so can’t really help much there. But things to figure out:
- how to put items there
- figure out what kind of events you get through it
To have this work well you’ll be poking around in the GHOST code of Blender. This is the layer that sits between Blender and the operating system. It translates anything OS into Blender understandable events.
I can see some value in mirroring the toolbar in the touch bar, since it is designed to show context relevant controls and would save lots of cursor movement up to the top of the screen and back.
Like I said (and tweeted) the GHOST part is the part I already worked on, I can push button that does simple stuff (such as console log) now I need to bind it to the interface.
Though I have a question: why you don’t use it ? which feature would you see on it to make it useful enough to start use it ? (that could help me make something efficient on blender)
I personally find it very convenient when using xcode (with all debug shortcut on it) and watching movie (I can get rid of all interface as it is displayed on the bar), same with predictive input/emoji when chatting feel very cool among other things.
I don’t use it because it destroys hand orientation on the keyboard. Further, each application does its own thing which forces me to look down from the screen onto my keyboard. Those are needless distractions. That said, no doubt there are people who are fluent in using it, just not me (:
I can’t really say what I’d want to see in the touchbar. But perhaps the most obvious things would be: minimizing blender, fullscreen toggle of Blender, fullscreen toggle of space, start render, start animation render.
But I guess it’d be even better if these were configurable through the regular configure screen. Say you expose 4 or 5 buttons with their Blender events then you’d just have to make sure these events are usable in the preferences configuration for any space. Then it becomes a matter of having some code talking to the touchbar telling what to show there whenever active space changes. Or something like that.
I don’t use the touchbar either.
But maybe I would if it had “numpad” keys on it, so that we wouldn’t need to emulate it.
Numpad keys, that is a great suggestion!
View navigation is a great idea; also things that slide. The Touch Bar’s specialty is that it can provide sliding interfaces (volume and brightness, video scrubbing, and photo scrubbing are examples). A timeline slider might be useful, for example, etc. It would be great if it was configurable, and if it was mode-specific.
Do we know what other programs do? Maya and C4D are the other two big DCC programs with mac support.
Beyond useful or not, with that of usability problems and needs in UI that has blender to get to develop a touchbar that can only take advantage of a part of mac users instead of all users of blender…
This argument doesn’t work, why work on Optix when all blender users doesn’t have nvidia card, why work on linux or mac ports when all blender user doesn’t have those OS, why work on tablet pressure when all blender user doesn’t have one and the list goes on…
On the opposite I think it would be great to have every single blender user being able to take full advantage of their machine including those specifics features. the touchbar is an awesome usability feature that bring both efficiency (as it work like a keyboard shortcut) and smooth learning curve (as it display icons directly) with a few additionals details like you can slide (usefull for timeline or precise float adjustment), display previews and more.
In fact there is almost one workflow and usage per blender users so there is no “for all blender users” there is just usability features that some will love and some won’t even see.
I can’t use Optix and I’m fine with that but I have an awesome touchbar that I intend to use as much as I can.
Oh and additionally for now I’m the only one on this implementation (AFAIK) and I’m not even a part of the official blender dev team so it’s not even taking time from the dev-team.
I think you mix common user decisions, that exist on all platforms (graphical, using tablet) with decisions that only exist not on one platform, but on a small part of the devices on that platform (because it doesn’t even exist on all macs, not even all macbooks), that isn’t even a reference to anything in the professional world (I have a macbook pro).
And personally, that wasted development time on improving the experience of the… 2-3%? of users (being very indulgent) when 100% of users have to suffer interface problems such as bake, dysfunctional menus, icons, unfinished Sidebar, sculpt,… Well, I think it would be a bad decision.
The usefullness of the support is irrelevant in this conversation as the market penetration of a hardware or a software is not really an argument itself. There is just efficiency and hardware/software to support. I want more efficiency from my touchbar therefore I’m implementing it. Additionally the topic is not “If” but “how”.
Though the point of touch controls gave me the idea of looking at this touch bar like any touch controller with display ability. I could generalise an UX anyone with some touch ability with display (Wacom cintiq, IPad Pro with sidecar, MBP touchbar, IIyama touchscreens etc…) or not (multitouch trackpads). this way the touchbar would be my base case but it would benefit more hardware in the long run.
I think I’m gonna try to display a workspace header in it and see where I go from there.
im not developer but a curious pc user.
What is that bar internally for IOS? an extra slim second monitor? or a bunch of little leds with touch support layer on top to display whatever you want and react to it?
imho, the touchbar is a gimmick at best, and it’s just Apple being Apple.
Blender devs should focus on the 98%, so PC users - Windows and/or Linux. And OpenGL & Vulcan, not Apple’s way of working.
The time and money at Blender Foundation for devs to work on Blender is not infinite, and spending it on things that only a handful of people -might- use is just not worth it.
I’d love to see the touchbar better integrated. I could imagine a timeline, zoom, or the quick menu with favourite commands. In edit mode switching between vertex, edge and polygon, as well as some basic mesh editing commands. Of course the ultimate goal would be to let users configure their touchbars within blender as needed
Except that in reality Nvidia GPU is standard for 3d modelling workstations and only real option for anyone who wants to seriously work in that field.
Did you really compared OS support to macbook pro touchbar?
You mean something that is essential for sculpting?
Those features are must have to compete with other packages, macbook pro touch bar isn’t.
Could also be used for:
- switching tools & brushes
- scrubbing timeline (a la FCPX touchbar)
When other platforms adopt touch bars, we can port the implementation.
If you look at other technologies like color displays, sound, mouse support, support for multiple fonts, hiDPI etc, all of those also didn’t use to exist on PCs. If we would be constrained by the slow progress of PC’s, we’d never get support for modern technologies related to input and display.