Hi there, fairly new Blender user, but seasoned cross-platform open-source programmer here.
I started tinkering with support for MIDI input in Blender. The GHOST part was fairly trivial but of course the devil is in the details how it should show up in the UI etc. And the code in for instance source/blender/windowmanager/intern is much harder to understand than that in intern/ghost.
For now I implemented it only on macOS but I do have Windows and Linux machines, too, and know how to code for them, too.
But, before I proceed any further, to not just waste my time, best to ask whether it is worth it at all… would such work eventually be accepted into the repository?
The intended use case for this work is not to be able to “play” Blender in real time using MIDI instruments (although possibly, if/once this is in, somebody will do that, too. I see that people have done stuff like that using the existing Python-based MIDI add-ons).
The intended use case is to use MIDI controllers with rotary knobs, sliders, and to some extent buttons (possibly with “aftertouch” i.e. pressure sensitivity) as input source, conceptually parallel to keyboard and mouse. I don’t have any exact use case in mind yet, but I could imagine that for instance if you have a controller with at least three rotary knobs it might be easier to do walk-style navigation inside your model with those than using the keyboard. Or knob and sliders could be used to adjust various parameters and values of selected objects.