I’d like to use one or more mobile phones as graspable user interfaces for blender. I’m planning an app on the mobile phone with a number of icons. When an icon is pressed a configurable key combination should be sent to blender via WLAN and TCP/IP. There are commercial solutions for this, cf. e.g. https://www.unifiedremote.com/, but a free customized blender plug-in would be better. bpy.context.window.event_simulate would be perfect to use, but apparently it only works with the argument --enable-event-simulate, whereby normal keyboard input will be ignored. Would it be possible to add another argument --enable-event-simulate-and-normal-keyboard-input to Blender that would enable me to do the implementation?