The 3.x Blender Roadmap discusses a viewport for Real-time mode.
Such a mode would allow Blender to be used in a similar way as Touch Designer, VUO etc.
From a user POV, I would like to have a discussion as to what would be a useful implementation of real-time viewport.
Specifically:
- Multi-window output (to be able to output more than one viewport from Blender, in fullscreen)
- Camera locked to window output
- Ability to apply shaders directly to cameras.
- Pipe multiple cameras themselves into nodes (2D Shader nodes) to allow effects, or multi-camera / pass effects (VR, Dome mapping) from EEVEE.
Such additions would allow Blender to be used for real-time graphics (events, theatre) or even for real-time viewing of content for specialist screens (EEVEE dome stitching and realtime display - VR).