Real-time mode viewport

The 3.x Blender Roadmap discusses a viewport for Real-time mode.

Such a mode would allow Blender to be used in a similar way as Touch Designer, VUO etc.

From a user POV, I would like to have a discussion as to what would be a useful implementation of real-time viewport.

Specifically:

  • Multi-window output (to be able to output more than one viewport from Blender, in fullscreen)
  • Camera locked to window output
  • Ability to apply shaders directly to cameras.
  • Pipe multiple cameras themselves into nodes (2D Shader nodes) to allow effects, or multi-camera / pass effects (VR, Dome mapping) from EEVEE.

Such additions would allow Blender to be used for real-time graphics (events, theatre) or even for real-time viewing of content for specialist screens (EEVEE dome stitching and realtime display - VR).

2 Likes

I’ve done several workarounds to use Blender as a rendering engine for interactive works, but it’s clunky right now.

I have used the old blender game engine in the past for creating real-time artworks that would play fullscreen.

Are there any updates on the new real-time engine?

I’d love to get involved in helping develop this and think about the functionality.

Morgan

As someone who is about to embark on using Blender as a platform for ‘V-tubing’ (see link below) I am extremely buoyed by the talk of potentially re-introducing a logic based system to provide an environment for Real-Time projects. Has anything been started yet or is this planned for the latter half of the 3.x Roadmap?

Tom

1 Like