My vision for EEVEE’s ideal state for reflections and GI would be moving in the direction of deferred path tracing similar to what Enscape did as shown here:
I don’t expect EEVEE to actually use this as its primary engine, though, because with the limited temporal resources Blender has, this may just end up being a slight variation on Cycles and not worth it at all.
However, this technique might be an interesting way to handle better (more bounces, more accurate) screen-space reflections and refractions in the future.
Essentially, it uses rasterization instead of primary rays to get a good medium for screen space effects, then it uses either screen-space raymarching or BVH traversal depending on a weighted noise pattern (pure black and white, either screen-space or BVH).
It uses temporal caches for both denoising and speed, and only calculates one bounce per frame (each consecutive GI bounce has a 1 frame delay).
This way, it leverages the speed of rasterization and screen-space techniques to some extent, but manages to avoid baking and most of the artifacts of screen-space reflections and refractions. However, it isn’t strictly real-time (and doesn’t need to be), although it gets close on high-end machines, even without RTX (with 1-2 diffuse bounces).