Decoupling Ambient Occlusion sampling in Eevee


I was wondering if there was any plans to decouple Eevee’s ambient occlusion realtime sampling, and possibly other effects (shadows, anti aliasing?), from global sampling.
The main issue is that, when playing back animations, Eevee is rendering one sample per frame only, leaving noisy ambient occlusion:

Sub-sampling at 4-6 samples per frame would almost totally solve it:

Eevee has of course the temporal denoiser, but the drawback is the ghosting effect.
I’ve digged into the code, but so far no luck… If a developer familiarized with this development area (@Hypersomniac ?) coud point me in the right direction, that would be very appreciated. Is this something easily feasible or does it require a lot of changes in Eevee’s code structure?

Bumping up the thread with hope :slight_smile:

Hi, found this thread, wanted to ask the same thing (my biggest issue is antialiasing during viewport playback - I want to use Blender for assets presentations). Denoising in this case seems to be pretty useless.

I understand this is quite a niche use case, but it will need to be tackled for VR project Im I right?
From my humble experience with presenting stuff in Unreal VR I can say anti aliasing is really important to get right.

Will this change somehow will Vulcan?

1 Like

Unfortunately it seems like no one care about it so far :frowning: