So I had an idea for how to implement baking functionality into Eevee, including screen-space effects like ambient occlusion. I’ve implemented this in python (with a small patch to the offscreen py API to support 16-bit buffers). The basic idea is to render the 3D viewport into an offscreen buffer, render a bake mesh and then inside the fragment shader look up the bake pixels in the viewport buffer. Bake pixels that are facing away from (or at too oblique an angle with) the camera are ignored.
It works surprisingly well for something hacked from a python script. Anyway, to try it out, apply the attached patch. Then open accumbake.blend and follow the instructions. Basically, run the “Bake EEvee” operator from the search menu, rotate the 3D viewport until the bake image that overlays the mesh is full, then hit escape (I imagine a real implementation would generate viewpoints automatically).
So, I have no idea how the Eevee development process works. It would be great if there were someone who is already implementing bake or has plans to. That said I need functional baking for various projects I’m involved with and cycles just isn’t cutting it. It’s extremely annoying waiting 20 minutes for a noisy bake that has black artifacts because of minor artifacts in the mesh, that are hard to catch and hard to fix when the meshes are being procedurally generated via displace modifiers (which btw aren’t compatible with cycles’s displacement functionality). So I’m willing to do this myself too.