Baking with Eevee

bake

So I had an idea for how to implement baking functionality into Eevee, including screen-space effects like ambient occlusion. I’ve implemented this in python (with a small patch to the offscreen py API to support 16-bit buffers). The basic idea is to render the 3D viewport into an offscreen buffer, render a bake mesh and then inside the fragment shader look up the bake pixels in the viewport buffer. Bake pixels that are facing away from (or at too oblique an angle with) the camera are ignored.

It works surprisingly well for something hacked from a python script. Anyway, to try it out, apply the attached patch. Then open accumbake.blend and follow the instructions. Basically, run the “Bake EEvee” operator from the search menu, rotate the 3D viewport until the bake image that overlays the mesh is full, then hit escape (I imagine a real implementation would generate viewpoints automatically).

https://dev-files.blender.org/file/download/biwhoemtuxvpmeb46hxz/PHID-FILE-v3o75rmyj2rs6zgus7ww/offscreen_float_16.diff

https://dev-files.blender.org/file/download/rxud4dtl2bxdczrh77cc/PHID-FILE-at3jxf5qgwjtwycslfy7/accumbake.blend

So, I have no idea how the Eevee development process works. It would be great if there were someone who is already implementing bake or has plans to. That said I need functional baking for various projects I’m involved with and cycles just isn’t cutting it. It’s extremely annoying waiting 20 minutes for a noisy bake that has black artifacts because of minor artifacts in the mesh, that are hard to catch and hard to fix when the meshes are being procedurally generated via displace modifiers (which btw aren’t compatible with cycles’s displacement functionality). So I’m willing to do this myself too.

14 Likes

Good idea, could probably use it to bake procedural textures too. But tasks are bugs, submit it as a patch instead.https://wiki.blender.org/wiki/Tools/CodeReview

You can .zip your files and upload the zip file.

I didn’t see .zip in the ext list. I’m having computer troubles but afterwards I’ll try it.

It’s not a patch. It’s a proof of concept, the “patch” just adds support for 16 bit offscreen buffers to the py api

A patch is no more than a text file which contains data about the differences between two versions of code. They contain changes to code, adding functionality, fixing a bug, etc. A patch contains information about which files change on which lines and how. Contributing Code to Blender is done through such patches or diffs .

So yes, it is a patch. You can submit it as a proof of concept and then develop it further if you want to.

Anyway (and btw, I wrote the original wiki article for new developers) I just need to figure out who the Eevee developers are and how to get in contact with them.

As far as I know @fclem is the main developer for Eevee. Check this:
https://developer.blender.org/project/view/81/

You can also chat with developers here:
https://blender.chat/channel/blender-coders

1 Like

Hey this is pretty cool, but I assume this doest support of transfer maps from high -> low meshes?

So you cache some complex material into a texture - pretty nice.

Has anyone experimented with this much since Joseph’s original post on the idea? Not being able to bake in Eevee is a pretty serious limitation, wondering if this is a reasonable short-term workaround.

Hi Joseph. I’m interested in the patch you are proposing to support high_bitdepth in offscreen buffer.
Do you think it is possible to send it for review, so that it can be integrated into the master?

I would love to see something like this implemented

Hey has anyone gotten this working? I’m having trouble getting the patch to work

Sorta stumbled onto this thread. It sounds like it would be useful, but basically is the same as rendering and then using that as a brush to texture paint (or alternatively, just using it on a project-from-view unwrap, projected from the same camera used to render.) Which is sometimes nice, but of course pretty limited, and dealing with the borders between different perspectives can be pretty troublesome.

It is possible to do a rasterizer bake that works better than that. I’ve implemented one for MikuMikuDance, using DX9.0 (because I was too scared to learn Blender baking, lol, that was a long time ago.) There were occasional problems, but something is better than nothing, and I didn’t really have a very good understanding of shaders at that time (not like I have a great understanding now, but some things have sunk in a little bit.)

The basic technique of what I did was to just assign a screen pos of UV.x, UV.y, 0, 0.5 for all vertices in the vertex shader, in lieu of doing worldcameraproj transform. With backface culling off. Assigning a constant W prevented any perspective correction shenanigans. In hindsight, I’d suspect that clipping issues were the cause of the occasional issues I’d have. That could probably be addressed by sending only pre-clipped, in-bounds faces. Better would probably be to compute a camera matrix for each face, if that’s possible, not sure off the top of my head, and then send to an orthogonal projection.

You still compute the actual world pos, etc, which you can send to the pixel/fragment shader for calculating actual view dependent effects, irrespective of screen space. I can’t remember if I had any tricks to properly calculate shadow buffers; it seems like it should be possible by using world pos to depth check your shadow buffers, but I can’t remember if I did that or not.

Obviously, no post processes like bloom or screen space reflections. But I do still miss that I could get a (admittedly amateurish) bake in 16 ms…

That’s definitely the right way to do this sort of thing. I think we’ll end up tackling this for the texture paint rewrite.

2 Likes