I am trying to understand, how it’s possible to bake textures by using Cycles. Not from the Blender, but by using Cycles API as external render engine. There are no many differences between bake(…) and render(…) functions in the blender_session.cpp file. I setup a simple scene: plane with uvs, cube, one light source and a camera, and I would like to bake combine pass for the plane.
As I understand, the session setup for the baking differs from the setup for render in two items:
- Call scene->bake_manager->set(scene, object_name, shader_type, bake_pass_filter);
- Define session->read_bake_tile_cb callback
But here is a problem. What data should be passed to the render engine in this callback? In Blender sources this callback contains the following code:
for (BL::RenderPass &b_pass : b_rlay.passes) { /* find matching pass type */ PassType pass_type = BlenderSync::get_pass_type(b_pass); int components = b_pass.channels(); rtile.buffers->set_pass_rect( pass_type, components, (float *)b_pass.rect(), rtile.num_samples); }
What data contained in b_pass.rect()? And what I should do in my case? If I do not set a callback, then the renderer renders a usual image from the camera in the scene.
May be the render session requires some additional setup? For example, how the renderer know the uv-coordinates for projection of the baked data? Is it use the default uvs (contained in ATTR_STD_UV attribute), or these coordinates should be setup manually?
Or may be baking in the Cycles is not the same, as baking in general sense? Is it possible to obtain for example lightmaps of objects from the scene?
Here is a link to source file, which contains a code for console application: https://github.com/Tugcga/CyclesBake/blob/master/bake_app.cpp
What I am missing in this code, and what should be added to obtain a proper baking result?
I am sorry for so many questions.