Hello everyone,
Currently, I am attempting to merge complex lenses ray tracing into Cycles renderer. The purpose of this work is to make use of powerful ray tracing ability of cycles to render realistic images using a camera with complicated lenses setup. Basically, steps are as follows.
- Rays are sampled following the strategy of “ccl_device_inline void camera_sample”
- instead of directly put the rays into the scene, rays go into an extra group of lenses and calculate updated ray origins and ray directions.
- After the 2), rays finally goes into the scene and do the rest intersection and etc.
Maybe I miss some part of the code, I could not find ray tracing for 3 rays per pixel per sample. What I mean here is that, light is disperse and RGB are generated by different wavelengths of lights, and each color of R/G/B should have a corresponding ray. Each ray is dependent on given material, curvature of lenses and also wavelength. However, in Cycles, I only find 1 ray initialized per pixel per sample. How do a merge the realistic ray tracing of complex lenses into Cycles? Is there any pieces of code I missed? Or, in the worst case, is there any other open-sourced renderer that I could work on in the context of Blender? Any help is appreciated!
Best Regards,
Skye Fang