Cycles generates distorted depth

Hi,

I have an issue with the depth map generated by Cycles : the depth map seems to have (radial) distortion, while the default Blender renderer does not.

So I am looking for information on the internal camera model used by the Cycles engine, or a way to retrieve/compute correct camera intrinsic parameters ? I guess the lens has distortion, but I could not find parameters or even code running this in Cycles.

I am posting here as it is more an internal code related issue than usage / UI / UX question.

I posted a more detailed question on the StackExchange : export - Cycles generates distorted depth - Blender Stack Exchange

Thanks!

Cycles uses a pinhole camera (when depth of field is turned off) and uses the distance between a given point and the pinhole as its Z depth.
There is no explicitly modeled lens distortion, it happens implicitly. Imagine, what kind of surface would you need so that each point of that surface has the same distance from the pinhole? A sphere, centered around the camera. Any other surface, such as a square, will have non-uniform Z-depth.

The source code for camera ray generation is in kernel_camera.h camera_sample_perspective()

1 Like

To convert to rectilinear ‘depth’ (each plane perpendicular to camera’s facing has the same value) you can just do some trigonometry on the ‘spherical’ (ray length) depth data. You just need the angle from facing, which can be calculated with the camera’s horizontal and vertical fov, or by extension, with the sensor size and focal length. There are some online calculators to help with this.

Thanks for your answers!

So I just need to project the point from the camera plane onto the unit sphere (i.e. normalize it) and apply the depth to it to recover the correct 3D position. Works great.

I am surprised that the default Blender renderer has a different behavior though.