I use Cycles engine of Blender (V 2.82) to obtain the strength and distance (light-objects-camera) of each pixel, for some applications of Radar signal processing, as the papers:
[1] A simple radar simulation tool for 3D objects based on blender: A simple radar simulation tool for 3D objects based on blender | IEEE Conference Publication | IEEE Xplore
[2] General Purpose Radar Simulator based on Blender Cycles Path Tracer: 1570649487.pdf (sbrt.org.br)
The radar processing is not important here, the matters are the distance and amplitude information of pixels.
The issues I met are:
(i) set the light source and camera at one location, then a target model at a distance of around 8 meters. In principle, the distance matrices should be 16 meters (pixels with target) and 0 (pixels without target). I found most of the values are correct, however, there are some other values, 3 meters, 5 meters, and also even some big values like 25, and 50 meters.
ďŽ I guess those small values like 3 meters are from the pixels of the target bounder. The big values may be due to the sharp parts/points of the target, or the singularity of Blender, I am not sure.
Q (1): I want to ask, can I change some settings of Blender to obtain accurate distances?
(ii) for the same scenario, I only move the location of the camera by a small distance, for example, 0.5 mm, in principle, the changes in the distance matrix should be less than 0.5 mm; however, I found changes for meters happen in some pixels.
Q (2): Is Blender sensitive and accurate enough to catch the distance differences between two cameras, whose locations are only 0.5 mm?
Q (3): Where can I find some documents/resources on whatâs the distance rendering accuracy of Blender?
Leaving aside the radar part, which is beyond my knowledge, it looks like - as you describe yourself - an aliasing problem. There is a âpixel filterâ setting in the render options. You can set the filter width to 0.01, but unfortunately not below that. That gives you for the most part sharp edges, but some small amount of aliasing remainsâŚ
For the devs - this is a small papercut, with probably 10 people that are bothered by it, but I ran into this when doing color/position pass exports to pointclouds⌠With 0.01 its nearly perfect, but 0.0 would be nicer!
Thereâs no specific accuracy guarantees in Cycles or documentation about it. Thereâs a balance between performance and trying to avoid rendering artifacts.
Ray intersections can be inaccurate, either missing certain triangles or self intersecting when bouncing off a surface. Smooth normals can cause issues due to the discrepancy with the actual surface.
I think you will need to figure out exactly what the cause is if you want to resolve this, isolate the issue as much as possible and analyze what goes wrong.
Thereâs no setting for that, but maybe itâs possible to set things up in some way that avoids the problem. Still Cycles is not designed for accurate simulation so it may be difficult.
A small change in camera start position could trigger one of the precision issues described above.
Thereâs no specific accuracy guarantees. But also, renderers use floating point math and the question is incomplete in that context. Itâs not about absolute units but also how far you are from the origin. If you work in mm near the origin it may be fine, but if youâre 1km away from the origin youâll have problems.
Thanks for so many detailed and prompt replies.
In fact, I have noticed the floating-point, I use .exr file rather than .hdr files as the outputs.
In my rendering scenario, the objects are around 10 Blender units, where the unit is set to be 1 meter. Under this condition, I move the position of camera by ~ 0.5 mm, many distance of pixels vary a lot. Hence I ask the distance accuracy of Blender outputs, longing to figure out if I did something wrong or it may due to the Cycles. If it is the later, I need to consider using this engine for single camera simulation.