Interpolate 2D to 3D

Hi all. I am a beginner of Blender and am writing bpy program to form some data.

My issue is: Just like moving cursor to some object to select it, is there a way to find a 3D vertex (or location on a mesh) in a model given a 2D point in rendered image?

Input: 1) a rendered image of given 3D model at some pose; 2) a point on that image
Expected Output: an equation of line from camera origin to the 3D object the hit vertex.

My idea is to use ray_cast, but it is limited to object space.

Thanks for any help or suggestion.

My first thought was that you were asking about UV data, which obviously is 2d->3d mapped already. However, what you want is called translation. Assuming that the rendered frame camera data (sensor, FoV, ar, …) and position/rotation are known, you could translate that into world space, yes. You will need to learn how to use the matrix operators https://docs.blender.org/api/current/mathutils.html

Ironically, I don’t believe you would need the image. Only the camera data.

Edit: Perhaps a better question would be, is there any way to extrapolate the angle/position/fstop/fov/ar/exposure of a photo given different types of reference data with respect to one particular subject. I believe this problem is tackled in the modern age with machine learning models, and some techniques are used for photogrammetry.

Edit2: Some common sense,
imagine you have the model open in the 3d view
you go into camera perspective view
the camera settings are equal to how you took the render
you go into edit/face select mode and you click with the mouse on the image coordinate

This thread may be of some interest:

UV mapping can be very helpful. The reason it may still require some steps to use is what I have as input are two images of model in different poses with keypoints extracted. I want to compare how accurate these point pairs are, so I need to map keypoints in either image back to 3D and transform the model along with those points to the pose aligned in the other image and render again. Then I can compare pixelwise distance between points in each pair.

What I have found besides is world perspective matrix (https://blender.stackexchange.com/questions/15102/what-is-blenders-camera-projection-matrix-model/38189). I am trying to see if inversion can give coefficients of equation of epipolar line when explolating. If this doesn’t work, I will try to find nearest triangle or vertex as shown in the next reply. Thank you!

Thanks a lot! The method seems to be helpful to my goal directly. I will use it if has been published but this is a good direction to search.