Because the camera is off the floor (z=1.25m), points with z=0 should never project to the center of the image (only at infinity).
However, for z=0 and x=0 all points are projected to exactly the center of the image. Why is that?
Minimal example, with a camera image 512x512 in size (so, cx=cy=256
), and arbitrary focal lenght:
This
r = np.asarray([1.57079633, 0., 0.]) # 90 degrees
K = np.asarray([[128., 0., 256.],
[ 0., 128., 256.],
[ 0., 0., 1.]])
position = np.asarray([0., 0., 1.25]) # 1.25m off the ground
points3d = np.asarray([(0, y, 0) for y in range(1,10)]).astype(float)
points2d = np.squeeze(cv2.projectPoints(points3d, rvec=r, tvec=position, cameraMatrix=K, distCoeffs=None)[0]).round().astype(int)
print(points3d)
print(points2d)
outputs
points3d
[[0. 1. 0.]
[0. 2. 0.]
[0. 3. 0.]
[0. 4. 0.]
[0. 5. 0.]
[0. 6. 0.]
[0. 7. 0.]
[0. 8. 0.]
[0. 9. 0.]]
points2d
[[256 256]
[256 256]
[256 256]
[256 256]
[256 256]
[256 256]
[256 256]
[256 256]
[256 256]]
So it seems
cv2.projectPoints(..., rvec, tvec
first rotates camera byrvec
, and then translates camera (in that rotated frame)tvec
followingThus, if one wants to represent the camera being off the ground, given this specific
rvec=[1.57079633, 0., 0.]
one needs to alter the 2nd coordinate oftvec
instead of the thirdNamely, setting
position = np.asarray([0., 0., 1.25])
outputsthat is the expected "projected points starting at the bottom of the image and moving towards the center, the further the 3D point is"