Project point cloud into depth image in python

246 views Asked by At

I have a point cloud which is in the same coordinate system as the cameras used for capturing the image.

I have a set of points (1000000, 3)

The camera extrinsic matrix Rt

The camera intrinsic matrix K

And my image (480, 640, 3)

near=0.05 and far=20 values as well as depth scale = 1000

What i want to get out is my depth image for each camera frame I have. However, i am stuck and just get some weird offset image.

I use open3d to read the point cloud file, numpy for the calculation and matplotlib for vizualization.

    if os.path.exists(PATH):

            # Read
            pcd = o3d.io.read_point_cloud(PATH)
            o3d.visualization.draw_geometries([pcd])
    
    point_cloud = pcd.points  # get points

    # Load intrinsic and extrinsic camera parameters
    K = np.array([[fx, 0, cx],
                             [0, fy, cy],
                             [0, 0, 1]])
    extrinsics_matrix = np.load('rt.npy')  # Extrinsic matrix

    # Transform the point cloud to the world coordinates using extrinsics
    world_coords = np.dot(extrinsics_matrix[:3, :3], point_cloud.T) + extrinsics_matrix[:3, 3][:, np.newaxis]

    # Project world coordinates to 2D image coordinates using the intrinsic matrix
    homogeneous_coords = np.dot(K, world_coords)
    image_coords = homogeneous_coords[:2, :] / homogeneous_coords[2, :]

    image_coords = np.clip(image_coords, 0, np.array([640, 480])[:, np.newaxis])
0

There are 0 answers