How to evaluate error from projecting points into camera image?

45 views Asked by At

I'm doing lane estimation on line detection from lidar point cloud from PandaSet. I have the lines estimated and I need a way to compare them to something and draw error. I'd like to compare them to lines labeled in the camera images (not provided in the PandaSet), but they are just plain .jpg without any depth information, so I cannot transform the drawn lines into any other coordinates. However, I'm able to project the estimated lines from lidar points onto the images them selves. I have lidar position xyz and heading wxyz, camera position xyz, heading wxyz and intrinsics fx,fy,cx,cy.

Now I was thinking of doing reprojection error - calculate distances between the pixels of projected line estimates and drawn lines. The result would be just a pixel error, which isn't for much use for this application. But if I would take the distance of the lidar points projected into camera frame as the working distance, I could etimate the length error like in example here.

Would you please advise if this is a valid approach or perhaps suggest a better way? Thank you!

0

There are 0 answers