I have two 2d images, one is the source image and the other is a target image; I need to rotate the source image to match the target image using python (scikit & numpy). I have 3 landmark coordinates for each image, as follows:

image1_points = [(12,16),(7,4),(25,20)]
image2_points = [(15,22),(1,22),(25,10)]

I believe the following steps are what's needed:

  • Create rotation matrix using least squares approach using the 3 landmark coordinates
  • Use the rotation matrix to get theta
  • Convert theta to degrees (for the angle)
  • Use the apply_angle method with the angle to rotate the image

I've been trying to use these points and the least squares approach to compute a linear transformation matrix that transforms points from the source to the target image.

I know I need to create a rotation matrix, but having never taken algebra I'm a bit lost. I've done lots of reading, and tried using scipy's built-in procrustes to do an affine transformation below (which may be all wrong).

m1, m2, d = scipy.spatial.procrustes(target_points, source_points)
a = np.dot(m1.T, m2, out=None) / norm(m1)**2

#separate x and y for the sake of convenience
ref_x = m2[::2]
ref_y = m2[1::2]

x = m1[::2]
y = m1[1::2]

b = np.sum(x*ref_y - ref_x*y) / norm(m1)**2

scale = np.sqrt(a**2+b**2)
theta = atan(b / max(a.all(), 10**-10)) #avoid dividing by 0

degrees = cos(radians(theta))

apply_angle(source_img, degrees)

However, this is not giving me the result I would expect. It's giving me a degree around 1, where I would expect a degree around 72. I suspect that the degree is what's needed to rotate the image as the angle parameter.

Any help would be hugely appreciated. Thank you!

0

There are 0 answers