I am attempting to calibrate the extrinsics of four cameras that I have mounted on a set-up. They are pointing 90 degrees apart. I have already calibrated the intrinsic paramteres, and I am thinking of using an image of a calibration pattern to find the extrinsics. What I have done so far is: placed the calibration pattern so that it lies flat on the table, so that its roll and yaw angles are 0 and pitch is 90 (as it lies parallel with the camera). The cameras have 0,90,180,270 degrees angles yaw (as they are 90 degrees apart) and the roll angle of the cameras are 0 (as they do not tilt. So what is left to calculate is the pitch angle of the cameras.
I can't quite wrap my head around how to calculate it, as I am not used to doing mapping between coordinate systems, so any help is welcome. I have already made a part of the program that calculates the rotation vector (of the calibration pattern in the image) using the cv::solvePnPRansac() function, so I have the rotation vector (which I believe I can make into a matrix using cv::Rodrigues()
What would the next step be for me in my calculations?