Dual camera capturing photos with unexpected scale and offset on iPhone 14 Pro

83 views Asked by At

I have a setup in which I capture both photos from the actual devices of the virtual .builtinDualCamera. I set the videoZoomFactor of the virtual device to the first of virtualDeviceSwitchOverVideoZoomFactors. Taking a photo results in two AVCapturePhotos. One from the wide angle camera and one from the telephoto camera. Since the photo from the wide angle camera is zoomed in, a part of the photo black (this is to be expected).

On an iPhone X, the photo from the wide angle camera has a scale of 1 divided by that videoZoomFactor (1.8), which is what I would expect.

On an iPhone 14 Pro, however, the scale seems to be different from the expected 1.0 / 3.0! The scale is often smaller, but in these examples it is even bigger. The image is also not completely centered!

wide angle camera image

telephoto camera image

Why is this? And is there a way to calculate this scale and offset?

My setup (simplified):

import AVFoundation

class DualCamera {
    
    enum DualCameraError: Error {
        case initialisation
    }
    
    private let device: AVCaptureDevice
    private let deviceInput: AVCaptureDeviceInput
    private let session = AVCaptureSession()
    private let photoOutput = AVCapturePhotoOutput()
    private let queue = DispatchQueue(label: "camera queue")
    private let cameraDelegate = DualCameraDelegate()
    
    init() throws {
        guard
            let device = AVCaptureDevice.DiscoverySession(
                deviceTypes: [.builtInDualCamera],
                mediaType: .video,
                position: .back)
                .devices
                .first,
            let deviceInput = try? AVCaptureDeviceInput(device: device)
        else {
            throw DualCameraError.initialisation
        }
        
        self.device = device
        self.deviceInput = deviceInput
        
        session.sessionPreset = AVCaptureSession.Preset.photo
        session.addInput(deviceInput)
        session.addOutput(photoOutput)
        
        photoOutput.isVirtualDeviceConstituentPhotoDeliveryEnabled = true
        
        deviceConfiguration { device in
            if let zoomFactor = device.virtualDeviceSwitchOverVideoZoomFactors.first {
                device.videoZoomFactor = CGFloat(zoomFactor.floatValue)
            }
        }
    }
    
    func deviceConfiguration(_ configuration: @escaping (AVCaptureDevice)->()) {
        queue.async { [self] in
            session.beginConfiguration()
            do {
                try device.lockForConfiguration()
                
                configuration(device)
                
                device.unlockForConfiguration()
            } catch(let error) {
                print(error.localizedDescription)
            }
            session.commitConfiguration()
        }
    }
    
    func shoot() {
        queue.async { [self] in
            let photoSettings = AVCapturePhotoSettings()
            photoSettings.virtualDeviceConstituentPhotoDeliveryEnabledDevices = device.constituentDevices

            photoOutput.capturePhoto(with: photoSettings, delegate: cameraDelegate)
        }
    }
}

class DualCameraDelegate: NSObject, AVCapturePhotoCaptureDelegate {

    func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
        
        // Do something with the photo
        print(photo)
    }
}

Note: I expect AVCameraCalibrationData to be of help, but I cannot figure out how to use it with the limited documentation available.

0

There are 0 answers