On the current project once again I use the camera. I use two streams of video and photo Video to detect rectangle and photo to capture a photo with the flash. After several checks I found the bug. On 12 Pro & 13 Pro Max in a bright room I obtain overexposed photos, if I do the same in a dark room, there are no overexposed photos. This behavior is not available on older iPhones.
I look forward to all your suggestions and comments.
Environment: iOS 15.4.1, iPhone 12, 12 Pro, 12 Pro Max, 13, 13 Pro, 13Pro Max
Additional info: I capture photo in -
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {...}
So what I've discovered to help is
AVCapturePhotoSettings
'sisAutoStillImageStabilizationEnabled
, which is deprecated since iOS 13, with a reference tophotoQualityPrioritization
. Using.balance
or.quality
(and setting the same onAVCapturePhotoOutput
) didn't help, but.speed
did.