According to https://forums.developer.apple.com/thread/21694
Both iPhone 6s and 6s Plus can capture 12 megapixel photos (4032x3024) on the rear-facing camera via the AVCaptureStillImageOutput, and can deliver up to 30 fps 12 MP frames to your process via AVCaptureVideoDataOutput. When you use AVCaptureSessionPresetPhoto as your AVCaptureSession’s -sessionPreset, the 12 megapixel ‘420f’ format is chosen by default.
So I tried to do so with the following code:
self.captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
AVCaptureVideoDataOutput* videoOutput = [[AVCaptureVideoDataOutput alloc] init];
videoOutput.alwaysDiscardsLateVideoFrames = YES;
[videoOutput setSampleBufferDelegate:self queue:dispatch];
[self.captureSession addOutput:videoOutput];
My device's active format is set to:
device format: AVCaptureDeviceFormat: 0x12c684630 'vide'/'420f' 4032x3024, { 3- 30 fps}, fov:57.716, max zoom:189.00 (upscales @1.00), AF System:2, ISO:23.0-1840.0, SS:0.000013-0.333333
which looks good.
However, I added the following code to the captureOutput
callback:
CVPixelBufferLockBaseAddress(imageBufferRef,0);
size_t width = CVPixelBufferGetWidth(imageBufferRef);
size_t height = CVPixelBufferGetHeight(imageBufferRef);
CVPixelBufferUnlockBaseAddress(imageBufferRef,0);
NSLog(@"ImageCameraSource:\n width:%zu\n height:%zu\n", width, height);
And it outputs
ImageCameraSource: width:1000 height:750
Why isn't the resolution in the callback 4032x3024?
Try using
AVCaptureSessionPreset3840x2160
to get highest res frames in real time