I am using AvFoundation & AVCaptureVideoDataOutputSampleBufferDelegate to record a video.
I need to implement Zoom functionality in the video being recorded. I am using the following delegate method.
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
I am using this for getting video frames because i need to add text and images later on it before the appending it to the AVAssetWriterInput, using
[assetWriterVideoIn appendSampleBuffer:sampleBuffer]
The only way i can think to perform zoom is to scale and crop the "(CMSampleBufferRef)sampleBuffer" that i get from the delegate method.
Please help me out on this. I need to know the possible ways to scale and crop "CMSampleBufferRef".
One solution is to convert the CMSampleBuffer ref to a CIImage, then scale that and write it back to CVPixelBufferRef and append that.
You can see how to do that here which contains the code structure.
Adding filters to video with AVFoundation (OSX) - how do I write the resulting image back to AVWriter?
Another alternative is to scale the video using Layer Instructions like:
This tells the composition to scale the mutableCompositionTrack (or whatever variable name you use for the track) by a factor of 2.0 starting at the beginning of video.
Now when you composite the video, add the array of layer intructions and you'll get your scaling without needing to worry about manipulating CMSampleBuffer (it will also be a lot faster).