I am writing an app which needs to apply filter to a video captured using AVCaptureSession. The filtered output is written to an output file. I am current using CIFilter and CIImage for filter each video frame. Here is the code:
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
...
let pixelBuffer = CMSampleBufferGetImageBuffer(samples)!
let options = [kCVPixelBufferPixelFormatTypeKey as String : kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]
let cameraImage = CIImage(cvImageBuffer: pixelBuffer, options: options)
let filter = CIFilter(name: "CIGaussianBlur")!
filter.setValue((70.0), forKey: kCIInputRadiusKey)
filter.setValue(cameraImage, forKey: kCIInputImageKey)
let result = filter.outputImage!
var pixBuffer:CVPixelBuffer? = nil;
let fmt = CVPixelBufferGetPixelFormatType(pixelBuffer)
CVPixelBufferCreate(kCFAllocatorSystemDefault,
CVPixelBufferGetWidth(pixelBuffer),
CVPixelBufferGetHeight(pixelBuffer),
fmt,
CVBufferGetAttachments(pixelBuffer, .shouldPropagate),
&pixBuffer);
CVBufferPropagateAttachments(pixelBuffer, pixBuffer!)
let eaglContext = EAGLContext(api: EAGLRenderingAPI.openGLES3)!
eaglContext.isMultiThreaded = true
let contextOptions = [kCIContextWorkingColorSpace : NSNull(), kCIContextOutputColorSpace: NSNull()]
let context = CIContext(eaglContext: eaglContext, options: contextOptions)
CVPixelBufferLockBaseAddress( pixBuffer!, CVPixelBufferLockFlags(rawValue: 0))
context.render(result, to: pixBuffer!)
CVPixelBufferUnlockBaseAddress( pixBuffer!, CVPixelBufferLockFlags(rawValue: 0))
var timeInfo = CMSampleTimingInfo(duration: sampleBuffer.duration,
presentationTimeStamp: sampleBuffer.presentationTimeStamp,
decodeTimeStamp: sampleBuffer.decodeTimeStamp)
var sampleBuf:CMSampleBuffer? = nil;
CMSampleBufferCreateReadyWithImageBuffer(kCFAllocatorDefault,
pixBuffer!,
samples.formatDescription!,
&timeInfo,
&sampleBuf)
// write to video file
let ret = assetWriterInput.append(sampleBuf!)
...
}
The ret from the AVAssetWriterInput.append is always false. What am I doing wrong here? Also, the approach I am using is very inefficient. A few temp copies are created along the way. Is it possible to it in-place?
I used almost the same code with the the same problem. As I found out there was something wrong with pixel buffer created for rendering.
append(sampleBuffer:)
was always returning false andassetWriter.error
wasThey say this is a bug (as described here), already posted: https://bugreport.apple.com/web/?problemID=34574848.
But unexpectedly I found that problem goes away when using original pixel buffer for rendering. See code below:
Sure, we all know you are not allowed to modify sample buffer, but somehow this approach gives normal processed video. Trick is dirty and I can't say if it will be fine in cases when you have preview layer or some concurrent processing routines.