What i want to do is to capture iPhone camera frames with AVCaptureSession in an Xcode/swift3 project.
What i did is to instanciante AVCaptureSession, AVCapturePhotoOutput and stuff objects. It works great, the didOutputSampleBuffer delegate is called for each frame. What i want to do now is to do a simple task on each frame: I just want to make a threshold. It is very simple, i just have to iterate one time all my frame pixels.
I have read some tutorials that explained how i can convert my raw pointers to an UIImage and to display the result in an UIIMageView.
But this is very slow. I do not understand why because there is nothing in my task: Just a threshold and some conversion image stuff.
Do you know if i made a mistake or if there is a better way to do this ?
Thanks
class MyClass: AVCaptureVideoDataOutputSampleBufferDelegate
{
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!)
{
connection.videoOrientation = .portrait
connection.isVideoMirrored = true
let pixelBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: 0))
let width = CVPixelBufferGetWidth( pixelBuffer )
let height = CVPixelBufferGetHeight( pixelBuffer )
let bytesPerRow = CVPixelBufferGetBytesPerRow( pixelBuffer )
let black = PixelData(a:255,r:0,g:0,b:0)
var pixelData = [PixelData](repeating: black, count: Int(width * height))
if let baseAddress = CVPixelBufferGetBaseAddress(pixelBuffer)
{
let buf = baseAddress.assumingMemoryBound(to: UInt8.self)
var cpt = 0
for y in 0..<height
{
for x in 0..<width
{
let idx = x + y * width
if buf[ bytesPerRow*y + x*4 + 2] > 150 && buf[ bytesPerRow*y + x*4 + 1] < 150 && buf[ bytesPerRow*y + x*4 + 0] < 150
{
pixelData[ idx ].r = 0
pixelData[ idx ].g = 255
pixelData[ idx ].b = 0
cpt = cpt + 1
}
else
{
pixelData[ idx ].r = 0
pixelData[ idx ].g = 0
pixelData[ idx ].b = 0
}
}
}
}
var data = pixelData
let providerRef = CGDataProvider(
data: NSData(bytes: &data, length: data.count * MemoryLayout<PixelData>.size)
)
let cgim = CGImage(
width: width,
height: height,
bitsPerComponent: 8,
bitsPerPixel: 32,
bytesPerRow: width * (MemoryLayout<PixelData>.size),
space: CGColorSpaceCreateDeviceRGB(),
bitmapInfo: CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedFirst.rawValue),
provider: providerRef!,
decode: nil,
shouldInterpolate: true,
intent: .defaultIntent
)
let image = UIImage(cgImage: cgim!)
DispatchQueue.main.async { [unowned self] in
self.myimageview.image = image
}
CVPixelBufferUnlockBaseAddress( pixelBuffer, CVPixelBufferLockFlags(rawValue: 0) )
}
}