cvPixelBuffer to CGImage Conversion only Gives Black-White Image

846 views Asked by At

I am trying to convert raw camera sensor data to a color image. The data are firstly provided in a [UInt16] array and subsequently converted to a cvPixelBuffer.

The following Swift 5 code "only" creates a black-and-white image and disregards the color filter array of the RGGB pixel data.

I also tried VTCreateCGImageFromCVPixelBuffer to no avail. It returns nil.

   // imgRawData is a [UInt16] array

   var pixelBuffer: CVPixelBuffer?
   let attrs: [ CFString : Any ]  = [ kCVPixelBufferPixelFormatTypeKey : "rgg4" ]

   CVPixelBufferCreateWithBytes(kCFAllocatorDefault, width, height, kCVPixelFormatType_14Bayer_RGGB, &imgRawData, 2*width, nil, nil, attrs as CFDictionary, &pixelBuffer)
   
   // This creates a black-and-white image, not color

   let ciimg = CIImage(cvPixelBuffer: pixelBuffer! )

   let context: CIContext = CIContext.init(options: [ CIContextOption(rawValue: "workingFormat") : CIFormat.RGBA16 ] )
   guard let cgi = context.createCGImage(ciimg, from: ciimg.extent, format: CIFormat.RGBA16, colorSpace: CGColorSpace(name: CGColorSpace.sRGB), deferred: false)
    else { return dummyImg! }


   // This function returns nil
   var cgI: CGImage?
   VTCreateCGImageFromCVPixelBuffer(pixelBuffer!, options: nil, imageOut: &cgI)

Any hint is highly appreciated.

As for the demosaicing, I want CoreImage or CoreGraphics take care of the RGGB pixel-color interpolation.

0

There are 0 answers