Image from CMSampleBufferRef is always white

255 views Asked by At

I am trying to get each frame from the replaykit using startCaptureWithHandler.

startCaptureWithHandler returns a CMSampleBufferRef which i need to convert to an image.

Im using this method to convert to UIImage but its always white.

- (UIImage *) imageFromSampleBuffer3:(CMSampleBufferRef) sampleBuffer
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);

    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil];
    CVPixelBufferRef pxbuffer = NULL;
    CVPixelBufferCreate(kCFAllocatorDefault, width, height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options, &pxbuffer);

    CVPixelBufferLockFlags flags = (CVPixelBufferLockFlags)0;
    CVPixelBufferLockBaseAddress(pxbuffer, flags);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
//    CGContextRef context = CGBitmapContextCreate(pxdata, width, height, 8, CVPixelBufferGetBytesPerRow(pxbuffer), rgbColorSpace, kCGImageAlphaPremultipliedFirst);
    CGContextRef context = CGBitmapContextCreate(pxdata, width, height, 8, CVPixelBufferGetBytesPerRow(pxbuffer), rgbColorSpace, kCGImageAlphaPremultipliedFirst);

    CGImageRef quartzImage = CGBitmapContextCreateImage(context);

    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);
    CVPixelBufferUnlockBaseAddress(pxbuffer, flags);

    UIImage *image = [UIImage imageWithCGImage:quartzImage scale:1.0f orientation:UIImageOrientationRight];

    CGImageRelease(quartzImage);
    return image;
}

Can anyone tell me where im going wrong?

1

There are 1 answers

1
user1418067 On

sampleBuffer is 420f format and it has 2 planar.
For locking memory, CVPixelBufferLockBaseAddress(imageBuffer, 0 then 1).
For getting Y plane data, CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0).
For UV plane data, CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 1).
Do not forget unlock memory. I am not sure how to convert YUV to RGB.

On your code, you do not read image data from imageBuffer. You only work on pxbuffer and pxdata which has no image data.