Clear a CVPixelBufferRef using the GPU

3.9k views Asked by At

I am writing some video processing code using AVComposition. Giving only the necessary background details, I receive a CVPixelBuffer from an apple API that I do not control. This CVPixel buffer, contains a previously rendered video frame, as they are apparently recycled by this Apple API I do not control. So my goal, is to set all the pixels in the CVPixelBufferRef to [0, 0, 0, 0] (in RGBA color space). I can do this on the CPU via this function:

- (void)processPixelBuffer: (CVImageBufferRef)pixelBuffer
{
    CVPixelBufferLockBaseAddress( pixelBuffer, 0 );

    int bufferWidth = CVPixelBufferGetWidth(pixelBuffer);
    int bufferHeight = CVPixelBufferGetHeight(pixelBuffer);
    unsigned char *pixel = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);

    for( int row = 0; row < bufferHeight; row++ ) {
        for( int column = 0; column < bufferWidth; column++ ) {
            pixel[0] = 0;
            pixel[1] = 0;
            pixel[2] = 0;
            pixel[3] = 0;
            pixel += 4;
        }
    }
    CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );
}

Is there some way I can accomplish the same thing using the GPU? Additionally, is it possible to do this via CoreImage? as I don't know openGL and it appears quite complicated to set up.

4

There are 4 answers

3
Tommy On

Assuming your pixel buffer is attached to an image buffer backed by an IO surface, you can use CVOpenGLESTextureCacheCreateTextureFromImage to get a CVOpenGLESTextureCache to give you a CVOpenGLESTextureRef. That will be able to vend a texture target and name so that you can bind the thing in OpenGL.

In OpenGL you can use a framebuffer object to render to a texture. Having done that you could use glClear to clear the thing. Call glFinish to create a synchronisation point with the GL pipeline and when you next check it from the CPU, your memory should be cleared.

3
user1118321 On

From this page it looks like you can directly access the CVImageBufferRef as an OpenGL Texture via:

glBindTexture( CVOpenGLTextureGetTarget( image ), CVOpenGLTextureGetName( image ) );

Once you have it as a texture you can use it as the draw buffer for an FBO and simply call glClear(GL_COLOR_BUFFER_BIT); on it.

1
Ian Bytchek On

You can use Accelerate vImageBufferFill to do exactly that (not GPU though). This fills BGRA with black in Swift 4:

let width: Int = 3
let height: Int = 3

var pixelBuffer: CVPixelBuffer?
var imageBuffer: vImage_Buffer
var date: Date
let iterations: Int = 10000

let pixelBufferAttributes: [CFString: Any] = [kCVPixelBufferCGImageCompatibilityKey: true, kCVPixelBufferCGBitmapContextCompatibilityKey: true]
if CVPixelBufferCreate(kCFAllocatorDefault, width, height, kCVPixelFormatType_32BGRA, pixelBufferAttributes as CFDictionary, &pixelBuffer) != kCVReturnSuccess || pixelBuffer == nil { fatalError("Cannot create pixel buffer.") }
if CVPixelBufferLockBaseAddress(pixelBuffer!, []) != kCVReturnSuccess { fatalError("Cannot lock pixel buffer base address.") }
imageBuffer = vImage_Buffer(data: CVPixelBufferGetBaseAddress(pixelBuffer!), height: UInt(height), width: UInt(width), rowBytes: CVPixelBufferGetBytesPerRow(pixelBuffer!))
vImageBufferFill_ARGB8888(&imageBuffer, [0, 0, 0, 0xFF], UInt32(kvImageNoFlags))
if CVPixelBufferUnlockBaseAddress(pixelBuffer!, []) != kCVReturnSuccess { fatalError("Cannot unlock pixel buffer base address.") }

Testing plain memset to fill small buffer with zeroes in 10000 iterations is way faster in playground. In release build with real data results will probably not differ that much.

  • accelerate: 1629.0
  • memset: 314.0

enter image description here

0
bcattle On

I have this same problem (forcing a recycled buffer supplied by -[AVVideoCompositionRenderContext newPixelBuffer] to be black).

What if you use Core Image to do the heavy lifting? Something like this:

EAGLContext *eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
CIContext *ciContext = [CIContext contextWithEAGLContext:eaglContext];

CIImage *image = [CIImage imageWithColor:[CIColor colorWithRed:0 green:0 blue:0]];
CVPixelBufferRef destination = [request.renderContext newPixelBuffer];
[ciContext render:image toCVPixelBuffer:destination];
[request finishWithComposedVideoFrame:destination];
CVBufferRelease(destination);

You would have to test the performance to know if this is any better than memset, but this is a better approach for my money than trying to peer into OpenGL APIs just to set a few bits.