I am writing some video processing code using AVComposition. Giving only the necessary background details, I receive a CVPixelBuffer from an apple API that I do not control. This CVPixel buffer, contains a previously rendered video frame, as they are apparently recycled by this Apple API I do not control. So my goal, is to set all the pixels in the CVPixelBufferRef to [0, 0, 0, 0] (in RGBA color space). I can do this on the CPU via this function:
- (void)processPixelBuffer: (CVImageBufferRef)pixelBuffer
{
CVPixelBufferLockBaseAddress( pixelBuffer, 0 );
int bufferWidth = CVPixelBufferGetWidth(pixelBuffer);
int bufferHeight = CVPixelBufferGetHeight(pixelBuffer);
unsigned char *pixel = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);
for( int row = 0; row < bufferHeight; row++ ) {
for( int column = 0; column < bufferWidth; column++ ) {
pixel[0] = 0;
pixel[1] = 0;
pixel[2] = 0;
pixel[3] = 0;
pixel += 4;
}
}
CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );
}
Is there some way I can accomplish the same thing using the GPU? Additionally, is it possible to do this via CoreImage? as I don't know openGL and it appears quite complicated to set up.

Assuming your pixel buffer is attached to an image buffer backed by an IO surface, you can use
CVOpenGLESTextureCacheCreateTextureFromImageto get aCVOpenGLESTextureCacheto give you aCVOpenGLESTextureRef. That will be able to vend a texture target and name so that you can bind the thing in OpenGL.In OpenGL you can use a framebuffer object to render to a texture. Having done that you could use
glClearto clear the thing. CallglFinishto create a synchronisation point with the GL pipeline and when you next check it from the CPU, your memory should be cleared.