Render a CVPixelBuffer to an NSView (macOS)

1.4k views Asked by At

I have a CVPixelBuffer that I'm trying to efficiently draw on screen.

The not-efficient way of turning into an NSImage works but is very slow, dropping about 40% of my frames.

Therefore, I've tried rendering it on-screen using CIContext's drawImage:inRect:fromRect. The CIContext was initialized with a NSOpenGLContext who's view was set to my VC's view. When I have a new image, I call the drawImage method which doesn't spit out any errors... but doesn't display anything on screen either (it did log errors when my contexts were not correctly setup).

I've tried to find an example of how this is done on MacOS, but everything seems to be for iOS nowadays.

EDIT:

Here's some of the code I am using. I've left out irrelevant sections

On viewDidLoad I init the GL and CI contexts

NSOpenGLPixelFormatAttribute pixelFormatAttr[] = {
  kCGLPFAAllRenderers, 0
};
NSOpenGLPixelFormat *glPixelFormat = [[NSOpenGLPixelFormat alloc] initWithAttributes: pixelFormatAttr];
NSOpenGLContext *glContext = [[NSOpenGLContext alloc] initWithFormat:glPixelFormat shareContext:nil];
glContext.view = self.view;

self.ciContext = [CIContext contextWithCGLContext:glContext.CGLContextObj pixelFormat:glPixelFormat.CGLPixelFormatObj colorSpace:nil options:nil];

Then, when a new frame is ready, I do:

dispatch_async(dispatch_get_main_queue(), ^{
       [vc.ciContext drawImage:ciImage inRect:vc.view.bounds fromRect:ciImage.extent];
        vc.isRendering = NO;
});

I am not sure I'm calling draw in the right place, but I can't seem to find out where is this supposed to go.

1

There are 1 answers

1
fumoboy007 On

If the CVPixelBuffer has the kCVPixelBufferIOSurfaceCoreAnimationCompatibilityKey attribute, the backing IOSurface (retrieved via CVPixelBufferGetIOSurface) can be passed directly to the contents property of a CALayer.

This is probably the most efficient way to display a CVPixelBuffer.