How to create a CGImageRef from a NSBitmapImageRep?

1.3k views Asked by At

How can I create a CGImageRef from a NSBitmapImageRep? Or how can I define a complete new CGImageRef in the same way as the NSBitmapImageRep? The definition of a NSBitmapImageRep works fine. But I need an image as CGImageRef.

unsigned char *plane = (unsigned char *)[data bytes]; // data = 3 bytes for each RGB pixel

NSBitmapImageRep* imageRep = [[NSBitmapImageRep alloc]
                              initWithBitmapDataPlanes:     &plane
                              pixelsWide:                   width
                              pixelsHigh:                   height
                              bitsPerSample:                depth
                              samplesPerPixel:              channel
                              hasAlpha: NO
                              isPlanar: NO
                              colorSpaceName: NSCalibratedRGBColorSpace
                              //bitmapFormat: NSAlphaFirstBitmapFormat
                              bytesPerRow:                  channel * width
                              bitsPerPixel:                 channel * depth
                             ];

I have no idea how to create the CGImageRef from the NSBitmapImageRep or how to define a new CGImageRef:

CGImageRef imageRef = CGImageCreate(width, height, depth, channel*depth, channel*width, CGColorSpaceCreateDeviceRGB(), ... );

Please, can somebody give me a hint?

2

There are 2 answers

0
justin On BEST ANSWER

The easy way is by using the CGImage property (introduced in 10.5):

CGImageRef image = imageRep.CGImage;

Documentation:

https://developer.apple.com/library/mac/documentation/Cocoa/Reference/ApplicationKit/Classes/NSBitmapImageRep_Class/index.html#//apple_ref/occ/instm/NSBitmapImageRep/CGImage

Return Value

Returns an autoreleased CGImageRef opaque type based on the receiver’s current bitmap data.

Discussion

The returned CGImageRef has pixel dimensions that are identical to the receiver’s. This method might return a preexisting CGImageRef opaque type or create a new one. If the receiver is later modified, subsequent invocations of this method might return different CGImageRef opaque types.

0
Ken Thomases On

From your code snippet, it seems you're starting with an NSData object. So, your question seems to be how to create a CGImage from a data object. In that case, there's no reason to go through NSBitmapImageRep.

You were almost there with the call to CGImageCreate(). You just needed to figure out how to supply a CGDataProvider to it. You can create a CGDataProvider from an NSData pretty directly, once you realize that NSData is toll-free bridged with CFData. So:

CGDataProviderRef provider = CGDataProviderCreateWithCFData((__bridge CFDataRef)data);
CGColorSpaceRef colorspace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB);
CGImageRef image = CGImageCreate(width, height, depth / 3, depth, channel*width, colorspace, kCGImageAlphaNone, provider, NULL, TRUE, kCGRenderingIntentDefault);
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorspace);