iOS sdk, render image from unsigned char array, CGBitmapContextCreate returns NULL

1k views Asked by At

I have an unsigned char* data with the following values, as seen while debugging in xcode,

\xc1\xc1\xc1\xc1\xc1\xc1\xc1\xc0\x84\x03\x03\x02\x03\x02\x03\x03\x03\x02\x02\x03\x02\x03\x02\x02\x03\x02\x02\x02\x02\x03\x03\x03\x03\x03\x02\x02\x03\x03\x03\x02\x02\x03\x02\x02\x02\x02\x03\x03\x03\x02\x02\x02\x02\x03\x02\x03\x02\x03\x02\x03\x02\x03\x02\x03\x02\x02\x02\x03\x03\x03\x03\x03\x02\x03\x02\x03\x02\x03\x03\x02\x02\x03\x03\x03\x02\x02\x02\x02\x02\x03\x02\x02\x03\x02\x03\x02\x02\x03\x03\x03

This is the data array for a qr code.

Symbol data is represented as an array contains width*width uchars. Each uchar represents a module (dot). If the less significant bit of the uchar is 1, the corresponding module is black.

In the above case the width is 177.

I have tried various combination in CGBitmapContextCreate, but always seem to get NULL.

Please advise.

Your help will be much appreciated.

1

There are 1 answers

0
Kaj On BEST ANSWER

I have successfully used :

CGImageRef refImage = [myUIImage CGImage];
CGContextRef ctx;
int w = CGImageGetWidth(refImage);
int h = CGImageGetHeight(refImage);
int bytesPerRow = CGImageGetBytesPerRow(refImage);
ctx = CGBitmapContextCreate(rawData,
                            w,
                            h,
                            8,
                            bytesPerRow,
                            CGImageGetColorSpace( refImage ),
                            kCGImageAlphaPremultipliedLast ); 

// Fill in rawData, which is the unsigned char pixel array

refImage = CGBitmapContextCreateImage (ctx);
UIImage* myUIImage2 = [UIImage imageWithCGImage:refImage];  

CGContextRelease(ctx); 
CGImageRelease(refImage);