NSImage and UIImage give different NSData representations

1.2k views Asked by At

Scenario: I have an image in the iPhone camera roll. I access it using ALAssetLibrary and get an ALAsset object. I get a UIImage and NSData object from it using something like the following code.

ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref)
{
    UIImage *largeimage = [UIImage imageWithCGImage:iref ];
    NSData * data = UIImageJPEGRepresentation(largeimage, 1.0f);
}

I then copy the image from the camera roll using Image Capture onto my mac. I then use NSImage in my Mac Code to open the copied image and try to get a NSData representation using the following code.

NSImage * image = [[NSImage alloc] initWithContentsOfURL:fileURL];
NSBitmapImageRep *imgRep = [[image representations] objectAtIndex: 0];
NSData *data = [imgRep representationUsingType: NSJPEGFileType properties: nil];

Problem: Unfortunately, the two NSData representations I get are very different. I want to be able to get the same NSData representation in both cases (since it is the same file). I can then go on to hash the NSData objects and compare the hashes to conclude that the two are (possibly) the same image. Ideally I would want the following two functions;

//In iOS
-(NSData *) getDataFromALAsset:(ALAsset*)asset;
//or 
-(NSData *) getDataFromUIImage:(UIImage*)image;

//In OS X
-(NSData *) getDataFromFileAtURL:(NSURL*)url;
//or 
-(NSData *) getDataFromNSImage:(NSImage*)image;

Such that the NSData* representation I get in OS X and iOS are exactly the same given that they come from the same source image.

What I have tried:

I have tried to play around with how I get the UIImage object from the ALAsset object, I have tried to UIImagePNGRepresentation (and the corresponding for getting NSData in OS X). I have also tried to play around with different parameters for getting the representation in OS X but nothing has come through. I have also tried to create a CGIImageRef on both platforms, convert that to Bitmaps and read them pixel by pixel and even those seem to be off (and yes I do realise that the NSBitmapImageRep has different co-ordinate system).

1

There are 1 answers

0
Tayyab On BEST ANSWER

I did eventually find a way to do what I wanted. The ALAssetRepresentation class's getBytes:fromOffset:length:error: method can be used to get the NSData object which is same as [NSData dataWithContentsOfURL:fileURL] in OS X. Note that doing it from the UIImage is not possible since the UIImage performs some processing on the image. Here is what the requested functions would look like.

//In iOS
-(NSData *) getDataFromALAsset:(ALAsset*)asset {
    ALAssetRepresentation *rep = [asset defaultRepresentation];
    Byte *buffer = (Byte*)malloc(rep.size);
    NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:rep.size error:nil];
    NSData *assetData = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
    return assetData;
}

//In OS X
-(NSData *) getDataFromFileAtURL:(NSURL*)url
{
    return [NSData dataWithContentsOfURL:url];
}