the problem isnt getting the pixel data i was able to find some source for that
-(NSArray *)getRGBAtLocationOnImage:(UIImage *)theImage X:(int)x Y:(int)y
{
// First get the image into your data buffer
CGImageRef image = [theImage CGImage];
NSUInteger width = CGImageGetWidth(image);
NSUInteger height = CGImageGetHeight(image);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
unsigned char *rawData = malloc(height * width * 4);
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(rawData, width, height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);
CGContextDrawImage(context, CGRectMake(0, 0, width, height),image);
CGContextRelease(context);
// Now your rawData contains the image data in the RGBA8888 pixel format.
int byteIndex = (bytesPerRow * y) + x * bytesPerPixel;
int red = rawData[byteIndex];
int green = rawData[byteIndex + 1];
int blue = rawData[byteIndex + 2];
//int alpha = rawData[byteIndex + 3];
NSLog(@"Red: %d Green: %d Blue: %d",red,green,blue);
NSArray *i = [[NSArray alloc] initWithObjects:[NSNumber numberWithInt:red], [NSNumber numberWithInt:green], [NSNumber numberWithInt:blue], nil];
free(rawData);
return i;
}
the problem is i the location of the pixels i want to get. i have no idea how to figure out where the pixels i want to get are located. what is a way of figuring that out.
Not sure to undestand your issue, but...
Take a look at your method:
It waits for x and y and returns i, an array containing the RGB data of the point (x,y) you passed.
Suppose to have an image 100x100 pixels, you have to call your method 10000 times (one per pixel) if you want to check all the pixels in your image.
In that case, you can try something like this: