In one class I have an array consisting of five UIImages such as tiangle.png and circle.png.
In the current class I have a list of colors. After clicking to the UIImage and after clicking to one color it is possible to change the color of the current UIImage. This will be realized by masking the image, set its color and replace the old UIImage with the new one in the selected color.
But there is something wrong in the method, which should change the color:
- (void) changeColor: (UITapGestureRecognizer*) gestureRecognizer{
UIGraphicsBeginImageContext(test.newView.image.size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGPoint loc = [gestureRecognizer locationInView:self.view];
CGContextSetFillColorWithColor(context, [[self colorOfPoint:loc]CGColor]);
CGContextTranslateCTM(context, 0, test.newView.image.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
CGRect rect = CGRectMake(test.newView.frame.origin.x,test.newView.frame.origin.x, test.newView.image.size.width,test.newView.image.size.height);
CGContextSetBlendMode(context, kCGBlendModeColorBurn);
CGContextClipToMask(context, rect, test.newView.image.CGImage);
CGContextAddRect(context, rect);
CGContextDrawPath(context, kCGPathFill);
CGImageRef imgRef = CGBitmapContextCreateImage(context);
UIImage* img = [UIImage imageWithCGImage:imgRef];
CGImageRelease(imgRef);
CGContextRelease(context);
UIGraphicsEndImageContext();
test.newView.image = img;
}
The only thing what is happening is that the clicked UIImage is going to get opaque.
The imageview which is holding the UIImage is in this case not deleted.
I'm not sure that grabbing an imageRef using a CGBitmapContextCreateImage() call is appropriate inside a UIGraphics image context. Every example I've ever seen uses
to grab the image before closing the image context. You might try that.