CIGaussianBlur image size

8.5k views Asked by At

I want to blur my view, and I use this code:

//Get a UIImage from the UIView
NSLog(@"blur capture");
UIGraphicsBeginImageContext(BlurContrainerView.frame.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

//Blur the UIImage
CIImage *imageToBlur = [CIImage imageWithCGImage:viewImage.CGImage];
CIFilter *gaussianBlurFilter = [CIFilter filterWithName: @"CIGaussianBlur"];
[gaussianBlurFilter setValue:imageToBlur forKey: @"inputImage"];
[gaussianBlurFilter setValue:[NSNumber numberWithFloat: 5] forKey: @"inputRadius"]; //change number to increase/decrease blur
CIImage *resultImage = [gaussianBlurFilter valueForKey: @"outputImage"];

//create UIImage from filtered image
blurredImage = [[UIImage alloc] initWithCIImage:resultImage];

//Place the UIImage in a UIImageView
UIImageView *newView = [[UIImageView alloc] initWithFrame:self.view.bounds];
newView.image = blurredImage;

NSLog(@"%f,%f",newView.frame.size.width,newView.frame.size.height);
//insert blur UIImageView below transparent view inside the blur image container
[BlurContrainerView insertSubview:newView belowSubview:transparentView];

And it blurs the view, but not all of it. How can I blur all of the View?

billed: postimg.org/image/9bee2e4zx/

3

There are 3 answers

1
Rob On BEST ANSWER

The issue isn't that it's not blurring all of the image, but rather that the blur is extending the boundary of the image, making the image larger, and it's not lining up properly as a result.

To keep the image the same size, after the line:

CIImage *resultImage    = [gaussianBlurFilter valueForKey: @"outputImage"];

You can grab the CGRect for a rectangle the size of the original image in the center of this resultImage:

// note, adjust rect because blur changed size of image

CGRect rect             = [resultImage extent];
rect.origin.x          += (rect.size.width  - viewImage.size.width ) / 2;
rect.origin.y          += (rect.size.height - viewImage.size.height) / 2;
rect.size               = viewImage.size;

And then use CIContext to grab that portion of the image:

CIContext *context      = [CIContext contextWithOptions:nil];
CGImageRef cgimg        = [context createCGImage:resultImage fromRect:rect];
UIImage   *blurredImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);

Alternatively, for iOS 7, if you go to the iOS UIImageEffects sample code and download iOS_UIImageEffects.zip, you can then grab the UIImage+ImageEffects category. Anyway, that provides a few new methods:

- (UIImage *)applyLightEffect;
- (UIImage *)applyExtraLightEffect;
- (UIImage *)applyDarkEffect;
- (UIImage *)applyTintEffectWithColor:(UIColor *)tintColor;
- (UIImage *)applyBlurWithRadius:(CGFloat)blurRadius tintColor:(UIColor *)tintColor saturationDeltaFactor:(CGFloat)saturationDeltaFactor maskImage:(UIImage *)maskImage;

So, to blur and image and lightening it (giving that "frosted glass" effect) you can then do:

UIImage *newImage = [image applyLightEffect];

Interestingly, Apple's code does not employ CIFilter, but rather calls vImageBoxConvolve_ARGB8888 of the vImage high-performance image processing framework. This technique is illustrated in WWDC 2013 video Implementing Engaging UI on iOS.

0
Noah Witherspoon On

Looks like the blur filter is giving you back an image that’s bigger than the one you started with, which makes sense since pixels at the edges are getting blurred out past them. The easiest solution would probably be to make newView use a contentMode of UIViewContentModeCenter so it doesn’t try to squash the blurred image down; you could also crop blurredImage by drawing it in the center of a new context of the appropriate size, but you don’t really need to.

4
Cœur On

A faster solution is to avoid CGImageRef altogether and perform all transformations at CIImage lazy level.

So, instead of your unfitting:

// create UIImage from filtered image (but size is wrong)
blurredImage = [[UIImage alloc] initWithCIImage:resultImage];

A nice solution is to write:

Objective-C

// cropping rect because blur changed size of image
CIImage *croppedImage = [resultImage imageByCroppingToRect:imageToBlur.extent];
// create UIImage from filtered cropped image
blurredImage = [[UIImage alloc] initWithCIImage:croppedImage];

Swift 3

// cropping rect because blur changed size of image
let croppedImage = resultImage.cropping(to: imageToBlur.extent)
// create UIImage from filtered cropped image
let blurredImage = UIImage(ciImage: croppedImage)

Swift 4

// cropping rect because blur changed size of image
let croppedImage = resultImage.cropped(to: imageToBlur.extent)
// create UIImage from filtered cropped image
let blurredImage = UIImage(ciImage: croppedImage)