Using GaussianBlur on image in viewDidLoad blocks UI

642 views Asked by At

I'm creating a blur effect using this below function in viewDidLoad of viewController

func applyBlurEffect(image: UIImage){

        let imageToBlur = CIImage(image: image)!
        let blurfilter = CIFilter(name: "CIGaussianBlur")!
        blurfilter.setValue(10, forKey: kCIInputRadiusKey)
        blurfilter.setValue(imageToBlur, forKey: "inputImage")
        let resultImage = blurfilter.value(forKey: "outputImage") as! CIImage
        let croppedImage: CIImage = resultImage.cropping(to: CGRect(x:0,y: 0,width: imageToBlur.extent.size.width,height: imageToBlur.extent.size.height))
        let context = CIContext(options: nil)
        let blurredImage = UIImage (cgImage: context.createCGImage(croppedImage, from: croppedImage.extent)!)
        self.backImage.image = blurredImage


}

But this piece of code blocks the UI and the viewController opens after 3-4 seconds of lag. I don't want to present the UI without the blurEffect as well as i don't want the user to wait for 3-4 seconds while opening the viewController. Please provide with a optimum solution for this problem.

4

There are 4 answers

0
ubiAle On

Can you present the view controller with the original image and perform the blur on a background thread and do a nice effect to replace the image once the blur ones is ready??

Also, maybe you could use a UIVisualEffectView and see if performance are better?

Apple a while ago also released an example where they were using UIImageEffects to perform a blur. It is written in Obj-C but you could easily use it in Swift https://developer.apple.com/library/content/samplecode/UIImageEffects/Listings/UIImageEffects_UIImageEffects_h.html

0
Warif Akhand Rishi On

Core Image Programming Guide

Performance Best Practices

Follow these practices for best performance:

  • Don’t create a CIContext object every time you render. Contexts store a lot of state information; it’s more efficient to reuse them.
  • Evaluate whether you app needs color management. Don’t use it unless you need it. See Does Your App Need Color Management?. Avoid Core Animation animations while rendering CIImage objects with a GPU context. If you need to use both simultaneously, you can set up both to use the CPU.
  • Make sure images don’t exceed CPU and GPU limits. Image size limits for CIContext objects differ depending on whether Core Image uses the CPU or GPU. Check the limit by using the methods
    inputImageMaximumSize and outputImageMaximumSize.
  • User smaller images when possible. Performance scales with the number of output pixels. You can have Core Image render into a smaller view, texture, or framebuffer. Allow Core Animation to upscale to display size.
  • Use Core Graphics or Image I/O functions to crop or downsample, such as the functions CGImageCreateWithImageInRect or
    CGImageSourceCreateThumbnailAtIndex.
  • The UIImageView class works best with static images. If your app needs to get the best performance, use lower-level APIs.
  • Avoid unnecessary texture transfers between the CPU and GPU. Render to a rectangle that is the same size as the source image before
    applying a contents scale factor.
  • Consider using simpler filters that can produce results similar to algorithmic filters. For example, CIColorCube can produce output
    similar to CISepiaTone, and do so more efficiently.

  • Take advantage of the support for YUV image in iOS 6.0 and later. Camera pixel buffers are natively YUV but most image processing
    algorithms expect RBGA data. There is a cost to converting between
    the two. Core Image supports reading YUB from CVPixelBuffer objects
    and applying the appropriate color transform.

Have a look at Brad Larson's GPUImage also. You might want to use it. see this answer. https://stackoverflow.com/a/12336118/1378447

2
Vincent Joy On

Make use of dispatch queues. This one worked for me:

func applyBlurEffect(image: UIImage){

        DispatchQueue.global(qos: DispatchQoS.QoSClass.userInitiated).async {

            let imageToBlur = CIImage(image: image)!
            let blurfilter = CIFilter(name: "CIGaussianBlur")!
            blurfilter.setValue(10, forKey: kCIInputRadiusKey)
            blurfilter.setValue(imageToBlur, forKey: "inputImage")
            let resultImage = blurfilter.value(forKey: "outputImage") as! CIImage
            let croppedImage: CIImage = resultImage.cropping(to: CGRect(x:0,y: 0,width: imageToBlur.extent.size.width,height: imageToBlur.extent.size.height))
            let context = CIContext(options: nil)
            let blurredImage = UIImage (cgImage: context.createCGImage(croppedImage, from: croppedImage.extent)!)

            DispatchQueue.main.async {
                self.backImage.image = blurredImage
            }
        }
    }

But this method will create a delay of 3-4 seconds for image to become blur(but it won't block the loading of other UI contents). If you don't want that time delay too, then applying UIBlurEffect to imageView will produce a similar effect:

func applyBlurEffect(image: UIImage){

        self.profileImageView.backgroundColor = UIColor.clear
        let blurEffect = UIBlurEffect(style: .extraLight)
        let blurEffectView = UIVisualEffectView(effect: blurEffect)
        blurEffectView.frame = self.backImage.bounds
        blurEffectView.alpha = 0.5

        blurEffectView.autoresizingMask = [.flexibleWidth, .flexibleHeight] // for supporting device rotation
        self.backImage.addSubview(blurEffectView)
    }

By changing the blur effect style to .light or .dark and alpha value from 0 to 1, you can get your desired effect

0
azzie On

GPUImage (https://github.com/BradLarson/GPUImage) blur works really much faster than CoreImage one:

extension UIImage {
  func imageWithGaussianBlur() -> UIImage? {
    let source = GPUImagePicture(image: self)
    let gaussianFilter = GPUImageGaussianBlurFilter()
    gaussianFilter.blurRadiusInPixels = 2.2
    source?.addTarget(gaussianFilter)
    gaussianFilter.useNextFrameForImageCapture()
    source?.processImage()
    return gaussianFilter.imageFromCurrentFramebuffer()
  }
}

However small delay is still possible (depends on image size), so if you can't preprocess the image until view loads, I'd suggest to resize the image first, blur and display the resulted thumbnail, and then after the original image is processed in background queue, replace the thumbnail with the blurred original.