Convert SwiftUI View to UIImage on iOS 14+

1.8k views Asked by At

I can convert any SwiftUI View to a high resolution UIImage, using the code below. It works great... until... I try to use an image size larger than CGSize(width: 2730, height: 2730).

If I increase the image size to CGSize(width: 2731, height: 2731) or larger, the line:

self.drawHierarchy(in: self.layer.bounds, afterScreenUpdates: true)

in "extension UIView", can no longer draw the UIImage.

Any idea on why there is a size limitation?

One Note: I can overcome the size limitation by uncommenting the 2 lines in the "extension View" and replacing:

self.drawHierarchy(in: self.layer.bounds, afterScreenUpdates: true)

With:

layer.render(in: context.cgContext)

in the "extension UIView"... But THEN "layer.render" will not render image effects such as "blur", SceneKit subviews, or metal. So using "self.drawHierarchy" is a must.

// 1: Set breakpoint on line: print("done") to inspect the high res image
// 2: Run, then tap the image on screen to inspect the highresImage
// 3: repeat after changing the size to CGSize = CGSize(width: 2731, height: 2731)

import SwiftUI

struct ContentView: View {
    @State var blurRadius: CGFloat = 4.0
    let imageSize: CGSize = CGSize(width: 2730, height: 2730)

    var body: some View {
        testView
            .frame(width: 300, height: 300)
            .onTapGesture {
                // Adjust blur radius based on high res image scale
                blurRadius *= imageSize.width * 0.5/300

                // Capture high res image of swiftUI view
                let highresImage = testView.asImage(size: imageSize)
                // set breakpoint here to inspect the high res image size, quality, etc.
                print("done")
                
                // reset blur radius back to 4
                blurRadius = 4
            }
    }

    var testView: some View {
        ZStack {
            Color.blue
            Circle()
                .fill(Color.red)
        }
        .blur(radius: blurRadius)
    }
}

extension UIView {
    func asImage() -> UIImage {
        let format = UIGraphicsImageRendererFormat()
        format.scale = 1
        return UIGraphicsImageRenderer(size: self.layer.frame.size, format: format).image { context in
            self.drawHierarchy(in: self.layer.bounds, afterScreenUpdates: true)
            //layer.render(in: context.cgContext)

        }
    }
}


extension View {
    func asImage(size: CGSize) -> UIImage {
        let controller = UIHostingController(rootView: self)
        controller.view.bounds = CGRect(origin: .zero, size: size)
        //UIApplication.shared.windows.first!.rootViewController?.view.addSubview(controller.view)
        let image = controller.view.asImage()
        //controller.view.removeFromSuperview()
        return image
    }
}

1

There are 1 answers

2
milan.elsen On

I can't figure out why, but it works for me, after changing the y-coordinate of the view's origin to anything non zero. This may be a bug in UIHostingController.

If you use a very small Int, you can't see the difference, e.g.:

controller.view.bounds = CGRect(origin: CGPoint(x: 0, y: 0.0001), size: size)