How to efficiently move image data from an XPC service to the main app

687 views Asked by At

TL/DR: What is the most efficient way to decode images in an XPC service, move the data to the main application and then display it?

In several of the WWDC videos that deal with XPC services on macOS, an example that is presented as a good candidate for using an XPC service is the decoding of images and videos. I have an application that does a lot of image decoding that I would like to move into an XPC service but I would like a better understanding of how to efficiently move decoded image data from the XPC service into the main application. There are numerous APIs that one can use when working with images and it's not clear to me which is the most efficient (and performant) when used in an XPC service.

Image Reading:

  • Using Apple's Image I/O framework and CGImage
  • Using Core Image and CIImage
  • Using third-party libraries like OpenImage IO

All three of those APIs can take a filepath and return "image data". Apple's APIs tend to do "lazy decoding" and only decompress the image data when the image needs to be drawn. Third party libraries tend to just return a buffer of decoded data that can then be wrapped with a CGImage at a later date.

When you decode an image using Apple's APIs in an XPC service, you need a way to get the results back to the main application and neither CGImage nor CIImage can be natively transported across an XPC connection. However, Apple does provide two APIs for efficiently transferring generic data across an XPC connection:

  • IOSurface
  • dispatch_data

However, using either of those requires that the XPC service fully unpack the image. The CGImage would need to be rendered to a bitmap context or data consumer, while the CIImage would need to be rendered to a destination using a CIContext. Does doing so negate any of the benefits that would otherwise have been realized had that unpacking been done within the main application?

(Note that the final destination of each rendered image is as the contents property of a CALayer.)

Using an IOSurface, instead of a dispatch_data object, seems like the better solution because an IOSurface can be used (apparently) as the contents of a CALayer. CIContext has an API for rendering into an IOSurface, but CGImage / CGImageDestination does not.

(There is a roundabout way of using vImage to go from a CGImage to a CVPixelBuffer and then to an IOSurface, but that seems... Wrong?)

Using Core Image seems like overkill for me when all I want to do is display thumbnails of each image. Image IO offers CGImageSourceCreateThumbnailAtIndex for doing exactly that and it works great. (Especially on RAW files where it will use the embedded JPEG thumbnail, if present, whereas Core Image will decode the RAW contents.)

And, finally, where does Metal fit into all of this? Do I gain anything by rendering to a metal texture in the XPC service and then sending that across?

By using an XPC service, how am I impacting either Image IO or Core Image's ability to efficiently use the GPU when appropriate? How do I minimize the amount of data that is being copied around (both between processes and between the CPU and GPU)? These thumbnails are being rendered in a scrolling grid view, so performance is a priority.

0

There are 0 answers