Create pixel buffer in lab colour space

746 views Asked by At

I'm implementing a colour selection tool similar to photoshop's magic wand tool in ios.

I have already got it working in RGB, but to make it more accurate I want to make it work in LAB colour space.

The way it currently works is that it takes a UIImage, creates a CGImage version of that image. It then creates a CGContext in RGB colourspace, draws the CGImage in that context, takes the context data and then binds that to a pixel buffer which uses a struct RGBA32.

 let colorSpace       = CGColorSpaceCreateDeviceRGB()
 let width            = inputCGImage.width
 let height           = inputCGImage.height
 let bytesPerPixel    = 4
 let bitsPerComponent = 8
 let bytesPerRow      = bytesPerPixel * width
 let bitmapInfo       = RGBA32.bitmapInfo

 guard let context = CGContext(data: nil, width: width, height: height, bitsPerComponent: bitsPerComponent, bytesPerRow: bytesPerRow, space: colspace, bitmapInfo: bitmapInfo) else {
            print("unable to create context")
            return nil
        }
  context.draw(inputCGImage, in: CGRect(x: 0, y: 0, width: width, height: height))

  guard let buffer = context.data else {
            print("unable to get context data")
            return nil
        }

  let pixelBuffer = buffer.bindMemory(to: RGBA32.self, capacity: width * height)


struct RGBA32: Equatable {

    var color: UInt32

    var redComponent: UInt8 {
        return UInt8((color >> 24) & 255)
    }

    var greenComponent: UInt8 {
        return UInt8((color >> 16) & 255)
    }

    var blueComponent: UInt8 {
        return UInt8((color >> 8) & 255)
    }

    var alphaComponent: UInt8 {
        return UInt8((color >> 0) & 255)
    }


    init(red: UInt8, green: UInt8, blue: UInt8, alpha: UInt8) {
        color = (UInt32(red) << 24) | (UInt32(green) << 16) | (UInt32(blue) << 8) | (UInt32(alpha) << 0)

    }
    static let clear   = RGBA32(red: 0,   green: 0,   blue: 0,   alpha: 0)

    static let bitmapInfo = CGImageAlphaInfo.premultipliedLast.rawValue | CGBitmapInfo.byteOrder32Little.rawValue

    static func ==(lhs: RGBA32, rhs: RGBA32) -> Bool {
        return lhs.color == rhs.color
    }
}

It then uses that pixel buffer to quickly compare the colour values to selected pixel using a very simple euclidian distance as detailed here.

https://en.wikipedia.org/wiki/Color_difference

As I said it works but for more accurate results I want it to work is CIE Lab Colour space.

Initially I tried converting each pixel to LAB colour as they were checked, then used CIE94 comparison as detailed in the above colour difference link. It worked but was very slow, I guess because it had to convert a million pixels(or so) to LAB colour before checking.

It then struck me that to make it work quickly it would be better to store the pixel buffer in LAB colourspace(it's not used for anything else).

So I created a similar struct LABA32

struct LABA32:Equatable {


    var colour: UInt32

    var lComponent: UInt8 {
        return UInt8((colour >> 24) & 255)
    }
    var aComponent: UInt8 {
        return UInt8((colour >> 16) & 255)
    }
    var bComponent: UInt8 {
        return UInt8((colour >> 8) & 255)
    }
    var alphaComponent: UInt8 {
        return UInt8((colour >> 0) & 255)
    }

    init(lComponent: UInt8, aComponent: UInt8, bComponent: UInt8, alphaComponent: UInt8) {
        colour = (UInt32(lComponent) << 24) | (UInt32(aComponent) << 16) | (UInt32(bComponent) << 8) | (UInt32(alphaComponent) << 0)

    }

     static let clear   = LABA32(lComponent: 0, aComponent: 0, bComponent: 0, alphaComponent: 0)
    static let bitmapInfo = CGImageAlphaInfo.premultipliedLast.rawValue | CGBitmapInfo.byteOrder32Little.rawValue

    static func ==(lhs: LABA32, rhs: LAB32) -> Bool {
        return lhs.colour == rhs.colour
    }

I might be wrong but in theory if I draw the CGImage in a context with a LAB colourspace instead of Device RGB it should map the data to this new struct.

The problem I'm having is actually creating the colourspace(let alone test if this theory will actually work).

To create a LAB colourspace I am trying to use this constructor

 CGColorSpace(labWhitePoint: <UnsafePointer<CGFloat>!>, blackPoint: <UnsafePointer<CGFloat>!>, range: <UnsafePointer<CGFloat>!>)

According to apple documentation

white​Point: An array of 3 numbers that specify the tristimulus value, in the CIE 1931 XYZ-space, of the diffuse white point. black​Point: An array of 3 numbers that specify the tristimulus value, in CIE 1931 XYZ-space, of the diffuse black point. range: An array of 4 numbers that specify the range of valid values for the a* and b* components of the color space. The a* component represents values running from green to red, and the b* component represents values running from blue to yellow.

So I've created 3 arrays of CGFloats

var whitePoint:[CGFloat] = [0.95947,1,1.08883]
var blackPoint:[CGFloat] = [0,0,0]
var range:[CGFloat] = [-127,127,-127,127]

I then try to construct the colour space

let colorSpace = CGColorSpace(labWhitePoint: &whitePoint, blackPoint: &blackPoint, range: &range)

The problem is that I keep getting error "unsupported color space" so I must be doing something completely wrong. I've spent a lot of time looking for others trying to construct a LAB colour space but there doesn't seem to be anything relevant, even trying to find objective-C versions.

So how do I actually create a LAB colourspace correctly ?

Thanks.

1

There are 1 answers

0
Gilles On

The documentation also says:

Important: iOS does not support device-independent or generic color spaces. iOS applications must use device color spaces instead.

So if you want to work in LAB I guess you have to do the transformation manually.