CIAreaHistogram + CIHistogramDisplayFilter to get Luminance ONLY Histogram

1.4k views Asked by At

I believe this could be relative to Core Image in iOS as well as in Mac OS.

I am able to get a RGB Histogram to show up using CIAreaHistogram + CIHistogramDisplayFilter in Core Image. Is there a way to get just LUMINANCE instead of RGB separately?

1

There are 1 answers

1
James Bush On

Here's how to generate a histogram image (iOS and Mac OS X), provided you've already created a CIImage object (ciImage):

ciImage = [CIFilter filterWithName:@"CIAreaHistogram" keysAndValues:kCIInputImageKey, ciImage, @"inputExtent", ciImage.extent, @"inputScale", [NSNumber numberWithFloat:1.0], @"inputCount", [NSNumber numberWithFloat:256.0], nil].outputImage;
ciImage = [CIFilter filterWithName:@"CIHistogramDisplayFilter" keysAndValues:kCIInputImageKey, ciImage, @"inputHeight", [NSNumber numberWithFloat:100.0], @"inputHighLimit", [NSNumber numberWithFloat:1.0], @"inputLowLimit", [NSNumber numberWithFloat:0.0], nil].outputImage;

There are a hundred different solutions out there for displaying a histogram; this is simple (merely two lines of code), and works everywhere, flawlessly.

On to outputting the luminance channel only of a color image, and passing it to the histogram-related filters...

Do you know how to create a custom Core Image filter that returns the output of a CIKernel (or CIColorKernel) object? If not, you should; and, I'd be happy to provide you with easy-to-understand instructions for doing that.

Assuming you do, here's the OpenGL ES code that will return only the luminance values of an image it processes:

vec4 rgb2hsl(vec4 color)
{
    //Compute min and max component values
    float MAX = max(color.r, max(color.g, color.b));
    float MIN = min(color.r, min(color.g, color.b));

    //Make sure MAX > MIN to avoid division by zero later
    MAX = max(MIN + 1e-6, MAX);

    //Compute luminosity
    float l = (MIN + MAX) / 2.0;

    //Compute saturation
    float s = (l < 0.5 ? (MAX - MIN) / (MIN + MAX) : (MAX - MIN) / (2.0 - MAX - MIN));

    //Compute hue
    float h = (MAX == color.r ? (color.g - color.b) / (MAX - MIN) : (MAX == color.g ? 2.0 + (color.b - color.r) / (MAX - MIN) : 4.0 + (color.r - color.g) / (MAX - MIN)));
    h /= 6.0;
    h = (h < 0.0 ? 1.0 + h : h);

    return vec4(h, s, l, color.a);
}

kernel vec4 hsl(sampler image)
{
    //Get pixel from image (assume its alpha is 1.0 and don't unpremultiply)
    vec4 pixel = unpremultiply(sample(image, samplerCoord(image)));

    //Convert to HSL; only display luminance value
    return premultiply(vec4(vec3(rgb2hsl(pixel).b), 1.0));
}

The above is OpenGL ES code written originally by Apple developers; I modified it to display only the luminance values.

Again: if you don't know how to at least plug-in kernels into a custom Core Image filter, learn how. I can show you.