Cocoa + Quartz Newbie: Getting the color of pixels in an NSImage & drawing offscreen

659 views Asked by At

I'm attempting to create a custom color picker for my Cocoa (Mac, not iOS) application, and I'm having some trouble. I'm familiar enough with the various patterns of computer graphics and drawing, but I'm new to Cocoa, and I think the issue is basically about getting lost amid its multitiered graphics technologies.

What I'm trying to implement is a color picker that's docked into a larger sidebar View, so I thought it would be best to draw it myself rather than using the standard floating panel. (I have good reason; warnings about diverging from Apple's HIG need not be mentioned.) In researching the best way to retrieve the color of the pixel clicked on an NSImage (in this case, colorWheel.PNG), I have found that it's best to go down into Core Graphics and draw the image into my own bitmap context.

...And that's as far as I've come sucessfully. I've successfully created the context, but I'm unsure where to go from here and whether or not this is overkill. I guess what I'm looking for is a solid explanation of the hierarchy under which these classes link together, and how I might best create the effect I'm looking for. Drawing offscreen will be useful in many parts of this application, and I'd love to understand it better.

Thanks in advance.

1

There are 1 answers

0
Anne On

Quick solution:

There is no need to draw it offscreen.
Simply create a NSBitmapImageRep from the NSImage.
And use colorAtX:y: to retrieve the NSColor.

Of course you only need create the NSBitmapImageRep once.
Then lookup the color every time the user clicks of moves the mouse.
In my apps this does not cause any performance problems at all.

Use NSView to display the PNG and retrieve the coordinates.