Apple recently enabled 30-bit color support on OS X. They’ve posted some sample code that shows how to enable this. However, they don’t seem to provide an example for how you can detect when your app is running on a display that supports 30-bit color.
We’d like to be able to detect when the display(s) support 30-bit color and only enable 30-bit color for displays that support it and revert to 24-bit color otherwise.
Does anyone know how to do that?
So far I’ve tried using the CGDisplay
APIs (CGDisplayCopyDisplayMode
and CGDisplayModeCopyPixelEncoding
) to query the display’s pixel encoding. But these seem to always return 24-bit encodings and CGDisplayModeCopyPixelEncoding
was deprecated in Mac OS X 10.11. I’ve also tried using NSScreen’s
“depth” property, but this also returns 24-bits per pixel.
The built-in System Information app is obviously able to get at this information, I just can’t figure out how they’re doing it. Any hints?
As of macOS 10.12 Apple has some new APIs that allow you to detect if a display is capable of wide gamut color (i.e. deep color). There are a few ways to do this:
Use NSScreen's
- (BOOL)canRepresentDisplayGamut:(NSDisplayGamut)displayGamut
Use
CGColorSpaceIsWideGamutRGB(...)
:NSWindow
also has- (BOOL)canRepresentDisplayGamut:(NSDisplayGamut)displayGamut
.I don't know that you're guaranteed to be on a 30-bit capable display when the display is considered "wide gamut RGB" or capable of
NSDisplayGamutP3
, but this appears to be Apple's official way of determining if a display is capable of wide gamut color.