I have some code which renders RGB images from a physical simulation. Those images have a linear intensity scale, so must be gamma corrected before display on a normal PC monitor and it's easy enough for my application to apply the necessary power law at some point in its display pipeline (generally I use something from 1.6 to 2.2 on a fairly ad-hoc basis; whatever I think looks best).

Now it is likely that in future the application may be run by users with DICOM calibrated displays. It's entirely unclear to me in what way these differ from a normal PC monitor (other than in some way being "more accurate"). Is there a particular gamma value that should be used, or some completely different response function needed, in order to reproduce the original linear-intensity image reasonably accurately on the display ?

2

There are 2 answers

0
Matt On BEST ANSWER

The definitive reference on the topic is here.

0
Andreas Brinck On

Looking at this document:

http://www.docstoc.com/docs/6460598/White-Paper-DICOM-Display-calibration

It seems like at least some brands of displays are calibrated (using a LUT) to have a gamma of 1.