Monitor calibration best practices
Monitor calibration best practices
- Subject: Monitor calibration best practices
- From: Roger Breton <email@hidden>
- Date: Mon, 13 Oct 2008 18:54:40 -0400
Dear fellow ColorSyncers,
Suppose I bring the calibrated luminance of an Apple Cinema Display down from its native 350 cd/m2 to 120 cd/m2, through, say, the “brightness” knob on the side of the display.
Would anyone know how to go about characterizing the ensuing “loss of levels” at that calibrated luminance versus calibrating at the native luminance?
The argument always goes, on this list, that one shall never change the native luminance on any LCD display (the ones that do not have built-in LUTs, like the Eizo’s or the higher end NECs or the HP Dreamcolor) in order to conserve the maximum discrete calibrated luminance levels.
Would there be a way to unequivocally quantify this effect at all? If so, what sort of procedure would one go through in order to prove this to oneself?
Roger Breton
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden