Re: Monitor calibration best practices
Re: Monitor calibration best practices
- Subject: Re: Monitor calibration best practices
- From: Steve Upton <email@hidden>
- Date: Mon, 13 Oct 2008 16:17:10 -0700
At 6:54 PM -0400 10/13/08, Roger Breton wrote:
>Dear fellow ColorSyncers,
>
>Suppose I bring the calibrated luminance of an Apple Cinema Display down from its native 350 cd/m2 to 120 cd/m2, through, say, the "brightness" knob on the side of the display.
>
>Would anyone know how to go about characterizing the ensuing "loss of levels" at that calibrated luminance versus calibrating at the native luminance?
>
>The argument always goes, on this list, that one shall never change the native luminance on any LCD display (the ones that do not have built-in LUTs, like the Eizo's or the higher end NECs or the HP Dreamcolor) in order to conserve the maximum discrete calibrated luminance levels.
>
>Would there be a way to unequivocally quantify this effect at all? If so, what sort of procedure would one go through in order to prove this to oneself?
>
As in many things color, consult Bruce Lindbloom's great site.
he has worried about it as well....
<http://www.brucelindbloom.com/LevelsCalculator.html>
Regards,
Steve
________________________________________________________________________
o Steve Upton CHROMiX www.chromix.com
o (hueman) 866.CHROMiX
o email@hidden 206.985.6837
________________________________________________________________________
--
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden