Re: Monitor calibration best practices
Re: Monitor calibration best practices
- Subject: Re: Monitor calibration best practices
- From: Marco Ugolini <email@hidden>
- Date: Tue, 14 Oct 2008 01:53:39 -0700
- Thread-topic: Monitor calibration best practices
In a message dated 10/13/08 3:54 PM, Roger Breton wrote:
> Would anyone know how to go about characterizing the ensuing ³loss of levels²
> at that calibrated luminance versus calibrating at the native luminance?
Hi Roger.
My understanding is that, whereas altering the display's native *color
temperature* (via a vcgt curve applied to the CPU's graphic card by the
monitor profile) causes a loss of discrete levels, on the other hand
altering just the display's *luminance* does not.
As long as one preserves the display's native white point/color temperature,
there should be no loss of levels, whether or not the backlight's intensity
is kept at maximum or lowered.
(Though lowering the luminance might reduce the display's *gamut* perhaps? I
have not tested that.)
I may be getting some of it wrong, though. In which case, please correct me.
Best.
Marco Ugolini
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden