Re: Monitor calibration best practices
Re: Monitor calibration best practices
- Subject: Re: Monitor calibration best practices
- From: cdtobie <email@hidden>
- Date: Tue, 14 Oct 2008 09:07:04 -0400
On Oct 14, 2008, at 4:53:39 AM, "Marco Ugolini"
<email@hidden> wrote:
My understanding is that, whereas altering the display's native *color
temperature* (via a vcgt curve applied to the CPU's graphic card by the
monitor profile) causes a loss of discrete levels, on the other hand
altering just the display's *luminance* does not.
That will depend on how you lower it. If you lower it with a backlight
control, that has not effected the digital levels directly. If you
lower it in the video LUTs, then unless there is a high bit video
stream from that point onwards, you have reduced the number of levels
accordingly.
As long as one preserves the display's native white point/color
temperature,
there should be no loss of levels, whether or not the backlight's
intensity
is kept at maximum or lowered.
(Though lowering the luminance might reduce the display's *gamut*
perhaps? I
have not tested that.)
In theory, the luminance is Cap Y, and the color gamut is in Little
x/y, so lowering the backlight won't change the gamut. Whether there
are any practical limits to that is another question, but within a
reasonable range, color gamut is separate from luminance.
--
C. David Tobie
W.W. Product Technology Manager
Digital Imaging & Home Theater
Datacolor
email@hidden
www.datacolor.com/spyder3/
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden