Re: UGRA/UDACT-conforming displays [was: i1Display x i1Pro]
Re: UGRA/UDACT-conforming displays [was: i1Display x i1Pro]
- Subject: Re: UGRA/UDACT-conforming displays [was: i1Display x i1Pro]
- From: Koch Karl <email@hidden>
- Date: Tue, 31 Mar 2009 09:22:39 +0200
Hi Roger,
the only important thing is: how many dinstinct steps get through to
the panel?
It doesn´t matter where you reduce the number of steps per channel
(R,G,B). As long as you apply a tonal response curve to your data – be
it in your favourite imaging software, e it in a graphics card (LUT)
or be it when converting an image from gamma 1.8 to L* or gamma 2.2
etc. – and you do that in 8 bits per channel you are bound to lose
steps.
As soon as you go up in bit depth (can you go up in depth?), you have
4 times (10 bit) or even 16 times (12 bit) as many steps available for
manipulation as you can output in the end. That means, as long as your
gradient in your transfer function doesn´t exceed 4:1 or 16:1 (to put
it simple), you don´t lose anything on the output side.
And that´s exactly what the so called hardware calibration is about:
more bits for the manipulation of a transfer curve – less banding.
You can avoid banding by not altering the tonal response curve and
not manipulate your data at all.
If we look at it from the perspective of the human observer, we have
an l* characteristic here. So it makes sense to calibrate your monitor
to L* (and, if you take ambient light effects into account, a CIECAM
model is even better). Consequently, you (8bit) data should bear an L*
gradation as well. That´s why we designed LStar-RGB (now eciRGB_v2).
This allows for a "banding-free" workflow from RAW (with all necessary
manipulations in 16 bit) directly to L* (eciRGB) to the L* (CIECAM)-
calibrated monitor.
Best,
Karl
Am 31.03.2009 um 02:53 schrieb Roger:
Karl,
Do you mean to say that, depending on the presence of a hardware LUT
in the
monitor, that the choice of one gamma (transfer function) over
another may
or may not have an impact on banding?
Suppose, for example, that I have a monitor with internal 10bit LUT,
and I
choose to calibrate with the "antique" (your words) L*, then it
would not
matter what my choice of RGB Working Space is, such as AdobeRGB
(2.2) or
eciRGB (L*) or ColorMatchRGB (1.8) on loss of levels?
Thank you in advance for taking a few minutes of your precious time to
briefly explain,
MfG / Roger
This depends solely on the characteristics of the panel.
Unfortunately
a lot of LCD monitor manufacturers believ that they should force the
behavior of a flat screen to mimic the characteristic of a CRT. Thus
they compel a gamma behavior in the controller instead of the
"naturally" linear behavior of the panel. If the LUT is deformed
twice, this could indeed result in banding.
A hardware calibratable monitor is far more tolerant.
Best regards,
Karl Koch
Am 30.03.2009 um 16:17 schrieb C D Tobie:
On Mar 30, 2009, at 10:09 AM, Todd Shirley wrote:
What exactly does it mean to "lose 20% of your levels"? Smaller
gamut? Banding? What are the real-world implications of "losing
levels"?
Loss of levels on screen, which may be visible as banding or lack of
detail on screen. This does not effect the image or the print,
except as it may effect your ability to make edits.
C. David Tobie
Global Product Technology Manager
Digital Imaging & Home Theater
email@hidden
<image003.gif>
Datacolor
www.datacolor.com/Spyder3
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden