RE: UGRA/UDACT-conforming displays [was: i1Display x i1Pro]
RE: UGRA/UDACT-conforming displays [was: i1Display x i1Pro]
- Subject: RE: UGRA/UDACT-conforming displays [was: i1Display x i1Pro]
- From: Roger <email@hidden>
- Date: Tue, 31 Mar 2009 07:51:16 -0400
Thomas und Karl,
I find it hard to buy the notion that we have to marry the calibrated
transfer curve (hardware or software of whichever flavour) to our choice of
RGB working space, das ist quite restrictive. And why not, by the same
logic, marry the calibrated transfer curve to a choice of CMYK output
profile; why would it have to to be different: loosing steps is loosing
steps, regardless of color space, richtig?
Ich always thought calibration und profiling were independent, as in
device-independence.
Granted, if a monitor does NOT have internal LUTs then were left with
calibrating inside the video RAMs own LUT. And, at 8 bit, this could
potentially turn into some banding, depending on how severe the per-channels
curves adjustments, I agree. But, please excuse my ignorance, I would not
have thought that this limitation *should* have turned into restrictions on
the choice of gamma-encoded or L*-encoded RGB working space.
If I may, to further clarify, Id like to take the example of a monitor we
know is devoid of internal LUTs, the ACD 23.
So, I proceed to take one happy ACD 23 out of the box and put it on my desk.
The units Ive worked on had about 6000K to 6600K native CCT. Gamma wise, I
confess I have not checked. Luminance wise, its above 300 cd/m2, for sure.
So, whats a poor man to do? Leave it all native? Leave its native gamma
alone? Leave its native Luminance alone?
Personally, I always reduce its brightness down to around 160, using the
System Preferences Monitors Panel. That move alone creates banding you
would say?
Gamma wise, I always use 2.2, depending on the application I happen to use
for calibrating. I never use 1.8, thats for sure. And only use L* with
basICcolor Display and ColorEyes Display Pro so far.
By my choosing 2.2, Im further creating banding you say?
What about grayscale tracking: dont I want the calibrated chromaticities to
be the same from the white point down to the shadows? How else would you be
able to do this if not in the per-channel curves? In the monitor profile?
Isnt the profile going to create banding anyway, in the final analysis, in
the course of matching the PCS on screen? Because 8 bit is still 8 bit. I
wish I had the time to measure all this L
Ich liebe farbe...
MfG / Roger
De : Koch Karl [mailto:email@hidden]
Envoyé : 31 mars 2009 03:23
À : ColorSync Users Mailing List
Cc : Roger
Objet : Re: UGRA/UDACT-conforming displays [was: i1Display x i1Pro]
Hi Roger,
the only important thing is: how many dinstinct steps get through to the
panel?
It doesn´t matter where you reduce the number of steps per channel (R,G,B).
As long as you apply a tonal response curve to your data be it in your
favourite imaging software, e it in a graphics card (LUT) or be it when
converting an image from gamma 1.8 to L* or gamma 2.2 etc. and you do that
in 8 bits per channel you are bound to lose steps.
As soon as you go up in bit depth (can you go up in depth?), you have 4
times (10 bit) or even 16 times (12 bit) as many steps available for
manipulation as you can output in the end. That means, as long as your
gradient in your transfer function doesn´t exceed 4:1 or 16:1 (to put it
simple), you don´t lose anything on the output side.
And that´s exactly what the so called hardware calibration is about: more
bits for the manipulation of a transfer curve less banding.
You can avoid banding by not altering the tonal response curve and not
manipulate your data at all.
If we look at it from the perspective of the human observer, we have an l*
characteristic here. So it makes sense to calibrate your monitor to L* (and,
if you take ambient light effects into account, a CIECAM model is even
better). Consequently, you (8bit) data should bear an L* gradation as well.
That´s why we designed LStar-RGB (now eciRGB_v2).
This allows for a "banding-free" workflow from RAW (with all necessary
manipulations in 16 bit) directly to L* (eciRGB) to the L*
(CIECAM)-calibrated monitor.
Best,
Karl
Am 31.03.2009 um 02:53 schrieb Roger:
Karl,
Do you mean to say that, depending on the presence of a hardware LUT in the
monitor, that the choice of one gamma (transfer function) over another may
or may not have an impact on banding?
Suppose, for example, that I have a monitor with internal 10bit LUT, and I
choose to calibrate with the "antique" (your words) L*, then it would not
matter what my choice of RGB Working Space is, such as AdobeRGB (2.2) or
eciRGB (L*) or ColorMatchRGB (1.8) on loss of levels?
Thank you in advance for taking a few minutes of your precious time to
briefly explain,
MfG / Roger
This depends solely on the characteristics of the panel. Unfortunately
a lot of LCD monitor manufacturers believ that they should force the
behavior of a flat screen to mimic the characteristic of a CRT. Thus
they compel a gamma behavior in the controller instead of the
"naturally" linear behavior of the panel. If the LUT is deformed
twice, this could indeed result in banding.
A hardware calibratable monitor is far more tolerant.
Best regards,
Karl Koch
Am 30.03.2009 um 16:17 schrieb C D Tobie:
On Mar 30, 2009, at 10:09 AM, Todd Shirley wrote:
What exactly does it mean to "lose 20% of your levels"? Smaller
gamut? Banding? What are the real-world implications of "losing
levels"?
Loss of levels on screen, which may be visible as banding or lack of
detail on screen. This does not effect the image or the print,
except as it may effect your ability to make edits.
C. David Tobie
Global Product Technology Manager
Digital Imaging & Home Theater
email@hidden
<image003.gif>
Datacolor
www.datacolor.com/Spyder3
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden