Re: gamma bewilderment wrt/ Argyll’s documentation
Re: gamma bewilderment wrt/ Argyll’s documentation
- Subject: Re: gamma bewilderment wrt/ Argyll’s documentation
- From: Graeme Gill <email@hidden>
- Date: Thu, 31 Mar 2016 17:31:20 +1100
Uli Zappe wrote:
Why you are talking about television here? Conceptually, a computer has nothing to do
with television.
Hmm. Do you really not know anything about the development of computers and electronic
display technology ?
Quick summary :- When computers needed an electronic display, they made use of
a widely deployed and relatively cheap technology :- the CRT. As mentioned previously,
CRT's have an (approx.) gamma of 2.4. This is close to a perceptual weighting curve,
something of considerable importance in getting a good visual signal to noise ratio
in analog TV transmission and also limited bit depth.
For continuity/backwards compatibility and the above technical signal encoding
reasons, new display technologies are generally made to be backwards compatible
with the CRT EOTF.
Apple started out with using a gamma of 1.8 which reflected printing technologies
rather than displays. This alone should make clear that television played no role
whatsoever when Apple conceived ColorSync.
1.8 meant 1.8. When Apple switched to 2.2/sRGB because it had become the de facto web
standard, they meant 2.2 just the same; you won’t find any mention anywhere in their
documents that this should be used in conjunction with a 2.4 display (and certainly not
a 2.4 CRT – those were completely dead when Apple switched to 2.2 in 2009).
I'm not sure why you are so certain you understand the basis for Apples choice
or what it does or does not imply.
As someone developing micro-computers at the same time Apple was developing
the Mac, and having read a few accounts of the company at that time, and
guessing that there was not any particular expertise in imaging or Color Science
available to them then, 1.8 strikes me as something of an engineering decision
made for a project that was assumed to be a closed system. A fact that supports
my view is that 1.8 is not an optimal choice of perceptual encoding (to minimize
the impact of limited bit depth), 2.2 - 2.4 being closer to the mark.
*Actually, this is the core of what I do not understand in Argyll’s documentation:*
What does it even mean in an ICC color managed system “to use gamma 2.2 with a gamma
2.4 display”? Assuming you calibrate your display to gamma 2.4 and then profile it, the
monitor profile will also sport a gamma 2.4 TRC (if it’s built correctly and without
color appearance adjustments). And when the system is using gamma 2.2, the CMM will
match colors to gamma 2.4 as soon as they are sent to the display, so there is no
difference at all vs. a display with gamma 2.2.
You are mixing up a bunch of different aspects.
In short:
1. In ICC color management, the display gamma is completely irrelevant in principle (=
assuming no limitations in bit depth), because changing it will not change the image
display at all.
Yes. But then I'm not talking about the physical display gamma in that context, but
the target response of the display under color management. Discussing
the idealized device responses of (say) the TV system sets the stage for
understanding how to configure a color management system to comply with
the expectations and standards of image encoding and display. (That
and the fact that fully color managed applications are more the exception
than the rule, and everything else assumed sRGB.)
2. In ICC color management, it isn’t even possible/makes sense to “use a system gamma
2.2 with a gamma 2.4 monitor”. Whatever monitor gamma you have, you’ll have a monitor
profile with the same gamma, and every differing gamma will be matched to that.
Well, of course it is. If the display profile is setup expecting sRGB encoding
and viewing conditions, and the perceptual table of the profile is setup
for the display under dim viewing conditions, then an appearance transform will
have a quite similar effect to a gamma adjustment of around 1.1. - 1.2.
I read this of course, but again, it has little to do with computers.
Course ? Book! Yes, it has a lot of relevance to computers.
Pointon comes
from the video world, which is a very different world.
Except for the fact that computer displays trace their history and standard
back to the video world.
BT.1886 is also pertinent.
Certainly not.
Certainly is, since it is a standardization of the CRT display
characteristic that underlies computer display standards.
BT.1886 is a spec for watching videos in a dim environment, using calibrated monitors
instead of ICC color management (in conjunction with a color appearance model) to
achieve desired color appearance adjustments. Computers, on the other hand, are
typically used in a bright environment,
That depends on the display. If you have a 50 cd/m^2 CRT on a computer, you
are viewing it in a dim environment, or you aren't seeing much. Of course if
you have a 350 cd/m^2 LCD display, you can use it in a much brighter environment,
and the setup for viewing standard images will be slightly different.
In the same vein, the simplified gamma TRCs of REC 709 and SMPTE 240M are 1.961 and
1.932, respectively, and certainly not gamma 2.2, as the Argyll documentation
says.
I don't say they are 2.2 though, I say that they are "approximately 2.2".
Well, if even 1.961 still counts as “approximately 2.2”, then almost any gamma curve
will be “approximately 2.2”. ;-) Gamma curves outside of the 2.2 ± 0.3 range are
probably quite rare.
ICRT's are typically 2.4-2.5, which is distinct from 2.0-2.2.
Why does the documentation assume that a color appearance adjustment (= contrast
expansion of gamma 1.1) is desirable by default?
Because that's how color appearance works.
? Where is the “law of color appearance” that says “contrast expansion is always
desirable”?
There has been considerable research into color appearance, much of it
embodied in CIECAM02 and the associated scientific papers. Mark Fairchild's
book is a good introduction.
As noted, a contrast expansion is desirable when adjust for the difference
between a bright viewing environment and a dim viewing environment.
A contrast expansion is desirable if and only if the viewing environment is
dimmer than the production environment. This is typically not the case with computers.
It was very typical, but may well be less so now. Having been given an understanding
of what and why an appearance adjustment may be desirable, you can now figure out
how to allow for that.
Television wasn't developed with any editing. What came out of the camera was what
was broadcast.
Yep, but that’s history. And we’re talking about computers here. Why this constant
reference to a technology that has nothing to do with computers (and is outdated, and
is polluted with commercial interests (“contrast sells”))?
Those who cannot remember the past are condemned to repeat it, or at least be
bewildered by why the world they live in is arranged as it is.
You're confusing what was established by standard and practice at the time, to latter
developments that have to work within the established standards. What happened was
that (out of the available display technologies), CRT's were chosen as the primary
display medium. [...]
I’m well aware of the *history* of *television* color processing.
You aren't showing it.
What I don’t understand is why *computer* users should care about that *today*.
If you have some magic wand that can suddenly switch everyone system to a new
standard overnight, I'm sure there are many companies that would like to buy it off you.
But this is history now. Today you can color manage video just as you have been able to
color manage still images for the last 20 years. So whatever historical standards
existed, you can emulate in a corresponding video profile which “virtualises” the
historic video hardware, so to speak.
Yes. So what - that's exactly what ArgyllCMS and other color management systems
achieve.
2) Determines the look of non-color managed output.
There shouldn’t be any non-color managed output anymore in 2016 ... oh well ...
I'm not sure what dream world you live in, but the vast majority of systems and
software are color managed only by the expectation that everything is sRGB.
Apple systems are amongst the best in terms of pervasive color management, but its
still not in everything, and not in any hand-held device.
But if you agree that these are the two reasons for calibrating a display for an ICC
color managed computer, aren’t you saying yourself that appearance adjustments (i.e.
contrast adjustments via gamma settings) are *not* among the reasons for a specific
calibration?
Not for color managed applications no, but certainly for everything else.
Or does all this “gamma 2.2 vs. 2.4 vs. BT.1886” talk only refer to your
point 2), i.e. non-ICC color managed output that today shouldn’t even exist anymore?
See above.
I think the monitor profile is the only logical place to perform any output appearance
adjustments, because color appearance phenomena happen at the screen → observer stage.
Yes and no. It's a practical location to do it, but like color management
itself, there is both a source and a destination which together determine the
overall transform needed. You can try and cut it into two peices with a common
appearance space as an intermediary (which ICC PCS attempts to do to some degree),
so you can mix and match source and destination profiles, but this may not
always give you a good result, or be practical.
Graeme Gill.
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden