Monitor Gamma
Monitor Gamma
- Subject: Monitor Gamma
- From: Roger Breton via colorsync-users <email@hidden>
- Date: Wed, 15 Jan 2020 20:38:03 -0500
Im doing some exploratory experiments with regards to gamma. I know its
far from gamut discussions. Its a lot more complex subjects than I
thought. I confess I got drawn in this exploration in the wake of 8-bit vs
10-bit capabilities. I just want to share some progress. In the following
Excel sheet (yes, Excel is a great color management tool), youll find two
tabs, at the bottom. The first one is labeled Step=32 and the second is
Step-4:
https://1drv.ms/x/s!AkD78CVR1NBqktUgigDMsTGPE-U6zw?e=FVTqrn
Id suggest you start by the Step=4 tab. Towards the top left, in the range
B12:B20 are the RGB values I sent to my monitor. In the range C12:C20 are
the measured Luminances (Y) of the corresponding RGB values in VisualStudio
(C#), whereas in the range D12:D20 are the measured Luminances of the of the
corresponding RGB values as displayed in Photoshop. Basically, I generated
the chart that sits in the columns T to W in code, to test some hypothesis
about gamma, and thats when I noticed a difference between the chart viewed
in my application development environment and a PNG version of the chart
viewed in Photoshop. (I did not share that PNG but it is the same as what I
created in Excel, in all these 9 rectangles, sitting between columns T to W.
In the process of experimenting with gamma, I wanted to see what those
rectangles would look like had they not have any gamma correction, in other
words, in terms of equal brightness increments. The theory goes (as far as
I read) that perceived brightness follows a power law. Fair enough. So
youll see I build two charts, one which shows the measured Luminances
gathered from VisualStudio and Photoshop, and the other, the theory behind
gamma encoding and gamma expansion, as I found it called. Sorry, I did not
give that second graph a title. And there is a screen capture below it.
Now, if you turn your attention to the step wedge sitting between column X
and AA, youll see that its the same 9 rectangles but filled with different
RGB values, gamma expanded values which, in theory, ought to look equally
spaced, perceptually, in terms of brightness. Arrived at that point, it
occurred to me that, because of gamma expansion, RGB 32,32,32 becomes RGB
100,100,100? I thought, arent we loosing a whole lot of levels to this
gamma expansion? I mean, is there life between RGB = and RGB 100? So
wanting to address this question, I turned my attention to creating a step
wedge with finer increments, Steps = 4, thats the second tab. I think its
relatively easy to follow at this point, basically studying the relationship
between Input RGB codes values and their appearances with and without gamma
expansion. On the left, between columns B and E, are straight RGB values,
from 0 to 256, in steps of 4. Between columns J and M are the same
rectangles but gamma expanded. In Excel, you could zoom out like hell, to
25%, to see the whole gradient. Now, I was somewhat reassured by this
study of gamma in steps of 4. But I was still not happy with what was going
on below 4? I mean, look at the relatively large jump between RGB 4 (that
becomes 39) and RGB 0? Thats not the same jump as the rest of the scale?
And somehow, I got the impression that Im losing depth or contrast to
this gamma scheme, in terms of tone range? So I created yet another series
of rectangles, in columns S to Z where I went as small as in Steps of 1,
from RGB 0 to RGB 16. And even at such small increments, with a gamma of
2.2, there is still quite a jump between RGB 0 and RGB 1, which becomes
RGB 21 in a gamma expanded world.
Call me nuts but I still was not satisfied with the result of this
experiment? At one point, I toyed with the idea of what if I was to lower
the gamma value down question? Like 1.8 or 1.5 or, gasp!, 1.0? Well, this
this monitor, there is a utility called Multiprofiler that allows changing
the monitor gamma on the fly. So I was off to the races and, voilà!, gamma
of 1.0, no recalibration, nothing. I cant post how the image appeared on
my monitor here, youd have to have your monitor set to a gamma of 1.0 as
well. But suffice to say that, all of a sudden, there was no jumps
anywhere throughout the entire scale, no blocked shadow details, perfect
1:1 tonal mapping, if I can dare say. Of course, all the windows, dialogs,
UI looked completely washed out and there is nothing anyone can do about
that, thats just the nature of Windows and MacOS as well. There is a
underlying system gamma of 2.2 so that all controls and UI are designed to
work at 2.2. What Id like to see is having a monitor profile created with a
gamma of 1.0 while the system gamma would remain at 2.2, if that was ever
possible. I realize with this crazy idea of gamma = 1.0, Im beginning to
sound like Timo Autokiari, for those who remember, many years ago
At this point, the question I have is, does gamma calibration matters? For
color management? I know, in my humble and perhaps naïve practice, I never
questioned the need for gamma calibration, whether 1.8 or 2.2, depending on
religion. But in terms of preserving image quality, whats the trade
off? The majority of LCD monitors sold today emulate in circuitry how the
CRT response, thats a given. My cheapo SAMSUMG 2693HM has three choices of
Gamma in its OSD, Gamma mode 1, Gamma mode 2 and Gamma mode 3 no
documentation, no numbers of any kind to be found in the OS. But, clearly,
according to my measurements, one of these modes emulate a 2.2 gamma
encoding. So that, with a 2.2 system gamma such as in Windows and on the
Mac, the whole world is not going to be crumbling down the moment the
monitor is plugged in the video card. So, gamma is an evil that cant be
avoided. Monitors manufacturers play by it, television manufacturers play by
it. You would think that, the day LCD were invented, we would abandoned the
CRT gamma for a linear response? But I understand the need for legacy. It
makes me think what happens when CTPs first entered in prepress, everybody
was wondering the same thing: What should we set our plate calibration
to?, now that they had the possibility of making 50% dot in Photoshop print
50% on the plate. It took a while for that innovation to digest in the
industry
Also, as I discovered, gamma is also built-in digital images. So that, the
way I understand it, JPEG has a built-in default (maybe hard-coded?) gamma
value of 2.2. So, its everywhere. It cant be turned off. Now one of the
lingering question I have is how does this gamma encoding translates to ICC
profiles? Are there still these limitations in terms of usable device
level values or are we free to do as we want? I mean, between RGB 0 and
RGB 1, whether its coming from an ICC profile, with a gamma of 2.2, or
straight from a 2.2 gamma-calibrated monitor, isnt there the same
limitation? Which makes me want to study what happens in a 16-bit per
channel RGB world. My initial foray in this brave new world showed me that
the relationship between input RGB levels and measured Luminances are
*linear* -- yes, linear, at least at a monitor gamma of 2.2, in Windows.
Thanks for bearing with me this far / Roger
_______________________________________________
Do not post admin requests to the list. They will be ignored.
colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden