Are gamma discrepancies relevant? [was: UGRA/UDACT-conforming displays]
Are gamma discrepancies relevant? [was: UGRA/UDACT-conforming displays]
- Subject: Are gamma discrepancies relevant? [was: UGRA/UDACT-conforming displays]
- From: Marco Ugolini <email@hidden>
- Date: Thu, 02 Apr 2009 16:03:22 -0700
- Thread-topic: Are gamma discrepancies relevant? [was: UGRA/UDACT-conforming displays]
In a message dated 4/1/09 4:56 AM, Thomas Holm/pixl wrote:
> If you have a high bit display why not put it to an emperical test (I
> did on 2 different monitors).
> Make 3 profiles one at L*, one at Gamma 2.2 and one at G 1.8, leave
> luminosity and whitepoint to be the same for all profiles. The only
> change between these profiles is, well the profiles, and the gradation
> made in the high-bit lut in the display. This will allow you to change
> profiles without affecting the VCGT.
>
> Open you editor of choice, open color settings, turn dither off.
> Make an RGB document, ideally the width should be something
> multipliable with 256, but not exceeding your monitor resolution width
> (say 1536 px which will give you 6 pixels of each level), and make two
> duplicates. Assign Adobe RGB, Apple RGB and your L* space of choice.
> Then make a black to white gradation in each document - be very picky
> about placement of the endpoints and angle!
> Position all three documents so they are all visible, and load the
> varying monitor profiles and see if it makes a difference.
> My experience tells me that you will see various degree of banding
> especially, in the two windows with a tone curve/gamma setting
> workspace which is different from your monitor calibration.
Sorry, but I don't agree that this test actually provides incontrovertible
evidence of the theory's validity.
(Disclaimer: my monitor does not have a built-in LUT. Still, if the theory
were valid, the effects would still be visible, at least to a significant
degree.)
To introduce a measure of objectivity in the discussion -- if the test were
to prove the point, the file encoded with the same gamma as the TRC of the
active monitor profile would *always* appear to be the most banding-free of
the three. Stated slightly differently -- depending on which profile is
being activated, the image that appears most free of banding would be the
one encoded with the same gamma as that of the active monitor profile.
Actually, much of the banding in that file would be greatly minimized when
the profile with a matching gamma is activated. That would constitute strong
empirical evidence.
But a test on my display proves that such is not the case. (Technical note:
other than the differences in TRC, each of my three profiles -- made with
basICColor display 4.1.2 -- is 16-bit LUT-based, has CAT02 chromatic
adaptation, uses the monitor's native white point, and is based on 140 cd/m2
luminance.)
Result: the amount of visible banding in any of the 3 files does *not*
change in any significant or clearly one-sided way when I switch from one of
the 3 profiles to another.
This leads me to conclude, at least provisionally, that any banding that is
visible on screen in any of those three files as viewed through any of the 3
display profiles is due solely to each monitor profile's own built-in
limitations, as well as to the physical limitations of the display itself
within an 8-bit environment -- and is not directly correlated with a
mismatch between the active monitor profile's TRC and the gamma of the
working space used in a given image file.
The results may possibly differ when using an on-board high-bit monitor LUT
coupled with linear CPU display card curves. I'd like to hear from other
users who have monitor devices with an on-board high-bit LUT and read their
impressions.
Marco Ugolini
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden