RE: Monitor Luminance pattern
RE: Monitor Luminance pattern
- Subject: RE: Monitor Luminance pattern
- From: Roger Breton via colorsync-users <email@hidden>
- Date: Fri, 10 Jan 2020 09:12:39 -0500
Wayne,
I figured this much, that the value are getting "corrected" by virtue of
passing through the monitor's internal 14-bit LUTs. Two things, if I may.
First, there has to be a 16-bit to 10-bit reduction, somewhere, on the way
to the monitor, by the video driver, makes sense? Second, I wonder to what
extent does the monitor actually "gamma correct" the values, by expanding
them to 14-bit, since the slope of the resulting 'curve' is a straight line?
I could not find any literature on the workings of these monitor LUTs
yesterday...
/ Roger
-----Original Message-----
From: Wayne Bretl <email@hidden>
Sent: Thursday, January 9, 2020 10:29 PM
To: email@hidden; 'Andrew Rodney via colorsync-users'
<email@hidden>
Subject: RE: Monitor Luminance pattern
If I had to guess without further info, it looks like the 10-bit values are
gamma corrected, and the monitor then applies the inverse gamma, making the
output luminance linear, but with larger steps at higher luminance, which is
what you want for best perceptual uniformity and least visibility of the
quantizing steps at all luminance levels.
-----Original Message-----
From: colorsync-users
<colorsync-users-bounces+waynebretl=email@hidden> On Behalf Of
Roger Breton via colorsync-users
Sent: Thursday, January 09, 2020 8:11 PM
To: email@hidden
Subject: Monitor Luminance pattern
In trying to separate "myths from facts". I've created a small application
to directly feed my monitor 16-bit/channel RGB values, from 0 all the way to
65536. The results is here :
https://1drv.ms/b/s!AkD78CVR1NBqkpYWyI61LO82SM-FVA?e=rPFvOe
In analyzing the result, I confess I'm no programming wiz or mathematician,
and I'm having a hard time separate what part of my display "pipeline" is
responsible for what. The graph shows measured Luminance values for RGB
levels in increments of 64 (0, 64, 128, 192, 256 ,,, 65536). It is clear
that the mapping is pretty linear in the darkest shadows and progressively
"deteriorates" (voluntarily?) as we move towards the highlights. In an ideal
world, we would want to enjoy 1:1 tone mapping "throughout", such that, if
we start with 16-bit/channel RGB photography, we get 16-bit mode in
Photoshop and we get 16bit/channel off the monitor through 16-bit/channel
video card. Maybe in a few years.
To summarize, 16-bit RGB -> Video Card -> which sends 10-bit to the monitor
> which passes this through 14-bit LUT.
/ Roger Breton
_______________________________________________
Do not post admin requests to the list. They will be ignored.
colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden
_______________________________________________
Do not post admin requests to the list. They will be ignored.
colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden