If I had to guess without further info, it looks like the 10-bit values are gamma corrected, and the monitor then applies the inverse gamma, making the output luminance linear, but with larger steps at higher luminance, which is what you want for best perceptual uniformity and least visibility of the quantizing steps at all luminance levels. -----Original Message----- From: colorsync-users <colorsync-users-bounces+waynebretl=cox.net@lists.apple.com> On Behalf Of Roger Breton via colorsync-users Sent: Thursday, January 09, 2020 8:11 PM To: colorsync-users@lists.apple.com Subject: Monitor Luminance pattern In trying to separate "myths from facts". I've created a small application to directly feed my monitor 16-bit/channel RGB values, from 0 all the way to 65536. The results is here : https://1drv.ms/b/s!AkD78CVR1NBqkpYWyI61LO82SM-FVA?e=rPFvOe In analyzing the result, I confess I'm no programming wiz or mathematician, and I'm having a hard time separate what part of my display "pipeline" is responsible for what. The graph shows measured Luminance values for RGB levels in increments of 64 (0, 64, 128, 192, 256 ,,, 65536). It is clear that the mapping is pretty linear in the darkest shadows and progressively "deteriorates" (voluntarily?) as we move towards the highlights. In an ideal world, we would want to enjoy 1:1 tone mapping "throughout", such that, if we start with 16-bit/channel RGB photography, we get 16-bit mode in Photoshop and we get 16bit/channel off the monitor through 16-bit/channel video card. Maybe in a few years. To summarize, 16-bit RGB -> Video Card -> which sends 10-bit to the monitor
which passes this through 14-bit LUT.
/ Roger Breton _______________________________________________ Do not post admin requests to the list. They will be ignored. colorsync-users mailing list (colorsync-users@lists.apple.com) Help/Unsubscribe/Update your Subscription: https://lists.apple.com/mailman/options/colorsync-users/waynebretl%40cox.net This email sent to waynebretl@cox.net