Andrew, I'll start "humble", I'm only interested at this point to tell what type of "information reduction" is at play between pixel values in a file and measured pixel values on screen. For sure, we all *know* it's not a 1:1 relationship. Like, this is crazy, but I confess I never stopped to think what happens in Photoshop when working in 16bit modes? Suppose my video card can only honor 8-bits: can you imagine the huge "reduction" going behind the scene between the image in the file and the image on screen? Now what happens in the case of a 10-bit video card scenario? Assuming a 10-bit capable display like your 271Q and my 271W, and Wire's UP2916? Surely the reduction must not never be as severe. But I'd like to know, quantitatively, what kind of "compression" are we talking about. For now, I'm only interested in looking at R=G=B values, that will give plenty of data to graph to play with. But I don't think I'll expand this to the whole 16.7 million (let alone 1.04 billion) colors. / Roger -----Original Message----- From: colorsync-users <colorsync-users-bounces+graxx=videotron.ca@lists.apple.com> On Behalf Of Andrew Rodney via colorsync-users Sent: Tuesday, January 7, 2020 3:25 PM To: Roger Breton via colorsync-users <colorsync-users@lists.apple.com> Subject: Re: perceptual differences in Lab deltaE So if (ugh that be hard) you're going to use your 16 million color test file, what's going to happen if it's in Adobe RGB vs. sRGB vs. ProPhoto RGB vs. the gamut and calibration of your display? Andrew Rodney http://www.digitaldog.net/ <http://www.digitaldog.net/>
On Jan 7, 2020, at 1:21 PM, Roger Breton via colorsync-users <colorsync-users@lists.apple.com> wrote:
Ilah,
Glad you brought this up because that's exactly one my current "pet projects". My code is not finished yet but basically, I'm going to be sweeping the entire "1024" scale with my instrument, to determine, exactly, what is what. I'll be glad to report as soon as I'm done -- can't wait!
/ Roger