Re: High End Displays
Re: High End Displays
- Subject: Re: High End Displays
- From: Terry Wyse <email@hidden>
- Date: Thu, 01 May 2008 05:40:43 -0700
Hi Mark,
I'm basically saying that doing something like luminance adjustment (and color temp for that matter) via software and the video card LUTs is not optimal compared to doing it at the hardware level. You doing it in software but via DDC is making the adjustments in hardware, not the video LUTs. You can clearly see that in ColorEyes Display Pro where you can see both the video card LUT curves and the monitor LUT curves. In my case, I calibrated my EIZO CG211 via DDC which left the video LUT linear and adjusted the monitor/hardware LUT with its higher bit depth. In the example of my Tanning Bed 24" Gateway, it doesn't support DDC AFAIK so I had to rely on video card LUT adjustments which I could see in CED as well.
My understanding of the difference is that the video LUTs are 8bit vs. 10bits+ in the monitor LUTs. In the case of the Gateway display where I was taking it from 350 cd/m2 all the way down to, say, 150 cd/m2, I probably only had 4-6bits left once the luminance was reduced thus limiting the available levels from which to make the remaining calibration adjustments....and compromising the accuracy of the profile.
Make sense? Of course, someone should straighten me out if this is not exactly correct.
Regards,
Terry Wyse
On Wednesday, April 30, 2008, at 07:08PM, "Mark Segal" <email@hidden> wrote:
>Terry,
>
>This is interesting - but not sure how well I understand what you are saying here. When I set-up CED to profile the monitor I set three parameters in the CED interface (and the DDC business does the rest), one of which is Luminance in candelas. I usually select 110 cd. Are you saying that dulling-down the display this way is inferior to using the brightness setting on the display panel itself for doing the same thing? On my display, the brightness numbers are not in candelas, the scale is in percentage, and if I set that, I still need to tell CED what parameter to use for luminance, which presumably over-rides anything I set on the display itself because the DDC system is functioning. So what piece am I missing in this story?
>
>Mark
> ----- Original Message -----
> From: Terence Wyse
> To: 'colorsync-users?lists.apple.com' List
> Sent: Wednesday, April 30, 2008 1:59 PM
> Subject: Re: High End Displays
>
>
> ................I recently scrapped my entire MacBook Pro system
> and had to re-install all my apps, including ColorEyes Display. I took
> the opportunity to use it on my piece-of-crap 24" Gateway FPD2485W
> display on one of my other Macs and tested using the software to dial
> down the luminance vs. doing it using the display controls. I started
> at maximum/native luminance (360 cd/m2! you could use this display as
> a tanning bed!) and then dropped it in about 20% increments down to
> 120 cd/m using software/video LUT adjustment only. I then went back
> and repeated the test using hardware luminance adjustment and left the
> software at "maximum" luminance. ColorEyes' "Validition" history
> clearly showed me that hardware luminance adjustment to be superior to
> having the software adjust the video LUTs to bring down the
> brightness. ..............................
>
> Regards,
> Terry
>
> _______________________________________________
> Do not post admin requests to the list. They will be ignored.
> Colorsync-users mailing list (email@hidden)
> Help/Unsubscribe/Update your Subscription:
>
> This email sent to email@hidden
>
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden