re: On SpectraView 2180WG LCD
re: On SpectraView 2180WG LCD
- Subject: re: On SpectraView 2180WG LCD
- From: "paul graham" <email@hidden>
- Date: Sun, 15 Jan 2006 10:04:33 -0500
- Importance: Normal
From: paul graham [mailto:email@hidden]
Sent: Friday, January 13, 2006 10:48 AM
To: 'email@hidden'
Subject: re: On SpectraView 2180WG LCD
on the subject of >8bit display pipelines, this was posted on the Yahoo T2x
forum, and I thought it might be of some interest here.
there may of course be mistakes in the thoughts/ information, as he freely
states,
paul
>>> snip:
Most DirectX9 graphics cards (I believe) have support for 10-bit
output - there are A2R10G10B10 and A2B10G10R10 standard modes;
some could, theoretically, use 16-bit floating point video frame
buffers, although I'm not sure how good support is. DirectX only
supports the 10-bit modes full-screen, so you can't select them
for the desktop, at least in XP - although the LUTs which most
cards use when you calibrate the monitor usually have 10 bits
of output (so you get 10 bits of accuracy with your 8-bit desktop).
However, this applies only to the RAMDAC outputs - although the
newer ATi cards do dither corrected colours when you use DVI, I
think most others just truncate DVI to 8 bits. Hence one of the
"CRTs are better than LCDs" arguments, and why I've tried to
jump through hoops to profile my LCDs independently of attempts
to calibrate them using graphics card LUTs (which my colorimeter
software insisted on trying to do by default).
I was under the impression that the extra colour depth on these
monitors is mostly for calibration purposes, and that they have
custom LUT support, like the dithering built in to the T221. At
least one of the two brands above blatantly dithers - the number
of colours supported not being a power of two - but whether you
can see it is another matter. I've expressed concern about
dithering in the past, mainly because by the time Photoshop has
done it, your graphics card has done it, and your monitor has
done it, you could end up with anything... but I'm probably
paranoid.
I didn't know that the NEC supported 10-bit input; interesting.
The DVI spec (section 2.2.3) does say you can do high colour
support by sending the MSB over the primary link and the LSB
over the secondary (so it's sixteen bit or nothing). 3.2.3 of
"DI-EXT" (extension block for EDID - out of my depth here, but
downloadable from VESA) shows how to specify 48-bit high colour
depth modes, so it looks like it's all standard. So I'm not sure
what NEC are worried about, unless they just haven't tested it.
I'd expect the newer (AVIVO) ATi cards to be able to handle this,
if anything can - I believe Brightside used one to drive a high
dynamic range display with a 16-bit input, although they may have
been doing custom things with the display rather than running in a
simple high bit depth mode. Please don't buy one on my say so
without checking, though!
I don't believe Photoshop currently offers running in a 10-bit
full screen mode as an option, and that would be the most
obvious application to me. There are rumours that Vista might
support higher bit depths, I think (if Aero Glass uses DirectX
for rendering everything, there's no reason why it shouldn't).
This may be what "future compatibility" means.
A number of medical devices support 10-bit greyscale, but I
think they mostly use custom display modes (eight bits from
green and two from one of the other channels), or reconstitute
luminance from bit stealing (see a paper mentioned on this group
some time ago). I've not met a dual link one.
Some plasma televisions claim to support an enormous number of
colours. I'm a little hazy on *how* - presumably it's only down
the analogue input, since they're certainly not dual link. Is
there a YUV format for HDMI which I don't know about? 10:7:7
or 12:6:6 might actually look better than 8:8:8 RGB on TV inputs,
and I know some DVD players work with more than 8 bits of luminance
data...
The gamut with the NEC and Eizo panels impresses me more than
the extra colour depth support (not that I've had much chance to
look at them). The extra colour depth is partly there to make up
for the stretched range - correctly calibrated, there wouldn't be
enough samples to make the screen look right, otherwise. AFAIK
you really do need to have them properly profiled, though,
otherwise the colours will be miles off because of the nonstandard
filters. I'm still coveting a CG220 to go with my T221, although
I might consider a 3007WFP (what was wrong with FPW?) as an
alternative. Not that I can afford them, but I can dream.
I hope that helps (and that the more TFT-savvy members of the
group will forgive me for any misinformation I've inadvertently
spread).
--
Fluppeteer
>>>
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden