Re: On SpectraView 2180WG LCD
Re: On SpectraView 2180WG LCD
- Subject: Re: On SpectraView 2180WG LCD
- From: William Hollingworth <email@hidden>
- Date: Sun, 08 Jan 2006 19:05:09 -0600
At 12:34 PM 1/8/2006 +0000, Steve Kale wrote:
Thanks for this. For a little bit of a novice can you elaborate on a couple
of points. When you say the "frame buffer", how does this relate to the
front-side (PCI) bus? I thought that the latter is a real bottleneck and
that current OS - well Mac OS-X Tiger at least - were >8 bit capable. Most
expect the next generation of Mac desktops to adopt at least PCI Express X16
freeing up this bottleneck substantially.
The underlying bus, be it PCI or PCIx really has nothing to do with the
support of high bit video. PCI is quite capable of transferring 8, 16, 32
and 64 bit data. PCIx is basically just a faster bus.
The main issue is the video card's RAM and GPU architecture. Currently most
hardware and video drivers are designed for 8 bits per color per pixel. Not
to mention the APIs in the OSs that may need to be updated to support >8
bit color.
There are some highly specialized video cards that use frame buffers with
floating point pixel values. However those are for very specific
applications and hardware.
Re DVI, anything I've read on capacity gets confusing quickly. It would be
great if you could add a little clarity. For example, I understood DVI and
HDMI to have the same video bandwidth. hdmi.org claims that HDTV (with 8
channel 192kHz 24 bit audio alongside the video) uses less than half of
HDMI's 5Gbps bandwidth and hence has plenty of bandwidth for future
upgrades. Various sources mention a single link DVI bandwidth of a "maximum
of 165 MHz (1920x1080 at 60 Hz, 1280x1024 at 85Hz)" without mentioning bit
depth. ("Dual link DVI supports 2x165 MHz (2048x1536 at 60 Hz, 1920x1080 at
85 Hz)" again without mentioning bit depth.)
As far as RGB video, HDMI is basically DVI with a different connector and
the addition of digital audio on the same cable.
You are correct in the resolution limits of single-link DVI/HDMI and those
are for 8 bit video. If you want to make hardware that supports >8 bit
video, then you need to use the dual link feature of DVI which allows the
second channel to be used to carry the extra 2 (or more) video data bits
per color. The downside of doing this, is you then can't use the second
link to carry data for a higher resolution. So you are stuck with either
higher resolution or higher bit depth.
It would seem that with
"consumer video electronics", ie HDTV and the like, embedding HDMI and
computer and computer display manufacturers grappling for ways to get over
the 8 bit colour issue that the two are heading in different directions
again. The new UDI is said to be compatible with HDMI but are there going
to be bit depth bottleneck issues due to bandwidth constraints? I guess my
points are predicated on whether or not HDMI has in fact the same video
bandwidth as DVI or whether it's already substantially better. Is UDI
intended to address 16 or better bit per channel colour? How long do we
have to wait for the display path to catch up with our editing workflow?
Your thoughts and knowledge would be greatly appreciated.
All very good questions! Things are perhaps getting even more complicated
with VESA working on DisplayPort. At this point I don't have answers
regarding UDI. Perhaps someone from Apple would like to clarify....?
Will
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden