Re: Video card LUT depth
Re: Video card LUT depth
- Subject: Re: Video card LUT depth
- From: neil snape <email@hidden>
- Date: Mon, 02 Jun 2008 12:25:22 +0200
- Thread-topic: Video card LUT depth
on 2/06/08 11:41, Graeme Gill wrote :
> In the process of tweaking my software, I've made an interesting observation
> about the NVidia 8600GT graphics card that I have installed in my MSWin2K
> machine: It has 10 bit D/A converters on its VGA output. This was discovered
> by measurement, none of the literature that accompanies the card actually
> mentions
> it. I would guess this would be of no benefit if you were using a DVI
> connected LCD
> display though.
>
When they announced this card I remember seeing that it was supposed to use
10 bit for display , but at the time I thought that wouldn't be a benefit to
the current operating systems , nor connections as you know already.
I still remember a very old Thunder card for Macs, a 10 bit PCI card
available around 1997 or 1998 .
Do you think there might be a way of sending 10 bit display directly to the
monitor?
Below from the Nvidia tech spec page:
Advanced Display Functionality
Two dual-link DVI outputs for digital flat panel display resolutions up to
2560x16004
One dual-link DVI outputs for digital flat panel display resolutions up to
2560x16005
One single-link DVI outputs for digital flat panel display resolutions up to
1920x120066
Dual integrated 400MHz RAMDACs for analog display resolutions up to and
including 2048x1536 at 85Hz
Integrated HDTV encoder provides analog TV-output
(Component/Composite/S-Video) up to 1080i resolution
NVIDIA nView® multi-display technology capability
10-bit display processing
--
Neil Snape photographer Paris France email@hidden
http://www.neilsnape.com
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden