Re: Are gamma discrepancies relevant?
Re: Are gamma discrepancies relevant?
- Subject: Re: Are gamma discrepancies relevant?
- From: Marco Ugolini <email@hidden>
- Date: Fri, 03 Apr 2009 11:39:39 -0700
- Thread-topic: Are gamma discrepancies relevant?
In a message dated 4/3/09 12:50 AM, Koch Karl wrote:
> Marco Ugolini wrote:
>> It creates the possibility of that sort of results, yes.
> That´s all I ever meant to say ;-)
Me too. Sometimes the banding is close to nonexistent, other times it is
barely noticeable, and in a few instances it's readily visible.
> But "possibility" is very vague if you want to be on the safe side,
> you want to avoid "possibilities" altogether.
Yes, "possibility" is vague, but it was my way of pointing out that this
does not always result in horrible banding: sometimes it's just
"nothing-much" banding, and even practically none.
Obviously I agree with you completely that even a "possibility" of banding
is cause for concern (a "sizable concern", as I had put it myself). My
remark never meant to dismiss the reality and seriousness of the phenomenon.
>>> (As the color spaces are not perceptually uniform, some of them are
>>> more annoying than others.
>>
>> Meaning just monitor profile spaces, or all color spaces, working
>> spaces included?
> Even more so, when they interact and they always do. If you display
> a file on your monitor, there are always working space and monitor
> space involved.
Of course. That is understood.
>> Again, I don't think that "co-ordinating working- and display TRCs"
>> does much to solve the problem, because it's a solution only in the
>> absence of vcgt curves, which always exist in monitor profiles.
> But they are linear und thus innoxious if you have a hardware-
> calibrated monitor.
Right -- if you have a monitor with its own on-board high-bit LUT, the CPU's
8-bit display card is kept perfectly linear, and, at least in theory, the
results ought to be practically banding-free. Still, 8-bit environments are
inherently limited by their mathematical inexactness, and I'd think that the
results would be remarkably better if the video signal itself originated as
high-bit, correct? After all, no matter how good it is, a monitor's on-board
high-bit LUT still has to work from the CPU's display card 8-bit signal.
So, the question comes to mind again: why are we still without 12- or 14- or
16-bit standard CPU display cards, in the year 2009? What's taking so long?
It cannot be just a technical problem: it must be in great part because it's
not perceived as a going user concern.
>> I agree much more strongly on the desirability and usefulness of on-
>> board high-bit monitor LUTs, or even ... heaven forfend ... high-bit
>> display cards in the CPU itself!
> We´re on the way, if you look at the new Mac Pros.
On the Apple site I can't see any specific bit-depth information for the
MacPro's graphic cards. Are they high-bit?
> But there is still a long way to go. As a software manufacturer, I dread
> the consequences. As long as there is no established standard, and as long
> as we have a mixture of "old" 8bit and new high-bit graphic cards plus
> high-bit LUTs in Monitors, a general solution that suits all scenarios
> will be complicated to write and thus so expensive that no one will
> be prepared to paythe price ;-)
Maybe we should keep it simple for now, and limit ourselves just to a
high-bit display card, assuming a monitor that does not have its own
on-board high-bit LUT. Isn't it better to start with a high-bit video signal
in the CPU anyway? Then whatever other problems arise after that has been
taken care of would be tackled as they come up.
>> You may say I'm a dreamer, but I'm not the only one... ;-)
> no, you´re not!
Peace, bro. :-)
Marco
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden