Re: X-Rite color management (was Eye-One Diagnostics)
Re: X-Rite color management (was Eye-One Diagnostics)
- Subject: Re: X-Rite color management (was Eye-One Diagnostics)
- From: Marc Levine <email@hidden>
- Date: Mon, 05 Jan 2004 22:58:49 -0500
>
Is the gamut size
>
(which many seem to look at first) really an indicator of the quality of
>
either the profile or characterizing? What would one be looking for assuming
>
they used two instruments (and the same host software to level the playing
>
field)?
Andrew (et al),
Gamut size and shape are more important for comparing one device to another,
but will be less useful in the course of trying to evaluate the performance
of an instrument and its resulting display profile. There's basically 3
things that you need to be concerned with display profiling:
1) Device condition. Ideally, you want to have your device set up so that
you can utilize the full dynamic range that the device is delivering. In
other words, you don't necessarily want to run a device "wide open" if you
are trying to match it to something else (another monitor perhaps) with a
lower dynamic range. In such a case, you would need to "clip" a software
LUT, which could potentially degrade the tonality of your profile. Of
course, you don't want to restrict the gamut of the device too much in
calibration - this would prevent the display from being used to its full
potential.
2) Device Utilization. Once a profile is made - it's a good idea to check
and see that changing your system levels (the RGBeez in Photoshop) actually
correlates to a change in Lab (visual change) on screen. I know of a great
test that Bruce prescribes using Photoshop to identify whether the profile
is clipping or not as it goes to 0,0,0 black.
3) Profile Visual Attributes (Accuracy/Tonality). These are really the big
things. Profiles should produce a smooth image when rendering Lab to device.
Many a Lab-wedge have been created in Photoshop as a basic test. A fantastic
file was sent to me for testing that contained several ramps and gamma
indicators to help identify profile smoothness and accuracy. In terms of
color, a profile should be able to produce color in a measurably accurate
manner and achieve a desired balance of saturation and detail. To test
specific colors, you could display a series of colors, measure the Lab
values, and compare the 2 (MonacoOPTIX Pro software does this). To evaluate
saturation and detail, you would probably need to check out a number of
images and look for results in saturated areas.
>
OK so getting back to the instrument spec's, are we at a point where either
>
device would produce the same results (assuming the same host software which
>
has to play a role) as far as the user is concerned? In other words, are
>
either device spec's at a level of accuracy beyond what we'd be able to
>
benefit from? If not, is there a spec at which one should take notice of?
>
Let's take a scanner as an analogy. Most of us know that the spec's for
>
dynamic range are spotty at best but they do provide some idea of where the
>
scanner stands (dynamic range spec of 2.8 verses 3.4 is a huge difference
>
even if both are off).
From what I have seen, the more accurate, the better. The more precise and
repeatable the instrument, the more accurate the data. The more accurate the
data is, the more accurate the profile is. Consider that you can discern a
very small difference on white, and that the perception of your image on
screen relies heavily on that established white-point. A more precise
instrument means that, when the software gives you feedback on how to adjust
your monitor controls, that feedback is accurate! Also keep in mind that
monitor profiles are build on a relatively small sets of data and small
differences in this data can have a real impact on your profile accuracy and
smoothness.
>
I ask this based on a very eye opening exercise done many years ago. Using
>
Optical due to the large number of instruments supported, I used several
>
instruments on one display naively expecting to get nearly identical results
>
only to find pretty extreme differences in the final calibration and
>
profiles. Of course it's not all that easy to switch profiles and evaluate
>
effectively which instrument did a "better" job (hence the origin of this
>
question).
The reality is that it's very difficult to get to the truth about an
instrument using this method because you don't know what manufacturers are
doing in the software. It is quite possible that - even though you are using
the same piece of software - data from different instruments may be handled
differently in order to get everything to look the same. I'm not saying this
is necessarily the case, but it's definitely a possibility
>
It's a lot easier to profile a printer using several packages (and even the
>
same Spectrophotometer) and compare what you get from the same test image.
>
When it comes to displays, it's a lot more difficult.
Yes. In general, I think that the 2 are relatively similar. The biggest
difference is that some people are more in tune with what they like to see
on paper than what they like to see on the monitor. Maybe it comes from
years of "you can't trust the monitor", or maybe it has to do with the fact
that the average person probably sees a lot more things on paper than they
do on monitors and they just innately feel "more experienced" (or should I
say more comfortable) making a critical decision about color on a piece of
paper than they do making a critical decision about color on a screen.
Whatever the case, your display profile is the window into your imaging
environment and it's accuracy has a direct impact on the expectation of
color before a print is made. Color management is not just about liking the
output (although that is HUGE!), it's about getting the results you expect.
In terms of display hardware, I would say that accuracy is king. Look at the
specs, look at the price, make your decision. On the software side,
technology and toolset make the difference. Easy to use is good. Powerful is
good. Results will ultimately tell the story. What you'll find in the latest
display calibration software from X-Rite and Monaco is that both hardware
and software are unrivaled in their combination of capability and
performance.
Now that's a long post!
Marc
--
Marc Levine
Sales Guy
Technical Guy
Monaco Systems / X-Rite, Inc.
www.monacosys.com / www.xrite.com
email@hidden
_______________________________________________
colorsync-users mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/colorsync-users
Do not post admin requests to the list. They will be ignored.