Re: Re[2]: Best CRT or Flat Monitor for Softproofing?
Re: Re[2]: Best CRT or Flat Monitor for Softproofing?
- Subject: Re: Re[2]: Best CRT or Flat Monitor for Softproofing?
- From: Roger Breton <email@hidden>
- Date: Fri, 13 Feb 2004 08:33:10 -0500
>
How do you test how "accurate" a measurement device is?
>
>
Looking forward to hear from your experience.
>
>
Peter
That is the eternal 1 million dollar question. The only thing you can
hopefor is the instrument to be as close to factory calibration as can be,
to begin with. Heavy color measurement instrument users will sent their
instrument in for 'certification' on an annual basis as part of their
ISO-900XX procedures. Beyond that, accuracy (IMHO) is a function of the
quality of the instrument and how close you adhere to the manufacturer's
recommendations for a correct usage, and the adequacy of the instrument for
the application at hand. For example, there is no use measuring metallic
inks with a 45/0 geometry, you need a sphere-based instrument to do that.
Same with foil. I, myself, am very concerned with this very subject of
instrument accuracy in the context of monitor measurements. It takes a lot
of reading to understand what's involved, electronically, physically and
conceptually, at the time of triggering the read button. And I am constantly
searching for the better instrument albeit one I -- mere mortal -- can
afford in this lifetime without second mortgaging my house. Bruce Fraser was
alluding to a Minolta CA-100 yesterday for making black luminance
measurements. That's a type of instrument generally viewed as accurate. A
good spectroradiometer is also considered, generally, as a more accurate
instrument. But there a opinions to the effect that a spectroradiometer,
which measures light energy at the wavelength level, may not be best suited
for measuring "broadband" devices like self-luminous displays.
If you want to have accuracy, you're also going to have to define relative
to what as (others can correct me if I am wrong) a colorimeter, for example,
is very hard to make to perfectly match the Standard Observer function. Only
very, very expensive commercial instrument achieve such a feat. What most
colorimeter's manufacturer do is to fine tune the response of their
instrument to a certain source of light. As fas as I know, the better they
do it the better the accuracy. But, caveat emptor, they must use a source of
light that is closely related to the types of lights YOU intend to use the
instrument for. Otherwise, the reading will be biased. For example, let's
say a manufacturer was to calibrate its colorimeters to an illuminant A type
of source (2856K) that would not be very well suited for measuring light
coming from CRTs and LCDs. Unfortunately, manufacturers won't disclose the
type of light they calibrate their instrument to in their specs because of
competitive reasons. But I've heard from ColorVision that they use a
D65-like source to calibrate their Spyders. Is that true? I don't know. For
its part, last time I checked, X-Rite was using some kind of specially tuned
monitor to calibrate its instruments of which I can't get the details of. I
would also think that more and more US manufacturers will come to use NIST
specially-developped calibration facility for colorimeters certification.
It's going to be interesting to follow.
Please excuse my typos...
Roger Breton | Laval, Canada | email@hidden
http://pages.infinit.net/graxx
_______________________________________________
colorsync-users mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/colorsync-users
Do not post admin requests to the list. They will be ignored.