Re: Eye-One factory recalibration and Xrite Support
Re: Eye-One factory recalibration and Xrite Support
- Subject: Re: Eye-One factory recalibration and Xrite Support
- From: Klaus Karcher <email@hidden>
- Date: Thu, 10 Jul 2008 11:45:09 +0200
Randy Norian wrote:
X-rite claims a fairly small delta-E between similar x-rite
instruments. [...]
variation of dE > 4 between two identical instruments is a lot, in my
limited experience. Especially in light of the stated specification. [...]
The stated specs tell few about the variations under practical
conditions IMHO (laboratory conditions: new instruments, constant
temperature for at least 24 hours AFAIK, single-point measurements on
BRCA tiles, verification against the "manufacturing standard", which is
some kind of moving average of i1Pro's AFAIK).
But no one uses his i1 under these conditions: we don't work in a
climate chamber, we measure prints and proofs on a wide variety of
substrates, our instruments are not brand new and we measure in scan
mode typically.
quoted from X-Rite's i1System_Brochure for the i1Pro:
Inter-instrument agreement:
Average DE*94 0.4, max. DE*94 1.0
(Deviation from X-Rite manufacturing standard at 23°C for single measurement mode on 12 BCRA tiles (D50,2°)
Short-term repeatability:
DE*94 <= 0.1 (D50,2°),
with respect to the mean CIELab value of 10 measurements every 3 seconds on white
I recently made a round-robin test to come to know more about the
variations under *practical* conditions.
As a matter of course this test tells nothing about the instruments in
general, but only about the particular tested devices.
I measuerd 5 different charts (3 laser prints and 2 inkjet proofs) with
5 instruments (2 Spectoscans, 1 eye-one rev A, 1 eye-one rev B and 1
eye-one rev D, all without filter)
Here are some of the results:
<http://digitalproof.info/colorsync-users/instrumentTestApr08-3.pdf>
Unfortunately the graphs are not really self-explanatory, therefore some
additional explanations:
altMKkx, altMKkz and PantDC are laser print test chars,
altMKL and PantL are inkjet proofs.
Each chart has been measured with each instrument up to 3 times, the
results have been verified against the weighted mean of all instruments.
Specrolino/Specrtoscan 1: blue, labeled "spectroscan s&f" or "SSsf"
Specrolino/Specrtoscan 2: green, labeled "spectroscan LUP" or "SSl"
eye-one rev A: black, labeled "eye-one s&f" or "i1sf"
eye-one rev B: orange, labeled "eye-one LUP alt" or "i1la"
eye-one rev D: red, labeled "eye-one LUP neu" or "i1la"
The comparison of the individual measurement series on pages 1-5
suggests that the repeatability of the rev B and D eye-ones is inferior
than of the spectrolinos or the rev A eye-one.
The Delta E 2000 histograms on the last pages show quite clearly that
the rev D eye-one is not in the ballpark and the plots on the pages
28-30 reveal that the variation is systematic.
As mentioned, this tells nothing about the eye-one rev D in general. I
advised the customer to use the instrument no longer for any critical
tasks and to send it to the Xrite service.
With regard to proof verification and certification, inter-instrument
agreement is a very important topic IMHO.
I am planing to extended my round-robin tests as I think they are a good
method to learn something about the reliability of an instrument.
Carried out on a regularly basis, they can be a good an inexpensive
module in the quality inspection chain.
If you are interested in the further development of my test procedure or
if you are interested to take part on the tests, feel free to contact me
off list.
Regards,
Klaus Karcher
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden