Thanks for the information.
That clears things up quite a bit.
I'll see about the UV filter on the Datacolor ... it's at another lab.
I'll make that suggestion.
Are you saying the added fluorescent effect of the Xenon bulb could
skew the Colordata results toward a closer measurement relative to the
standard? And then sticking the UV filter on the device would settle
down the fluorescence and yield a reading more similar to ours?
regards
Richard
email@hidden wrote:
Hello
Richard,
I have seen the exact same phenomenon when measuring yellow, orange and
red BCRA tiles with five different instruments (including a NIST
traceable one). Some instruments gave errors in the 7-8 DeltaE range
(CIELAB) relative to the NIST traceable measurements, while others were
in the less than 1 DeltaE range.
All these colors have very sharp cut-off slopes, and this spectral
characteristic makes the mesured color very sensitive to small
spectrometer wavelength calibration shifts.
First, get away from CIELAB. You will measure smaller, by a factor of
two plus, and more realistic, compared to visual assessment, color
differences using CIE94 or CIEDE2000. Still, you probably will not
achieve the less than 1 DeltaE target.
You may want to look at an inter-instrument calibration solution, such
as netprofiler from GMB, but this one in particular may not be
compatible with the instruments you have.
I have also done inter-instruments matching on my own, using from 8 to
30 samples, and have improved matching in all cases (a factor of two
improvement is not uncommon).
The Xenon vs LED may also be the cause of the problem since the Xenon
is likely to generate more fluorescence. Can you try a UV filter on the
Datacolor?
Danny Pascale
email@hidden
www.BabelColor.com
On Fri, 25 Aug 2006 12:40:21 -0500
Richard Brackin <email@hidden> wrote:
We have supplied identical (or as close as it
can get) color standards to some manufacturers.
We are all measuring L*a*b* values of a series of films against those
standards, (Green, Blue, Red, Purple, Gray, and Orange, etc.).
We, as well as their color laboratories, are taking measurements on
these very specific pigmented colors and we're all using different
measuring devices.
In discussing the issue with the head of color matching at one lab, we
are seeing similar (not extremely close, but similar) L*a*b*
measurements for the standard.
However, we're all seeing color variances of approx 1 delta E on all of
the colors except this particular orange.
While they are seeing 1 delta E comparing the orange to the standard,
we're seeing 13 delta E.
We're measuring Delta E (CIE 1976) D50 at 10 degrees.
(CIE 1994 Graphics shows approx 6 delta E but we've been asked to use
CIE 1976).
What are some factors that would make one color show up so differently?
Should we be using a different color model for measurement?
The standard probably has a variance of 0.5 delta E across the surface.
They are using DataColor SF600 Plus and we're using BYK-Gardner
ColorGuide Sphere.
The Data Color uses Xenon, the ColorGuide uses LED.
Could the pigments used to make their orange color cause such a drastic
difference in measurement between the two light sources?
many thanks
Richard
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden
.
|