• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Colorsync-users Digest, Vol 4, Issue 393
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Colorsync-users Digest, Vol 4, Issue 393


  • Subject: Re: Colorsync-users Digest, Vol 4, Issue 393
  • From: MSP Graphics <email@hidden>
  • Date: Sat, 3 Nov 2007 12:30:55 -0700




------------------------------

Message: 3
Date: Fri, 02 Nov 2007 20:45:23 +0100
From: Klaus Karcher <email@hidden>
Subject: Re: Accuracy of Instruments
To: email@hidden
Message-ID: <email@hidden>
Content-Type: text/plain; charset=ISO-8859-1; format=flowed

Mike Strickler wrote:

Wait a minute: "In all probability"? The "expert eye"? And "I guess"?

Would you please be so kind to /not/ truncate quotes beyond recognition?



Strictly speaking it was a selection, not a truncation, but a telling one. I guess I'll have to leave your entire post intact (you don't need to do that with mine)


... but back to the facts:

The University of Wuppertal, the University of applied Sicences
Stuttgart and the FOGRA carried out an investigation about
inter-instrument differences between 8 instruments of 5 manufacturers
and presented the results at the 4. Digitalproof-Forum 2004 in Stuttgart.


The probes (BCRA tiles, digital proofs and offset prints on APCO paper)
were measured with a reference instrument before and after the test to
eliminate probes damaged during the test.


They result of the investigation was a Delta E 76 of up to 2 to the
average on 95% confidence level. I think one can argue in good
conscience that these differences are visible and significant in process
control and quality management.


Klaus Karcher


I understand what you're saying, Klaus, and I agree it sounds terrible, but two issues occur to me. First, in what practical scenario should we worry about this--I think Mike E. tries to address this below--and second, you're raising an an additional issue of agreement between instruments of different manufacturers and design. This latter factor may greatly complicate any attempts at significantly better interinstrument agreement. I do wish we could hear more from the designers and makers of these instruments because without that we don't know if what you are asking for is even possible. In a way, what this whole discussion boils down to is mistrust and unhappiness with these manufacturers (which now is mainly just one company). I'm still agnostic on the question: I haven't seen compelling evidence that they are falling short of their best effort, given cost and other practical constraints. All that said, I personally would insist that if there is a requirement of very close matching of measurements in different locations all the instruments be of the same make and model and that all had been recently recertified because we all know that different models/makes can read quite differently (and think of the different UV-filtering schemes!) That would go a long way toward cutting down on the sort of variance you're talking about.
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden
  • Prev by Date: Re: Colorsync-users Digest, Vol 4, Issue 393
  • Next by Date: Re: Accuracy of instruments
  • Previous by thread: Re: Colorsync-users Digest, Vol 4, Issue 393
  • Next by thread: Re: Colorsync-users Digest, Vol 4, Issue 394
  • Index(es):
    • Date
    • Thread