proof verification (was: Creating a SWOP proof with an Epson)
proof verification (was: Creating a SWOP proof with an Epson)
- Subject: proof verification (was: Creating a SWOP proof with an Epson)
- From: Klaus Karcher <email@hidden>
- Date: Sat, 15 Jul 2006 16:06:34 +0200
Roger Breton wrote:
On 14 Jul 2006, at 12:47 pm, Roger Breton wrote:
In principle, the numeric method should agree with our visual
assessment.
That is, if a proof is on average less than 1 deltaE away from
TR-001, then,
it should agree with our visual sensation as well.
How well a certain Delta E conforms to the visual assessment depends on
numerous aspects. Some of the most important are:
- size and distance of the probes
- surround conditions
- illumination (Luminance, ...)
- applied color distance formula
E.g if you compare results of DE76 (which is still in use for the FOGRA
Media Wedge evaluation) with DE94 (which is used by most measurement
device manufacturers to define the precision of their instruments) and
DE00 (which is known to have the best conformance to to the visual
assessment by now) in the yellow region, you will determine that a DE94
or DE00 of 1 goes along with a DE76 of up to 5.4!
Be aware of the precision of usual measurement devices which is in the
same order of magnitude.
That wouldn't be workable - what printer is going to accept an
average of 1dE as a contract proof?
counterquestions: what do you think how many printers have ever seen a
proof with less than avg. DE76 of 1? How many are able to fall below
this DE even within one press run?
I must be misunderstanding you. Are you saying that your proofs show an
average of 3 dE away from the ISO Coated standard, which you say you have
embraced and been certified for below? You must be meaning to say that
"Fogra allow up to 3 dE for paper white (on the proof vs the reference
process), 5 dE for CMYK (on the proof vs the reference) and a max of 10 dE
(on the proof vs the reference)". But even then? I don't get it. Those
numbers seems a bit high. I am confused.
FOGRA tried to reduce the tolerances for CMYK to DE76 2.5 and had to
accept that this is neither viable nor useful with todays proofing
systems and measurement devices.
The Univerity of Wuppertal realized a very interesting research project
on proof verification, which resulted in an alternative analysis method
based on DE00 and a new set of reference points which much better
represents the colors of average print jobs. Unfortunately the
documetnation is only in german:
http://www.dmt.uni-wuppertal.de/proof/index.php?tmpl=alternativ.php
One of the incidental results was that a jury of approved experts was
inable to gauge the quality of a proof by visual comparisons of normal
sized, printed FOGRA media wedges.
Of course no measurement procedure can substitute all important visual
aspects of a proof with reasonable effort and no one should forget that
the final destination of a proof is to bee a visual reference -- but a
"visual matched" proofing reference is neither a objective benchmark nor
feasible in todays production workflows.
Moreover there are serious metamerism issues as long as we have to live
with todays poor D50-Simulators in viewing booths (this unfortunately
concerns also measurement devices).
Would you mind be more specific, Martin: what works for you?
FOGRAs tolerances for the media wedge proved to be useful.
here are two "typical" results of our proofing systems for two different
printing conditions:
DE76 paper white: 0,7 / 3 (22%)
avg. DE76: 1,6 / 4 (39%)
max. DE76: 3,1 / 10 (31%) at 0/100/100/0
max. DE76 CMYK: 2,2 / 5 (44%) at 100/0/0/0
DE76 paper white: 0,3 / 3 (9%)
avg. DE76: 1,5 / 4 (38%)
max. DE76: 3,2 / 10 (32%) at 0/70/70/0
max. DE76 CMYK: 1,8 / 5 (36%) at 0/100/0/0
If someone else would measure very same media wedge with a different
device, I'm totally sure the results would bee significantly worse
because of inter-instrument divergences.
I'm not at all convinced of the "iterative" trimming (or cheating ;-)
approach for two reasons:
- as mentioned on 14 Dec 2005 (Re: Editing profiles)
iteration can damage the reliability and smoothness of a profile
even if the measuered results seem to be better than before
(you have to split the reference data in to sets to measure the "real"
effects of iteration)
- a perfect match to the reference data does not matter a perfect proof.
E.g. Gerhard Fuernkranz recommended a promising approach to estimate the
optimal ammout of smoothness in argyllcms for scattered, noisy data
based on Generalized Cross Validation, see
http://www.freelists.org/archives/argyllcms/06-2006/msg00008.html
Worse luck nothing of all mentioned above can help Lee Blevins to make
his proof as long as he has no access to the Characterization Data and
evaluation criteria for SWOP.
Regards, Klaus
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden