WMU Profiling Review 1.1
WMU Profiling Review 1.1
- Subject: WMU Profiling Review 1.1
- From: Abhay Sharma <email@hidden>
- Date: Tue, 09 Apr 2002 10:13:28 -0400
Dear Friends
Many thanks for your interest in this review. I have had over 100 requests
for the document.
1. Please note that the review has been updated. It is now at version 1.1.
The new version has been sent automatically to all those who requested
version 1.0.
2. What is new in version 1.1? One very small but significant change only
in the result for Monaco monitor profiling. The monitor profiling was tested
with an early beta. Monaco have supplied a newer beta version that is the
release candidate for Monaco Profiler 4.0. We re-tested the monitor
profiling. The Delta E in the earlier test was 8.15. In the new test this is
greatly reduced to 0.68 and is representative of what will be in the final
released product.
3. Rules for this review. I have learnt from this process and would like to
suggest the following rules for the review. All products tested are release
versions. All products should be running on a full license not a temporary
dongle. The review will be updated and released twice a year on July 1 and
January 1. The next release on July 1 will include printer profiles and some
vendors omitted from the first review.
4. One new issue to mention. We have found that Gretag Macbeth ProfileMaker
v4.0 can make monitor profiles with and without a vcgt tag. We found that
sometimes it can give us a gamma of 1.8 and sometimes a gamma of 3.0. This
is being queried directly with Gretag. More in the next release of the
review.
5. Below is my answer to correspondence on the Colorsync list.
Abhay
--
Dr Abhay Sharma, Associate Professor
Department of Paper and Printing Science
Welborn Hall, Western Michigan University
Kalamazoo, MI, 49008-5362
Tele (616) 387 2825
Fax (616) 387 2813
e-mail email@hidden
--__--__--
Message: 10
Date: Tue, 02 Apr 2002 19:03:07 -0500
Subject: Re: WMU Profiling Review 1.0
From: Terry Wyse <email@hidden>
To: <email@hidden>
CC: <email@hidden>
on 4/2/02 4:43 PM, Roger Breton wrote:
>
> WMU Profiling Review 1.0
>
>
>
> This is to tell you of a report we have done on profiling software to be
>
> presented at the TAGA Technical Conference, Asheville, North Carolina, April
>
> 14, 2002.
I'm sure you're going to get a lot of "armchair quarterbacking" as a result
of this report, but here are a few questions I had re: the WMU Profiling
Review:
* Regarding the pricing listed in the report, even though on page 9 you were
evaluating scanner profiling only, the prices you listed were of the
complete application bundles instead of just the scanner profiling
application. If the scanner profiling was available separately, you should
have listed this as the price. It appears that it costs between $2,000-4,000
to build a scanner profile!
I will try to pay more attention to this part of the review in future. Abhay
* On page 18 of the monitor profile eval, while you list the price of the
Gretag Monitor module only, with the others you still listed the package
price. This, at the very least, is not consistent with the way prices were
listed for the scanner profile eval. We all know better, but it would appear
that it costs $4,000+ to profile a monitor with Monaco software!
See above. Abhay
* All the ColorVision and Profile City applications were conspicuous by
there absence. There are more than a few of us that have a high regard for
these two company's products. I realize that if you had to include every
single profiling application out there you would still be testing, but the
question still must be asked: why weren't they included?
This was my omission. In the first instance we
just wanted to try this idea of a review, as the user community seem to like
it
I will widen it to include other vendors. I am contacting other vendors and
have already spoken to ColorBlind. Abhay
* The reference data used for the various scanning targets, was it "generic"
batch data or were the targets actually custom measured prior to
scanning/profiling? If so, with what?
The measurements used were generic (batch) data. Abhay
* Were the targets scanned multiple times, once for each profile
application, or were the targets simply scanned once and this raw scan
re-used for each profile? (I hope the latter)
The latter. Abhay
* I'm not sure how to interpret the Delta E results. Of the top 4, the Mean
Delta E seems to track OK but it's the Max Delta E that has me perplexed.
Looking at the results, I guess I would've gone with the FujiFilm ColourKit
Profiler as it exhibited the least variance in Max Delta E between the three
targets. Looking at the top 3, it seems each package went at least slightly
whacko with one of the targets. The Fuji Profiler didn't appear to have this
trait.
Good point. Thanks for your comments. Abhay
I also noticed at the end of the report, it was stated that a similar test
was being conducted for digital cameras, LCD panels and output profiles. In
the case of the digital cameras I would strongly recommend that they be
tested using at least three different cameras, possibly a high-end "chip"
back, a high-end scanning back and maybe a "pro-sumer" 35mm SLR-type camera.
And PLEASE include Profile City's ICC Capture Pro in the mix! One other
thing, don't be afraid to use a "scanner profiling" application for
profiling a digital camera. I've done it with the Gretag software and the
results might surprise you!
Next?
Regards,
Terry
_____________________________
Terence L. Wyse
Color Management Specialist
All Systems Integration, Inc.
http://www.allsystems.com
email@hidden
_____________________________
--__--__--
Message: 15
From: "Marc Aguilera" <email@hidden>
To: <email@hidden>
Subject: RE: WMU Profiling Review 1.0 / Vendor Response
Date: Thu, 4 Apr 2002 17:50:11 -0800
ColorBlind Software gladly offers its current applications as well as
any future products available for testing, providing the parameters are
fair and well thought out.
Terry Wyse's questions directly reflect our own as we read the analysis:
>
* The reference data used for the various scanning targets, was it
>
"generic" batch data or were the targets actually custom measured prior
to >scanning/profiling? If so, with what?
This is an extremely valid point. The accuracy of read color is profound
in the overall performance of a resultant profile. If the batch data was
used as reference data then the resultant data may be flawed due to the
unavoidable discrepancy between the individual manufactured target and
it's methodology of acquiring reference data for that particular batch
allotment. We often recommend users build their own custom reference
files for their given targets. We have been testing the Barbieri 100 xy
and the consistency of its calibration to white is built to be dEab
under 0.02. If a value is higher the unit must be recalibrated. When
producing Kodak IT8 Q60 reflective reference files, and then producing
an input profile, the results are ALWAYS visually better than a profile
built with batch reference data.
Using batch measured data vs user measured data would produce no change
in the delta E measurements shown in this review, because we are asking the
profile
to map RGB to LAB (generic) or RGB to some slightly shifted LAB (custom).
The result will
be the same in Delta E. Further the review attempted to mimic the typical
user and most users would not measure the chart themselves. Abhay
I am sure there will be more discussion regarding this paper.
It's disconcerting that we were not "informed" of the potential of being
part of such an analysis. As said before, we will gladly offer software
to WMU for consideration of future analysis.
Apologies for this. We are now in contact with you. Abhay
Marc Aguilera
ColorBlind Software
www.color.com
--__--__--
_______________________________________________
colorsync-users mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/colorsync-users
Do not post admin requests to the list. They will be ignored.