• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Monitor profile verification
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Monitor profile verification


  • Subject: Re: Monitor profile verification
  • From: Klaus Karcher <email@hidden>
  • Date: Sun, 02 Dec 2007 19:41:14 +0100

Klaus Karcher wrote:
I's somewhat misleading to use a instrument to verify it's own results:

Marco Ugolini wrote:
Not if one's purpose is to verify the *internal consistency* of a monitor
profile

Agreed. It's not misleading to perform internal consistency checks, but it's definitely misleading to "sell" the results as a sufficient quality criterion for softproofing applications.


My intention was to place emphasis on the difference between this (definitely relevant) internal consistency check and the validation of a system against *external* references with tools like the UDACT <http://ugra.ch/index.php?show=299>: Both methods are not capable to evaluate the absolute accuracy, but only the latter one can tell whether a system is able to reproduce the colors of a specified application or not (assumed the measurement device is sufficiently accurate). Tools like UDACT are indispensable for the evaluation of softproofing systems IMO.

Reading display "reviews" in several user-to-user forums sometimes makes me smile, especially when the reviewer quotes those internal validation results and in the same breath argues to "see" what he thinks the validation report told him to see. A fictitious example: "the validation tool X revealed a distinct weakness in the blue region (Delta E > 0.4), and in fact shades of blue look of the mark on display Y". I ask myself: what visual reference did the reviewer use to arrive at this conclusion? Under which viewing conditions? What was the precision of his measurement device? (Graeme Gill alluded to Delta E's above 5 between different display colorimeters a few weeks ago). I'm sure that even the dispersion of high quality reference prints is a multiple of the variations those "Experts" claim to see.

When I see that the gamut deficiencies of displays we use today for softproofing purposes are roughly 20 times as much as typical internal validation errors, I feel obliged to put the different validation methods and their area of application in perspective.

Regards,
Klaus Karcher
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


  • Follow-Ups:
    • Re: Monitor profile verification
      • From: Marco Ugolini <email@hidden>
References: 
 >Re: Monitor profile verification [was: Eye One Pro for monitor calibration? [was: Re: NEC 2690 SpectraView]] (From: Marco Ugolini <email@hidden>)

  • Prev by Date: Re: Monitor profile verification [was: Eye One Pro for monitor calibration? [was: Re: NEC 2690 SpectraView]]
  • Next by Date: Re: Monitor profile verification [was: Eye One Pro for monitor calibration? [was: Re: NEC 2690 SpectraView]]
  • Previous by thread: Re: [TheWildBunch] Re: Monitor profile verification [was: Eye One Pro for monitor calibration? [was: Re: NEC 2690 SpectraView]]
  • Next by thread: Re: Monitor profile verification
  • Index(es):
    • Date
    • Thread