RE: Monitor profiling - what is 'correct'
RE: Monitor profiling - what is 'correct'
- Subject: RE: Monitor profiling - what is 'correct'
- From: tom lianza <email@hidden>
- Date: Fri, 16 Feb 2007 08:13:30 -0500
This is an interesting question and I thought that I would share some of
the experiences that I have had over the years of designing software and
hardware in this area. There are two elements of the process that users
often confuse:calibration and profiling. During the process of
calibration, the display is physically altered and during the process of
profiling it is characterized. Large shifts from the native color
temperature will often lead to compromises in the calibration ability of
the display/display card combination. Large deviations from the native
gamma of the display will often lead to problems in the reproduction of
gray ramps. So as a general rule of thumb: the less "calibration" you
need the better the final image of the display will be.
The author of the original thread said the following of a display
calibrated to D65 in L*:
The differences are very slight - perhaps a slightly different looking
white. Noticeable if you switch between profiles, but any choice quickly
'looks' OK after a few moments use.
While it is true that different instruments measure differently, it is also true that if the display changed slightly between calibrations with each instrument, one would expect changes as different profiles are swapped in. A better test is to calibrate two displays, side by side with the same instrument first to test the repeatability of the instrument/software combination. Then calibrate each display with competing instruments. Then swap the instrument/display combination and repeat the test. Realistically, the average person doesn't have the time to this test correctly and most "pundits" wouldn't bother doing the test anyway.
When you actually run hundreds of these tests with large populations of units, you get a sense of the magnitude of variations between devices as well as the ability to make the measurement. Let me give you some numbers. In the lab, with some of the finest equipment from Minolta (CS1000) or the PhotoResearch PR650, the ability to repeatedly set up and measure a single display has a nominal delta x-y uncertainty of +/- .0005 xy (one sigma) . This means that about 67% of the time we can set up and make the same measurement on a display with that repeatability. Most of this error comes from small variations on the display ( which by the way can easily reach well over +/ .01 depending upon location and viewing angle). Testing that I did a number of years ago using hundreds of short term mount-dismount tests indicated that the act of simply putting a unit on the screen led to an uncertainty of measurement of +/ .0006 for CRT's and +/- .0008 for LCD's. This is because LCD's have slightly higher short distance variation than a CRT, but much lower overall spatial change than a CRT. The combined uncertainty of the process gets to about +/- .0009xy (one sigma). We would typically use a 2 sigma variation to determine manufacturing limits and from this we see an uncertainty of +/- .0018xy. Keep in mind, that this is the uncertainty in making a measurement, it doesn't include manufacturing variances.
Graeme Gill wrote:
I did a quick
check recently between a DTP92, DTP94, Eye One Display 1, Eye One Display 2
and a Spectrolino, and all differed from each other by noticeable amounts,
typically 3-5 delta E worst case measuring a single display.
If you look at base uncertainty and do the calculation, the delta E uncertainty is on the order of 2 or 3 delta E . If Graeme used the Spectrolino as a standard, I would expect the DTP94/DTP92 to be different because of differences in calibration standards. That alone could add 1 or 2 delta E's to the estimate. Graeme's numbers are basically spot on for a large population of different instruments and that represents a reasonable bound on the current state of the art.
In my own experience with on-screen proofing, the single biggest issue has been building the profile with the proper black point in the LUT description. The current ICC spec is a little vague about how the CMM should handle the black point tag, if it is present. In general, it seems to be ignored and the "black point" in the lut is used. Many programs put a zero in the zero location. This causes massive errors in on screen proofing when trying to proof to print on the screen. That simple problem is bigger than any display measurement error and is more responsible for customer disappointment in display calibration than anything else.
Regards,
Tom Lianza
Xrite Corporation.
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden