Hello all, Looking for recommendations on software that can process fairly large amounts of measurement data (10-50+ data sets) with the purpose coming up with the optimal data set. Currently doing this in Excel which is painful. Some of the specific features I’m looking for: Import or drag/drop a set of measurements, process the AVERAGE of the set but also highlight data outliers that should not be included. Import or drag/drop a set of measurements, process the MEDIAN of the set but also show data outliers that should not be included. It should be able to do either of the functions above on a patch-by-patch basis rather than including/excluding the entire data set(s). Ideally the tool would use a CRF (cumlative relative frequency)-like method of displaying the data and the outliers…..and then un-check the outliers either manually or based on metrics (dE threshold)…and then produce a final data set. This isn’t necessarily for the purpose of profiling but more creating an ideal set of reference measurements for a fleet of devices. ColorLogic’s ColorANT and basICColor’s IMprove look promising but seem to lack a few features I’m looking for. BabelColor Patch tool may offer something as well but I’m unsure. Vendors of any of these products feel free to contact me. Regards, Terry Wyse
Terry, We perform this type of comparison in two situations. The first is building averaged profiles for media vendors and printer manufacturers. The underlying concern is that a particular printer, media or ink batch, etc. may be an outlier. The second instance sounds more along the lines of what you are looking for. Our largest client has fleets of hundreds of printers from several vendors and different model types. We construct a constantly evolving expected data set for each printer manufacturer/model (or model family) variation. Measurements can then be compared to the expected data and also compared against historical data from the particular machine to detect outlier behavior. The results are fed into both our profiling algorithms and, when significant Initially we used Excel as a processor. As you found, this quickly becomes impractical. We then switched to home-brewed software that, albeit flexible and powerful, was a PITA to maintain and add features to. We now use SAS JMP. It can easily import custom data sets - if you are already importing CGATS files into Excel, JMP is not going to be any more difficult. The real power, however, are the included statistical tests and graphing routines. Depending on t test values, we typically use either Tukey's HSD or a Thompson Tau test to determine if a measurement is a true outlier at a given confidence level. Cumulative percent plots are also easy to generate; there isn't a faster method to visualize your data, see if there is a non-normal distribution, and check if there is more than one mechanism driving the measured behavior. JMP isn't cheap (unless you are associated with an academic institution), but we've found it to be worth the price. There is a free (30 days?) evaluation. Cheers, Ethan
From: Terence Wyse
Looking for recommendations on software that can process fairly large amounts of measurement data (10-50+ data sets) with the purpose coming up with the optimal data set. Currently doing this in Excel which is painful.
Some of the specific features I’m looking for: Import or drag/drop a set of measurements, process the AVERAGE of the set but also highlight data outliers that should not be included. Import or drag/drop a set of measurements, process the MEDIAN of the set but also show data outliers that should not be included. It should be able to do either of the functions above on a patch-by- patch basis rather than including/excluding the entire data set(s). Ideally the tool would use a CRF (cumlative relative frequency)-like method of displaying the data and the outliers…..and then un-check the outliers either manually or based on metrics (dE threshold)…and then produce a final data set.
This isn’t necessarily for the purpose of profiling but more creating an ideal set of reference measurements for a fleet of devices.
On 2 Aug 2016, at 20:22, Ethan Hansen <ehansen@drycreekphoto.com> wrote:
JMP isn't cheap (unless you are associated with an academic institution), but we've found it to be worth the price. There is a free (30 days?) evaluation.
Anybody using FileMaker for this sort of thing? Might not have the visualising capabilities of JMP but its storage, processing and statistical capabilities are easily exceed anything you’d need for a lifetime of chart reads. And, when you’re not analysing chart data, you can use it to run the rest of your business :-) -- Martin Orpen Idea Digital Imaging Ltd
Hello Terry, FYI, with PatchTool (that I sell) you can do a part of what you want. 1- drag and drop sets of measurements 2- do an average. I would suggest doing a weighted average, which takes care of outliers by associating a lower weight to these values when computing the average. In practice, for 20+ files, the effect of outliers on the average is very small to non-existent. Note: The weighted average is discussed in more details inthe PatchTool 5 Help manual, page 91 (and the manual can be downloaded without the program from the BabelColor download page). You can compare a reference and the average, or the average with any individual file, and get stats (CRF) for all the patches in the files as well as apply Delta*E thresholds. However, in PatchTool, you will NOT get stats on the individual patches. Danny Pascale www.babelcolor.com Marche Tue, 02 Aug 2016 08:08:24 -0400, Terence Wyse écrit: Hello all, Looking for recommendations on software that can process fairly large amounts of measurement data (10-50+ data sets) with the purpose coming up with the optimal data set. Currently doing this in Excel which is painful. Some of the specific features I’m looking for: Import or drag/drop a set of measurements, process the AVERAGE of the set but also highlight data outliers that should not be included. Import or drag/drop a set of measurements, process the MEDIAN of the set but also show data outliers that should not be included. It should be able to do either of the functions above on a patch-by-patch basis rather than including/excluding the entire data set(s). Ideally the tool would use a CRF (cumlative relative frequency)-like method of displaying the data and the outliers…..and then un-check the outliers either manually or based on metrics (dE threshold)…and then produce a final data set. This isn’t necessarily for the purpose of profiling but more creating an ideal set of reference measurements for a fleet of devices. ColorLogic’s ColorANT and basICColor’s IMprove look promising but seem to lack a few features I’m looking for. BabelColor Patch tool may offer something as well but I’m unsure. Vendors of any of these products feel free to contact me. Regards, Terry Wyse
participants (4)
-
Danny Pascale
-
Ethan Hansen
-
Martin Orpen
-
Terence Wyse