• Best Wishes to all for a Wonderful, Joyous & Beautiful Holiday Season, and a Joyful New Year!

Measurement Data Optimisation

Time permitting, I like to average out multiple chart readings to "smooth" measurements before profiling.

Does anybody have any experience with the following software? Comments?

ColorLogic ColorAnt - What is ColorAnt?

basICColor IMProve - basICColor IMProve - Software for intelligent correction, optimization and processing of color measurement data | basICColor GmbH


Stephen Marsh

Optimizing and averaging data is a red herring in the sense that the real problem is variation. If one has a high level of variation, good optimization of data will not result in a consistent and predictable process and a more accurate profile.

Reducing variation requires a physical change to the process.

This probably does not help with your question but I am only trying to put it into perspective. Good luck.
 
Thanks for the reply, I agree about averaging data and high levels of variation Eric! It is a given that process control must be in place before attempting to measure and profile. Not all prints have high levels of variation, such as output from a good inkjet printer. That being said, even a single print measured multiple times will vary slightly between measurements, even when using say an i1iO table. Does the minor measurement variation matter? Does averaging multiple readings help or simply slow down the process for no appreciable gain?

I am curious about the previously linked optimisation software apps. The ability to account for OBA content may be helpful if ones profiling package does not have this ability. Then there is the "smoothing or optimisation" aspects etc. I am just wondering what others experiences are, do these tools make that big a difference to the final profile and output? Is it all smoke and mirrors?

Stephen Marsh
 
I'm gonna' disagree with Erik a bit. Data smoothing is a very good thing.

The point of averaging and/or smoothing data is to minimize the effects of spurious measurements or odd happenings that are not indicative of the process. For example, if you measure one profiling target and that target happens to have a hickey on the one 50% K patch in the chart, then you have a measurement that distorts the whole effort. I say that the more targets you can measure, the better. Measure until you can't stand it any more, and then measure a few more.

One of the issues I've seen crop up from certain profiling packages is the introduction of banding. There can be too many grid points in a profile and there can be too much accuracy in a profile that describes a colorspace that you're separating into.

Anyway, Stephen, I haven't used the packages you referred to. I have had friends smooth data using Heidelberg PrintOpen, and I've been very pleased with the result.

Variation isn't a problem, it's a natural occurrence. Minimizing variation is a good thing in a manufacturing process, but you'll never eliminate it.
 
depending on the use of a profile, smoothing (differentiated from averaging) of measurement data could be a benefit even to the theoretical device free of variation, should such device be rather non-linear, though possibly coming at the expense of overall accuracy. Matrix based vs. LUT based profile for monitors, for example.
 
I'm gonna' disagree with Erik a bit. Data smoothing is a very good thing.

Minimizing variation is a good thing in a manufacturing process, but you'll never eliminate it.

I was not meaning to say that data smoothing or optimization was bad. Of course it is helpful.

My view is just that reducing process variation will result in better data but smoothing data will not affect process variation that much.

Getting an average which has eliminated the odd faulty measurements is good but one should also look at the calculated "standard deviation" value. That is an important value with respects to statistical process control and analysis. If one does not obtain the standard deviation value then you don't really have a measure of what your variation is. It is just a normal manufacturing method that is commonly used to report variation.
 
I was not meaning to say that data smoothing or optimization was bad. Of course it is helpful.

My view is just that reducing process variation will result in better data but smoothing data will not affect process variation that much.

Getting an average which has eliminated the odd faulty measurements is good but one should also look at the calculated "standard deviation" value. That is an important value with respects to statistical process control and analysis. If one does not obtain the standard deviation value then you don't really have a measure of what your variation is. It is just a normal manufacturing method that is commonly used to report variation.

I see what your saying, but it is good to note that data smoothing and optimization can be very effective, even more so if the program uses a good algorythm. Of course like you said, it wont work if you dont "control your process". But there are several programs out there like the OP mentioned and a few others that we use here in our shop that provide very helpful information. Sometimes, it's not just about eliminating odd measurements, but rather, setting your press up based off of how it runs on most occasions.

To Stephen Marsh: We dont use your specific program, although we do have something alone those lines. A good way to "smooth" out the results is to enable filters, if your program has them. Filtering out 3-6% off of your best will usually give you nice tight readings which you can use to generate an average.
 

PressWise

A 30-day Fix for Managed Chaos

As any print professional knows, printing can be managed chaos. Software that solves multiple problems and provides measurable and monetizable value has a direct impact on the bottom-line.

“We reduced order entry costs by about 40%.” Significant savings in a shop that turns about 500 jobs a month.


Learn how…….

   
Back
Top