When performing recalibration of a proofer, why do I get a different Maximum ΔE value between the samples view and statistics?
The ΔE values column of the second iteration is sorted from high to low values.
How can this be explained and what is the impact on the decision to finalize the process or not?
The maximum ΔE value in the statistics table is the highest ΔE value of all patches without using the bad iterations. Here is an example of a bad iteration:
Sample 88 has undergo two iterations.
The first ΔE value is the "check" value. In this case, there is a ΔE of 2.5 compared with the Lab value stored in the proofer profile for this color.
The first iteration gives a ΔE value of 2.18.
The second iteration gives a ΔE value 17.08. Clearly this iteration doesn't improve the match with the reference Lab value (stored in the proofer profile) and can be considered as a bad iteration.
Only the first iteration will be used when finalizing the recalibration process, the bad iteration (second) will be discarded. The maximum ΔE value for this patch used for the statistics is 2.18.
It is important to understand that the high ΔE values in the samples view are typically for a specific iteration which might not be used after all! If you want to see which sample is responsible for the maximum ΔE in the statistics, then you have to select the Maximum ΔE option in the Contents drop-down menu:
The decision to finalize the recalibration process should be based on the statistics tolerances and not on the samples view!