 # Bias

Go Back

Definition Also referred to as Accuracy, Bias is a measure of the distance between the measured value and the "True" or "Actual" value of the sample or part. If the distance is greater than zero, we say there is a positive bias; if it is smaller than zero, the bias is negative. Thus, bias is related to the quality of the measurements (how close they are to the standard) as opposed to precision which has to do with the consistency of the measurements (how close they are to each other).

Examples

A sample part is measured 10 times by one operator, yielding the following measurements (in mm):
0.16, 0.20, 0.17, 0.15, 0.15, 0.18, 0.20, 0.15, 0.18, 0.21
The reference standard is 0.15 mm and the process variation for the part is 0.3 mm.

The average of the part measurements is taken: Average = 1.75/10 = 0.175

Bias = Average – Reference value = 0.175 – 0.15 = 0.025

Bias as a percentage of the process variation = 100*(|Bias|/Process variation) = 100*(0.025/0.3)= 8.33%

If the tolerance is known, we can also calculate the bias as a percent of the tolerance:

Tolerance = Upper specification – Lower specification = 0.25
Bias as a percentage of tolerance = 100*(|Bias|/Tolerance) = 100*(0.025/0.25)  = 10%

Application

If Bias is high, examine the following potential root causes:
1. Appraisers not following the correct measurement procedure.
2. An error in measuring the Reference Value.
3. Instability in the measurement device. If a control chart shows a trend, the measurement device could be wearing or calibration could be drifting.