Definition

Gage linearity is a measure of the consistency of Bias over the operating range of the measurement device (gage). For example, if a bathroom scale is under by 1.0 pound when measuring a 150 pound person, but is off by 5.0 pounds when measuring a 200 pound person, then the scale of Bias is not constant and it may help to understand the extent of changes in Bias over the range of weights being measured – this is gage linearity.

Linearity is assessed by comparing the measurements of parts selected to represent the entire operating range of the measuring instrument to a confirmed reference standard for each of these parts. The results are analyzed using a scatter plot of the Bias versus Reference values and a fitted regression line along with the R^{2} value indicating the fraction of bias variance accounted for by linearity. In general, lower the slope, the 'better' the gage linearity.

Examples

Five parts representing the operating range of the measurement device are selected and their reference values determined using a master gage. An appraiser takes measurements over five trials. In each trial, the appraiser measures each of the five selected parts randomly, giving 25 observations. The bias is calculated as |Observed value – Reference Value|. A scatter plot of the Bias vs. Reference values is shown.

The best-fit (regression) line is also obtained:

Bias = -0.6352 + 0.2478 x Reference

The slope of the line (0.2478) represents the linearity present in the gage.

Linearity can be expressed in two ways:

Absolute linearity:

Linearity = slope x process variation

OR

Linearity as a percentage of the process variation:

%Linearity = 100 x slope

In our example,

Suppose the process variation is 0.1. Then,

Linearity = slope x process variation = 0.2478 x 0.1 = 0.02478

%Linearity = 100 x slope = 24.78%

Application

The chart on the left shows constant bias over the range of reference values. The one on the right shows bias increasing in a linear fashion as the reference value increases.