Professional Documents
Culture Documents
Measurement Systems Analysis
Measurement Systems Analysis
Measurement Systems Analysis
Analysis (MSA)
This is the
actual variation
Precision Accuracy
Repeatability Bias
Reproducibility Stability
x
2
i
Standard Deviation i
n 1
Instrumentation Concepts &
Terminology
Interval
The two endpoints that an instrument can be used for
measurement
Range
The difference between the two end points of the
interval. This is also called as Span or Full Scale.
Resolution
Smallest value of the variable that can be measured by
the instrument
Some times this is called as sensitivity, Threshold or
detection limit
Precision vs. Accuracy
Accuracy
Accuracy is an indicator of a measurement that is true. How
close is the average of all measurements to the true value of
the measurement. The true value of a variable is given by a
standard instrument.
Accuracy is generally called as “Bias”
Bias = Average of all observed values – True value
True value: Is measured by a standard instrument
Precision
Precision is an indicator of the consistency of
measurements. How close the measurements are to each
other.
This basically calculated as the “standard deviation”.
This is given by Repeatability & Reproducibility (Gauge R&R)
Precision vs. Accuracy
3 is in industiral practice.
99.7% of the values are within μ±3σ
Suitability of an Instrument in a
process – The P/T ratio
P/T Ratio – Precision to Tolerance ratio
Standard P/T ratio
6 Mesurement Error
P / T ratio
Specified Tolerance of the Process