Professional Documents
Culture Documents
Calibration and Standards
Calibration and Standards
Calibration and Standards
BENEFITS OF CALIBRATION
Calibration is a process of testing and comparing the errors of
measurement instruments and processes with accepted
standards in order to detect and correct variations in
performance. Therefore, calibration assures that devices and
processes meet expected performance specifications within
universally acceptable levels and accuracy. Hence, calibration
:has the following benefits
Voltage standards
AC measurement standards
Resistance standards
Ratio standards
Current shunts
Frequency standards
RF calibration standards
Maintenance apparatus
Resistance bridges
Standard resistors
Piston gauges
Deadweight testers
Calibration Steps
A calibration process starts with the basic step of comparing a
known with an unknown to determine the error or value of the
unknown quantity. However, in practice, a calibration process
may consist of "as found" verification, adjustment, and "as left"
verification.
Many measurement devices are adjusted physically (turning an
adjustment screw on a pressure gauge), electrically (turning a
potentiometer in a voltmeter), or through internal firmware
settings in a digital instrument.
For example, for some devices, the data attained in calibration
is maintained on the device as correction factors, where the
user may choose to compensate for the known correction for
the device. An example of this is RF attenuators, where their
attenuation values are measured across a frequency range.
The data is kept with the instrument in the form of correction
factors, which the end-user applies to improve the quality of
their measurements. It is generally assumed that the device in
question will not drift significantly, so the corrections will
remain within the measurement uncertainty provided during
the calibration for the calibration interval. It is a common
mistake for people to assume that all calibration data can be
used as correction factors, because the short and long term
variation of the device may be greater than the measurement
uncertainty during the calibration interval.
Non-adjustable instruments, sometimes referred to as
“artifacts”, such as temperature RTDs, resistors, and Zener
diodes, are often calibrated by characterization. Calibration by
characterization usually involves some type of mathematical
relationship that allows the user to use the instrument to get
calibrated values. The mathematical relationships vary from
simple error offsets calculated at different levels of the
required measurement, like different temperature points for
a thermocouple thermometer, to a slope and intercept
correction algorithm in a digital voltmeter, to very complicated
polynomials such as those used for characterizing reference
standard radiation thermometers.
The “as left” verification step is required any time an
instrument is adjusted to ensure the adjustment works
correctly. Artifact instruments are measured “as-is” since they
can’t be adjusted, so “as found” and “as left” steps don’t apply.
A calibration professional performs calibration by using a
calibrated reference standard of known uncertainty (by virtue
of the calibration traceability pyramid) to compare with a
device under test. He or she records the readings from the
device under test and compares them to the readings from the
reference source. He or she may then make adjustments to
correct the device under test.
Calibration Example
Let’s say that you use a precise thermometer to control the
temperature in your pharmaceutical plant processes and you
need to calibrate it regularly to ensure that your products are
created within specified temperature ranges. You could send
your thermometer to a calibration lab or perform the calibration
yourself by purchasing a temperature calibrator, such as a liquid
bath calibrator or dry-well calibrator. A liquid-bath calibrator
(like the Fluke Calibration models 6109A or 7109A portable
calibration baths) will have a temperature-controlled tank filled
with a calibration fluid connected to a calibrated temperature
display. The dry-well calibrator is similar but a metal
temperature-controlled block will have measurement wells that
are sized to fit the diameter of the DUT thermometer. The
calibrator has been calibrated to a known accuracy. You place
your thermometer, the device under test (DUT), in the calibrator
tank or measurement well then you note the difference between
the calibrator display and the DUT over a distributed set of
temperatures within the range for which your thermometer is
used. In this way, you verify if your thermometer is within
specification or not. If the thermometer needs to be adjusted,
you may be able to adjust the display of the thermometer, if it
has one, or you can use the calibration results to determine new
offsets or characterization values for the probe. If you make
adjustments, then the calibration process is repeated to ensure
the adjustments worked correctly and verify that the
thermometer is within specification. You can also use the
calibrator to occasionally check the thermometer to make sure
it's still in tolerance. This same general process can be used for
many different measurement devices like pressure gauges,
voltmeters, etc.
A Dry-well Calibrator (Fluke 9190A) with Reference and DUT Thermometer Probes
Metrologists
Lab managers
Calibration engineers
Calibration technicians
Manufacturing engineers
Instrument technicians
COSTS OF CALIBRATION
REFERENCES
a) Calibration Process , Halit Eren , March 2005
b) https://static-int.testo.com/media/5e/fb/0f62e1610132/Factsheet-push-food-coldchain-
EN.pdf
)https://us.flukecal.com/literature/about-calibrationc