Calibration and Standards

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 14

‫‪Calibration and Standards‬‬

‫علي محمد خليل‬


‫‪:‬االسم‬
‫اجهزة القياس‬
‫‪:‬المادة‬
‫الثاني \ تكييف‬
‫‪:‬المرحلة ‪ /‬الفرع‬
‫‪DR. Ali Dawood‬‬
‫‪:‬المشرف‬
Introduction
Calibration of instruments and processes is essential for
checking their performances against known standards. This
provides consistency in readings and reduces errors, thus
validating the measurements universally. The calibration
procedure involves comparison of the instrument against
primary or secondary standards. In some cases, it may be
sufficient to calibrate a device against another one with a
known accuracy. After the calibration of a device or a process,
future operation is considered to be error bound for a given
period of time under similar operational conditions. The
process of calibration is carried out in a hierarchical order. At
the highest level, the primary reference standard is assigned a
value by direct comparison with the reference base of SI units.
The primary standards are designated and widely
acknowledged as having the highest metrological quantities
that have values without reference to other standards of the
same quantity. In the second level, the secondary reference
standards are calibrated by comparing with primary standards
of the same quantity using a high precision comparator and
making appropriate corrections. In the third level, working
standards are routinely used to calibrate instruments and
processes against the secondary reference standards or their
representatives. More is provided about physical standards in
Article 43, Units,. The most important element in calibration is
the relationship between a single measurement and the
reference base for the unit of measurement: the reference
base is the prime source of authority. The base units of
measurement are the Le System International Unites (SI) units
maintained in the Bureau International des Poised Measures,
Paris. These are kilogram for mass, meter for length, second for
time, candela for luminous intensity, kelvin for
thermodynamics, ampere for current, and mole for amount of
substance. Other reference bases such as newton for force,
hertz for frequency, and so on, are derived from the base units
and maintained by national standards. Recently, with the wide
applications of digital systems, many intelligent instruments
can make self-calibrations – Article 160, Smart Sensor System
Features,. In these cases, post measurement corrections are
made and the magnitudes of various errors are stored in the
memory to be recalled and used in the laboratory and field
applications. A new trend is that certain calibrations can be
conducted over the Internet by entering the appropriate
Websites of manufacturers or calibration authorities

BENEFITS OF CALIBRATION
Calibration is a process of testing and comparing the errors of
measurement instruments and processes with accepted
standards in order to detect and correct variations in
performance. Therefore, calibration assures that devices and
processes meet expected performance specifications within
universally acceptable levels and accuracy. Hence, calibration
:has the following benefits

It determines whether measurements made before the •


.calibration were valid

It gives confidence that the future measurements will be •


.accurate

It assures consistency and compatibility with those made •


.elsewhere
It leads to repeatability and reproducibility assessments of the •
instruments and processes. • It provides confidence that
products meet their specifications, thus reducing legal liability –
.see Article 25, Introduction to Signals in Physical Systems

Without calibration, the product quality may be poor, thus •


opening up legal challenges and high failure rates of the
.products, thus increasing costs

It increases efficiency by ensuring that measurements are •


.correct

In the process industry, calibration of devices assures that the •


processes are well controlled and that the products meet
.expected specifications

It leads to documentation of performance of instruments and •


processes to meet quality standards such as ISO 9000, ISO
.1400, and QS-9000

Frequent calibrations can provide a graphical view of the •


equipment uncertainty over time, thus leading to reliability of
performance. This gives in-service life analysis; hence,
depreciation and replacements can be predicted in an informed
.manner

Measurements made within international standards •


.promotes global acceptance, thus increasing competitiveness

It helps convenient implementation of related regulations and •


legislation that govern the use of equipment in a particular
.application

As the technology changes, the regulations and legislation of •


test and measuring instruments change continually, and
calibration helps compliance validity of measurements and
.processes under changing conditions

In some cases, calibration can be used as a gain; the value •


.multiplied by some input can produce a scaled output

?Why do we need calibration standards


Imagine a trip to the grocery store where you buy a pound of
hamburger. How do you know that the pound you buy at one
store is the same weight as one you can get at another? The
answer, of course, is that the meat is weighed before it is
packaged. But how do we know that the scales used to weigh
the meat are delivering accurate measurements? We trust this
is true because periodically the scales are calibrated with more
accurate weights and then adjusted to be within their
specifications. The process of comparing one scale against a
more accurate instrument continues, with the measurements
becoming more and more accurate, until we reach the “top” of
.the measurement pyramid, which is the kilogram

This is a simplistic example of traceability. Traceability is how


we refer to a chain of measurements that range from the
lowest level of the calibration pyramid all the way up to the SI
standards. A traceable calibration is one performed as part of
an unbroken chain of measurements that can be traced back to
an SI unit. Each measurement also has an evaluation of the
measurement uncertainty, to quantify the quality of each
measurement. More information about traceability, the SI, and
the calibration pyramid is available on the Fluke Calibration
.About Calibration page
Having traceable measurements allows us to understand and
trust in them. Traceable measurements also enable reciprocal
agreements between countries; these agreements facilitate fair
.international trade

?Calibration and adjustment – what are the differences


Calibration: Comparison of measurement values, traceable to a
(national) standard

Adjustment: Setting a measuring instrument to the smallest


possible deviation from the correct value. In adjustment, a
.interference with the instrument is necessary

?Where do we find calibration standards


Calibration standards can be found in primary calibration
laboratories around the world. Primary laboratories perform
the most accurate calibrations and are often classified as
National Metrology Institutes (NMIs). NMIs can be found in
almost every country, with a network of less precise working
laboratories that branch out into a system that forms the
.measurement infrastructure of each country

Primary calibration laboratories are typically accredited by an


organization that has been independently qualified to review
and certify the labs’ quality, accuracy and processes. Most large
countries have one or more providers of accreditation, and
countries often agree to trust each other’s measurements
.based on their trust in the respective accreditations
?How do we determine which standards are needed
Measurement devices typically have specifications for their
measurement ranges, accuracies and uncertainties. You can
find this information in the product manual. The manual also
often specifies what is required to calibrate the device.
Typically you are going to use standards that are at least four
times more accurate than the device you want to calibrate.
You’ll want to balance the need for accuracy against your
budget. And you’ll want to look at other product features like
usability, form factor, and ability to calibrate multiple devices.
Depending on your workload you might also want to look for a
calibration standard that can be automated with calibration
.software

What types of calibration standards does Fluke


?Calibration manufacture
Fluke Calibration manufactures multiple types of calibration
: standards

Electrical calibration standards

Voltage standards

AC/DC transfer standards

AC measurement standards

Resistance standards

Ratio standards

Current shunts

Frequency standards
RF calibration standards

Low phase noise reference sources

Temperature calibration standards

Standard platinum resistance thermometers

ITS-90 fixed-point cells

Maintenance apparatus

Liquid nitrogen comparison calibrator

Resistance bridges

Standard resistors

Pressure calibration standards

Piston gauges

Deadweight testers

Pressure controllers and calibrators

Pressure comparators and digital pressure gauges

Portable/handheld pressure calibrators

Flow calibration standards

Gas flow calibration standards

Reference flow monitors

?Why is Calibration Important


Calibration helps keep your world up, running and safe. Though
most never realize it, thousands of calibrations are quietly
conducted every day around the world for your benefit. When
on your next flight or taking medication or passing a nuclear
facility, you can expect that the systems and processes used to
create and maintain them are calibrated regularly to prevent
.failure both in production and in on-going use

?What about quality standards


When people talk about calibration standards, they don’t
always mean physical instruments. Sometimes they might be
referring to quality standards or regulations that specify
calibration. For example, ISO 9001 is a quality standard that
requires certified companies to calibrate their measurement
equipment, plus document the processes and procedures
involved. ISO/IEC 17025 is the quality standard that calibration
laboratories use to ensure they produce valid results. ISO/IEC
17025 is the quality standard that calibration laboratories use
.to ensure they produce valid results

How is a Calibration Performed?


There are several ways to calibrate an instrument depending on
the type of instrument and the chosen calibration scheme.
There are two general calibration schemes:
1. Calibration by comparison with a source of known value. 
An example of a source calibration scheme is measuring an
ohmmeter using a calibrated reference standard resistor. The
reference resistor provides (sources) a known value of the ohm,
the desired calibration parameter. A more sophisticated
calibration source like the resistor is a multifunction calibrator
that can source known values of resistance, voltage, current,
and possibly other electrical parameters. A resistance
calibration can also be performed by measuring a resistor of
unknown value (not calibrated) with both the DUT instrument
and a reference ohm meter. The two measurements are
compared to determine the error of the DUT.
2. Calibration by comparison of the DUT measurement with
the measurement from a calibrated reference standard.  A
variant of the source-based calibration is calibrating the DUT
against a source of known natural value such as a chemical
melt or freeze temperature of a material like pure water.
From this basic set of calibration schemes, the calibration
options expand with each measurement discipline.

Calibration Steps
A calibration process starts with the basic step of comparing a
known with an unknown to determine the error or value of the
unknown quantity. However, in practice, a calibration process
may consist of "as found" verification, adjustment, and "as left"
verification.
Many measurement devices are adjusted physically (turning an
adjustment screw on a pressure gauge), electrically (turning a
potentiometer in a voltmeter), or through internal firmware
settings in a digital instrument.
For example, for some devices, the data attained in calibration
is maintained on the device as correction factors, where the
user may choose to compensate for the known correction for
the device.  An example of this is RF attenuators, where their
attenuation values are measured across a frequency range. 
The data is kept with the instrument in the form of correction
factors, which the end-user applies to improve the quality of
their measurements.  It is generally assumed that the device in
question will not drift significantly, so the corrections will
remain within the measurement uncertainty provided during
the calibration for the calibration interval.  It is a common
mistake for people to assume that all calibration data can be
used as correction factors, because the short and long term
variation of the device may be greater than the measurement
uncertainty during the calibration interval.
Non-adjustable instruments, sometimes referred to as
“artifacts”, such as temperature RTDs, resistors, and Zener
diodes, are often calibrated by characterization. Calibration by
characterization usually involves some type of mathematical
relationship that allows the user to use the instrument to get
calibrated values. The mathematical relationships vary from
simple error offsets calculated at different levels of the
required measurement, like different temperature points for
a thermocouple thermometer, to a slope and intercept
correction algorithm in a digital voltmeter, to very complicated
polynomials such as those used for characterizing reference
standard radiation thermometers.
The “as left” verification step is required any time an
instrument is adjusted to ensure the adjustment works
correctly. Artifact instruments are measured “as-is” since they
can’t be adjusted, so “as found” and “as left” steps don’t apply.
A calibration professional performs calibration by using a
calibrated reference standard of known uncertainty (by virtue
of the calibration traceability pyramid) to compare with a
device under test.  He or she records the readings from the
device under test and compares them to the readings from the
reference source. He or she may then make adjustments to
correct the device under test.
Calibration Example
Let’s say that you use a precise thermometer to control the
temperature in your pharmaceutical plant processes and you
need to calibrate it regularly to ensure that your products are
created within specified temperature ranges.  You could send
your thermometer to a calibration lab or perform the calibration
yourself by purchasing a temperature calibrator, such as a liquid
bath calibrator or dry-well calibrator.  A liquid-bath calibrator
(like the Fluke Calibration models 6109A or 7109A portable
calibration baths) will have a temperature-controlled tank filled
with a calibration fluid connected to a calibrated temperature
display.  The dry-well calibrator is similar but a metal
temperature-controlled block will have measurement wells that
are sized to fit the diameter of the DUT thermometer.  The
calibrator has been calibrated to a known accuracy.  You place
your thermometer, the device under test (DUT), in the calibrator
tank or measurement well then you note the difference between
the calibrator display and the DUT over a distributed set of
temperatures within the range for which your thermometer is
used. In this way, you verify if your thermometer is within
specification or not.  If the thermometer needs to be adjusted,
you may be able to adjust the display of the thermometer, if it
has one, or you can use the calibration results to determine new
offsets or characterization values for the probe.  If you make
adjustments, then the calibration process is repeated to ensure
the adjustments worked correctly and verify that the
thermometer is within specification.  You can also use the
calibrator to occasionally check the thermometer to make sure
it's still in tolerance.  This same general process can be used for
many different measurement devices like pressure gauges,
voltmeters, etc.
A Dry-well Calibrator (Fluke 9190A) with Reference and DUT Thermometer Probes

Who Performs Calibrations?


People who perform calibration in laboratories include:

 Metrologists
 Lab managers
 Calibration engineers
 Calibration technicians

People who perform calibration work in the field include:

 Manufacturing engineers
 Instrument technicians

COSTS OF CALIBRATION

A successful calibration process requires hardware and


software, special equipment, and manpower, hence the costs
are variable depending intensity of use of these variables. The
cost of calibration depends on what is calibrated and who is
calibrating by it. In simple cases where a one-off instrument is
involved, cost can be lower than one hundred dollars, but
complex cases can cost thousands of dollars. Calibration cost
depends on whether the calibration is carried out on the
premises of calibrating laboratories or on the factory floor
being outsourced to third parties. Certification by ISO 10012-1,
ISO 9001, MIL-STD 45662A, and MIL-HDBK-52B requires
calibration for measuring equipment. In many situations, such
as weighting systems calibration, it is a statutory requirement.
One of the major factors for cost is the frequency of calibration
of an instrument. Most calibration systems issue

REFERENCES
a) Calibration Process , Halit Eren , March 2005

b) https://static-int.testo.com/media/5e/fb/0f62e1610132/Factsheet-push-food-coldchain-
EN.pdf

)https://us.flukecal.com/literature/about-calibrationc

You might also like