Common Terms Used in Instrument Calibration: Calibration Range

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

Common Terms Used in Instrument Calibration:

Calibration Range
The calibration range of an instrument is defined as the region between the limits within
which a quantity is measured, received or transmitted, expressed by stating the lower
(LRV) and upper(URV) range values. These limits are defined by the Zero and Span
values. The zero value is the lower end of the range or LRV and the upper range value is
the URV. For example if an instrument is to be calibrated to measure pressure in the
range 0psig to 400psig, then LRV = 0 and the URV = 400psig. The calibration range is
therefore 0 to 400psig.

Span
Span is defined as the algebraic difference between the upper and lower range values.
Span = URV – LRV
For the example considered above, where the calibration range is 0 to 400psig. Then our
span = 400 – 0 = 400psig.

Instrument Range
The instrument range refers to the capability of the instrument. It is often the nameplate
rating of the instrument. For example an instrument nameplate rating may read:
Instrument range 0 -800psig; Output 4 to 20mA.

Never confuse the instrument range with the calibration range. They are two different
things. Although our instrument range is 0 – 800psig, we may decide to calibrate it to a
range 0 – 400psig or even 0 – 800psig for an application with high input pressure in
which case the instrument range becomes the calibration range of the device.

Ranging an Instrument
To range an instrument means to set the lower and upper range values so it responds
with the desired sensitivity to changes in input. Suppose we want to use a pressure
transmitter to measure pressure in the range 0 -100 bar to give an output of 4 – 20mA.
To range this transmitter, we simply set:

0 bar = 4mA

100 bar = 20mA

Closely related to ranging is re-ranging which simply means resetting the lower and
upper range values to a different measurement range. For example, suppose we want to
re-range the above transmitter to now measure pressure in the range 50 – 150 bar we
simply reset as follows:

50 bar = 4mA
150 bar = 20mA.
Zero and Span Adjustments
Zero and Span Adjustments are commonly done on analog and smart instruments. By
adjusting both zero and span, we may set the instrument for any range of measurement
within the manufacturer’s limits. For most analog instruments, zero and span
adjustments are interactive. That is, adjusting one has an effect on the other. Specifically,
changes made to the span adjustment almost always alter the instrument’s zero point.
An instrument with interactive zero and span adjustments requires much more effort to
accurately calibrate, as one must switch back and forth between the lower- and upper-
range points repeatedly to adjust for accuracy

For smart instruments however, there is no interaction between the zero and span
adjustments.

Five Point Calibration


When calibrating an instrument, as a general rule, the instrument data points should
include readings taken at 0%, 25%, 50%, 75% and 100% of the calibration range of the
instrument. This is often referred to as a five-point calibration. During a five-point
calibration exercise, both upscale(increasing) and down scale(decreasing) testing should
be done to determine the repeatability and hysteresis of the particular instrument.

Field Calibration
In field calibration, the instrument is not removed from the process. In fact it remains in
its mounting brackets. Field calibration allows the field instrument to be tested or
calibrated at the true process and ambient conditions. Calibration done under field
conditions is often very different from those done under shop conditions and they even
produce different calibration results. Most field instruments have isolating valve
manifold that make it easy to disconnect them from the process. After disconnection,
the instrument is vented to the atmosphere before the test or calibration signal is
applied.

In-Shop or Bench Calibration


A bench calibration is a procedure where the instrument is calibrated at a calibration
bench using calibration devices to simulate the process, rather than calibrating the
device in the field using the actual process itself as the input means. Here, the
instrument is disconnected from the process, cleaned and taken to the shop where it is
mounted on a test stand at a calibration bench.

Bench Tester
A bench tester is used for carrying out bench calibration of an instrument or device. It
consists of a highly accurate standard gauge and a pressure source for producing test
pressure required for testing the instrument. Most bench testers are fabricated on the
job site by instrument technicians, while some are ordered as complete systems from
vendors. A standard bench should have various hoses and pumps that are well labelled
and organized to aid technicians in the calibration process.
Calibrators
Calibrators are used to calibrate instruments that require calibration. They vary in form
and function with the equipment or device they are designed to calibrate. Typical
calibrators include:

(a) Block calibrator and fluidized baths are used to calibrate temperature probes –RTDs,
Thermocouples etc.

(b) Signal Reference is used to calibrate panel meters and temperature controllers. It is a
type of calibrator that can generate a known electrical signal. There are voltage, current,
and frequency signal references. Once a signal from one of these calibrators is fed into
the equipment in question, the display or output value of the equipment can be
adjusted until it matches the known signal.

The simulator, a special kind of signal reference, generates sensor output. Signal
references and simulators can often read as well as generate signals.

(c) Pneumatic Calibrators. These are calibrators which provide a regulated pressure
regime required to test or calibrate pressure instruments. They are often used in
conjunction with a pressure source.

Calibration Records
Calibration records are the documentation that is done to ensure that the history of the
device or instrument is not lost. It also aid in troubleshooting any drift in the
instrument’s performance over time. Calibration records should show:

(a) The as found data

(b) The current calibration date

(c) The final calibration or as left data

(d) The name or initials of the technician who did the calibration

(e) The date the instrument is due for the next calibration
As Found Data
The as found data of an instrument to be calibrated is the response (reading) from the
device at the points of calibration (0%, 25%, 50%, 75% and 100%) before the actual
calibration exercise begins.
As Left Data
The as left data of an instrument is the response (reading) from the device at the points
of calibration (0%, 25%, 50%, 75% and 100%) after the instrument has been calibrated.
Traceability
All calibrations should be performed traceable to a nationally or internationally
recognized standard. Traceability is defined as the property of a result of a
measurement whereby it can be related to appropriate standards, generally national or
international standards, through an unbroken chain of comparisons. This means that the
calibrations performed are traceable to a national or international standard. In the U.S,
we have NIST as a national standard. The National Institute of Standards and
Technology (NIST), part of the U.S. Department of Commerce, oversees the
development of measurement standards and technology consistent with the
International System of Units (SI).

Traceability is achieved by ensuring that the test standards we use for calibration
operations are regularly calibrated by higher level reference standards. Typically

the measurement standards we use in a workshop are sent out periodically to a


standards laboratory which has more accurate test equipment. The standards from the
calibration laboratory are in turn periodically checked for calibration by higher level
standards, and so on until eventually the standards are tested against Primary Standards
maintained by NIST or another internationally recognized standard.

You might also like