Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 7

Lecture 4

Calibration of Measurement Systems

• Calibration may be defined as the process of comparing the output of the


instrument under test against the output of an instrument of known
accuracy, when the same input is applied to both instruments.

• This procedure is carried out for a range of inputs covering the whole
measurement range of the instrument.

• Calibration ensures that measuring accuracy/standard of all instruments


used in a measurement system is known over the whole measurement
range, provided the calibration instruments are used in environmental
conditions that are same as those under which they were calibrated.

• There are two basic types of calibration: static and dynamic calibration.
Static Calibration
• Static calibration is the most common type of calibration. All measurements
involving a static calibration are referred to as static measurements.

• In static calibration, values of the variables involved remain constant (do not
change with time). In a static measurement, only the magnitudes of the
known input and the measured output are important.

• By application of a range of known values for the input and observation of


the system output, a direct calibration curve can be developed for the
measurement system. On such a curve the input is plotted on the x-axis
where as the measurand is plotted on the y-axis. (The input value is a
controlled independent variable while the measured output value becomes
dependent variable.)

• The static calibration curve describes the static input-output relationship for a
measurement system and forms the logic by which the indicated output can
be interpreted during an actual measurement. Calibration curves are used to
validate all range of instrument values. Manufacturer usually develops them.
Continued…
• Some important characteristics of static calibration are defined as follows:

1. Accuracy: The closeness with which a measuring instrument can


measure the true value of the measurand under stated environmental
conditions. True value is elusive. Like value of pi. strike right balance of
accuracy.

2. Sensitivity: The relationship between a change in the output reading


for a given change of the input. (An instrument with a large sensitivity
will indicate a large movement of the indicator with a small input
change.)Galvanometer a sensing device. Highly sensitive.

3. Linearity: Most instruments are specified to function over a


particular range and the instruments can be said to be linear when
incremental changes in the input and output are constant over the
specified range. Depends on type of sensor. If sensing element is
showing lin behavior in range than design for that range.
Continued…

(d) Resolution: It is defined as the smallest input increment change that


gives some small but definite numerical change in the output.ruler resolution
1mm.

(e) Threshold: If the instrument input is very gradually increased from


zero there will be minimum value required to give a detectable output
change. This minimum value defines the threshold of the instrument.

(f) Repeatability: This is the ability of a measuring instrument to give


identical indications, or responses, for repeated applications of the same
value of the measurand under stated conditions of use.

(g) Drift: It is the variation in the output of an instrument, which is not


caused by any change in the input, it is commonly caused by internal
temperature changes and component instability. Behavior not due to problem.
Continued…

(h) Zero Stability: It is a measure of the ability of an instrument to return to zero


reading after the measurand has returned to zero and other variations such as
temperature, pressure, vibration etc. have been removed.

(i) Dead Band: It is the largest change in the measurand to which the instrument
does not respond. This may be produced by friction, backlash or hysteresis in
the instrument. Region exist in cali graph where no output given.

(j) Readability: It is defined as the ease with which readings may be taken
with an instrument. Output quality and design. One glance.in analog
instruments.

(k) Range: Scale range is defined as the difference between the nominal values
of the measured quantities corresponding to the terminal scale marks. (This is
normally expressed in the form ‘A to B’ where A is the minimum scale value
and B is the maximum scale value.) Instrument range is the total range of
values which an instrument is capable of measuring. Instrument range can de
Dynamic Calibration

• A dynamic calibration determines the relationship between an input


of known dynamic behavior and the measurement system output.
Such calibrations involve either a sinusoidal signal or a step change
as the known input signal. All measurements involving dynamic
calibration are referred to as dynamic measurements. Pressure
measurement…cyclic pressure change.time dependant
variables.same variables but dep on time.
References/Further Reading

• Alan. S. Morris, Measurement and Instrumentation Principles, Butterworth-


Heinemann, Noida, 2001.
• Richard S. Figliola, Donald E. Beasley, Theory and Design for Mechanical
Measurements, John Wiley, Singapore, 2004.
• C. V. Collet, A. D. Hope, Engineering Measurements, Pitman, London,
1983.

You might also like