Professional Documents
Culture Documents
ME8501 - Metrology and Measurements - Unit - I
ME8501 - Metrology and Measurements - Unit - I
Applications:
1. Industrial measurement
2. Commercial transactions
3. Public health and human safety ensuring.
Functions:
Input
Manipulation
Primary stage Conversion stage
stage
Analogous
output Amplified signal
Read out
recording stage
Primary or Input sensing stage
Secondary stage or Conversion stage
Tertiary stage or Manipulation stage
Final stage or Readout – Recording stage
Classification of measuring instruments:
1. On the basis of function
a) Length measuring instruments
b) Angle measuring instruments
c) Geometrical form checking instruments
d) Surface finish – checking instruments
2. On the basis of accuracy
e) Most accurate instruments. Eg: light interference instruments
f) Moderate accurate instruments. Eg: comparators.
g) Below moderate accurate instruments. Eg: dial indicator
3. On the basis of precision
h) Precision measuring instruments
i) Non precision measuring instruments
Characteristics of measuring instrument:
Sensitivity:
It is the degree of response of an instrument to an incoming signal. It is the
ratio of the change in the output of an indicating instrument or transducer to
the change in the value of the measured quantity.
Stability:
It is the property of a measuring instrument, whereby its metrological
properties remain constant with time. It helps to evaluate the performance of
an measurement device over time.
Hysteresis:
It is the difference between the indications of a measuring instrument when
the same value of the measured quantity is reached by increasing or
decreasing that quantity.
Range:
It describes the lowest to highest values that can be measured using that
particular instrument.
Span:
The difference between the highest and lowest calibration point is called span.
Linearity:
Linearity of an instrument refers to the output that is directly proportional to
input over its entire range,
Non linearity:
Non linearity of an instrument refers to the output that is not proportional to
input over its entire range.
Threshold:
The minimum value of input below which no output value is achieved is
known as threshold.
Repeatability and reproducibility:
Repeatability may be defined as the ability of the instrument to give same
output reading when the same input value is applied repeatedly under the
same operating conditions.
Reproducibility may be defined as the degree of closeness among the repeated
measurements of the output for the same value of input under the same
operating conditions at different times.
Response time:
The time taken by an instrument to approach its true output when subjected to
a step input is sometimes referred to as its response time.
Dead band/time:
Dead band of an instrument is the range of input values for which the
instrument does not respond. The dead band is typically a region of input
close to zero at which the output remains zero.
Dead time is the time taken by the sensor from the application of input to
begin its response and change.
Zero Drift:
Drift is the variation of change in output for a given input over a period of
time.
Resolution:
It is defined as the smallest change that can be detected by an
instrument. It can also be defined as the maximum value of the input
required to cause an appreciable change or an increment in the
output.
Fidelity:
The degree to which an instrument indicates the changes in a measured
value without dynamic error is called fidelity.
Accuracy Vs Precision
Fundamental Units
1 Length Meter m
2 Mass Kilogram kg
3 Time Second s
5 Temperature Kelvin K
Supplementary Units
Thermal units: Units for specific heat capacity, laten heat, sensible
heat etc.
Pearson
In science, the observer effect refers to changes that the act of observation
the internal state of a given real system from measurements of the input
Depending on the functions and applications, different types of standards of measurement are
classified as follows:
1. International standards
2. Primary standards
3. Secondary standards
4. Working standards
Types of measuring Instruments:
1. Device accuracy
2. Procedural effects
3. Environmental effects
Errors in measurement:
Error is the difference between the measured value and the true value.
The errors in measurement can be expressed either as an absolute error or relative error.
True absolute error: Algebraic difference between the results of measurement and the true
value of the quantity measured is called true absolute error.
Apparent absolute error: while taking the series of measurement, the algebraic difference
between one of the results of measurement to the arithmetic mean is called apparent absolute
error.
Random error: It is defined as the result of absolute error and the value of comparison used
for the calculation of absolute error.
Characteristic error
Instrumental error
Random error
Environmental error
Problem 1:
Problem 2: