Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 51

METROLOGY –ETAT 307

METROLOGY

SCIENCE OF MEASUREMENT
Deterministic metrology.
Dynamic Metrology. This is a new philosophy in which
The technique of part measurement is replaced by
Legal Metrology. measuring small variations process measurement. The new
That part of metrology of a continuous nature. techniques such as 3D error
which treats units of The technique has proved compensation by CNC (Computer
measurements, methods very valuable, and a Numerical Control) systems and
of measurements and the record of continuous expert systems are applied, leading
measuring instruments, in measurement, over a to fully adaptive control. This
relation to the technical surface, for instance, has technology is used for very high
and legal requirements obvious advantages over precision manufacturing machinery
individual measurements and control systems to achieve micro
of an isolated character. technology and nanotechnology
accuracies.
SYLLABUS –ETAT 307

UNIT - I                                                                                 
Principles of measurement: Definition of Metrology, difference between
precision and accuracy. Sources of errors: Controllable and Random Errors,
Effects of Environment and Temperature, Effects of support, alignment
errors, application of Least Square principles, errors in measurement of a
quality which is function of other variables. Introduction to Coordinate
Measuring Machine (CMM).
Length Standards: Line standards, end standards and wavelength standards,
transfer from line standards to end standards.  Numerical based on line
standards. Slip gauges – its use and care, methods of building different heights
using different sets of slip gauges.
Limits, fits and Tolerances: Various definitions, IS919-1963, different types
of  fits and methods to provide these fits.  Numerical to calculate the limits, fits
and tolerances as per IS 919-1963.  ISO system of limits and fits; Gauges and its
types, limit gauges – plug and ring gauges.  Gauge Design – Taylor’s Principle,
wear allowance on gauges.  Different methods of giving tolerances on gauges,
Numericals.
[T1,T2][No. of Hrs. 11
SYLLABUS –ETAT 307

UNIT - II
Comparators: Mechanical Comparators: Johanson Mikrokator and Sigma Mechanical
Comparator. Mechanical-optical comparator. Principles of Electrical and electronic
comparators.  Pneumatic comparators – advantages, systems of Pneumatic gauging:-
Flow type and back pressure type,  Principle of working of back pressure gauges,
different type of sensitivities and overall magnification, Solex Pneumatic gauges and
differential comparators. Numericals based on pneumatic comparators.

Angular Measurement:  Sine Bar – different types of sine bars, use of sine bars in


conjunction with slip gauges, precautions and calibration of sine bars.  Use of angle
gauges, spirit level, errors in use of sine bars. Numericals. Circular Division: dividing
head and circular tables, circular division by precision Polygons. Caliper Principle,
Calibration of polygons.  Numerical based on circular division.
[T1,T2][No. of Hrs. 11]
SYLLABUS –ETAT 307

UNIT - III
Straightness and Flatness: Definition of Straightness and Flatness error. Numericals
based on determination of straightness error of straight edge with the help of spirit
level and auto collimator.  Numericals based on determination of flatness error of a
surface plate with the help of spirit level or auto collimator. Surface texture, different
types of irregularities, standard measures for assessment and measurement of surface
finish.     
Screw Thread Measurement: Errors in threads, Measurement of elements of screw
threads –major dia, minor dia, pitch, flank angle and effective diameter (Two and
three wire methods).  Effect of errors in pitch and flank angles and its mathematical
derivation. Numericals.
Gear Measurement:  Measurement of tooth thickness – Gear tooth vernier caliper,
Constant chord method, base tangent method and derivation of mathematical
formulae for each method.  Test plug method for checking pitch diameter and tooth
spacing.  Measurement of Gear Pitch, Parkinson Gear Tester, Numericals.
[T1,T2][No. of Hrs. 11
SYLLABUS –ETAT 307

UNIT - IV
Machine Tool Alignment: Machine tool tests and alignment tests on lathe.  Alignment
tests on milling machine.  Alignment tests on a radial drilling machine.
Interferometry:  Principle of measurement, Interferometry applied to flatness testing,
surface contour tests, optical flats, testing of parallelism of a surface with the help
of  optical flat. Quantitative estimate of error in parallelism, Flatness Interferometer NPL-
Gauge length interferometer for checking the error in slip gauges. Numericals based on
Interferometry
Introduction to Seismic Transducers - Displacement and acceleration measurement, Pressure
measurement - Bourdon pressure gauge, bulk modulus gauge, pirani gauge.
[T1,T2][No. of Hrs. 10]
Text Books:
[T1] R.K. Jain, “Engineering Metrology”, Khanna Publishers, Delhi
[T2] I.C. Gupta, “Engineering Metrology”, Dhanpat Rai Publications, Delhi
 
Reference Books:
[R1] F.W. Galyer & C.R. Shotbolt, “Metrology for Engineers”, ELBS edition\
[R2] Beckwith, Buck, Lienhard, Mechanical Measurements, Pearson Education\
[R3] Anand K Bewoor, Vinay A Kulkarni “ Metrology and Measurement”,TMH
Introduction to Metrology

 Metrology word is derived from two Greek words such as metro


which means measurement and logy which means science. 

 Metrology is the science of precision measurement. The engineer


can say it is the science of measurement of lengths and angles and
all related quantities like width, depth, diameter and straightness
with high accuracy. 

 Metrology demands pure knowledge of certain basic mathematical


and physical principles. The development of the industry largely
depends on the engineering metrology.

 Metrology is concerned with the establishment, reproduction and


conservation and transfer of units of measurements and their
standards. Irrespective of the branch of engineering, all engineers
should know about various instruments and techniques.
Objective of Metrology

 To standardize the measuring methods

 To maintain the accuracies of measurement.

 Complete evaluation of newly developed products.

 Determination of the process capabilities and ensure that these are better than
the relevant component tolerances.

 Determination of the measuring instrument capabilities and ensure that they are
quite sufficient for their respective measurements.

 Minimizing the cost of inspection by effective and efficient use of available


facilities.

 Reducing the cost of rejects and rework through application of Statistical Quality
Control Techniques.

 To prepare designs for all gauges and special inspection fixtures.


Need of Inspection/ Metrology

 To ensure the material, parts and components conform to the


established standards
 To meet the interchangeability of manufacture
 To provide the means of finding the problem area for meeting the
established standards
 To produce the parts having acceptable quality levels with reduced
scrap and wastage
 To purchase good quality of raw materials, tools and equipment
that govern the quality of finished products
 To take necessary efforts to measure and reduce the rejection
percentage
 To judge the possibility of rework of defective parts
Metrology and Manufacturing
Metrology and Manufacturing
Elements of Metrology

• Standard
• The most basic element of measurement is standard without which no
measurement is possible.
• Standard is a physical representation of unit of measurement.
• Different standards have been developed for various units including
fundamental units as well as derived units.
• Workpiece
• Work piece is the object to be measured/measured part
• Variations in geometry and surface finish of the measured part
directly affect measuring system’s repeatability
• Compressible materials like plastic or nylons pose a different type of
problem that any gauge pressure will distort the material. This can be
avoided by fixing of gauging pressure as suggested by the industry so
that everyone will get uniform results
Elements of Metrology
• Instruments
• Instrument is a device with the help of which the measurement can be
done
• The instrument must be selected based on the tolerance of the
parts to be measured, the type of environment and the skill
level of operators
• It should be remembered that what type of instruments the customer
prefer to measure.
• Person
• There must be some person or mechanism to carryout the
measurement
• Modern gauges are increasingly easy to use but the failure to
adequately train the operating personnel will lead a poor performance
Effects of elements of metrology on Precision and
Accuracy
Accuracy of Measurements:
The purpose of measurement is to determine the true dimensions of a part. But
no measurement can be made absolutely accurate. There is always some error.
The amount of error depends upon the following factors:
 The accuracy and design of the measuring instrument
 The skill of the operator
 Method adopted for measurement
 Temperature variations
 Elastic deformation of the part or instrument etc

The true dimension of the part cannot be determined but can only by approximate.
The agreement of the measured value with the true value of the measured quantity is called
accuracy. If the measurement of dimensions of a part approximates very closely to the true value of
that dimension, it is said to be accurate.
The term accuracy denotes the closeness of the measured value with the true value. The difference
between the measured value and the true value is the error of measurement. The lesser the error,
more is the accuracy.
Effects of elements of metrology on Precision and
Accuracy

Precision and Accuracy

Accuracy:
Accuracy may be defined as the ability of an instrument to respond to a true value of measured
variable under the reference conditions. It refers how closely the measured value agrees
with the true value. The difference between the measured value and the true value is know
as Error of measurement.
Precision:
Precision may be defined as the degree of exactness for which an instrument is designed or
intended to perform. It refers the repeatability or consistency of measurement when
the measurements are carried out under identical conditions
Effects of elements of metrology on Precision and
Accuracy
Effects of elements of metrology on Precision and
Accuracy
Factors affecting elements of metrology

 Factors affecting the standard of measurement


• Coefficient of thermal expansion
• Elastic properties of a material
• Stability with time
• Calibration interval
• Geometric compatibility

 Factors affecting the workpiece to be measured


• Coefficient of thermal expansion of material
• Elastic properties of a material
• Cleanliness, surface finish, surface defects such as scratches,
waviness etc.,
• Adequate datum on the workpiece
• Thermal equalization
Factors affecting elements of metrology

 Factors affecting the characteristics of an instrument


• Scale error
• Repeatability and readability
• Calibration errors
• Effect of friction, zero drift, backlash etc.,
• Inadequate amplification
• Deformation when heavy workpieces are measured
• Constant geometry for both workpiece and standard.

 Factors affecting person


• Training/skill
• Ability to select the measuring instruments and standard
• attitude towards accuracy
• Planning measurement techniques for minimum
cost, consistent with precision requirements etc.
Factors affecting elements of metrology

 Factors affecting environment


• Temperature, humidity, atmosphere, pressure etc.,
• Clean surrounding and minimum vibration enhance precision
• Temperature equalization between standard, workpiece
and instrument.
• Thermal expansion effects due to heat radiation from
lights, heating elements, sunlight and people.
• Manual handling may also introduce thermal expansion.
Factors affecting elements of metrology

Higher accuracy can be achieved only if, ail the sources of error due to the
above five elements in the measuring system are analyzed and steps taken
to eliminate them.

The above analysis of five basic metrology elements can be composed into
the acronym. SWIPE, for convenient reference where,
S - STANDARD
W- WORKPIECE
I - INSTRUMENT
P-PERSON
E – ENVIRONMENT
Errors in Measurement

• An error may be defined as the difference between the measured value


and the actual value
• True value may be defined as the average value of an infinite
number of measured values
• Measured value can be defined as the estimated value of true value
that can be found by taking several values during an experiment.

• Error in measurement=Measured value-True value


The Errors in measurement can be expressed either as an absolute or
relative error
Errors in Measurement

 Absolute Error:
• True absolute error: It is the algebraic difference between the
result of measurement and the conventional true value of the
quantity measured.
• Apparent absolute error: If the series of measurement are made
then the algebraic difference between one of the results of
measurement and the arithmetical mean is known as apparent
absolute error.

 Relative Error:
It is the quotient of the absolute error and the value of comparison
use or calculation of that absolute error. This value of comparison
may be the true value, the conventional true value or the arithmetic
mean for series of measurement. '.
Types of Errors

1. Systematic Error:
 These error include calibration errors, error due to variation in the
atmospheric condition Variation in contact pressure etc.
• If properly analyzed, these errors can be determined and reduced or even
eliminated hence also called controllable errors.
• All other systematic errors can be controlled in magnitude and sense
except personal error.
• These errors results from irregular procedure that is consistent in action.
• These errors are repetitive in nature and are of constant and similar form.
Types of Errors

2. Random Error:
• These errors are caused due to variation in position of setting standard and
work-piece errors.
• Due to displacement of level joints of instruments, due to backlash and
friction, these error are induced.
• Specific cause, magnitude and sense of these errors cannot be determined
from the knowledge of measuring system or condition of measurement.
• These errors are non-consistent and hence the name random errors.
Types of Errors
Types of Errors
Types of Errors

3. Environmental Error:
• These errors are caused due to effect of surrounding
temperature, pressure and humidity on the measuring
instrument.
• External factors like nuclear radiation, vibrations and magnetic
field also leads to error.
• Temperature plays an important role where high precision is
required. e.g. while using slip gauges, due to handling the slip
gauges may acquire human body temperature, whereas 20 the
work is at 20°C. A 300 mm length will go in error by 5 microns
which is quite a considerable error.
• To avoid errors of this kind, all metrology laboratories and
standard rooms worldwide are maintained at 20°C.
Types of Errors

4. Alignment Error (Cosine Error):


Types of Errors
Types of Errors

5. Elastic Deformation or Support Error:


Long bars due to improve support or due to self weight may undergo deflection or may
bend. As shown in Figure, due to less or high distance between the support, A long bar
tends to deform

Such errors can be reduced if the distance between the support point is kept as 0.577 of
the total distance of bar.
Types of Errors

6. Dirt Error:
Sometimes, dirt particles can enter in the inspection room through the door and the
windows. These particles can create small dirt 22 errors at the time of measurement.
These errors can be reduced by making dust proof, laboratories.
Types of Error

7. Contact Error:
• The rings as show in Figure whose thickness is to be measured.
• Number of times, the contact of jaws with work piece plays an important
role while measure in laboratory or work shops.
• The following example shows the contact error. If the jaws of the instrument
are placed as shown in Figure the error 'e' is developed, which is because of
poor contact only.
Types of Error

8. Parallax Error (Reading Error):


• The position of the observer at
the time of taking a reading (on
scale) can create errors in
measurement.
• For this two positions of the
observers are shown (X and Y),
which will be the defect
generating positions. Position Z
shows the correct position of the
observer i.e. he should take
readings by viewing eye position
exactly perpendicular to the scale
Methods of Measurement

• Direct method
• In this method, the quantity to be measured is directly compared with
the primary or secondary standard.
• This method is widely employed in production field.
• In this method, a very slight difference exists between the actual and the
measured values because of the limitation of the human being
performing the measurement.

• Indirect method
• In this method, the value of quantity is obtained by measuring other
quantities that are functionally related to the required value.
• Measurement of the quantity is carried out directly and then the value is
determined by using a mathematical relationship.
• Eg: angle measurement using sine bar
Methods of Measurement
Methods of Measurement

• Fundamental or absolute method


• In this method, the measurement is based on the measurements of base
quantities used to define the quantity.
• The quantity under consideration is directly measured and is then linked with the
definition of that quantity.
• Comparative method
• The quantity to be measured is compared with the known value of the
same quantity or any other quantity practically related to it.
• The quantity is compared with the master gauge and only the deviations from the
master gauge are recorded after comparison.
• Eg. Dial indicators
Methods of Measurement

• Transposition method
• This method involves making the measurement by direct comparison
wherein the quantity to be measured(V) is initially balanced by
a known value (X) of the same quantity. Then, ‘X’ is replaced
by the quantity to be measured and balanced again by
another known value (Y). If the quantity to be measured is equal
to both X and Y, then
• V= 𝑋𝑌
• Eg. Determination of mass by balancing methods
• Coincidence methods
• In this method, a very minute difference between the quantity
to be measured and the reference is determined by careful
observation of certain lines and signals
• Eg: Vernier caliper
Methods of Measurement
Methods of Measurement
Methods of Measurement
Methods of Measurement

• Deflection method
• This method involves the indication of the value of the quantity to be
measured by deflection of a pointer on a calibrated scale.
• Eg. Pressure measurement

• Null measurement method


• In this method, the difference between the value of the quantity to be
measured and the known value of the same quantity with which
comparison is to be made is brought to be zero.

• Substitution method
• This method involves the replacement of the value of the quantity to be
measured with a known value of the same quantity, so selected that the
effects produced in the indicating device by these two values are the same.
Methods of Measurement

• Contact method
• In this method, the surface to be measured is touched by the sensor or
measuring tip of the instrument.
• Eg. Micrometer, Vernier calliper and dial indicator
• Contactless method
• As the name indicates, there is no direct contact
with the surface to be measured
• Eg. Tool makers microscope, profile projector
• Composite method
• The actual contour of a component to be
checked is compared with its maximum and minimum tolerance
limits.
• Cumulative errors of the interconnected elements of the component which are
controlled through a combined tolerance can be checked by this method.
Methods of least square

•The least squares method is a statistical procedure to find the best fit for a
set of data points by minimizing the sum of the offsets or residuals of points
from the plotted curve.

•Least squares regression is used to predict the behavior of dependent


variables.

Example of the Least Squares Method


An example of the least squares method is an analyst who wishes to test the
relationship between a company’s stock returns, and the returns of the index for
which the stock is a component. In this example, the analyst seeks to test the
dependence of the stock returns on the index returns. To achieve this, all of the
returns are plotted on a chart. The index returns are then designated as the
independent variable, and the stock returns are the dependent variable. The line of
best fit provides the analyst with coefficients explaining the level of dependence.
Methods of least square

The Line of Best Fit Equation

The line of best fit determined


from the least squares method
has an equation that tells the
story of the relationship
between the data points. Line
of best fit equations may be
determined by computer
software models, which
include a summary of outputs
for analysis, where the
coefficients and summary
outputs explain the
dependence of the variables
being tested.
Methods of least square
Introduction to Coordinate Measuring Machine (CMM)

• A coordinate measuring machine (CMM) is a device that measures


the geometry of physical objects by sensing discrete points on the
surface of the object with a probe.

• Various types of probes are used in CMMs, including mechanical,


optical, laser, and white light.

• Depending on the machine, the probe position may be manually


controlled by an operator or it may be computer controlled.

• CMMs typically specify a probe's position in terms of its displacement


from a reference position in a three-dimensional Cartesian coordinate
system (i.e., with XYZ axes).

• In addition to moving the probe along the X, Y, and Z axes, many


machines also allow the probe angle to be controlled to allow
measurement of surfaces that would otherwise be unreachable.
Introduction to Coordinate Measuring Machine (CMM)

The typical 3D "bridge" CMM allows probe


movement along three axes, X, Y and Z, which are
orthogonal to each other in a three-dimensional
Cartesian coordinate system.

Each axis has a sensor that monitors the position


of the probe on that axis, typically with micrometer
 precision.

When the probe contacts (or otherwise detects) a


particular location on the object, the machine
samples the three position sensors, thus
measuring the location of one point on the object's
surface.

This process is repeated as necessary, moving the


probe each time, to produce a "point cloud" which
describes the surface areas of interest.

You might also like