Sensors and Transducers: 1 APHY 101

You might also like

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 62

Sensors and Transducers

1st Lecture APHY 101


Prof. N.C. Altoveros Physics Division, IMSP UPLB

Transducer a device which transforms energy from one form to another. In instrumentation, it is more appropriately defined as a device used in measurement and control of physical properties in the laboratory and industrial plants.

Sensor and detector two words more or less

synonymous with transducer. A sensor or detector is only a part of a transducer.

Example: A transducer might be composed of a platinum resistance sensor and an associated bridge circuit.

Example: A simple oven in which the input signal from the temperature sensor is compared with a set-point voltage. The op. amp. output is then used to control a heater or silicon controlled rectifier (SCR).

Standards and Traceability


The basis of accepted measurement is the Systeme

Internationale (SI). The SI is regularly reviewed and refined as new techniques for measurement are developed.

The SI base units

Length (meter) distance traveled by light in vacuum in 1/299 792 458 second.

The SI base units


Mass (kg) mass of a platinum- iridium cylinder kept

at the International Bureau of Standards in Paris.

The SI base units


Time (second) time taken for 9 192 631 770 oscillations of microwave radiation from the isotope cesium-133.

The SI base units


Temperature (Kelvin) the temperature

difference between absolute zero and the triple point of water is 273.16 Kelvin (K)

The SI base units


Electric current (ampere) the current flowing

through two infinitely long thin parallel wires one meter apart which produces a force of 2x10-7 newtons per meter length of the wire.

The SI base units


Luminous intensity (candela) the intensity of

visible light emitted by a blackbody of area 1/60 000 m2 at the temperature of freezing platinum.

Base & Some Derived Units


Length (meter) Mass (kilogram) Time (second) Temperature (Kelvin) Electric current(ampere) Luminous intensity (candela) Amount of substance (mole)

Frequency (hertz) Emf (volt) Resistance (ohm) Luminous flux (lumen) Density (kg m-3) Viscosity (Pa s) Power (watt) Energy (joule)

Traceability
National standard laboratory Approved calibration laboratory

Manufacturers calibration
Reference instruments Transducer

Variable to be measured

Time and Frequency Standards


The most accurately determined and probably the

most important single standard is time (or frequency). The second is defined as 9 192 631 770 oscillations of the Cs-133 atomic clock.

Cesium atomic clock

Electrical Standards
Although the ampere is the SI base unit for electrical

quantities, it is difficult to measure the force between conductors to much more than 1 part in 105.
National standard laboratories therefore maintain

primary standards of the volt and the ohm (related to the ampere by the equation V=IR and by power P=IV).

Voltage
The traditional standard of emf or voltage was the

Weston standard cell, a cadmium-mercury electrochemical cell with a nominal emf of 1.018636 volt.

Voltage
The international standard was redefined in terms of

the Josephson effect.

Josephson effect
The voltage steps depend only on the frequency of

the applied radiation and are given by

nh V 2e

Josephson junction array chip developed by NIST as a standard volt

Josephson effect
nh V 2e
The Josephson effect provides an exactly reproducible conversion between frequency and voltage. Since the frequency is already defined precisely and practically by the caesium standard, the Josephson effect is used, for most practical purposes, to give the definition of a volt.

Josephson effect
nh V 2e
This definition has two advantages.

First, it is the same for any two superconducting materials, and does not depend on the purity and preparation of the materials.

Josephson effect
nh V 2e
Second, the only other measurement involved is

one of frequency, which is the most accurately measurable base unit.

Josephson effect
The Josephson constant Kj is defined as Kj = 2e/h = 483.5979 x 1012 Hz V-1 so the Voltage becomes

n V Kj

Progress check
Q. What is the 1st voltage given by a Josephson junction if the incident microwave radiation has a frequency of 10 GHz ? A.

Progress check
Q. What is the 1st voltage given by a Josephson junction if the incident microwave radiation has a frequency of 10 GHz ? A. V= (1)10.0x109/483.5979 x 1012 V= 20.67833 V

Voltage standards
Practical laboratory voltage standards of fair

accuracy are available from semiconductor technology. The simple Zener diode has a reverse breakdown voltage Vz which varies by about 0.1% with temperature (over 20-70 C) and about 1% with load current.

Voltage standards
The preferred device is the band-gap reference

which has a very low temperature coefficient and good line and load stability.
A band-gap voltage reference is a temperature independent voltage reference circuit widely used in integrated circuits, usually with an output voltage around 1.25 V, close to the theoretical 1.22 eV band-gap of silicon at 0 K.

Voltage standards
Typical specs of a bandgap reference diode Nominal voltage: 10.240V Tolerance: 0.03% (300 ppm) Load stability: 0.01% (I<5mA) Line stability: 0.02% Aging: less than 0.2% per year Temperature coeff: 0.004% C-1

Resistance
Resistance standards are traditionally based on the

measured resistance of metal resistors of high purity and exact dimensions.

Resistance
A more recent standard uses the quantized Hall

effect for which von Kitzing was awarded the 1985 Nobel prize in physics.
The Hall effect is a DC voltage generated when a

magnetic field B is applied at right angles to a conducting strip.

Resistance
The Hall voltage VH is measured at right angles to both B and current I.

Resistance

The Hall resistance RH is defined as:

V R H H I

Resistance
To get the quantum Hall effect, the conductor is a very thin (about 0.1m thick) layer of n-type semiconductor in the channel of a Field Effect Transistor (FET). A high magnetic field (~ 1T) is applied at right angles to the layer, and the whole apparatus must be maintained at temperatures below that of liquid helium (less than 4.2 C).

Resistance
The Hall resistance then shows a series of steps at values:

RH

1. h _ _ 2 n e

as the field is increased.

Resistance
Practical laboratory resistance standards are precision wire-wound resistors made from special alloys with very low temperature coefficients of resistance.

Resistance
An example specification of a precision wire wound resistor is:
Temperature coeff:
Aging: Thermal emf:

5 ppm/ C
35 ppm/year 0.2 V/ C

QUIZ #1
1. What are the six fundamental quantities used in

physics. Give the units and standards used for measuring them (note: 1 sentence description suffice). 12 points.

2. Discuss using illustration the concept of Hall Resistance. 8 points

Instrument Properties

Resolution

Sensitivity
Range Calibration

Precision
Accuracy Traceability

Linearity Offset Hysterisis Drift Noise Cross Sensitivity Response Time

Instrument Properties
Resolution

defined as the smallest amount of the quantity being measured which the sensor will detect.

Instrument Properties
Sensitivity a measure of how much an output signal varies with respect to the input signal the sensor will produce some output O which varies as the input I changes.

change in Output O S= change in Input I

O = I

Instrument Properties
Range

is the region of operation where the relationship between the output and the input is known and is predictable.

Instrument Properties
Calibration is the process of comparing a measurement with known values of the physical variable.

Instrument Properties
Precision (reproducibility) a measure of the nearness of the measured values to each other. A measure of repeatability. it relates to the uncertainty in measurement and for normal data distributions can be expressed through the standard deviation.

Instrument Properties
Accuracy refers to the nearness of a measured value with a standard or true value it relates to discrepancy in measurement and can be defined in terms of the fractional error.

Instrument Properties
Accuracy vs. Precision

Instrument Properties
Traceability is a term used to describe the chain of calibrations that can link the assigned value of any component to the national standard.

Instrument Properties
Linearity determines the degree which the output is proportional to the input. Deviations from linearity can be expressed either as proportional or independent.

Instrument Properties
Proportional or Independent Linearity

Instrument Properties
Offset or Zero Error a finite output for a zero input. An instrument provides a reading when it is not measuring anything.

Instrument Properties
Hysterisis when output depends on whether input is increasing or decreasing. The reading is different when the measurement being done when the parameter is increasing compared to when the parameter is decreasing. It is eliminated by averaging output for ascending and descending values of input.

Instrument Properties
Drift describes the property of some instruments where the output changes with time. It is a problem common to electronic instrumentation.

Instrument Properties
Noise is defined as a signal originating in the measuring system which varies randomly with time and is superimposed on the output signal.

Instrument Properties
The relationship between signal and noise can be expressed as an amplitude ratio.

S N

average signal rms noise

or as a power ratio

S N

signal power noise power

Instrument Properties
Cross Sensitivity

a response of the sensor to other inputs from the system. For instance, a CO2 gas sensor might respond also to H2O vapor.

Instrument Properties
Response Time

Any sensor will need a finite time to respond to a sudden change in the measured property.

Progress check
Q. A thermocouple has an emf of zero at 27 C and 43 mV at 100 C. What is its sensitivity? A.

Progress check
Q. A thermocouple has an emf of zero at 27 C and 43 mV at 100 C. What is its sensitivity? A. 0.59 mV C-1

Progress check
Q. The output of an instrumentation amplifier is 1.2V and the rms noise is 24 V. What is the noise (i) as an amplitude ratio (ii) in dB?

Progress check
Q. The output of an instrumentation amplifier is 1.2V and the rms noise is 24 V. What is the noise (i) as an amplitude ratio (ii) in dB?

A.

(i) 0.000 02 (20 ppm) (ii) +94 dB

Thank You

You might also like