Biometrics 1 63

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

06-09-2021

Introduction
Biometrics
• Biometrics are automated methods of recognizing a
person based on a
• Physiological (hand, finger, palmprint, face, iris, vein, etc.)
• method for uniquely recognizing an individual using his or her
Dr. Saravanan Chandran intrinsic physical traits.
Associate Professor • Behavioral (signature, keystroke dynamics, voice, gait,
mouse movement, etc.)
Department of Computer Science and • measure of uniquely identifying and measurable patterns in
Engineering, NIT Durgapur human activities
cs@cse.nitdgp.ac.in • Biometrics applications are deployed at airports,
+91-94347-88036 offices, banking, colleges, law enforcement, health
services, etc.
• Example Video https://youtu.be/5glvuLUr06w

History Fingerprint
• Cataloguing of fingerprints dates back to 1881
when Juan Vucetich started a collection of
fingerprints of criminals in Argentina.[17]
• Josh Ellenbogen and Nitzan Lebovic argued that
Biometrics originated in the identification systems
of criminal activity developed by Alphonse Bertillon
(1853–1914) and by Francis Galton's theory of
fingerprints and physiognomy.[

Dr. Saravanan Chandran, NIT Durgapur, India

1
06-09-2021

Iris and Retina Hand Geometry

Handwriting Facial Recognition

2
06-09-2021

Vein Nailbed

Dr. Saravanan Chandran, NIT Durgapur, India

palmprint others
• Body salinity
• Body odor – electronic nose
• Thermography
• DNA
• Ear shape
• Gait
• Sweat Pores

Dr. Saravanan Chandran, NIT Durgapur, India Dr. Saravanan Chandran, NIT Durgapur, India

3
06-09-2021

others Need
• Skin luminescence • Security breaches and transaction fraud increases.
• Brain wave pattern • Biometric technologies are becoming the
• Footprint recognition foundation of an extensive array of highly secure
identification and personal verification solutions.
• Foot dynamics
• Confidential financial transactions and Personal
data privacy.

Dr. Saravanan Chandran, NIT Durgapur, India

Applications Distinct Stages of Biometrics


• Biometric-based authentication applications • Enrollment
include • Provide identifier and biometric
• workstation, network, and domain access • Factors are position, distance, pressure, environment,
• Features can change over time, re-enroll
• single sign-on, application logon
• Verfication
• data protection • One to one matching – Attendance
• remote access to resources • Identification
• transaction security and Web security • One to N matching – Law enforcement

4
06-09-2021

Basic Image Operations Enhancement


• Enhancement • A process of enhancing the visual quality of images
• Filter due to nonideal image acquisition process (e.g.,
poor illumination, coarse quantization etc.)
• Edge Detection
• No reference (original) image is available for
• Localisation comparison
• Smoothning • Human vision system (HVS) is the JUDGE.
• Sharpning
• Thresholding

Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,


India India

Enhanced Image Technique Types


• Point operations
• Histogram Equalization
• Unsharp masking
• Homomorphic filtering

Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,


India India

5
06-09-2021

Point operations
• Point operations are zero-memory operations
where a given gray level x∈[0,L] is mapped to
another gray level y∈[0,L] according to a
transformation.
• Based only on the intensity of single pixels.
• Linear
• Applications - Contrast Enhancement / Feature
Enhancement

Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,


India India

Important point operations Image negatives


• Image negatives • The transformation is very simple :
• Contrast stretching
• Gray-level slicing s=T(r)
• Bit-plane slicing
T(r)=(L-1)-r

L is the number of gray levels.


The result of this transformation is that low
intensities are made high and vice versa.

Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,


India India

6
06-09-2021

Contrast stretching
• Contrast stretching (often called normalization)
attempts to improve the contrast in an image by
“stretching” the range of intensity values to a
desired range of values.
Pout=(Pin-c)((b-a)/(d-c)) + a
8-bit graylevel images the lower and upper limits
might be 0 and 255, a & b, lowest and highest pixel
values currently present in the image c & d.

Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,


India India

contrast stretching Example


Following 3x3 matrix from a 8 bit grayscale image
where minimum pixel value is 10 and maximum
pixel value is 200, calculate the contrast stretched
matrix.

50 60 70 a=0
60 50 80 b=255
80 70 50 c= 10, d=200

Pout=(Pin-c)((b-a)/(d-c)) + a
Pout=(50-10)((255-0)/(200-10))+0=53.68=54

Dr. Saravanan Chandran, NIT Durgapur, India


Dr. C. Saravanan, NIT Durgapur,
India

7
06-09-2021

Gray-level slicing
• Give a high value for all the gray-levels in the
specified range and a very low value for all the
other gray-levels.

Bit-plane slicing
• The intensity of each pixel of an image is defined by
several bits - highest order bits are dominant.

Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,


India India

Histogram Equalisation Histogram Equalisation steps


• A histogram with a small spread has low contrast 1. Find the running sum of the histogram values
• A histogram with a wide spread has high contrast
• An image with its histogram clustered at the low 2. Normalize the values from step 1 by dividing by
end of the range corresponds to a dark image the total number of pixels

3. Multiply the values from step 2 by the maximum


gray level value and round to the closest integer

4. Map the gray level values to the results from step


3 using a one-to-one correspondence
Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,
India India

8
06-09-2021

Histogram Example Example continues ...


Step 1: Step 2: Normalizing by dividing by the total number
Gray level Number of pixels Running Sum of pixels (51) we get 10/51, 18/51, 27/51, 29/51,
value (Histogram values) 43/51, 44/51, 49/51, 51/51
0 10 10
1 8 10+8=18 Step 3: Multiply by the maximum gray level value (7)
2 9 10+8+9=27 and round we obtain 1, 2, 4, 4, 6, 6, 7, 7
3 2 29
4 14 43
Step 4: Map the original value to the results from
5 1 44 step 3
6 5 49
7 2 51
Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,
India India

Histogram equalisation Unsharp masking


• Combine histogram modification and filtering
operations.

Input Image -> filter -> Histogram modification ->


output image

Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,


India India

9
06-09-2021

Homomorphic filter
• Simultaneously normalizes the brightness across an
image and increases contrast.
• To make the illumination of an image more even,
the high-frequency components are increased and
low-frequency components are decreased, because
the high-frequency components are assumed to
represent mostly the reflectance in the scene (the
amount of light reflected off the object in the
scene), whereas the low-frequency components are
assumed to represent mostly the illumination in the
scene.

Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,


India India

Frequency Domain Methods


• Compute Fourier Transform of the image to be
enhanced.
• Multiply the result by a filter
• Take the inverse transform to produce the
enhanced image
• Will diminish camera noise, spurious pixel values,
missing pixel values etc.

Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,


India India

10
06-09-2021

Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,


India India

Neighbourhood Averaging
• Smooth Image F(x,y) = Average pixel value in a
neighbourhood of I(x,y)
• For example, 3 x 3 neighbourhood
• Each pixel value is multiplied by 1/9
• Sum of 9 pixel value is the output

Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,


India India

11
06-09-2021

Edge Preserving
• Also called Median Filtering
• Median of the neighbourhood pixel values
• More like neighbours
• Edges are preserved

Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,


India India

Image Sharpening
• Human perception is highly sensitive to edges and
fine details of an image, and since they are
composed primarily by high frequency
components, the visual quality of an image can be
enormously degraded if the high frequencies are
attenuated or completed removed.

Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,


India India

12
06-09-2021

Sharpening
• In contrast, enhancing the high-frequency
components of an image leads to an improvement
in the visual quality.
• Image sharpening is widely used in printing and
photographic industries for increasing the local
contrast and sharpening the images.
• Enhance detail that has been blurred.

Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,


India India

Edge Detection
• Identifying and Locating sharp discontinuities in an
image.
• Discontinuities are abrupt changes in pixel
intensity.
• Sobel operator / filter
• Canny edge operator

Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,


India India

13
06-09-2021

Sobel Operator
• 3 x 3 convolution kernels convolved with original
image.
• * denotes convolution operation

  1 0  1  1  2  1
Gx   2 0  2 * A Gy   0 0 0  * A
  1 0  1    1  2  1

Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,


India India

Canny Edge Operator


1. Smoothing: Blurring of the image to remove noise.
2. Finding gradients: The edges should be marked where the
gradients of the image has large magnitudes.
3. Non-maximum suppression: Only local maxima should be
marked as edges.
4. Thresholding: Potential edges are determined by
thresholding.
5. Edge tracking by hysteresis: Final edges are determined by
suppressing all edges that are not connected to a very
certain (strong) edge.

Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,


India India

14
06-09-2021

Smoothing
The image is first smoothed by applying a Gaussian filter.

 2 4 5 4 2
4 9 12 9 4
1  
B 5 12 15 12 5
159  
4 9 12 9 4
2 4 5 4 2
Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,
India India

Finding gradients Non-maximum suppression


• Convert the blurred edges in the image of the gradient
• Finds edges where the grayscale intensity of the magnitudes to “sharp” edges.
image changes the most. • This is done by preserving all local maxima in the
• These areas are found by determining gradients of gradient image, and deleting everything else.
the image
• Gradients at each pixel in the smoothed image are
2 3 5 4
determined by applying Sobel-operator
4 5 7 6
5 6 4 3
3 4 3 1
Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,
India India

15
06-09-2021

Thresholding
• The edge-pixels remaining will probably be true
edges in the image.
• But some may be caused by noise or color
variations.
• Discern between these would be to use a
threshold.
• The Canny edge detection algorithm uses double
thresholding, Strong and Weak.

Dr. C. Saravanan, NIT Durgapur, Dr. C. Saravanan, NIT Durgapur,


India India

Edge Tracking by Hysteresis


• Strong edges are included in the final edge image.
• Weak edges are included if and only if they are
connected to strong edges.

Dr. C. Saravanan, NIT Durgapur,


India

16

You might also like