Indhufiles

You might also like

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 17

NEURAL NETWORK BASED FACE RECOGNITION

BATCH MEMBERS K.Archana K.Aruna T.C.Indhumathi

Guider Mrs. Kanimozhi

INTRODUCTION
A

neural network is an information processing system. It is based on mathematical model. The connection between the artificial and the real thing is also investigated and explained. The aim of the project is to recognize the face from its database. The different faces are trained by the neural network.

Block Diagram
TRAINING
Different set of face images

PHASE
Hit-missalgorithm

ARTIFICIAL NEURON
Input neuron hidden layer output neuron

Trained face images

Block Diagram Explanations


Scan

the Different set of face images from database. Resize the Image with the Size of (30x30) Form a Neural Network of input layer 900, Hidden layer 10, and Output layer 4. Give all the 40 Different data base Image to the Neural Network in the Training Phase

HIT-MISS-ALGORITHM
The

hit-and-miss algorithm is to detect certain patterns in a binary image, using a structuring element containing 1's, 0's and blanks for don't cares. For example: The following four structuring elements can detect corners of four different orientations.

OR of four images

ARTIFICIAL NEURON
An

artificial neural network is a system based on the operation of biological neural networks

Biological neural activity

Each neuron has a body, an axon, and many dendrites


Can be in one of the two states: firing and rest. Neuron fires if the total incoming stimulus exceeds the threshold Thin gap between axon of one neuron and dendrite of another

The artificial neural network


z
1

w1 w
j j 2

z
2

z
3

w3
j

f( )

yj

= f ( k wkj ( f ( i wik xi + w0 k )) + w0 j )
output

y j = f ( k wkj zk + w0 j ) =

hidden input

Feed-forward nets

Information flow is unidirectional


Data is presented to Input layer Passed on to Hidden Layer Passed on to Output layer Information is distributed

Information processing is parallel

Perceptron

x1 x2

x0=1 w1 w2 w0

wn

xn

i=0

wi xi
o(xi) =

1 if wi xi >0 -1 otherwise

Perceptron Learning Rule


wi = wi + wi wi = (t - o) xi t=c(x) is the target value o is the perceptron output Is a small constant (e.g. 0.1) called learning rate If the output is incorrect (t o) the weights wi are changed such that the output of the perceptron for the new weights is closer to t. If the output is correct (t=o) the weights wi are not changed

Perceptron Training

Linear threshold is used. W - weight value t - threshold value

Learning algorithm
Target Value, T : When we are training a network we not only present it with the input but also with a value that we require the network to produce. For example, if we present the network with [1,1] for the AND function the training value will be 1

Output , O : The output value from the neuron

Back Propagation Method

Can theoretically perform any input-output mapping; Can learn to solve linearly inseparable problems.

Advantage of neural network


More

like a real nervous system Parallel organisation permits solutions to problems where multiple constraints must be satisfied simultaneously. Rules are implicit rather than explicit. FACIAL ANIMATION

Applications
It

is mostly used in bank, to avoid bank robber To be much safer, this technique can used in airport Without testing, in jewellary shop, this type of technique can be used. In face animation can also implemented In this technique.

You might also like