Professional Documents
Culture Documents
Ann 2
Ann 2
Networks
Neural Networks 1
Agenda
• Neural Networks
• Neural Network Learning
• Network Architectures
• Dimensions of a Neural Network
• Backpropagation Training Algorithm
• Over-Training Prevention
• Successful Applications
Neural Networks 2
Neural Networks
• A NN is a machine learning approach inspired by the
way in which the brain performs a particular learning
task:
– Knowledge about the learning task is given in the form of
examples.
3
Learning
• Supervised Learning
– Recognizing hand-written digits, pattern recognition,
regression.
– Labeled examples
(input , desired output)
– Neural Network models: perceptron, feed-forward, radial
basis function, support vector machine.
• Unsupervised Learning
– Find similar groups of documents in the web, content
addressable memory, clustering.
– Unlabeled examples
(different realizations of the input alone)
– Neural Network models: self organizing maps, Hopfield
networks.
4
Network architectures
Neural Networks NN 1 5
Single Layer Feed-forward
Neural Networks NN 1 6
Multi layer feed-forward
3-4-2 Network
Input Output
layer layer
Hidden Layer
Neural Networks NN 1 7
Recurrent network
Recurrent Network with hidden neuron(s): unit
delay operator z-1 implies dynamic system
z-1
input
z -1
hidden
output
z-1
Neural Networks NN 1 8
Neural Network Architectures
Neural Networks NN 1 9
Real Neurons
• Cell structures
– Cell body
– Dendrites
– Axon
– Synaptic terminals
10
Real Neural Learning
Output
Input x2 w2 v () y
signal
Summing
function
xm wm
Synaptic
weights
Neural Networks NN 1 13
Bias of a Neuron
• Bias b has the effect of applying an affine
transformation to u
v=u+b
• v is the induced field of the neuron
v
u
m
u wjxj
j 1
Neural Networks NN 1 14
Bias as extra input
• Bias is an external parameter of the neuron. Can be
modeled by adding an extra input. m
v wj xj
w0 j 0
x0 = +1
w0 b
x1 w1 Activation
Local function
Field
Input Output
signal x2 w2 v () y
Summing
function
xm wm Synaptic
weights
Neural Networks NN 1 15
Dimensions of a Neural
Network
• Various types of neurons
• Various applications
Neural Networks NN 1 16
Backpropagation Training
Algorithm
•Create the 3-layer network with H hidden units with full
connectivity between layers. Set weights to small random real
values.
•Until all training examples produce the correct value (within ε),
or mean squared error ceases to decrease, or other termination
criteria:
Begin epoch
For each training example, d, do:
Calculate network output for d’s input values
Compute error between current output and correct output for d
Update weights by backpropagating error and using learning rule
End epoch
17
Comments on Training Algorithm
• Not guaranteed to converge to zero training error,
may converge to local optima or oscillate indefinitely.
on test data
on training data
0 # training epochs
• Keep a hold-out validation set and test accuracy on it after
every epoch. Stop training when additional epochs actually
increase validation error.
• To avoid losing training data for validation:
– Use internal 10-fold CV on the training set to compute the average
number of epochs that maximizes generalization accuracy.
– Train final network on complete training set for this many epochs.
19
Determining the Best
Number of Hidden Units
• Too few hidden units prevents the network from
adequately fitting the data.
• Too many hidden units can result in over-fitting.
error
on test data
on training data
0 # hidden units
Neural Networks 22