Professional Documents
Culture Documents
5 Ann
5 Ann
Machine Learning
Dr. Hammad Afzal
hammad.afzal@mcs.edu.pk
1
Agenda
• Preliminary Concepts
– Derivatives
• Perceptron
• Gradient Descent
• Perceptron Learning
• Multi Layer Perceptron
• Back Propagation Algorithm
2
Derivatives
Session-21
3
Contents
• Derivatives
• Computation Graphs and Chain Rule
4
Derivatives
5
Derivatives
6
Derivatives
7
Derivatives
8
Derivatives
9
Derivatives
10
Derivatives
• Derivative of a function means slope of
the function
– Line – Same slope
– Curve – Different slopes at different points
11
Computation Graph
12
Chain Rule
13
Computation Graph
14
Computation Graph
15
Artificial Neural Networks
16
What is it?
• An artificial neural network is a crude way of trying to
simulate the human brain (digitally)
• Human brain – Approx 10 billion neurons
• Each neuron connected with thousands of others
• Parts of neuron
– Cell body
– Dendrites – receive input signal
– Axons – Give output
What is ANN
• ANN – made up of artificial neurons
– Digitally modeled biological neuron
• Each input into the neuron has its own weight
associated with it
• As each input enters the nucleus (blue circle)
it's multiplied by its weight.
http://www-cse.uta.edu/~cook/ai1/lectures/figures/neuron.jpg
Let w0 = -T and x0 = 1
Output is 1 if D> 0;
Output is 0 otherwise
Y Y Y Y
+1 +1 1 1
0 X 0 X 0 X 0 X
-1 -1 -1 -1
Linear
neuron
150 50 100
2 5 3
2 5 3
• This gives new weights of
portions of portions of
70, 100, 80
portions of fish
chips drink • Notice that the weight for
chips got worse!
Perceptron
• In 1958, Frank Rosenblatt introduced a training algorithm that
provided the first procedure for training a simple ANN: a
Perceptron.
Inputs
x1 Linear Hard
w1 Combiner Limiter
Output
Y
w2
x2
Threshold
A Two Input Perceptron
Perceptron
Perceptron
Perceptron
Implementing ‘OR’ with Perceptron
Implementing ‘AND’ with Perceptron
Implementing ‘XOR’ with Perceptron
Linearly Separable
Perceptron Learning
Simple Perceptron
Perceptron Learning Algorithm
Perceptron Learning Algorithm
Learning Example
Learning Example
Learning Example
Learning Example
Learning Example
Learning Example
Learning Example
Learning Example
Learning Example
Learning Example
Perceptron Learning
Perceptron Learning
Mean square Algorithm – Gradient
Descent
Gradient Descente vs Batch Gradient
Perceptron Training Rule (X)
Incremental Gradient Descent (X)
Gradient Descente - Algorithm
Example of
perceptron
learning: the
logical
operation AND
Gradient Descente - Algorithm
Gradient Descente - Algorithm
Gradient Descente - Algorithm
Gradient Descente - Algorithm
Gradient Descente - Algorithm
Gradient Descente - Algorithm
Gradient Descente - Algorithm
Gradient Descente - Algorithm
Gradient Descente - Algorithm
Multi-Layer Perceptron
Multi-Layer Perceptron
Multi-Layer Perceptron
Multi-Layer Perceptron
Multi-Layer Perceptron
Multi-Layer Perceptron
Multi-Layer Perceptron
73
Multi-Layer Perceptron
75
Back Propagation
Back Propagation
Back Propagation
Back Propagation
Back Propagation
Back Propagation
Back Propagation
Back Propagation
Back Propagation
Back Propagation - Example
Back Propagation - Example
Back Propagation - Example
Back Propagation - Example
Back Propagation - Example
Back Propagation - Example
RESOURCES
91
Acknowledgements
AND XOR
0,1 1,1
0,0 1,0
1 XOR
2
0,0 1,0
0,1 1,1
XOR
0,0 1,0
XOR
BackPropagation
Backpropagation Algorithm
Computational complexity
Influence
Output Units
Map Layer
1
Influence
Map Layer
2
• Sensitivity to noise
– Very tolerant
• Transparency
– Neural networks are essentially black boxes
– There is no explanation or trace for a particular answer
– Tools for the analysis of networks are very limited
Different non linearly separable
problems
Other neural networks
Multilayer neural networks trained with the back-propagation
algorithm are used for pattern recognition problems.
A recurrent neural network has feedback loops from its outputs to its
inputs
RESOURCES
101
Acknowledgements
• Dr. Imran Siddiqi: Bahria University, Islamabad
103