Professional Documents
Culture Documents
3-Intro to Deep Learning and Perceptron
3-Intro to Deep Learning and Perceptron
Neural Networks
1
Artificial Intelligence vs. Machine Learning vs. Deep Learning
and Neural Networks
What is AI?
The idea of artificial intelligence came into
existence and the term AI was first coined
in 1956 by John McCarthy
“Artificial Intelligence is a technique
which enables the machines to mimic
human behavior”
2
Artificial Intelligence vs. Machine Learning vs. Deep Learning
and Neural Networks What is Learning?
4
Artificial Intelligence vs. Machine Learning vs. Deep Learning
and Neural Networks
5
Artificial Intelligence vs. Machine Learning vs. Deep Learning
and Neural Networks
What are Neural Networks (NNs) ?
Neural Networks or Artificial Neural Networks (ANNs) are
machine learning techniques inspired by our brain cells'
functionality. These brains cells are knows as bilogical neurons.
9
Neural Network Learning Algorithms (2)
• Both features and the output classes are allowed to be real valued
13
Types of Machine learning algorithms (1)
11
Types of Machine learning algorithms (2)
Target problems in supervised , unsupervised and reinforcement learning respectively
(from left to right)
12
Types of Machine learning algorithms (3)
Nature of data in supervised , unsupervised and reinforcement learning respectively
(from left to right)
16
Types of Machine learning algorithms (4)
Training process in supervised , unsupervised and reinforcement learning respectively
(from left to right)
14
Types of Machine learning algorithms (5)
Purpose of supervised , unsupervised and reinforcement learning respectively
(from left to right)
15
Traditional ML vs. Deep Learning
(1)
19
Traditional ML vs. Deep Learning
(2)
17
Traditional ML vs. Deep Learning
(3)
18
Perceptron: The First Neural Network
Types of Artificial Neural Networks
• ANN can be categorized based on number of hidden layers contained
in ANN architecture
the network
wn
xn
Unit Function
• Linear Function
• Weighted sum followed by an activation function
Perceptron Example
Pixel 1 x1
0.25 If S >= 0
+1 Bright
Pixel 2 x2 0.25
ŷ
S
0.25
Pixel 3 x3
0.25 Otherwise -1 Dark
x4
Pixel 4
∆1 = ɳ (t(E) – o(E) )* x1
wʹ1 = w1 +∆1 = 0.25 - 0.2 = 0.05
= 0.1 ( -1-1) * 1 = -0.2
∆2 = ɳ (t(E) – o(E) )* x2
wʹ2 = w2 +∆2 = 0.25 + 0 = 0.25
= 0.1 ( -1-1) * 0 = 0
∆3 = ɳ (t(E) – o(E) )* x3
wʹ3 = w3 +∆3 = 0.25 + 0 = 0.25
= 0.1 ( -1-1) * 0 = 0
∆4 = ɳ (t(E) – o(E) )* x4 wʹ4 = w4 +∆4 = 0.25 + 0 = 0.25
= 0.1 ( -1-1) * 0 = 0
Perceptron Example
• Calculation (Step-2):
• X1 = 1
1 x1
• X2 = 0 0.05 +1 Bright
• X3 = 0 If S>= 0
• X4 = 0 0 x2 0.25
ŷ
S
S= 0.05*(1) + 0.25* (0) + 0.25*(0) + 0.25* (0) = 0.05 0.25
0 x3
Otherwise
• 0.05 > 0, so the output of ANN is +1 0.25 -1 Dark
• Calculation (Step-3):
• X1 = 1
1 x1
• X2 = 0 -0.15 +1 Bright
• X3 = 0 If S>= 0
• X4 = 0 0 x2 0.25
ŷ
S
S= - 0.15*(1) + 0.25* (0) + 0.25*(0) + 0.25* (0) = - 0.15 0.25
0 x3
Otherwise
• - 0.15 < 0, so the output of ANN is -1 0.25 -1 Dark
X1 X2 X1 AND X2 +1 ON
If S >= 0
0 0 0
0 x1 W1=0.5
0 1 0
S ŷ
W2=0.5
1 0 0
-1 OFF
1 x2
1 1 1
Observed Output +1 +1 +1 -1
Another Example (AND)
X1 X2 X1 AND X2 +1 ON
If S >= 0
0 0 0
0 x1 0.5
0 1 0
S ŷ
1 0 0
-0.1
-1 OFF
1 x2
1 1 1
Observed Output +1 +1 +1 -1
Another Example (AND)
X1 X2 X1 AND X2 +1 ON
If S >= 0
0 0 0
0 x1 -0.1
0 1 0
S ŷ
1 0 0
-0.1
-1 OFF
1 x2
1 1 1
Observed Output -1 +1
Use of Bias
Bias x0
• Bias is just like an intercept
If S >= 0 +1 ON
added in a linear equation. 0 x1 0.5
S ŷ
0.5
-1 OFF
output = sum (weights * inputs) + bias 1 x2
y =mx + c
• It allows us to move the line down and up fitting the prediction with the
data better. If the constant c is absent then the line will pass through the
origin (0, 0) and we will get a poorer fit.
Example (AND) with Bias
x0 =
Bias
1
X1 X2 X1 AND X2 W0=0.5
0 0 0 If S > 0 +1 ON
0 1 0 0 x1 W1=0.5
S ŷ
1 0 0
W2=0.5
1 1 1 -1 OFF
1 x2
Observed Output +1 +1 +1 -1
Example (AND) with Bias
x0 =
Bias
1
X1 X2 X1 AND X2 W0=-0.1
0 0 0 If S > 0 +1 ON
0 1 0 0 x1 W1=0.
5 S ŷ
1 0 0
W2=-
1 1 1 0.1 -1 OFF
1 x2
Observed Output +1 -1
Example (AND) with Bias
x0 =
Bias
1
X1 X2 X1 AND X2 W0=-0.3
0 0 0 If S > 0 +1 ON
0 1 0 0 x1 W1=0.
3 S ŷ
1 0 0
W2=-
1 1 1 0.1 -1 OFF
1 x2
Observed Output -1 +1
Example (AND) with Bias
x0 =
Bias
After 2 Epochs 1
W0= - 0.3
X1 X2 X1 AND X2 If S > 0 +1 ON
0 0 0 0 x1 W1= 0.3
S ŷ
0 1 0 W2= 0.1
-1 OFF
1 0 0 1 x2
1 1 1
Final Weights
Weights
• X1 = 0
• w0 - 0.3
X2 = 1,
• ɳ = 0.1 w1 0.3
• t(E) = -1 w2 0.1
Learning in Perceptron
• Need To Learn
• Both the weights between input and output units
• And the value for the bias
xn