Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 30

G5AIAI Neural Networks

Neural Networks
• AIMA – Chapter 19

• Fundamentals of Neural Networks :


Architectures, Algorithms and
Applications. L, Fausett, 1994

• An Introduction to Neural Networks


(2nd Ed). Morton, IM, 1995
G5AIAI Neural Networks

Neural Networks
• McCulloch & Pitts (1943) are generally
recognised as the designers of the first
neural network

• Many of their ideas still used today (e.g.


many simple units combine to give
increased computational power and the
idea of a threshold)
G5AIAI Neural Networks

Neural Networks

• Hebb (1949) developed the first learning


rule (on the premise that if two neurons
were active at the same time the strength
between them should be increased)
G5AIAI Neural Networks

Neural Networks
• During the 50’s and 60’s many
researchers worked on the perceptron
amidst great excitement.
• 1969 saw the death of neural network
research for about 15 years – Minsky &
Papert
• Only in the mid 80’s (Parker and LeCun)
was interest revived (in fact Werbos
discovered algorithm in 1974)
G5AIAI Neural Networks

Neural Networks
G5AIAI Neural Networks

Neural Networks

• A human brain consists of approximately 10" computing


elements called neurons. They communicate through a
connection network of axons and synapses having a
density of approximately lo4 synapses per neuron.
G5AIAI Neural Networks

• The neuron is able to respond to the total of its


inputs aggregated within a short time interval
called the period of latent summation. The
neuron's response is generated if the total potential
of its membrane reaches a certain level.
• A more precise condition for firing is that the
excitation should exceed the inhibition by the
amount called the threshold of the neuron,
typically a value of about 40mV
G5AIAI Neural Networks

Neural Networks
• Signals “move” via electrochemical
signals

• The synapses release a chemical


transmitter – the sum of which can cause
a threshold to be reached – causing the
neuron to “fire”

• Synapses can be inhibitory or excitatory


G5AIAI Neural Networks

The First Neural Neural Networks

McCulloch and Pitts produced the first


neural network in 1943

Many of the principles can still be seen


in neural networks of today
G5AIAI Neural Networks

Comparison of Brains and Traditional Computers

• 200 billion neurons, 32 • 1 billion bytes RAM but trillions


trillion synapses of bytes on disk
• Element size: 10-9 m
• Element size: 10-6 m
• Energy watt: 30-90W (CPU)
• Energy use: 25W
• Processing speed: 109 Hz
• Processing speed: 100 Hz
• Serial, Centralized
• Parallel, Distributed
• Generally not Fault Tolerant
• Fault Tolerant
• Learns: Some
• Learns: Yes
• Intelligent/Conscious: Generally
• Intelligent/Conscious: No
Usually
G5AIAI Neural Networks

Difference B/w conventional computing & ANN


G5AIAI Neural Networks

The First Neural Neural Networks


X1
2

X2 2
Y

-1

X3

The activation of a neuron is binary. That is,


the neuron either fires (activation of one) or
does not fire (activation of zero).
G5AIAI Neural Networks

The First Neural Neural Networks


X1
2

X2 2
Y

-1

X3 For the network shown here the activation


function for unit Y is

f(y_in) = 1, if y_in >= θ else 0

where y_in is the total input signal received


θ is the threshold for Y
G5AIAI Neural Networks

The First Neural Neural Networks

1
AND
X1
X1 X2 Y
Y

1 1 1
X2 1
1 0 0
AND Function
0 1 0
0 0 0

Threshold(Y) = 2
G5AIAI Neural Networks

The First Neural Neural Networks


OR
X1 2
X1 X2 Y
Y
1 1 1
X2 2
1 0 1
0 1 1
AND Function
OR Function
0 0 0

Threshold(Y) = 2
G5AIAI Neural Networks

The First Neural Neural Networks


AND
X1 2 NOT
Y X1 X2 Y
X2
1 1 0
-1
1 0 1
AND NOT Function
0 1 0
0 0 0

Threshold(Y) = 2
G5AIAI Neural Networks

The First Neural Neural Networks


2

X1 Z1
2
XOR
-1

Y X1 X2 Y
-1
1 1 0
Z2
X2
2 1 0 1
2
0 1 1
XOR Function
0 0 0

X1 XOR X2 = (X1 AND NOT X2) OR (X2 AND NOT X1)


G5AIAI Neural Networks

Find the truth table of the MCP network


G5AIAI Neural Networks

How neural networks learns


• A neural network learns about its environment through an
interactive process of adjustments applied to its synaptic
weights and bias level. Learning is a process of by which
the free parameters of neural network get adapted through
a process of stimulation by the environment in which the
networks is embedded. The type of learning is determined
by the manner in which the parameters changes takes
place. The set of well defined rules for solution for a
learning problem is called a learning algorithm. Each
learning algorithm differs from the other in the way in
which the adjustment to a synaptic weight of a neuron is
formulated. Also the manner in which a neural network is
made up of set of inter-connected neurons relating to its
environment, also to be considered.
G5AIAI Neural Networks
G5AIAI Neural Networks

Hebbs’s rule
G5AIAI Neural Networks

Hebb’s rule contd….


G5AIAI Neural Networks

Hebb’s rule contd….


G5AIAI Neural Networks

Modelling a Neuron

ini   j Wj , iaj • aj :Activation value of unit j


• wj,I :Weight on the link from unit j to unit i
• inI :Weighted sum of inputs to unit i
• aI :Activation value of unit i
• g :Activation function
G5AIAI Neural Networks

Activation Functions

• Stept(x) = 1 if x >= t, else 0


• Sign(x) = +1 if x >= 0, else –1
• Sigmoid(x) = 1/(1+e-x)
• Identity Function
G5AIAI Neural Networks

Simple Networks

AND OR NOT
Input 1 0 0 1 1 0 0 1 1 0 1
Input 2 0 1 0 1 0 1 0 1
Output 0 0 0 1 0 1 1 1 1 0
G5AIAI Neural Networks

Simple Networks

-1
W = 1.5

x t = 0.0

W=1
y
G5AIAI Neural Networks

Perceptron
• Synonym for Single-
Layer, Feed-
Forward Network
• First Studied in the
50’s
• Other networks were
known about but the
perceptron was the
only one capable of
learning and thus all
research was
concentrated in this
area
G5AIAI Neural Networks

Training a perceptron

Aim

AND
Input 1 0 0 1 1
Input 2 0 1 0 1
Output 0 0 0 1
G5AIAI Neural Networks

Training a perceptrons
-1
W = 0.3

x t = 0.0
W = 0.5

W = -0.4
y

I1 I2 I3 Summation Output
-1 0 0 (-1*0.3) + (0*0.5) + (0*-0.4) = -0.3 0
-1 0 1 (-1*0.3) + (0*0.5) + (1*-0.4) = -0.7 0
-1 1 0 (-1*0.3) + (1*0.5) + (0*-0.4) = 0.2 1
-1 1 1 (-1*0.3) + (1*0.5) + (1*-0.4) = -0.2 0

You might also like