Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 25

INTRODUCTION TO ARTIFICIAL NEURAL

NETWORKS
Biological Neuron to Artificial Neuron
McCulloh Pitts Perceptron Model
Layer of Neurons
Activation Function
Artificial Learning
Types of Learning
Introduction to Back Propagation Networks
Applications of Neural Network
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
The human nervous system can be broken down into three stages that
may be represented in block diagram form as:

• The receptors collect information from the environment – e.g. photons


on the retina.
• The effectors generate interactions with the environment – e.g.
activate muscles.
• The flow of information/activation is represented by arrows –
feedforward and feedback.
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
• The brain is a highly complex, non-linear, and parallel computer, composed of
some 1011 neurons that are densely connected (~104 connection per neuron).
• A neuron is much slower (10-3sec) compared to a silicon logic gate (10-9sec),
however the massive interconnection between neurons make up for the
comparably slow rate.
–  Complex perceptual decisions are arrived at quickly (within a few hundred
milliseconds)
• 100-Steps rule: Since individual neurons operate in a few milliseconds,
calculations do not involve more than about 100 serial steps and the
information sent from one neuron to another is very small (a few bits)
• Plasticity: Some of the neural structure of the brain is present at birth, while
other parts are developed through learning, especially in early stages of life, to
adapt to the environment (new inputs).
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
BASIS FOR COMPARISON BRAIN COMPUTER
Construction Neurons and synapses ICs, transistors, diodes, capacitors, transistors, etc.
Memory growth Increases each time by connecting synaptic links Increases by adding more memory chips
Backup systems Built-in backup system Backup system is constructed manually
Memory power 100 teraflops (100 trillion calculations/seconds) 100 million megabytes
Memory density 107 circuits/cm3 1014 bits/cm3
Energy consumption 12 watts of power Gigawatts of power
Information storage Stored in electrochemical and electric impulses. Stored in numeric and symbolic form (i.e. in binary
bits).
Size and weight The brain's volume is 1500 cm3 and weight is Variable weight and size form few grams to tons.
around 3.3 pounds.
Transmission of information Uses chemicals to fire the action potential in the Communication is achieved through electrical coded
neurons. signals.
Information processing power Low High
Input/output equipment Sensory organs Keyboards, mouse, web cameras, etc.
Structural organization Self-organized Pre-programmed structure
Parallelism Massive Limited
Reliability and damageability Brain is self-organizing, self-maintaining and Computers perform a monotonous job and can't correct
properties reliable. itself.
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
Biological Neural Network:
It is well-known that the human brain consists of a huge
number of neurons, approximately 1011, with numerous
interconnections. A schematic diagram of a biological
neuron is shown in Figure.
The biological neuron depicted in Figure consists of three
main parts:
1. Soma or cell body- where the cell nucleus is located. The
neuron’s nucleus contains the genetic material in the form of
DNA. This exists in most types of cells, not just neurons.
2. Dendrites- where the nerve is connected to the cell body.
3. Axon- which carries impulses of the neuron.
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
• Dendrite — Dendrites are fibres which emanate from the cell body and provide the receptive
zones that receive activation from other neurons.
• Soma (cell body) — It sums all the incoming signals to generate input. The neuron’s cell body
(soma) processes the incoming activations and converts them into output activations.
• Axon —  An axon is a single, long connection extending from the cell body and carrying
signals from the neuron. The end of the axon splits into fine strands. It is found that each
strand terminates into a small bulb-like organ called synapse. When the sum reaches a
threshold value, neuron fires and the signal travels down the axon to the other neurons.
• Synapses — The point of interconnection of one neuron with other neurons. The amount of
signal transmitted depend upon the strength (synaptic weights) of the connections.
The receiving ends of these synapses on the nearby neurons can be found both on the dendrites
and on the cell body. There are approximately 104 synapses per neuron in the human brain.
• The connections can be inhibitory (decreasing strength) or excitatory (increasing strength) in
nature.
So, neural network, in general, is a highly interconnected network of billions of neuron with
trillion of interconnections between them.
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
• Electric impulses are passed between the synapse and the dendrites. This
type of signal transmission involves a chemical process in which specific
transmitter substances are released from the sending side of the junction.
• This results in increase or decrease in the electric potential inside the body
of the receiving cell. If the electric potential reaches a threshold, then the
receiving cell fires and a pulse or action potential of fixed strength and
duration is sent out through the axon to the synaptic junction of the other
cells.
• After firing, a cell has to wait for a period of time called the refractory
period before it can fire again. The synapses are said to be inhibitory if they
let passing impulses hinder the firing of the receiving cell or excitatory if
they let passing impulses cause the firing of the receiving cell.
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
ANNs are composed of
multiple nodes, which imitate
biological neurons of human
brain. The neurons are
connected by links and they
interact with each other. The
nodes can take input data and
perform simple operations on
the data. The result of these
operations is passed to other
neurons. The output at each Schematic diagram of a neuron
node is called its activation or
node value.
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
ANN is an information processing system that has certain performance characteristics in common with biological
nets.
Several key features of the processing elements of ANN are suggested by the properties of biological neurons:
1.The processing element receives many signals.
2.Signals may be modified by a weight at the receiving synapse.
3.The processing element sums the weighted inputs.
4.Under appropriate circumstances (sufficient input), the neuron transmits a single output.
5.The output from a particular neuron may go to many other neurons.
From experience: examples / training data
Strength of connection between the neurons is stored as a weight-value for the specific connection.
Learning the solution to a problem = changing the connection weights

ANNs have been developed as generalizations of mathematical models of neural biology, based on the
assumptions that:

6.Information processing occurs at many simple elements called neurons.


7.Signals are passed between neurons over connection links.
8.Each connection link has an associated weight, which, in typical neural net, multiplies the signal
transmitted.
9.Each neuron applies an activation function to its net input to determine its output signal.
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
Model Of A Neuron

Wa
X1

Wb Y
X2  f()

Wc
X3

Input units Connection Summing


computation
weights function

(dendrite) (synapse) (axon)


(soma)
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
• A neural net consists of a large number of simple processing elements called neurons, units,
cells or nodes.
• Each neuron is connected to other neurons by means of directed communication links, each
with associated weight.
• The weight represent information being used by the net to solve a problem.
• Each neuron has an internal state, called its activation or activity level, which is a function
of the inputs it has received. Typically, a neuron sends its activation as a signal to several
other neurons.

• It is important to note that a neuron can send only one signal at a time, although that signal
is broadcast to several other neurons.
• Neural networks are configured for a specific application, such as pattern recognition or
data classification, through a learning process
• In a biological system, learning involves adjustments to the synaptic connections between
neurons
 same for artificial neural networks (ANNs)
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
Artificial Neural Network
Synapse Nucleus

x1 w1
 
y
Axon
x2 w2 Activation Function:
yin = x1w1 + x2w2 (y-in) = 1 if y-in >= 
and (y-in) = 0

Dendrite

- A neuron receives input, determines the strength or the weight of the input, calculates the total weighted input,
and compares the total weighted with a value (threshold)
- The value is in the range of 0 and 1
- If the total weighted input greater than or equal the threshold value, the neuron will produce the output, and if
the total weighted input less than the threshold value, no output will be produced
BIOLOGICAL NEURON TO ARTIFICIAL NEURON

Dendrites: Input
Cell body: Processor
Synaptic: Link
Axon: Output
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
Analogy of ANN with BNN
• The dendrites in biological neural network is analogous to the weighted inputs based on
their synaptic interconnection in artificial neural network.
• Cell body is analogous to the artificial neuron unit in artificial neural network which also
comprises of summation and threshold unit.
• Axon carry output that is analogous to the output unit in case of artificial neural network.
So, ANN are modelled using the working of basic biological neurons.
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
Comparing ANN with BNN
As this concept borrowed from ANN there are lot of similarities though
there are differences too.
• Similarities are in the following table
Biological Neuron Artificial Neuron

Cell Neuron

Dendrites Weights or interconnections

Soma Net input

Axon Output
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
Criteria BNN ANN
The cycle-time of execution in few The cycle-time of execution in few
Speed
milliseconds nanoseconds
Massively parallel, slow but superior than Massively parallel, fast but inferior than
Processing
ANN BNN
1011 neurons and 1015 interconnections. The The size and complexity of an ANN is
Size &
size and complexity of a BNN is more than based on the chosen application and the
Complexity
an ANN network designer.
Very precise, structured and
Learning They can tolerate ambiguity formatted data is required to
tolerate ambiguity
Performance degrades with even partial It is capable of robust performance, hence
Fault tolerance
damage has the potential to be fault tolerant
Stores the information in
Storage Stores information in its interconnection or
continuous memory locations, loss of
capacity in synapse. No Loss of memory
memory may happen sometimes.
Control There is no such control unit for monitoring
It is very simple as compared to BNN.
mechanism in the brain
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
ANN possesses the following characteristics:
• It is a neurally implemented mathematical model
• There exist a large number of highly interconnected processing elements called
neurons in an ANN.
• The interconnections with their weighted linkages hold the informative
knowledge.
• The input signals arrive at the processing elements through connections and
connecting weights.
• The processing elements of the ANN have the ability to learn, recall and
generalize from the given data by suitable assignment or adjustment of weights.
• The computational power can be demonstrated only by the collective behavior of
neurons, and it should be noted that no single neuron carries specific information.
These make the ANNs as connectionist models, parallel distributed processing
models, self-organizing systems, neuro-computing systems and neuro-morphic
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
Evolution of Neural Networks
1943 McCulloch-Pitts neurons
1949 Hebb’s law
1958 Perceptron (Rosenblatt)
1960 Adaline, better learning rule (Widrow, Huff)
1969 Limitations (Minsky, Papert)
1972 Kohonen nets, associative memory
1977 Brain State in a Box (Anderson)
1982 Hopfield net, constraint satisfaction
1985 ART (Carpenter, Grossfield)
1986 Backpropagation (Rumelhart, Hinton, McClelland)
1988 Neocognitron, character recognition (Fukushima)
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
Important terminologies of ANNs
• Weights
• Bias
• Threshold
• Learning rate
• Momentum factor
• Vigilance parameter
• Notations used in ANN
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
• Weights:
 Each neuron is connected to every other neuron by means of directed links
 Links are associated with weights
 Weights contain information about the input signal and is represented as a matrix
 Weight matrix also called connection matrix
• Bias:
Bias is like another weight. Its included by adding a component x0=1 to the input vector X.
X=(1,X1,X2…Xi,…Xn)
Bias is of two types: Positive bias: increase the net input, Negative bias: decrease the net input
• The relationship between input and output given by the equation of straight line y=mx+C
C(bias)

Input X Y y=mx+C
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
• Threshold:
 Set value based upon which the final output of the network may
be calculated
 Used in activation function
 The activation function using threshold can be defined as
1ifnet   
• Learning rate: f ( net )   
 Denoted by α.  1ifnet   
 Used to control the amount of weight adjustment at each step of
training
 Learning rate ranging from 0 to 1 determines the rate of
learning in each time step
BIOLOGICAL NEURON TO ARTIFICIAL NEURON
• Momentum factor:
– used for convergence when momentum factor is added to
weight updation process.
– Used in back propagation network.
• Vigilance parameter:
– Denoted by ρ
– Used to control the degree of similarity required for patterns
to be assigned to the same cluster
– It range approximately from 0.7 to 1 to perform useful work
in controlling the no. of clusters.
MC-CULLOH PITTS PERCEPTRON MODEL
The early model of an artificial neuron is introduced by Warren McCulloch and
Walter Pitts in 1943.
It is usually called as M-P neuron. The M-P neurons are connected by directed
weighted paths. The activation of a M-P neuron is binary, that is, at any time
step the neuron may fire or may not fire. The weights associated with the
communication links may be excitatory (weight is positive) or inhibitory
(weight is negative). All the excitatory connected weights entering into a
particular neuron will have same weights.
The threshold plays a major role in M-P neuron: There is a fixed threshold for
each neuron, and if the net input to the neuron is greater than the threshold then
the neuron fires. Any nonzero inhibitory input would prevent the neuron from
firing. The M-P neurons are most widely used in the case of logic function.
MC-CULLOH PITTS PERCEPTRON MODEL
McCulloch-Pitts neuron model.

A simple M-P neuron is shown in Figure. As already discussed, the M-P neuron has both
excitatory and inhibitory connections. It is excitatory with weight (w > 0) or inhibitory with
weight –p (p < 0). In Figure inputs from X1 to Xn possess excitatory weighted connections
and inputs from Xn+ 1 m Xn+m possess inhibitory weighted interconnections. Since the firing of
the output neuron is based upon the threshold, the activation function here is defined as

For inhibition to be absolute, the threshold with the activation function should satisfy the
MC-CULLOH PITTS PERCEPTRON MODEL
The output will fire if it receives say “k” excitatory inputs but no inhibitory
inputs, where
The M-P neuron has no particular training algorithm. An analysis has to be
performed to determine the values of the weights and the threshold. Here the
weights of the neuron are set along with the threshold to make the neuron
"perform a simple logic function. The M-P neurons are used as building blocks
on which we can model any function or phenomenon, which can be represented
as a logic function
Features of McCulloch-Pitts model
• Allows binary 0,1 states only
• Operates under a discrete-time assumption
• Weights and the neurons’ thresholds are fixed in the model and no
interaction among network neurons
• Just a primitive model

You might also like