Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 25

INTRODUCTION TO

NEURAL NETWORKS

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 1
 Block diagram representation of nervous
system:

 Neurons encode their outputs as a series of


brief electrical pulses.
 These pulses are commonly known as action
potential or spike (firing).

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 2
 Neurons receive input(electric) signals from
other neurons through dendrites, integrate
them (sum), and generates its own signal
which travels along the axon.
 The axon, in turn, makes contact with the
dendrites of other neurons; thus, the output
signal of one neuron becomes input to other
neurons (parallelism).
 The points of electric contact between neurons
are called synapses.
Fundamentals of Neural Networks,
Instr. Kamil Yurtkan 3
Fundamentals of Neural Networks,
Instr. Kamil Yurtkan 4
 Cytoarchitectural map of the cerebral cortex. The different areas are
identified by the thickness of their layers and types of cells within
them. Some of the key sensory areas are as follows: Motor cortex:
motor strip, area 4; premotor area, area 6; frontal eye fields, area 8.
Somatosensory cortex: areas 3, 1, and 2. Visual cortex: areas 17, 18,
and 19. Auditory cortex: areas 41 and 42. (From A. Brodal, 1981; with
permission of Oxford University Press.)

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 5
 Human brain is the accepted model for the
artificial neural networks.
 When a child borns, the number of neurons are
fixed in general, and during the lifetime, the
connections between the neurons are changed,
developed.
 Neural networks are adapted mathematical
systems from the nervous system of the brain.

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 6
 Nowadays, there are many applications using
artificial neural networks for problem solving.
 Scientific research are highly included in neural
networks, publishing journals for the neural
networks field of science.
 Computer Engineering, Electrical and
Electronic Engineering, Industrial Engineering
and all other Engineering research fields are
highly included with the applications.

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 7
 Journals:
 Neural Networks
 Neural Computation
 IEEE Transactions on Neural Networks
 Network: Computation in Neural Systems
 Neurocomputing
 Connection Science
 International Journal of Neural Systems
 Neural Processing Letters
 Neural Network Review
 Journal of Neural Network Applications
 Artificial Life
 Neurolinguistics
 Cognitive Brain Research
 Neural Computing and Applications
 Journal of Computational Neuroscience
Fundamentals of Neural Networks,
Instr. Kamil Yurtkan 8
 The main objective is to construct systems which
can demonstrate human-like performance in
information processing.
 Are the computers capable of showing such
performance?
 All the computers are designed in accordance with
the Touring Machines(TM).
 Touring Machines can simulate all the decidable
problems.
 Developing strict algorithms for problem solving is
very critical for a computer system (e.g. Sorting
several numbers).
Fundamentals of Neural Networks,
Instr. Kamil Yurtkan 9
 Computers are powerful but we have to
program them. They do not learn themselves
and are not adaptive. NO INTELLIGENCE.
 Classical Artificial Intellingence do not bring
intelligence, we still need to program them.
 On the other hand, human brain is not
programmed, even though some genetic code
is inherited.
 We learn by experience and make decisions in
accordance with (e.g. child playing and
affected by a fire won’t try it again).
Fundamentals of Neural Networks,
Instr. Kamil Yurtkan 10
 Neural Networks field of science is a multi-
disciplinary subject:
 Neurobiology
 Psychology
 Maths
 Statistics
 Physics
 Computer Science
 Information Theory
 Electrical Engineering

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 11
 Formal Neuron:

x1 w1
x0

w0
x2
w2

. Activation
Σ+Threshold
.
. 1 if y >0
-1 otherwise
y = Σ wixi
xn
wn

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 12
 Basic unit in a neural network
 If y=+1 we say that the neuron is firing
 A formal neuron with a learning algorithm is
called Perceptron.
 Linear separator
 Parts
 N inputs, x1 ... xn
 Weights for each input, w1 ... Wn
 Weighted sum of inputs, y = w0x0 + w1x1 + ... + wnxn
 A threshold function, i.e 1 if y > 0, -1 if y <= 0

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 13
 The Activation
functions:
 Hard limiter (non-
linear) (Sign or Step
Function)
 Linear function
 Threshold logic
 Sigmoid Function
(Logistic ,Hyperbolic
Tan)

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 14
 More activation functions:

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 15
 Perceptrons are linear separators
 A perceptron separates N-dimensional hyperplane
into two regions linearly
 Inputs are flexible
 any real values
 Highly correlated or independent
 Target function may be discrete-valued, real-valued,
or vectors of discrete or real values
 Resistant to errors in the training data
 Long training time
 Fast evaluation

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 16
 Linear separation:
 (a) Linearly Separable pattern
 (b) Non-linearly Separable pattern

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 17
 Consider the 2-dimensional case:
This... But not this (XOR)

x2 x2
+

+ -
+
-

+ - x1 x1

-
- +

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 18
 XOR problem cannot be performed by a single
neuron
 Following neural networks can implement it:

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 19
 The solutions shown are not the only possible solutions to the
XOR problem
 There are, in fact, infinitely many possible solutions
 In these examples the relationships between the thresholds,
weights, inputs and outputs can be analyzed in detail
 But in neural networks (both computer and biological) with
large numbers of inputs, outputs and hidden neurodes
(neurons), the task of determining weights and threshold
values required to achieve desired outputs from given inputs
becomes practically impossible.
 Computer models therefore attempt to train networks to
adjust their weights to give desired outputs from given inputs.

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 20
x0

w0 = 3 X0 X1 Y
+1 +1 +1

Th= 2 Y = x1 AND x2 +1 -1 -1
-1 +1 -1
w1=4 -1 -1 -1
x1
x0
X0 X1 Y
w0 = 3
+1 +1 +1
Y = x1 OR x2 +1 -1 +1
Th= -6
-1 +1 +1
w1=5 -1 -1 -1
x1

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 21
X Y
-1 -1
w0 = 5
+1 +1
Th= 0 Y = X (IDENTITY MAPPING)
x0

w0 = -5 X Y
Y = NOT X (NEGATION) -1 +1
Th= 0
x0
+1 -1

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 22
 Learning Algorithm steps:
1. Initialise weights and threshold.
Set wi(t), (1 ≤ i ≤ m) to be the weight i at time t, and ø to be the threshold value
in the output node.
Set wi to small random values, thus initialising the weights and threshold.
2. Present input and desired output
Present input x0 and x1,x2,...,xm and desired output d(t)
3. Calculate the actual output
y(t) = f[w0(t) x0(t)+ w1(t)x1(t) + w2(t)x2(t) + .... + wm(t)xm(t)]
4. Adapts weights
wi(t + 1) = wi(t) + µ[d(t) − y(t)]xi(t) , for .
Steps 3 and 4 are repeated until the iteration error is less than a user-specified
error threshold or a predetermined number of iterations have been
completed.

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 23
 µ is the learning rate
 d(t) is the desired output
 y(t) is the actual output
 Error = d(t)-y(t)
 µ is between 0 and 1:

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 24
 Some perceptron boundaries:

Fundamentals of Neural Networks,


Instr. Kamil Yurtkan 25

You might also like