Professional Documents
Culture Documents
07 ANN Architecture
07 ANN Architecture
07 ANN Architecture
Debasis Samanta
IIT Kharagpur
dsamanta@iitkgp.ac.in
31.01.2023
For the first three patterns output is 0 and for the last pattern
output is 1.
00
0
10
01 1
11
x1
w1
Y
w2
x2
1 2
1 2
P
Here, y = wi xi − θ and w1 = 0.5,w2 = 0.5 and θ = 0.9
Debasis Samanta (IIT Kharagpur) Soft Computing Applications 31.01.2023 5 / 27
Single layer feed forward neural network
The concept of the AND problem and its solution with a single
neuron can be extended to multiple neurons.
INPUT OUTPUT
ɵ1
1
f
w11
x1 I1= o1
w12
w13 f2
x2 I2= o2
f3
x3 I3= o3
………..
………..
w1n
fn
xm In= on
INPUT OUTPUT
ɵ1
f1
w11
x1 I1= o1
w12
w13 f2
x2 I2= o2
f3
x3 I3= o3
………..
………..
w1n
fn
xm In= on
Note that the input layer and output layer, which receive input
signals and transmit output signals are although called layers,
they are actually boundary of the architecture and hence truly not
layers.
The only layer in the architecture is the synaptic links carrying the
weights connect every input to the output neurons.
I31= f31 o1
f11 I21= f21 3
x1 I11= ɵ21
ɵ1
ɵ11
I32= f32
x2 I12= f12 I22= f22
ɵ23
o2
ɵ2
1 ɵ22
………..
………..
………..
………..
I2m= f2m
I3n= f3n on
I1l= f11
ɵm
2
3
ɵn
xp ɵ1l
HIDDEN OUTPUT
INPUT
I31= f31 o1
f11 I21= f21 3
x1 I11= ɵ21
ɵ1
ɵ11
I32= f32
x2 I12= f12 I22= f22
ɵ23
o2
ɵ2
1 ɵ22
………..
………..
………..
………..
I2m= f2m
I3n= f3n on
I1l= f11
ɵm
2
3
ɵn
xp ɵ1l
Thus, in these networks, there could exist one layer with feedback
connection.
There could also be neurons with self-feedback links, that is, the
output of a neuron is fed back into itself as input.
2
2x
w
+
x2
1
1x
w
w0
+
b0
f=
x1 w1
x1
w2
x2
x2
0 0 0 y=0 Y=1
0 1 0
1 0 0 0,0
1 1 1 y=0 1,0
0 y=0
1 x1
f = 0.5 x1 + 0.5 x2 - 0.9
x2
0,1 1,1
x1 x2 Output (y) 1 y=1 y=0
0 0 0
0 1 1
1 0 1
y=0 y=1
1 1 0 0
x1
0,0 1,0
X1 1 0.5 1
1 f
0.5
1
-1
X2 1.5
1
TARGET
NEURAL NETWORK ERROR
INPUTS OUTPUT
ARCHITECTURE CALCULATION
FEED BACK
Adjust weights /
architecture