Professional Documents
Culture Documents
EELU ANN ITF309 Lecture 11 Spring 2024
EELU ANN ITF309 Lecture 11 Spring 2024
EELU ANN ITF309 Lecture 11 Spring 2024
• Unsupervised Learning
• Clustering
j
2d array of neurons
wj1 wj2 wj3 wjn Weights
d j ( wij xi ) 2
i
4. determine winner
find unit j with the minimum distance
5. Update winner so that it becomes more like x, together with
the winner’s neighbours for units within the radius
according to wij (n 1) wij (n) (n)[xi wij (n)]
6. Adjust parameters: learning rate & ‘neighbourhood function’
7. Repeat from (2) until … ?
Note that: Learning rate generally decreases 0 (n) (n 1) 1
with time: EELU ITF309 Neural Network
Lecture 11
EELU ITF309 Neural Network
Lecture 11
EELU ITF309 Neural Network
Lecture 11
EELU ITF309 Neural Network
Lecture 11
Neighborhood function
The neighborhood function Θ(u, v, s) depends on
the lattice distance between the BMU (neuron u)
and neuron v. In the simplest form it is 1 for all
neurons close enough to BMU and 0 for others,
but a Gaussian function is a common choice, too.
Regardless of the functional form, the
neighborhood function shrinks with time.
( n ) 0 exp n T
1
1 1
0 0
10
10
0
8
0
8
-8
-6
-4
-2
-8
-6
-4
-2
-1
-1
Distance from winner Distance from winner
Time
(n) 0 exp T
n
2
The initial radius is 0 and the learning rate is 0.5 . Calculate the
weight changes during the first cycle through the data, taking the
training vectors in the given order.
EELU ITF309 Neural Network
Lecture 11
Solution
Input vector 2 is closest to cluster unit 1 so update weights to cluster unit 1 again:
i1: (1, 1, 0, 0)
i2: (0, 0, 0, 1) Output units: 1 2
i3: (1, 0, 0, 0)
i4: (0, 0, 1, 1)
• With only 2 outputs, neighborhood = 0
– Only update weights associated with winning output unit (cluster) at each
iteration
• Learning rate
(t) = 0.6; 1 <= t <= 4
(t) = 0.5 (1); 5 <= t <= 8
(t) = 0.5 (5); 9 <= t <= 12
etc. Unit 1: .2 .6 .5 .9
• Initial weight matrix .8 .4 .7 .3
Unit 2:
(random values between 0 and 1)
n 2
d2 = (Euclidean distance)2 = k 1
(il ,k w j ,k (t ))
w j (t 1) w j (t ) (t )(il w j (t ))
Weight update:
Problem: Calculate the weight updates for the first four steps
EELU ITF309 Neural Network
Lecture 11
i1: (1, 1, 0, 0)
First Weight Update i2: (0, 0, 0, 1)
i3: (1, 0, 0, 0)
Unit 1: .2 .6 .5 .9 i4: (0, 0, 1, 1)
• Training sample: i1 Unit 2:
.8 .4 .7 .3
– Unit 1 weights
• d2 = (.2-1)2 + (.6-1)2 + (.5-0)2 + (.9-0)2 = 1.86
– Unit 2 weights
• d2 = (.8-1)2 + (.4-1)2 + (.7-0)2 + (.3-0)2 = .98
– Unit 2 wins
– Weights on winning unit are updated
new unit2 weights [.8 .4 .7 .3] 0.6([1 1 0 0] - [.8 .4 .7 .3])
[.92 .76 .28 .12]
– Giving an updated weight matrix:
Unit 1: .2 .6 .5 .9
Unit 2: .92 .76 .28 .12
EELU ITF309 Neural Network
Lecture 11
i1: (1, 1, 0, 0)
Second Weight Update i2: (0, 0, 0, 1)
i3: (1, 0, 0, 0)
Unit 1: .2 .6 .5 .9 i4: (0, 0, 1, 1)
• Training sample: i2
Unit 2: .92 .76 .28 .12
– Unit 1 weights
• d2 = (.2-0)2 + (.6-0)2 + (.5-0)2 + (.9-1)2 = .66
– Unit 2 weights
• d2 = (.92-0)2 + (.76-0)2 + (.28-0)2 + (.12-1)2 = 2.28
– Unit 1 wins
– Weights on winning unit are updated
new unit1 weights [.2 .6 .5 .9] 0.6([0 0 0 1] - [.2 .6 .5 .9])
[.08 .24 .20 .96]
– Giving an updated weight matrix:
Unit 1: .08 .24 .20 .96
.92 .76 .28 .12
Unit 2:
EELU ITF309 Neural Network
Lecture 11
i1: (1, 1, 0, 0)
Third Weight Update i2: (0, 0, 0, 1)
i3: (1, 0, 0, 0)
Unit 1: .08 .24 .20 .96 i4: (0, 0, 1, 1)
• Training sample: i3 .92 .76 .28 .12
Unit 2:
– Unit 1 weights
• d2 = (.08-1)2 + (.24-0)2 + (.2-0)2 + (.96-0)2 = 1.87
– Unit 2 weights
• d2 = (.92-1)2 + (.76-0)2 + (.28-0)2 + (.12-0)2 = 0.68
– Unit 2 wins
– Weights on winning unit are updated
new unit2 weights [.92 .76 .28 .12] 0.6([1 0 0 0] - [.92 .76 .28 .12])
[.97 .30 .11 .05]
– Giving an updated weight matrix:
Unit 1: .08 .24 .20 .96
.97 .30 .11 .05
Unit 2:
EELU ITF309 Neural Network
Lecture 11
i1: (1, 1, 0, 0)
Fourth Weight Update i2: (0, 0, 0, 1)
i3: (1, 0, 0, 0)
Unit 1: .08 .24 .20 .96 i4: (0, 0, 1, 1)
• Training sample: i4
Unit 2: .97 .30 .11 .05
– Unit 1 weights
• d2 = (.08-0)2 + (.24-0)2 + (.2-1)2 + (.96-1)2 = .71
– Unit 2 weights
• d2 = (.97-0)2 + (.30-0)2 + (.11-1)2 + (.05-1)2 = 2.74
– Unit 1 wins
– Weights on winning unit are updated
new unit1 weights [.08 .24 .20 .96] 0.6([0 0 1 1] - [.08 .24 .20 .96])
[.03 .10 .68 .98]
– Giving an updated weight matrix:
Unit 1: .03 .10 .68 .98
.97 .30 .11 .05
Unit 2:
EELU ITF309 Neural Network
Lecture 11
Applying the SOM Algorithm
Data sample utilized
• Sample: i1
– Distance from unit1 weights
• (1-0)2 + (1-0)2 + (0-.5)2 + (0-1.0)2 = 1+1+.25+1=3.25
– Distance from unit2 weights
• (1-1)2 + (1-.5)2 + (0-0)2 + (0-0)2 = 0+.25+0+0=.25 (winner)
• Sample: i2
– Distance from unit1 weights
• (0-0)2 + (0-0)2 + (0-.5)2 + (1-1.0)2 = 0+0+.25+0 (winner)
– Distance from unit2 weights
• (0-1)2 + (0-.5)2 + (0-0)2 + (1-0)2 =1+.25+0+1=2.25 2
k 1 (il ,k w j , k (t ))
n
• Sample: i3
– Distance from unit1 weights
• (1-0)2 + (0-0)2 + (0-.5)2 + (0-1.0)2 = 1+0+.25+1=2.25
– Distance from unit2 weights
• (1-1)2 + (0-.5)2 + (0-0)2 + (0-0)2 = 0+.25+0+0=.25 (winner)
• Sample: i4
– Distance from unit1 weights
• (0-0)2 + (0-0)2 + (1-.5)2 + (1-1.0)2 = 0+0+.25+0 (winner)
– Distance from unit2 weights
• (0-1)2 + (0-.5)2 + (1-0)2 + (1-0)2 = 1+.25+1+1=3.25
k 1 (il , k w j , k (t ))
n 2