Or Ve, Tor Referred Re - Peren, e (Or

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

LVQ

· Learnin9 ve,tor ctuanti2ation {LVQ) is a pattern ,lassi.Pi,ation


method in whi,h ea,h output ut1it represents a parti,ular ,lass
or ,ate9ory.
· The wei9ht ve,tor .Por an output unit is o.Pten referred to as a
re.Peren,e (or ,odebook) ve,tor .Por the ,lass that the unit
represents.
· During training, the output units are positioned to approximate
the de,ision sur.Pa,es o.P the theoreti,al i,ayes ,lassi.P ier.
· After training, an LVQ net ,lassi.Pies an input ve,tor by
assigning it to the same ,lass as the output unit that has its
weight ve,tor (re.Peren,e ve,tor) ,losest to the input ve,tor
Arc.hitecture
The CArc:.hite"ture o.P an LVQ neural net, is essent ially the same £AS thCAt o.P "
Kohonen sel.P-or9£Anizin9 mCAp (without a topolo9i""' struc:.ture bein9 assum ed
.Por the output units)

1l
•••

• ••

Figure 4.27 Leaming vector quantization neural net.


Algorithm
· The motivation .Por the al9orithm .Por the LVQ net is to .Pind
the output unit that is ,losest to the input ve,tor.
· Toward that end, i.P x and w, be lon9 to the same ,lass, then we
move the wei9hts toward the new input ve<.. tor; i.P x and w,
belon9 to di.P.Perent ,lasses, then we move the wei9hts C\way
.Prom this input ve,tor.
X training vector (x 1, •• • , x,, ... , .r,.).
T correct category or class for the training vector.

J . weight vector for jth output unit (w1 1, ... , wu~ ... , .Wnj).
C1 category or class represented by jth output unit.
lfx - w1U Euclidean distance between input vector and (weight vector for)
jth output unit.
Algorithm
Step 0. Initialize reference vectors (several strategies are discussed shortly);
initialize learning rate, cx(O).
Step l. While stopping condition is false, do Steps 2-6.
Step 2. For each training input vector x. do Steps 3-4.
Step .3. Find J so that Ux - wJII is a minimum.
Step 4. Update wJ as follows:
if T = CJ, then
wJ(new) = WJ(old) + «(x - w1(old)];
if T ~ C1, then
w1 (new) = w1(old) - a(x - w1(old)) ..

Step 5. Reduce learning rate.


Step 6. Test stopping condition:
.The condition may specify a fixed number of iterations
(i.e., executions of Step 1) or the learning rate reaching a
sufficiently small value.
Applic.Gtion
· The simplest method o.P initiafi2in9 the wei9ht (re.PeYen,e)

ve,toYs is to take the .Pirst m trainin9 ve,tors and use them as

wei9ht ve,tors; the remainin9 ve,tors are then used .Por

trainin9 (Example 4.11 ).

· Another simple method, is to assi9n the initial wei9hts and

,ICAssi.Pi,ations randomf y. (Example 4.12 ).


Ext1mple 4.11
· Learnin9 vec. tor eiuanti2ation (LVQ): .P ive vec. tors assi9ned to
two ,lasses.
· The .Poff ow in9 in;,ut vel:. tors re;,resent two ,lasses, 1 and 2:
VECTOR CLASS
(1 • .-. o. 0) 1
(0. o. 0, I) 2
(0. O. l, I) 2
(l. 0, 0, 0) 1
(0. I. I, 0) 2

· The .P irst two vec.tors will be used to initialize the two


re.Perenc.e ve,tors.
· Thus, the .P irst output unit represents ,lass 1, the sec.ond ,lass
2 (symbolic.ally, l 1 =1 and l2 =2).
Ex<Ample 4.11
.
• The vectors (0, 0, 1, 1), (1, 0, 0, 0), and (0, 1 , 1, 0)
are used as the training vectors.

• Only one iteration (one epoch) is shown:


• Step 0. Initialize weights:

• W1 = (1, 1, 0, O);

• W2 = (0, 0, 0, 1).

• lnitiallize the learning rate: a = 0.1


ExGmple 4.11
Sltp J. Begin computations.
Sttp 2. For input vector 1 == (0, 0, I, l) with T = 2, do Steps 3-4.
Sttp J. J = 2, since x is closer to w1 than to w•·
Step 4. Since T = 2 and C2 = 2, update w~ as follO\\'S:
W2 = (0, o. 0, 1} + .l [(Ot 0, I, 1) - (0, o.0.1)]
= (0, 0~ .I, I).
Step 2. For input vector x = (l, 0, 0, 0) -with 1 = 1, do Steps 3-4.
Step 3. J = 1.
Step 4. Since T = I and C1 = 1, update w1 as follows:
Wt= (1, 1,0,0) +.1 ((1,0,0,0)- (1, 1,0,0)]
= (1, .9, 0, 0).
Ex<Ample 4.11
Step 2. For input vector x = (0, 1, 1, 0) with T = 2, do Steps 3-4.
Sttp 3. J = I. .
Step 4. Since T = 2, but C1 -= 1, update w, as follows:
W1 = (1, .9,0,0) - .I [(O, 1, 1,0)- (1, .9.0.0))
= (I. I, .89, - .·1, 0).
Step 5. This completes one epoch of training.
Reduce the learning rate.
Sttp 6. Test the stopping condition.

You might also like