Professional Documents
Culture Documents
Thuat Toán TT
Thuat Toán TT
Thuat Toán TT
mt lc hoc tun t theo thi gian. C s kt hp l tri thc mng, c hnh thnh, tch lu v duy tr trong qu trnh hc, hun luyn v ng dng mng. Hun luyn ANN thc hin mt ng dng c th l iu chnh, xc lp cc gi tr trng s lin kt - cn c gi l b trng s kt ni ca mng (trong bi bo ny k hiu l W) - gia cc neuron trong mng v ca cc bias. Trong hc gim st, cc cp tn hiu vo ra c dng hun luyn mng sao cho tn hiu ra ca mng tim cn ti tn hiu ra mong mun ca h thng (Hnh 1). Sai s d bo l sai lch gia tn hiu ra mong mun v tn hiu ra ca mng:
e (i , W ) = y ( t ) y ( t , W )
(1)
B trng s ca mng nhn c sau hun luyn chnh l ma trn W lm ti thiu tiu chun c lng:
E (W ) = 1 P l (e(i,W ) min P i =1
(2)
trong , P l s mu d liu hun luyn. l(.) l chun xc nh dng, nghin cu ny s dng chun bnh phng L2.
chng lun thay i theo thi gian. Trong nhng trng hp ny, hun luyn mng trc tuyn (Online) c t ra, v do vn tc hun luyn mng phi c c t ln hng u. Hin c nhiu thut ton v hun luyn ANN pht trin trn c s cc tiu ho khai trin Taylor hm sai lch tn hiu ra (2), c th phn lm ba nhm chnh: nhm cc phng php Gradient Descent, nhm cc phng php Conjugate Gradient v nhm cc phng php Newton. Trong bi bo ny, chng ti xy dng thut ton mi v hun luyn mng da vo phng php Conjugate Gradient, trong mc tiu t ra l ci thin tc hi t ca qu trnh hun luyn ANN. B cc ca bi vit gm nm phn. Phn I, t vn , nu tm tt v hng nghin cu. Phn II nu ni dung phng php Conjugate Gradient v mt s nghin cu in hnh. Phn III l phn chnh ca bi bo, trnh by c s ton hc ca vn c cp v mt thut ton mi v hun luyn ANN c xy dng trn c s ton hc nu trn. Phn IV, th nghim kim chng. Trong phn ny chng ti xy dng mng neuron 5-5-1, s dng hm truyn Logsig(n)Purelin(n). Vit cc chng trnh hun luyn mng neuron trn c s thut ton mi (TT*) v thut ton trong [1], [2], [3] (gi tt l thut ton [1]) bng Matlab 7.1 chy m phng, so snh v kim chng. Phn V, kt lun. II. PHNG GRADIENT PHP CONJUGATE
(W ) =y- y
iu chnh trng s
Hnh 1. S hun luyn ANN trong hc gim st. Tn hiu ra ca mng y , tn hiu ra mong mun ca i tng y, tn hiu vo ca mng P. Trong lnh vc iu khin robot t hnh, cc thng s mi trng hoc chng ta khng th xc nh ht v c rt nhiu cc tnh hung c th xy ra; hoc c xc nh trc nhng gi tr ca
Trong mc ny, trnh by v ni dung ca phng php Conjugate Gradient c nu ln trong [1], [2], ng thi c th ho ni dung, chng ti nu ln mt thut ton Conjugate Gradient [3] c ng dng trong tnh ton ca Matlab 7.1. Hm sai s (2) theo chun L2:
Er (W ) =
(3)
1 (W Wn )T 2 E |W =Wn (W Wn ) 2!
(4)
- Bc dch chuyn ti u: Gi s hng dch chuyn c chn ti bc lp th n l pn vi bc dch chuyn l ti u n , nh vy im trng s th n+1 s l: Wn+1 = Wn + n .pn [1] chng minh: (5)
C s l thuyt trnh c by trn c th thut ton ho nh sau [3]: - Bc 1: Cho im xut pht W0, hng dch chuyn u tin l hng m (-) ca gradient ti W0, p0= -g0 - Bc 2: Chn h s hc ti u:
n =
E ( W )T |W =W . pn
n
p . E ( W ) |W =W . pn
T n 2
n
g p n = T n pn An pn
- Bc 3: Chn hng dch chuyn tip theo
pn +1 = g n +1 + n +1 . pn
T n
(13)
E (W ) |W =Wn+1 = 0 , v
(6)
v bc ti u c xc nh [2] ,[3]:
T g n pn n = T pn An pn
(8)
- Hng dch chuyn pn: Hng dch chuyn ti bc n [1], [2], [3]:
pn +1 = g n +1 + n +1. pn
C mt s hm lin kt khc nhau: Fletcher-Reeves (10), Polak-Ribiere Powell-BealeResatarts (12). g T .g n +1 = n +1T n +1 g n .g n
(9)
trnh by r cc phn sau, tc gi bi bo xin pht biu v chng minh b sau: Pht biu: Nu hm nhiu bin F(X) bt k c hm khai trin Taylor dng ng v gn ng ti o hm bc hai ti cc im ln cn ca X0 nh sau: - Dng ng:
Fr ( X ) = F ( X 0 ) + E T | X = X 0 ( X X 0 ) +
(11)
1 ( X X 0 )T 2 F | X = X 0 ( X X 0 ) + R2 (14) 2!
- Dng gn ng:
F ( X ) = F ( X 0 ) + E T | X = X 0 ( X X 0 ) +
n +1 = n +1 =
g n +1T .( g n +1 g n ) g nT .g n g .( g n +1 g n ) p .( g n +1 g n )
T n +1 T n
1 ( X X 0 )T 2 F | X = X 0 ( X X 0 ) 2!
trong ,
(15)
1 R2 = ( X X 0 )T [( X X 0 )T . 2 E ( ). 3! .( X X 0 )] v l mt im nm trong khong X v X0
th cc im cc tr ca hai hm F(X) v Fr(X) khng trng nhau. Chng minh: T (15) ta c:
2 F | X = X 0 .( 2 F | X = X 0 ) 1 F | X = X 0 ) + R2
= R2 | X = X
0 (
F | X = X 0 ) 1 F | X = X 0
ngha l im cc tr X (16) ca phng trnh (15) khng phi l im cc tr ca hm Fr(X) (14): iu phi chng minh.
1.2 Nhn xt
y:
T XT = [x1 x2 xk] ; X 0 = [x 01 x 02 ...x 0k ]
F xk x2
2
2 F x1xk 2 F L x2 xk O M 2 F L 2 xk X = X
L
Trong thut ton Conjugate Gradient ca [1],[2],[3] trnh by mc II chng ta thy rng: Bc dch chuyn n (13) l nghim ca phng trnh (6), trong E(W) l hm gn ng ti o hm bc 2 ca khai trin Taylor hm sai s gc Er(W) (3). V vy, nu theo bc n trn hng pn chng ta ch mi nhn c gi tr cc tiu ca hm gn ng E(W). Mt khc, da vo b 1.1 ta thy im cc tr hm khai trin E(W) khng phi l im cc tr ca hm gc Er(W), ngha l n tnh theo [1],[2],[3] cha phi l bc ti u, theo chng ta nhn c cc tiu ca hm sai s gc Er(W) (3). Thc t, bc dch chuyn ti u:
nop = n + n
0
(18)
F(X) t cc tr ti X ta phi c F = 0
F | X = X 0 + 2 F | X = X 0 .( X X 0 ) = 0
trong n l s gia v nop = n + n phi c xc nh t hm gc Er(W) hoc t hm khai trin dng (14) ca gi tr sai lch. S sai lch ca bc dch chuyn l yu t lm chm tc hi t ca thut ton [1], [2], [3]. Trong bi bo ny chng ti trnh by phng php gn ng xc nh n lm c s xy dng thut ton mi c tc hi t cao hn.
1.3 Xc nh n
X = X 0 ( 2 F | X = X 0 ) 1 F | X = X 0
(16)
Gi s hm sai s xc nh Wn c gn, pn, chng ta cn tm im cc tr Wn+1 theo hng pn. Qu trnh phi thc hin theo hai bc: - T Wn xc nh im Wn* bng cch dch chuyn theo hng pn, bc dch chuyn n
n =
T g n pn T pn An pn
W*n = Wn + n pn
- Bc 4: Ti im W*n, tnh g*n, A*n. Xc nh s gia n ca bc dch chuyn:
n = = g .p p .A . p
*T n n T * n n n
E (W ) |W =W * + p = 0 n n n n
n = E (W ) |W =W * . pn
T
n
(19)
E (W )T |W =W * . pn
n
pnT . 2 E (W ) |W =W * . pn
n
* g nT . pn * pnT . An . pn
pnT . 2 E (W ) |W =W * . pn
n
(20) l s gia ca bc dch chuyn theo hng pn, theo chng ta tim cn gn hn ti im cc tr ca hm gc Er(W) theo hong pn. Nh vy, bc dch chuyn ti u ti Wn l:
Cc tiu ha theo hng pn nhm xc nh im Wn+1 Wn+1 = W*n + n pn Nu thut ton cha hi t, quay li bc 2.
IV. TH NHIM KIM CHNG
nop = n + n nop
gT p g *T . p = T n n Tn * n g n An pn pn . An . pn (21)
Trong phn ny, chng ti s dng hai thut ton TT* v [1] hun luyn mng mng 55-1 ANN (hnh 3) nhn dng vector c trng ca nh theo hm sai s t tng quan (hnh 2): mt vn c nghin cu trong nhn dng nh [12], lin quan mt thit vi lnh vc k thut robot. Trong th nghim, vector c trng ca nh v bn vector vclj nh sau: v={1,2,1,0,1}; vcl1={1,1,2,1,0}; vcl2={0,1,1,2,1}; vcl3={1,0,1,1,2}; vcl4={2,1,0,1,1} Kho st hm sai s ca mng E(w) ng vi tt c cc mu hun luyn v tn hiu ra ca mng ng vi tng mu hun luyn (v,vcl1,2,3,4-tclj), j=0,1,,4. Kt qu cho thy tc hi t (gi tr trung bnh ca s vng lp) theo TT* nhanh hn [1], ng thi tn hiu ra ca mng theo TT* n nh hn theo [1]. (t hnh 4 n hnh 9)
p0= -g0
- Bc 2: Ti im Wn, tnh gn, n , An. Xc nh hng dch chuyn pn:
pn = g n + n . pn 1
v = {v1 , v2 ,..., vm }
SHIFT
A B
tclj =
(a
i =1
tclj
i
bi )
ej
tclj(W)
Hnh 2. Hun luyn mng neuron- dng hm sai s t tng quan- nhn dng vector c trng ca nh.
n1
111
b1
n2
121
n3
Lp phi tuyn
131 141
n4
b12
Lp tuyn tnh. Hm truyn purelin(n)
151
v0; vclk(i,j)
0ij
n5
b5
Hnh 5. Tn hiu ra ca mng, hm sai lch tv(W) theo 2 thut ton. Mu hun luyn vo-ra P0={v,vcl1,2,3,4-tcl0=0}
Hnh 6. Tn hiu ra ca mng, hm sai lch tcl1(W) theo 2 thut ton. Mu hun luyn vo-ra P1={v,vcl1,2,3,4-tcl1=2}
Hnh 7. Tn hiu ra ca mng, hm sai lch tcl2(W) theo 2 thut ton. Mu hun luyn vo-ra P2={v,vcl1,2,3,4-tcl2=2.4495}
Hnh 8. Tn hiu ra ca mng, hm sai lch tv(W) theo 2 thut ton. Mu hun luyn vo-ra P3={v,vcl1,2,3,4-tcl3=2.4495}
Hnh 9. Tn hiu ra ca mng, hm sai lch tv(W) theo 2 thut ton. Mu hun luyn vo-ra P4={v,vcl1,2,3,4-tcl4=2}
V. KT LUN
Trong nghin cu ny, chng ti xy dng mt thut ton mi (TT*) v hun luyn ANN c s vng lp trung bnh nh hn thut ton Conjugate Gradient trong [1],[2],[3]. iu ny c ngha trong vic hun luyn mng neuron trc tuyn Online trong nhiu ng dng v nhn dng v iu khin trong mi trng ng. V mt ton hc chng ta thy tnh cht ch ca thut ton TT* th hin thnh phn, ngha v vai tr ca n trong thut tan:
n = E (W )T |W =W * . pn
n
pnT . 2 E (W ) |W =W * . pn
n
* g n T . pn * pnT . An . pn
- Nu n l bc ti u: khi Wn* l im cc tr ca Er(W) (hnh 2) v do t (6), (7) ( * c chng minh trong [1],[2]) suy ra g nT . pn =0, ngha l n =0 v nop = n . - Nu n khng phi l bc ti u: L lun tng t nh trn, suy ra n 0 v do nop = n + n l bc ti u hng qu trnh kho st ti tim cn vi im cc tr theo hng pn vng lp th n. Thc t, trong hun luyn mng 1-5-1 (A=16x16) v 5-5-1 (A=36x36) chng ti nhn thy nhng vng lp u tin thng n 0 . S tham gia ca n lm tng tc hi t ca thut tan. Xt v s vng lp trung bnh trong hun luyn mng, TT* c gi tr nh hn ng k so vi [1] tuy nhin xt v thi gian hun luyn, s dng thut ton TT* ch nhanh hn [1] khi mng c ma trn trng s khng ln. Nu mng c ma trn trng s ln, TT* khng nhanh hn [1] mc d s vng lp t hn. y l vn t ra cho nghin cu tip theo ca chng ti nhm tng phm vi ng dng ca TT*.
1. Syed Muhammad Aqil Burney, Tahseen Ahmed Jilani, Cemal Ardil. 2004. A comparison of First and Second Order Training Algorithms for Artificial Neural Networks. International Journal of Computational Intelligence. Volume 1 number 3, 2004 ISSN: 1304-4508. 2. Ajith Abraham. 2003. Meta Learning Evolutionary Artificial Neural Networks. www.ElsevierComputerScience.com Powered by Science @ Director. Neurocomputing 56 (2004) 1-38. www.Elsevier.com/locate/neucom. 3. Prof. Matin Hagan of Oklahoma State University, Demuth, Beale . Neural Networks Design. ISBN 0-9717321-0-8. http://www.ee.Okstate.edu/mhagan/nnd.html; http://www.cubooks.colorado.edu. && 4. Bogdan M. Wilamowski, M. O nder Efe. An Algorithm for Fast Convergence in Training Neural Networks. 2001. IEEE. 5. Deniz Erdogmus, Jose C.Principe. Generalized Information Potential Criterion for Adaptive System Training. 2002. IEEE. 6. Nikolaos Ampazis, Stavros J. Perantonis. Two Highly Efficient Second-Order Algorithms for Training Feedforward Networks. 2002.IEEE. 9. Segio Lima Netto and Panajotis Agatholis. Efficient Lattice Realizations of Adaptive IIR Algorithms. IEEE 1998. 10. Ryan Mukai, Victor A. Vilnrotter. Adaptive Acquisition and Tracking for Deep Space Array Feed Antennas. 2002 IEEE. VOL 13.NO.5. September 2002. 11. Neural Networks for Robotic Control. Theory and Application. A.M.S. Zalzala and A.S. Morris. Britain 1996. 12. Nguyn c Minh. iu khin Robot Scorbot dng th gic my tnh. Lun vn Thc s. M s KKT-K13-009. i hc Bch khoa TP. HCM. 2004.