Professional Documents
Culture Documents
Markov Models PDF
Markov Models PDF
Hunh Dip Tn
Phan Tn Ton
Phan Th Hng
Ni dung
Gii thiu
M hnh Markov M hnh Markov n (HMMs)
Ba bi ton c bn ca HMMs
Implementation, Properties, and Variants, Further
Reading
Gii thiu
Hidden Markov Models (HMMs) M hnh
markov n c nn tn ca m hnh thng k c s dng trong h thng nhn dng dng ni hin i.
Cc bin th ca HMMs c s dng rng ri v thng c coi l thnh cng nht.
Gii thiu
Nhn dng ting ni.
Gii thiu
Chng HMMs c t phn u ca phn
Grammar ca sch v n lm vic trn th t ca cc t trong cu l mt khi u hiu v c php ca cu. Trong chng ny tc gi trnh by
Nn tn l thuyt ca HMMs, Lin h vi nhng ng dng ca chng Tng kt mt vi gi m rng HMMs v cc k thut
M hnh Markov
Gi s ta cn d on ngy mai s sch trong th vin
l bao nhiu, nh vy ta ch quan tm n s sch hin ti ch khng quan tm n s sch c trong ngy hm qua hay tun trc, nm trc,
Trong bi ton trn ta thy: ta cn d on trng thi
tng lai da vo thng tin trng thi hin ti m khng cn n thng tin ca qu kh.
M hnh Markov
nh ngha: Cho =(S, A, ) gi X = (X1, X2, , XT) l
mt dy cc bin ngu nhin vi gi tr nm trong tp S={s1, s2, s3, , sN} - tp khng gian trng thi, tha cc tnh cht sau:
P(Xt+1=sk | X1, X2, , Xt) = P(Xt+1=sk | Xt) 2. Xc xut chuyn trng thi c lp vi thi gian 3. Xc sut chuyn trng thi (lu Nvo ma trn A) aij=P(Xt+1=sj | Xt = si) v aij 1 j 1 4. Xc sut trng thi khi ng N i=P(X1=si) v i 1 i 1 c m t nh trn l mt m hnh Markov
1.
M hnh Markov
P(X1, X2, , XT) = P(X1).P(X2|X1).P(X3|X1,X2)P(XT|X1,X2,, XT-1)
= P(X1).P(X2|X1).P(X3|X2) P(XT|XT-1) =
a
t 1
T 1
Xt Xt 1
Crayzy machine
Ta c m hnh Markov r (Visible Markov Model) Nhng thc t n ch c xu hng lm iu ny, nh vy ta cn c xc sut sinh ra sn phm
P(Ot = k | Xt = si, Xt+1 = sj) = bijk (Xc sut sinh ra sn phm k khi chuyn t si sj)
Vi k l gi tr quan st c ti thi im t.
Vi dy quan st c l {lem, ice_t} sau hai ln mua th xc sut l: 0.7 * 0.3 * 0.7 * 0.1 + 0.7 * 0.3 * 0.3 * 0.1 + 0.3 * 0.3 * 0.5 * 0.7 + 0.3 * 0.3 * 0.5 * 0.7 = 0.084
Cu trc ca mt HMMs
Ba bi ton c bn ca HMMs
Bi ton 1: (Evaluation problem) Cho dy quan st O=o1o2...oT v HMM - hy xc nh xc sut sinh dy t m hnh P(O| ).
Ba bi ton c bn ca HMMs
Bi ton 2: (Decoding problem) Cho dy quan st O=o1o2...oT v HMM - hy xc nh dy chuyn trng X=(X1X2...XT) sao cho xc sut sinh ra O ln nht(optimal path).
Ba bi ton c bn ca HMMs
Bi ton 3: (Learning problem) Hiu chnh HMM - cc i ho xc sut sinh O P(O|) (tm m hnh khp dy quan st nht).
1. Evaluation problem
2. Decoding problem
3. Learning problem
1. Tm xc sut ca dy sn phm
Cho dy sn phm O = (o1,,oT) v m hnh
= (A, B, ).
s1
s2
Trng thi
s3
sN
T+1
S ln, t
1. Tm xc sut ca dy sn phm
1. Tm xc sut ca dy sn phm
s1
s2
Trng thi
s3
sN
T+1
S ln, t
Th tc li (Backward Procedure)
Kt hp th tc tin v li
= (A, B, ).
3. c lng cc tham s
3. c lng cc tham s
Khng c cng thc no chn cc i ha
P(O|) nhng c th cc i ha cc b bng thut gii leo i lp. vi m hnh (c th chn ngu nhin). Quan st kt qu tnh ton, chng ta c th nhn thy nhng qu trnh chuyn trng thi no v nhng pht sinh sn phm no c dng nhiu nht th tng xc sut ca chng.
3. c lng cc tham s
3. c lng cc tham s
3. c lng cc tham s
Bt u vi vo m hnh (c chn trc hoc ngu
nhin).
Cho dy O ln lt chy qua cc m hnh c lng
3. c lng cc tham s
HMMs
Implementation HMM 2. Initialization of parameter values 3. Variants of HMMs 4. Applications of HMM
1.
Implementation HMM
floating-point underflow.
Forward-Backward algorithm:
Khi khng cc gi tr d liu thit lp HMM ? Gi nh cc gi tr trong A l nh nhau. Gn mt s gi tr = 0. Gi tr B l quan trng nn c nhng phng n khi to tt (trnh ngu nhin)
Variants of HMMs
Applications of HMM
Trong sinh hc: phn tch gen trong chui ADN.
Trong x l ngn ng: gn cc t loi trong vn bn.
http://www.cs.ubc.ca/~murphyk/Software/HMM/hmm.html
http://ghmm.org/ http://www.mathworks.com/products/bioinfo/