Professional Documents
Culture Documents
Certanty Factor Dan Naive Bayes
Certanty Factor Dan Naive Bayes
PENGETAHUAN
INFERENSI DALAM
KETIDAK PASTIAN
KECERDASAN BERBASIS
PENGETAHUAN
CERTAINTY FACTOR
Uncertainty: Introduction
3
Uncertainty: Introduction
Misal :
- kita mengerti suatu kalimat yang tidak lengkap
- Dokter memberikan medical treatment yang
benar untuk symptom yang ambigu;
4
So, how to define the term “Uncertainty”?
5
So, how to define the term “Uncertainty”?
7
Uncertainty Handling Methods
E.g. Suppose:
Abductive reasoning If x is a bird then x flies
8
Evaluation Criteria for
uncertainty handling methods
Expressive power
Logical correctness
Computational efficiency of
inference
9
Scheme used by expert system in
Handling Uncertainty
10
UNCERTAIN EVIDENCE
Representing uncertain evidence
CF (E) = CF (It will probably rain today)
= 0.6
Similar to P(E), however, CF are not probabilities
but measures of confidence
Derajat yang menunjukkan kita percaya bahwa
fakta benar
It written as exact term but add CF value to it
Example: It will rain today CF 0.6
Certainty Theory
Certainty factors measure the confidence that is
placed on a conclusion based on the evidence
known so far.
Certainty factors combine belief and disbelief
into a single number based on some evidence
A certainty factor is the difference between the
following two components :
CF = MB[h:e] - MD[h:e]
3. Certainty Factor
Number that reflects the net level of belief in a
hypothesis given available information
CF = MB - MD
-1 CF 1
Certainty Theory
Range of CF values
Example:
IF There are dark cloud -E
THEN It will rain - H CF = 0.8
Example
From previous rule:
CF (E) = 0.5
C.F.P. is
CF (H, E) = 0.5 x 0.8
= 0.4 #
In words: It maybe raining
Certainty Factor Propagation (C.F.P)
C.F.P is
CF (H, E) = -0.5 * 0.8
= -0.4 #
Conjunctive Rule
IF EI AND E2 AND ... THEN H CF (RULE)
CF (H, E1 AND E2 AND..) = min {CF (Ei)} * CF
(RULE)
Example:
IF The sky is dark
AND The wind is getting stronger
THEN It will rain CF = 0.8
Assume:
CF (the sky is dark) = 1.0
CF (the wind is getting stronger) = 0.7
CF (It will rain) = min {1.0,0.7} * 0.8
= 0.56 #
In words: It probably will rain
Certainty Factor Propagation (C.F.P)
Example:
IF the sky is dark
OR the wind is getting stronger
THEN It will rain CF = 0.8
CF (It will rain) = max {1.0, 0.7} * 0.8
= 0.8 #
Rule 1
IF the weatherman says it is going to rain (E1)
THEN It is going to rain (H)
CF (Rule 1) = 0.8
Rule 2
IF the farmer says it is going to rain
(E2)
THEN It is going to rain (H)
CF (Rule 2) = 0.8
Certainty Factor Propagation (C.F.P)
Example
Consider Rain Prediction: Rule 1 and 2
Explore several cases
Case 4: Several Sources Predict Rain at the same level of belief but one
source predicts no rain
If many sources predict rain at the same level, CF(Rain) = 0.8, the CF
value converge towards 1
CFcombined (CF1, CF2 ..) 0.999 = CFold
CFold = collected old sources info.
If new source say negative
CFnew = -0.8 then,
CFcombined (CFold, CFnew)
= CFold + CFnew
1 - min { CFold, CFnew }
= 0.999-0.8
1 - 0.8
= 0.995
Shows single disconfirming evidence does not have major impact
Certainty Factor Propagation (C.F.P)
Example:
IF E1
AND E2
OR E3 max
AND E4
THEN H
Given the premise CF, how do you combine with the CF for the rule?
CF(H, Rule) = CF(E) * CF(Rule)
So, CF(mammal, R1) = CF(hasHair) * CF(R1) = 0.8 * 0.9 = 0.72
CF(mammal, R2) = CF(forwardEyes sharpTeeth) * CF(R2)
= 0.3 * 0.7
= 0.21
Certainty factors
Given diff rules with same conclusion, how do you combine CF's?
CF(H, Rule1 & Rule2) = CF(H, Rule1) + CF(H, Rule2)*(1-
CF(H,Rule1))
Rule 1
IF the weather looks lousy E1
OR I am in a lousy mood E2
THEN I shouldn’t go to the ball game H1 CF = 0.9
LATIHAN
Rule 2
IF I believe it is going to rain E3
THEN the weather looks lousy E1 CF = 0.8
Rule 3
IF I believe it is going to rain E3
AND the weatherman says it is going to rain E4
THEN I am in a lousy mood E2 CF = 0.9
LATIHAN
Rule 4
IF the weatherman says it is going to rain E4
THEN the weather looks lousy E1 CF = 0.7
Rule 5
IF the weather looks lousy E1
THEN I am in a lousy mood E2 CF = 0.95
LATIHAN
BAYES REASONING
TEORI PROBABILITAS
Contoh :
- pelemparan mata uang S={M, B}
- pelemparan dadu S = {1,2,3,4,5,6}
– Objektif Probability
Classical probability
- asumsi : keluaran dari suatu eksperimen
mempunyai kemungkinan sama
Experimental probability
b. P(S) = 1
P (rolling a 1 or 2 or 3 or 4 or 5 or 6) = 1
e. P(A’) = 1-P(A)
P (not rolling a 5) = 1 – 1/6 = 5/6
SIFAT PROBABILITAS (3)
P(cuaca=cerah|olahraga = ya) = ?
P ( B A) P( B | A).P ( A)
P ( A B ) P ( B | A).P ( A)
P ( B | A).P ( A)
P( A | B) Bayesian Rule
P( B)
CONDITIONAL PROBABILITY
P ( B | A).P ( A)
P( A | B)
P ( B | A).P ( A) P ( B | A).P (A)
BAYESIAN REASONING
– Diketahui rule :
meningitis sakit leher [prob= 0,8]
diketahui :
prob. seseorang menderita meningitis= 0,0001
prob. seseorang sakit leher = 0,1
Tentukan probabilitas seseorang kena meningitis jika
sakit leher.
Misal M = meningitis, S = sakit leher maka:
P(M|S) = P(S|M) x P(M) / P(S)
= 0.8 × 0.0001 / 0.1 = 0.0008
BAYESIAN REASONING
P ( E | H i ). P ( H i )
P( H i | E ) m
P( E | H
k 1
k ). P ( H k )
CONTOH
P( E1 E2 ...En | H i ).P( H i )
P( H i | E1 E2 ...En ) m
P( E E ...E
k 1
1 2 n | H k ).P( H k )
P( E | H
k 1
1 k ).P( E2 | H k ).....P( En | H k ).P( H k )
CONTOH
Hipotesis
Probabilitas i=1 i=2 i=3
P(Hi) 0,40 0,35 0,25
P(E1|Hi) 0,3 0,8 0,5
P(E2|Hi) 0,9 0,0 0,7
P(E3|Hi) 0,6 0,7 0,9
CONTOH
P ( E1 | H1 ). P ( E3 | H1 ). P ( H1 ) 0,3.0,6.0,40
P ( H1 |E 1E3 ) 0,19
m
0,3.0,6.0,40 0,8.0,7.0,35 0,5.0,9.0,25
P( E
k 1
1 | H k ). P ( E3 | H k ). P ( H k )
Main disadvantage:
67