Professional Documents
Culture Documents
!!!!!!!!!! ModelingUncertainty
!!!!!!!!!! ModelingUncertainty
www.zemris.fer.hr/~bojana
bojana.dalbelo@fer.hr
Modeling uncertainty
V1.1
Possible causes:
Data is unavailable or missing;
Data exists, but it's unclear or unreliable;
Data representation is imprecise;
Default values may have exceptions;
MODELING UNCERTAINTY IN
KNOWLEDGE-BASED SYSTEMS
How to represent vague data
How to combine vague data
How to infer conclusions from vague data
MODELING UNCERTAINTY IN
KNOWLEDGE-BASED SYSTEMS
3. Certainty factors
4. Dempster-Shafer theory
Monty Hall problem
The rules of the game: after you chose one door, the door
remains closed. The show presenter, Monty Hall, who knows
what is hidden behind each of the doors, opens up one of the
two remaining doors.
Monty Hall problem
X – certain event
∅ – impossible event
~xi – it is not the case that xi (opposite event)
Definition
Probability p is a function, p : P(X) → [0, 1]
0 ≤ p(xi) ≤ 1, for ∀xi ∈ X, such that p(X) = 1
If x1, x2, …, xk are mutually exclusive, then
p(∪i xi) = Σi p(xi)
Definition
Events x1 and x2 are independent if
p (x1 ∧ x2) = p(x1) p(x2)
BAYES SCHEME
Conditional probability
Let x and y be events, i.e., x, y ⊂ X.
Let us assume that x occurred
What is the probability of y if we know that x has occurred ?
We call this conditional probability - p(y|x).
Definition
Condiitonal probability is given by p(y|x) = p ( x∧ y ) . (2)
p( x )
y X – set of
elementary events
y∧x
BAYES SCHEME
p ( x∣y ) p ( y )
p(y|x) =
p( x)
Rule denominator:
p(x) = p(x ∧ X) = p(x ∧ (y ∨ ~y)) = p((x ∧ y) ∨ (x ∧ ~y)) =
according to (ii) from the definition, (x ∧ y) ∩ (x ∧ ~y) = ∅ ⇒
p(x) = p(x ∧ y) + p(x ∧ ~y)
p(y|x)== p ( x∣ y ) p ( y )
p(y|x) (8)
p ( x∣ y ) p ( y )+ p( x∣~ y ) p (~ y )
BAYES SCHEME IN EXPERT SYSTEMS
IF hypothesis H is true ,
THEN conclusion/evidence E is true, with probability p
IF hypothesis H is true ,
THEN conclusion/evidence E is true, with probability p
Example:
IF the patient has flu (H)
THEN the patient has rhinitis (E) with probability 0.75
p(E|H) =0.75
VICE VERSA: If E is true (patient has rhinitis) , what can
we infer about H (patient has flu)?
p(H|E) ? E
H->E
Which inference rule is this? H
Is it a sound rule?
BAYES SCHEME IN EXPERT SYSTEMS
p ( E∣H ) p( H )
p(H|E)
p(H|E) == (9)
p(E)
p ( E∣H ) p ( H )
p(H|E)
p(H|E) == p ( E∣H ) p( H )+ p ( E∣~ H ) p (~ H ) (10)
BAYES SCHEME IN EXPERT SYSTEMS
Example:
Does Ivan have flue (hypothesis), if he has rhinitis (fact) ?
p(flue|rhinitis) =
p (rhinitis∣flu ) p (flu )
p (rhinitis∣flu ) p( flu )+ p (rhinitis∣~ flu ) p (~ flu )
BAYES SCHEME IN EXPERT SYSTEMS
Then:
p(E) = p(Ivan has rhinitis) = (0.75)(0.2) + (0.2)(0.8) = 0.31
H1 H4 X
H6
H2
H3 H5 H7
(Hi that satify this are called the complete system of events)
BAYES SCHEME IN EXPERT SYSTEMS
Bayes analysis
Notation:
Ai, i=1, 2, 3 – the car is behind door i
Mij, i, j=1, 2, 3 – presenter Monty has chosen door j after
the participant has chosen door i
The car can be behind any of the doors, thus each of the
doors has the same prior probabilty to hide the car.
In our case, prior means “before the show starts” or
“before we see a goat”.
Hence, prior probability of event Ai equals:
P(Ai) = 1 / 3, i = 1, 2, 3
Monty Hall problem
The presenter will choose between the two doors that the
participant has not chosen. Moreover, the presenter will
always chose the door behid which there is no car.
P( M 13∣A1 ) P( A1 )
P( A1 ∣M 13 )=
P ( M 13 )
Monty Hall problem
1 1 1
P( M 13∣A 1 )P ( A 1 )= × =
2 3 6
Normalization constant in the denominator can be
expressed as:
P( M 13 )=P( M 13 ,A 1 )+P( M 13 ,A 2 ) +P( M 13 ,A 3 )
P( M 13 )=P( M 13∣ A1 ) P( A1 )+
P( M 13∣ A2 ) P( A 2 )+
P( M 13∣ A3 ) P( A 3 )
1 1 1 1 1
P( M 13 )= × +1× +0× = .
2 3 3 3 2
Monty Hall problem
Hence:
1 1 1
P( A1 ∣M 13 )= / =
6 2 3
P ( M 13∣A 3 ) P( A 3 ) 1 1
P( A 3∣M 13 )=
P( M 13 ) ( )
= 0× / =0
3 2
Monty Hall problem
It therefore follows:
1 2
P( A 2∣M 13 )=1− −0= .
3 3
This demonstrates that the winning strategy is always to
switch the choice.
BAYES SCHEME IN EXPERT SYSTEMS
Important remark:
Example:
Two symptoms A and B can both be indicative of an
illness with probability p. However, if both symptoms are
present simultaneously, they may reinforce each other (or
oppose each other)
0 . 3⋅0 . 6
p(H1| E1) = =0 . 4
0 .3⋅0 . 6+0 . 8⋅0 . 3+0 . 3⋅0 .1
0 . 8⋅0 . 3
p(H2| E1) = 0 .3⋅0 . 6+0 . 8⋅0 . 3+0 . 3⋅0 .1 =0 .53
0 . 3⋅0 . 1
p(H3| E1) = 0 .3⋅0 . 6+0 . 8⋅0 . 3+0 . 3⋅0 .1 =0 . 06
If it now also turns out that the patient has a cough, then
(n=2 and formula (11) )
0 .3⋅0 . 6⋅0 . 6
p(H1| E1 E2) = =0 . 33
0 .3⋅0 . 6⋅0 . 6+0 . 8⋅0 . 9⋅0 . 3+0 .3⋅0 . 0 ⋅0 . 1
0 . 8⋅0 . 9⋅0 . 3
p(H2| E1 E2) = 0 .3⋅0 . 6⋅0 . 6+0 . 8⋅0 . 9⋅0 . 3+0 .3⋅0 . 0 ⋅0 . 1 =0 . 67
0 . 3⋅0 . 0⋅0 .1
=0 . 00
p(H3| E1 E2) = 0 .3⋅0 . 6⋅0 . 6+0 . 8⋅0 . 9⋅0 . 3+0 .3⋅0 . 0 ⋅0 . 1
Advantages
Very good theoretical foundations
Most elaborated among all models for dealing with
uncertainty
Disadvantages
A large amount of dana is needed to determine all the
probabilites
Example
We have a system with 50 hypotheses and 300
evidences. How many probabilities must be defined?
An example
A car mechanic describes using his own words how
he/she infers why the car won't start. This is a passage of
this description pertaining to the accumulator battery age
as a one potential cause of the problem.
FUZZY LOGIC and FUZZY SETS
1
less new at the end of
new
its lifetime
0 1 2 3 4 5 age
FUZZY LOGIC and FUZZY SETS
IsIsititnatural
naturaltotodefine
definethe
the
expressions
expressionssuch suchas as"old"
"old"and
and
"new"
"new"withwithsuch
suchsharp
sharp
boundaries?
boundaries?
Does
Doesititmeans
meansthatthataa2-year
2-year
accumulator
accumulatorisisnew newtoday,
today,but
but
tomorrow
tomorrowititisn't
isn'tnew
newanymore?
anymore?
0 1 2 3 4 5 age
FUZZY LOGIC and FUZZY SETS
human
FUZZY LOGIC and FUZZY SETS
Fuzzy
Fuzzylogic
logicisisaamathematical
mathematicalmodel
modelfor
forthe
the
representation
representationand
andinference
inferencewith
withhuman
humanknowledge,
knowledge,which
which
isisexpressed
expressedininwords,
words,ininaavague
vagueand
andimprecise
imprecisemanner
manner
Why classical logic is not enough ?
P is P is
true false
?
?
? ? ?
Two truth values are inadequate for modelling
the inference based on human knowledge.
This knowledge is often incomplete, vague,
expressed in natural language, and a matter
of degree.
Vagueness - Fuzziness
Example
Linguistic variable: AGE
x= AGE
x - xnaziv
– namejezične
of the
linguistic variable
arijable Starosna dob
AGE
(GODINE)
TxTx-– vrijednosti
value of the
linguisticvarijable
jezične variable
VRLO
Very
MLAD MLAD
Young Old
STAR Mx
young Mx ––theznačenje
meaning of
the
tj.linguistic variable
neizraziti
0.5 (the fuzzy sets)
skupovi
0.4
0.9
0.8 0.9
0.8 0.5 0.6 0.9
0.5
0.6 0.7 0.8
5
20 25 30 35 .... 50 55 60 65 …
Fuzzy
Fuzzylogic
logic==computing
computingwith
withwords
words
(L.
(L.Zadeh)
Zadeh)
A = “Cold”
Con(A) = “Very cold”
Membership Grade
1
0.8
0.6
0.4
0.2
U
1 3 5 7 9 11 13 15 17 19 21
LINGUISTIC MODIFIERS
Example
Dil(A) = “More or less cold”
Membership Grade
0.8
0.6
0.4
0.2
U
1 3 5 7 9 11 13 15 17 19 21
The relation between FUZZY SETS and FUZZY LOGIC
Example
Assum John's height is 185 cm
Someone has stated “John is tall”
What is the truth value of this proposition?
Assume the set of tall people is defined using µTALL(x)
1
µ TALL – membership
0.5
function defining the
concept “TALL PEOPLE”
Then:
The truth value of proposition "John is tall" is equivalent to
the degree of membership with which the element 185
belongs to the fuzzy set of tall people
µTALL (185) = 0.5
OPERATIONS on fuzzy sets
0.8
0.6
0.4
0.2
0 5 10 15 20 25 30
OPERATIONS on fuzzy sets
0.4
0.2
0 5 10 15 20 25 30
OPERATIONS on fuzzy sets
1 1
0.8
0.6
0.8 0.4
0.2
0.4
0.2
0 5 10 15 20 25 30
OPERATIONS on fuzzy sets
Premise x is A1
Implication If x is A then y is B
Conclusion y is B1
REASONING IN FUZZY LOGIC
Example
Example
Fuzzy relation "approximately equal" defined on the
universal set V, W ={1,2,3}
R={ ((1, 1), 1), ((1, 2), .8), ((1, 3), .3), ((2, 1), .8), ((2, 2), 1),
((2, 3), .8), ((3, 1), .3), ((3, 2), .8), ((3, 3), 1) }
In matrix form:
1 2 3
1 1 0.8 0.3
2 0.8 1 0.8
3 0.3 0.8 1
REASONING IN FUZZY LOGIC
1
0.8
R 0.6 3
0.4
0.2
0 2 W
1
2
V 1
Remark 3
Example
Implication “If Ivan is tall then Ivan has a large weight" is
defined by a fuzzy relation “HEIGHT x WEIGHT” as
follows:
Given a set of “tall” people and a set of “heavy-weight”
people. Then, according to the above definition, the fuzzy
relation “tall and heavy-weighted " people is given by:
REASONING IN FUZZY LOGIC
Example
Implication “If Ivan is tall then Ivan has a large weight" is
defined by a fuzzy relation “HEIGHT x WEIGHT” as
follows:
Given a set of “tall” people and a set of “heavy-weight”
people. Then, according to the above definition, the fuzzy
relation “tall and heavy-weighted " people is given by:
REASONING IN FUZZY LOGIC
60 65 70 75 80 85 90 95 100
0 170 0 0 0 0 0 0 0 0 0
“tall people” (in cm)
0.3 175 0 0.1 0.2 0.3 0.3 0.3 0.3 0.3 0.3
0.5 180 0 0.1 0.2 0.3 0.5 0.5 0.5 0.5 0.5
0.8 185 0 0.1 0.2 0.3 0.5 0.7 0.8 0.8 0.8
1 1 0.8 0.3
0.3 0.8 1 3
In summary:
Linguistic Semantics → Inference Linguistic
expressions Assigning using approximation
meaning generalized
modus
“small” ponens
“approx. B=AoR “more or less
equal” small”
Subjective and
context-dependent
steps