Download as pdf or txt
Download as pdf or txt
You are on page 1of 43

Module 4

Reasoning with uncertainty


Module 4
Reasoning with uncertainty
• Traditional logic is monotonic
• The set of legal conclusions grows monotonically with the set of facts
appearing in our initial database
• When humans reason, we use defeasible logic
• Almost every conclusion we draw is subject to reversal
• If we find contradicting information later, we’ll want to retract earlier
inferences
• Nonmonotonic logic, or defeasible reasoning, allows a statement to be
retracted
• Solution: Truth Maintenance
• Keep explicit information about which facts/inferences support other
inferences
• If the foundation disappears, so must the conclusion
BMEE407L Module <4> 3
Will it rain tomorrow ??
given that it is cloudy today.
BMEE407L Module <4> 4
• Unknown or Unsure things.
• Randomness is all around us.

• This an unfair world.


• Chances are not equal.

BMEE407L Module <4> 5


• On the other hand, the problem might not be in the fact that
T/F values can change over time but rather that we are not certain
of the T/F value
• Agents almost never have access to the whole truth about their environment
• Agents must act in the presence of uncertainty
• Some information ascertained from facts
• Some information inferred from facts and knowledge about environment
• Some information based on assumptions made from experience

BMEE407L Module <4> 6


• Let action At = leave for airport t minutes before flight
• Will At get me there on time?
• Problems:
• Partial observability (road state, other drivers' plans, etc.)
• Noisy sensors (traffic reports)
• Uncertainty in action outcomes (flat tire, etc.)
• Complexity of modeling and predicting traffic
• Hence a purely logical approach either
• Risks falsehood: “A25 will get me there on time,” or
• Leads to conclusions that are too weak for decision making:
• A25 will get me there on time if there's no accident on the bridge and it doesn't rain and my tires remain intact, etc., etc.
• A1440 might reasonably be said to get me there on time but I'd have to stay overnight in the airport

BMEE407L Module <4> 7


BMEE407L Module <4> 8
Various types of reasoning strategies are used to handle uncertainty
• Probabilistic Models
• Bayesion Networks
• Monte Carlo Methods
• Decision Theory
• Fuzzy Logic
• Quantitative Reasoning

BMEE407L Module <4> 9


Eg of uncertain reasoning:
Toothache => Cavity (Wrong)
Toothache => Cavity ⋁ Gum disease ⋁ etc
Cavity => Toothache (Wrong)

Probabilistic assertions summarize effects of


• Laziness: failure to enumerate exceptions, qualifications, etc.
• Ignorance: lack of explicit theories, relevant facts, initial conditions, etc.
The agent’s knowledge can only provide a degree of belief and the tool for dealing with
degrees of belief is Probability Theory

BMEE407L Module <4> 10


• Probability theory is the mathematical
framework that allows us to analyse chance
events in a logically sound manner.

BMEE407L Module <4> 11


• Assign numbers to
the outcomes

• Heads = 1 Tails = 0

• Random variable.

BMEE407L Module <4> 12


The expected value is calculated by multiplying each of the
possible outcomes by the likelihood each outcome will
occur and then summing all of those values.

Essentially, the EV is the long-term


average value of the variable.
BMEE407L Module <4> 13
BMEE407L Module <4> 14
• Suppose the agent believes the following:
P(A25 gets me there on time) = 0.04
P(A90 gets me there on time) = 0.70
P(A120 gets me there on time) = 0.95
P(A1440 gets me there on time) = 0.9999
Probability is the measure of the likeliness that the event will occur.
• Which action should the agent choose?
• Depends on preferences for missing flight vs. time spent waiting
• Encapsulated by a utility function
• The agent should choose the action that maximizes the expected utility:
P(At succeeds) * U(At succeeds) + P(At fails) * U(At fails)
• Utility theory is used to represent and reason with preferences
• Decision theory = probability theory + utility theory
BMEE407L Module <4> 15
• P(event) is the probability in the absence of any additional information
• Probability depends on evidence.
• Before looking at dice: P(4) = 1/6
• After looking at dice: P(4) = 0 or 1, depending on what we see
• All probability statements must indicate the evidence with respect to which
the probability is being assessed.
• As new evidence is collected, probability calculations are updated.
• Before specific evidence is obtained, we refer to the prior or unconditional
probability of the event with respect to the evidence. After the evidence is
obtained, we refer to the posterior or conditional probability.
BMEE407L Module <4> 16
• We describe the (uncertain) state of the world using random variables
 Denoted by capital letters
• R: Is it raining?
• W: What’s the weather?
• D: What is the outcome of rolling two dice?
• S: What is the speed of my car (in MPH)?
• Random variables take on values in a domain
 Domain values must be mutually exclusive and exhaustive
• R in {True, False}
• W in {Sunny, Cloudy, Rainy, Snow}
• D in {(1,1), (1,2), … (6,6)}
• S in [0, 200]

BMEE407L Module <4> 17


BMEE407L Module <4> 18
• Probabilistic statements are defined over events, or sets of world states
 “It is raining”
 “The weather is either cloudy or snowy”
 “The sum of the two dice rolls is 11”
 “My car is going between 30 and 50 miles per hour”
• Events are described using propositions:
 R = True
 W = “Cloudy”  W = “Snowy”
 D  {(5,6), (6,5)}
 30  S  50
• Notation: P(A) is the probability of the set of world states in which proposition A holds
• P(X = x), or P(x) for short, is the probability that random variable X has taken on the value x

BMEE407L Module <4> 19


• Atomic event: A complete specification of the state of the world about which
the agent is uncertain
• E.g., if the world consists of only two Boolean variables Cavity and Toothache, then there
are 4 distinct atomic events:
Cavity = false Toothache = false
Cavity = false  Toothache = true
Cavity = true  Toothache = false
Cavity = true  Toothache = true

• Atomic events are mutually exclusive and exhaustive

BMEE407L Module <4> 20


• For any propositions (events) A, B
 0 ≤ P(A) ≤ 1
 P(True) = 1 and P(False) = 0
 P(A  B) = P(A) + P(B) – P(A  B)

BMEE407L Module <4> 21


• Because events are rarely isolated from other events, we may
want to define a joint probability distribution, or P(X1, X2,
.., Xn).
• Each Xi is a vector of probabilities for values of variable Xi.
• The joint probability distribution is an n-dimensional array of
combinations of probabilities.

Wet ~Wet
Rain 0.6 0.1
~Rain 0.2 0.1

BMEE407L Module <4> 22


Probability distribution gives values for all possible assignments:
P(Weather) = <0.72,0.1,0.08,0.1> (normalized, i.e., sums to 1)

• Joint probability distribution for a set of random variables gives the


probability of every atomic event on those random variables
P(Weather,Cavity) = a 4 × 2 matrix of values:

Weather = sunny rainy cloudy snow


Cavity = true 0.144 0.02 0.016 0.02
Cavity = false 0.576 0.08 0.064 0.08

• Every question about a domain can be answered by the joint distribution

BMEE407L Module <4> 23


• A joint distribution is an assignment of
probabilities to every possible atomic event

Atomic event P
Cavity = false Toothache = false 0.8
Cavity = false  Toothache = true 0.1
Cavity = true  Toothache = false 0.05
Cavity = true  Toothache = true 0.05

BMEE407L Module <4> 24


• Suppose we have the joint distribution P(X,Y) and we
want to find the marginal distribution P(Y)

P(Cavity, Toothache)
Cavity = false Toothache = false 0.8
Cavity = false  Toothache = true 0.1
Cavity = true  Toothache = false 0.05
Cavity = true  Toothache = true 0.05

P(Cavity) P(Toothache)
Cavity = false ? Toothache = false ?
Cavity = true ? Toochache = true ?

BMEE407L Module <4> 25


• Start with the joint probability distribution:

• For any proposition φ, sum the atomic events where it is


true: P(φ) = Σω:ω╞φ P(ω)
• P(toothache) = 0.108 + 0.012 + 0.016 + 0.064 = 0.2
• P(toothache V cavity)=
.108+.012+.016+.064+.072+.008=0.28
BMEE407L Module <4> 26
Prior or unconditional probabilities of
propositions
e.g., P(Cavity = true) = 0.1 and P(Weather =
sunny) = 0.72 correspond to belief prior to
arrival of any (new) evidence

Conditional or posterior probabilities


e.g., P(cavity | toothache) = 0.8

BMEE407L Module <4> 27


• Probability of cavity given toothache:
P(Cavity = true | Toothache = true)

• For any two events A and B, P( A  B) P( A, B)


P( A | B)  
P(B) P(B)

P(A  B)

P(A) P(B)

BMEE407L Module <4> 28


P(Cavity, Toothache)
Cavity = false Toothache = false 0.8
Cavity = false  Toothache = true 0.1
Cavity = true  Toothache = false 0.05
Cavity = true  Toothache = true 0.05

P(Cavity) P(Toothache)
Cavity = false 0.9 Toothache = false 0.85
Cavity = true 0.1 Toothache = true 0.15

• What is P(Cavity = true | Toothache = false)?


0.05 / 0.85 = 0.059
• What is P(Cavity = false | Toothache = true)?
0.1 / 0.15 = 0.667

BMEE407L Module <4> 29


• Start with the joint probability distribution:

• Can also compute conditional probabilities:


• P(cavity | toothache) = P(cavity  toothache)

P(toothache)
= 0.016+0.064
0.108 + 0.012 + 0.016 + 0.064
= 0.4

BMEE407L Module <4> 30


• You’re a contestant on a game show. You see three closed
doors, and behind one of them is a prize. You choose one door,
and the host opens one of the other doors and reveals that there
is no prize behind it. Then he offers you a chance to switch to
the remaining door. Should you take it?

BMEE407L Module <4> 31


• With probability 1/3, you picked the correct door,
and with probability 2/3, picked the wrong door. If
you picked the correct door and then you switch,
you lose. If you picked the wrong door and then
you switch, you win the prize.

BMEE407L Module <4> 32


• The product rule gives us two ways to factor a joint Rev. Thomas Bayes
distribution: (1702-1761)

P( A, B)  P( A | B) P( B)  P( B | A) P( A)
P ( B | A) P ( A)
• Therefore, P( A | B) 
P( B)
• Why is this useful?
• Can get diagnostic probability P(cavity | toothache) from causal
probability P(toothache | cavity)
• Can update our beliefs based on evidence
• Important tool for probabilistic inference
BMEE407L Module <4> 33
Question 1: Three individuals A, B, and C have applied for a job at a private firm. Their chances of selection are in the
ratio 1:2:4. The probabilities that A, B, and C can bring about changes to boost the company's profits are 0.8, 0.5, and 0.3,
respectively. If the desired change doesn't occur, find the probability that it is due to the appointment of C.
Solution: Let's denote the following events:
E1: A gets selected, E2: B gets selected and E3: C gets selected
A: Changes were introduced, but no profit occurred
Now, we can calculate the following probabilities:
P(E1) = 1/(1+2+4) = 1/7, P(E2) = 2/7 and P(E3) = 4/7
P(A|E1) = P(Profit didn't occur due to changes introduced by A)
= 1 – P(Profit occurred due to changes introduced by A) = 1 – 0.8 = 0.2
P(A|E2) = P(Profit didn't occur due to changes introduced by B)
= 1 – P(Profit occurred due to changes introduced by B) = 1 – 0.5 = 0.5
Similarly, P(A|E3) = 1- 0.3 =0.7
We need to find the probability of profit not occurring due to the selection of C.
Thus, the required probability is 0.7.
13-03-2024 BMEE407L 34
Question 2:
In a certain neighborhood, 90% of children fell ill due to the flu and 10% due to measles, with no other diseases reported.
The probability of observing rashes for measles is 0.95, and for the flu is 0.08. If a child develops rashes, find the
probability of the child having the flu.
Solution: Let's denote the following events:
F: Children with the flu, M: Children with measles, R: Children showing the symptoms of a rash
We can calculate the following probabilities:
P(F) = 90% = 0.9
P(M) = 10% = 0.1
P(R|F) = 0.08
P(R|M) = 0.95

Therefore, P(F|R) = 0.43


13-03-2024 BMEE407L 35
• Question 3:
• Marie is getting married tomorrow, at an outdoor ceremony in the desert. In recent years, it
has rained only 5 days each year (5/365 = 0.014). Unfortunately, the weatherman has
predicted rain for tomorrow. When it actually rains, the weatherman correctly forecasts
rain 90% of the time. When it doesn't rain, he incorrectly forecasts rain 10% of the time.
What is the probability that it will rain on Marie's wedding?
P (Predict | Rain ) P(Rain )
P (Rain | Predict ) 
P (Predict )
P (Predict | Rain ) P (Rain )

P(Predict | Rain ) P(Rain )  P(Predict | Rain ) P (Rain )
0.9 * 0.014
  0.111
0.9 * 0.014  0.1* 0.986

BMEE407L Module <4> 36


Question 4:
1% of women at age forty who participate in routine screening have cancer. 80%
of women with cancer will get positive mammographies. 9.6% of women without
cancer will also get positive mammographies. A woman in this age group had a
positive mammography in a routine screening. What is the probability that she
actually has cancer?

P(Positive| Cancer) P(Cancer)


P(Cancer | Positive) 
P(Positive)
P(Positive| Cancer) P(Cancer)

P(Positive| Cancer) P(Cancer)  P(Positive| Cancer) P(Cancer)
0.8 * 0.01
  0.0776
0.8 * 0.01  0.096* 0.99

BMEE407L Module <4> 37


• Suppose Norman and Martin each toss separate coins.
• Let A represent the variable "Norman's toss outcome“
• B represent the variable "Martin's toss outcome".
• Both A and B have two possible values (Heads and Tails).
• We can assume that A and B are independent.
• Evidence about B will not change our belief in A.

BMEE407L Module <4> 38


• Now suppose both Martin and Norman toss the same coin.
• Let A represent the variable "Norman's toss outcome“
• B represent the variable "Martin's toss outcome".
• Assume also that there is a possibility that the coin is biased towards heads but we do not
know this for certain.
• In this case A and B are not independent. For example, observing that B is Heads causes
us to increase our belief in A being Heads (in other words P(a|b)>P(a) in the case when
a=Heads and b=Heads).
• Here the variables A and B are both dependent on a separate variable C, "the coin is
biased towards Heads" (which has the values True or False).
• Although A and B are not independent, it turns out that once we know for certain the
value of C then any evidence about B cannot change our belief about A. Specifically:
P(A|B,C)=P(A|C)
• In such case we say that A and B are conditionally independent given C.
BMEE407L Module <4> 39
• Two events A and B are independent if and only if
P(A  B) = P(A) P(B)
• In other words, P(A | B) = P(A) and P(B | A) = P(B)
• This is an important simplifying assumption for modeling,
e.g., Toothache and Weather can be assumed to be
independent
• Are two mutually exclusive events independent?
• No

• Conditional independence: A and B are conditionally


independent given C iff P(A  B | C) = P(A | C) P(B | C)

BMEE407L Module <4> 40


A Generative Learning Algorithm

• Learns from a training set (emails received in 3 months)

• Builds probability models for both cases(namely spam and non-


spam)

• To classify a new email--it matches against the spam model and the
non-spam model and outputs the best match

BMEE407L Module <4> 41


• To automatically classify cancers as benign or malignant

• Algorithm to accept/reject a loan application

• The application of naive Bayes model averaging to predict


Alzheimer's disease from data.

• Classifying websites as interesting /not interesting based on past


browsing data

BMEE407L Module <4> 42


Application of Bayes Rule
to predict a drug

BMEE407L Module <4> 43

You might also like