Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 19

Vidya Vikas Education Trust’s

Universal College of Engineering, Vasai(E)

AI
CH:05
Reasoning Under Uncertainty
• Handling Uncertain Knowledge,
• Random Variables,
• Prior and Posterior Probability,
• Inference using Full Joint Distribution
Uncertainty:

Till now, we have learned knowledge representation using first-order logic and
propositional logic with certainty, which means we were sure about the predicates.

With this knowledge representation, we might write A→B, which means if A is true
then B is true, but consider a situation where we are not sure about whether A is true or
not then we cannot express this statement, this situation is called uncertainty.

So to represent uncertain knowledge, where we are not sure about the predicates, we
need uncertain reasoning or probabilistic reasoning.

Lack of correct or exact information to derive correct conclusion.


• Uncertainty arises when we are not 100 percent sure about the
outcome of the decisions.
• This mostly happens in those cases where the conditions are
neither completely true nor completely false.
REASONS FOR UNCERTAINTY
• Partially observable environment
• Dynamic environment
• Incomplete knowledge of the agent
• Inaccessible areas in the environment
METHODS TO HANDLE
UNCERTAINTY
• Fuzzy Logic
• Probabilistic Reasoning
• Hidden Markov Models
• Neural Networks
Causes of uncertainty:

Following are some leading causes of uncertainty to occur in the real world.

1. Information occurred from unreliable sources.


2. Experimental Errors
3. Equipment fault
4. Temperature variation
5. Climate change.
Handling Uncertain Knowledge
• Probabilistic reasoning is a way of knowledge representation where we apply the
concept of probability to indicate the uncertainty in knowledge.
• In probabilistic reasoning, we combine probability theory with logic to handle the
uncertainty.
• In the real world, there are lots of scenarios, where the certainty of something is
not confirmed, such as
– "It will rain today,"
– "behavior of someone for some situations,"
– "A match between two teams or two players."
• These are probable sentences for which we can assume that it will happen but not
sure about it, so here we use probabilistic reasoning.
Need of probabilistic reasoning in AI:

● When there are unpredictable outcomes.


● When specifications or possibilities of predicates becomes too large to handle.
● When an unknown error occurs during an experiment.

In probabilistic reasoning, there are two ways to solve problems with uncertain
knowledge:

● Bayes' rule
● Bayesian Statistics
Probability: Probability can be defined as a chance that an uncertain event will
occur. It is the numerical measure of the likelihood that an event will occur. The
value of probability always remains between 0 and 1 that represent ideal
uncertainties.

1. 0 ≤ P(A) ≤ 1, where P(A) is the probability of an event A.


2. P(A) = 0, indicates total uncertainty in an event A.
3. P(A) =1, indicates total certainty in an event A.
● P(¬A) = probability of a not happening event.
● P(¬A) + P(A) = 1.

Event: Each possible outcome of a variable is called an event.

Sample space: The collection of all possible events is called sample space.

Random variables: Random variables are used to represent the events and objects in the real world.

Prior probability: The prior probability of an event is probability computed before observing new
information.

Posterior Probability: The probability that is calculated after all evidence or information has taken into
account. It is a combination of prior probability and new information.
Conditional Probability:

Conditional probability P (A|B )=P(A∩B)/P(B)

where, P(A ∩ B) = Joint Probability of A and B,


P(B) = Marginal Probability of B and P(B) > 0
Example 1
In a class, there are 80% of the students who like
English and 30% of the students who likes English
and Mathematics, and then what is the
percentage of students those who like Math, also
like English?
Example 2
The table below shows the occurrence of
diabetes in 100 people. Let D and N be the
events where a randomly selected person "has
diabetes" and "not overweight".
Then find P(D | N).
Diabetes (D) No Diabetes (Dbar)

Not overweight (N) 5 45

Overweight (Nbar) 17 33
INFERENCE USING FULL JOINT DISTRIBUTIONS

• Probability inference means computation from observed


evidence of posterior probabilities for query
propositions.The knowledge based answering the query is
represented as full joint distribution.
• The full joint probability distribution specifies the
probability of each complete assignment of values to
random variables.
• Marginalization: to get the marginal probability-- attained
by adding the entries in the corresponding rows or
columns
For example, P(cavity) = 0.108 + 0.012 + 0.072 + 0.008 = 0.2
There are six atomic events for (cavity ∨ toothache): 0.108
+ 0.012 + 0.072 + 0.008 + 0.016 + 0.064 = 0.28
Variant of marginalization is called conditioning.
Computing a conditional probability P (cavity| toothache )=
P(cavity ∧toothache)|P(toothache)
=0.108+0.012 | 0.108+0.012+0.016+0.064
= 0.12|0.2 = 0.6
Similarly, P (∼ cavity| toothache) =
0.016+0.064|0.2
= 0.4
In both the cases,
1|P(toothache)= 1|0.2= 5 remains constant, no
matter which value of cavity we calculate.
It is a normalization constant (α) ensuring that the
distribution P(cavity | toothache) adds up to 1.
Thank you !

19

You might also like