Professional Documents
Culture Documents
Ai Unit 5
Ai Unit 5
Y party(Y) → not(study_hard(Y))
good_student(david)
• We now make this set of propositions into a
justification network.
o belief is associated with two other sets of beliefs.
P(success ) =
the number of successes
the number of possible outcomes
P( failure) =
the number of failures
the number of possible outcomes
P( failure) = q =
f
s+ f
and p+q=1
• If we throw a coin, the probability of getting a head will
be equal to the probability of getting a tail. In a single
throw, s = f = 1, and therefore the probability of getting
a head (or a tail) is 0.5.
p( A B ) =
the number of times A and B can occur
the number of times B can occur
p( A B )
p(A B ) =
p (B )
• Similarly, the conditional probability of event B
occurring given that event A has occurred equals
p ( B A)
p(B A) =
p ( A)
• Hence
And
p( A B) = p(B A) p( A)
• Substituting the last equation into the equation
p( A B )
p(A B ) =
p (B )
• yields the Bayesian rule:
Bayesian Rule:
p(B A) p( A)
p(A B ) =
p (B )
• where:
o p(A|B) is the conditional probability that event A
occurs given that event B has occurred;
o p(B|A) is the conditional probability of event B
occurring given that event A has occurred;
o p(A) is the probability of event A occurring;
o p(B) is the probability of event B occurring
n n
p( A Bi ) = p(A Bi ) p(Bi )
i =1 i =1
A B1
B4
B3 B2
p(B A) p( A)
p( A B ) =
p(B A) p( A) + p(B A) p(A)
• where:
o p(H) is the prior probability of hypothesis H being
true;
o p(E|H) is the probability that hypothesis H being
true will result in evidence E;
o p(H) is the prior probability of hypothesis H being
false;
o p(E|H) is the probability of finding evidence E
even when hypothesis H is false.