Bayesian Nets PDF

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

Bayesian Nets

-
2 Introduction
• Many times, the only knowledge we have
about a distribution is which variables are
or are not dependent.
• Such dependencies can be represented
efficiently using a Bayesian Belief Network
(or Belief Net or Bayesian Net).

Bayesian Nets allow us to represent a joint


probability density p(x,y,z,…) efficiently using
dependency relationships.

3

• A belief net is usually a Directed Acyclic Graph


(DAG)

• Each node represents one of the system


variables.

• Each variable can assume certain states (i.e.,


values).


4


5

• A link joining two nodes is directional and represents a


causal influence (e.g., X depends on A or A influences X)
• Influences could be direct or indirect (e.g., A influences X
directly and A influences C indirectly through X).


6

• Each variable is associated with a set of weights


which represent prior or conditional probabilities
(discrete or continuous) .

Probabilities
sum to 1


7

• Using the chain rule, the joint probability of a set of


variables x1, x2, …, xn is give as:

p ( x1 , x2 ,..., xn ) = p ( x1 / x2 ,..., xn ) p ( x2 / x3 ,..., xn )... p( xn −1 / xn ) p ( xn )

• The conditional independence relationships encoded in


the Bayesian network state that a node xi is conditionally
independent of its ancestors given its parents πi :

n
p( x1 , x2 ,..., xn ) = ∏ p( xi / π i ) much simpler!
i =1

8

• We can compute the probability of any configuration of


variables in the joint density distribution, e.g.:
(probability of catching a medium lightness, thin sea-bass from the North Atlantic in
summer)

P(a3, b1, x2, c3, d2)

= P(a3)P(b1)P(x2 /a3,b1)P(c3 /x2)P(d2 /x2)

= 0.25 x 0.6 x 0.4 x 0.5 x 0.4 = 0.012


9

• e.g., determine the probability at D


10

• e.g., determine the probability at H:


11

• Classify a fish given that the fish is light (c1) and was
caught in south Atlantic (b2) -- no evidence about
what time of the year the fish was caught nor its
thickness.


12


13


14

• Similarly,
P(x2 / c1,b2)=α 0.066

• Normalize probabilities (not needed necessarily):

P(x1 /c1,b2)+ P(x2 /c1,b2)=1 (α=1/0.18)

P(x1 /c1,b2)= 0.73


P(x2 /c1,b2)= 0.27

You might also like