Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 9

Bayes’ Rule

Bayes’ Rule - Updating Probabilities


• Let A1,…,Ak be a set of events that partition a sample
space such that (mutually exclusive and exhaustive):
– each set has known P(Ai) > 0 (each event can occur)
– for any 2 sets Ai and Aj, P(Ai and Aj) = 0 (events are disjoint)
– P(A1) + … + P(Ak) = 1 (each outcome belongs to one of events)
• If C is an event such that
– 0 < P(C) < 1 (C can occur, but will not necessarily occur)
– We know the probability will occur given each event Ai: P(C|Ai)
• Then we can compute probability of Ai given C occurred:
P(C | Ai ) P( Ai ) P( Ai and C )
P( Ai | C )  
P(C | A1 ) P( A1 )    P(C | Ak ) P( Ak ) P(C )
Example - OJ Simpson Trial
• Given Information on Blood Test (T+/T-)
– Sensitivity: P(T+|Guilty)=1
– Specificity: P(T-|Innocent)=.9957  P(T+|I)=.0043
• Suppose you have a prior belief of guilt: P(G)=p*
• What is “posterior” probability of guilt after seeing
evidence that blood matches: P(G|T+)?

P(T )  P (T  G )  P (T  I )  P(G ) P (T  | G )  P( I ) P (T  | I ) 
 p * (1)  (1  p*)(.0043)
 P(T  G ) P(G ) P (T  | G ) p * (1) p*
P(G | T )    
P(T  ) P(T  ) p * (1)  (1  p*)(.0043) .9957 p * .0043

Source: B.Forst (1996). “Evidence, Probabilities and Legal Standards for Determination of Guilt: Beyond the OJ Trial”, in
Representing OJ: Murder, Criminal Justice, and the Mass Culture, ed. G. Barak pp. 22-28. Harrow and Heston, Guilderland, NY
OJ Simpson Posterior (to Positive Test) Probabilities
Prior Probability of Guilt : P (G )  .10 
 .10(1) .10
P (G | T )    .9627
.10(1)  .90(.0043) .10387
P(G|T+) as function of P(G)

0.8

0.6
P(G |T + )

0.4

0.2

0
0 0.2 0.4 0.6 0.8 1 1.2
P(G)
Northern Army at Gettysburg
Regiment Label Initial # Casualties P(Ai) P(C|Ai) P(C|Ai)*P(Ai) P(Ai|C)
I Corps A1 10022 6059 0.1051 0.6046 0.0635 0.2630
II Corps A2 12884 4369 0.1351 0.3391 0.0458 0.1896
III Corps A3 11924 4211 0.1250 0.3532 0.0442 0.1828
V Corps A4 12509 2187 0.1312 0.1748 0.0229 0.0949
VI Corps A5 15555 242 0.1631 0.0156 0.0025 0.0105
XI Corps A6 9839 3801 0.1032 0.3863 0.0399 0.1650
XII Corps A7 8589 1082 0.0901 0.1260 0.0113 0.0470
Cav Corps A8 11501 852 0.1206 0.0741 0.0089 0.0370
Arty Reserve A9 2546 242 0.0267 0.0951 0.0025 0.0105
Sum 95369 23045 1 0.2416 1.0002
P(C)

• Regiments: partition of soldiers (A1,…,A9). Casualty: event C


• P(Ai) = (size of regiment) / (total soldiers) = (Column 3)/95369
• P(C|Ai) = (# casualties) / (regiment size) = (Col 4)/(Col 3)
• P(C|Ai) P(Ai) = P(Ai and C) = (Col 5)*(Col 6)
•P(C)=sum(Col 7)
• P(A |C) = P(A and C) / P(C) = (Col 7)/.2416
CRAPS
• Player rolls 2 Dice (“Come out roll”):
– 2,3,12 - Lose (Miss Out)
– 7,11 - Win (Pass)
– 4,5,6.8,9,10 - Makes point. Roll until point (Win) or 7 (Lose)
– Probability Distribution for first (any) roll:

Roll 2 3 4 5 6 7 8 9 10 11 12
Probability 1/36 2/36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 2/36 1/36
Outcome Lose Lose Point Point Point Win Point Point Point Win Lose

After first roll:


•P(Win|2) = P(Win|3) = P(Win|12) = 0
•P(Win|7) = P(Win|11) = 1
•What about other conditional probabilities if make point?
CRAPS
• Suppose you make a point: (4,5,6,8,9,10)
– You win if your point occurs before 7, lose otherwise and stop
– Let P mean you make point on a roll
– Let C mean you continue rolling (neither point nor 7)
– You win for any of the mutually exclusive events:
• P, CP, CCP, …, CC…CP,…
• If your point is 4 or 10, P(P)=3/36, P(C)=27/36
• By independence, and multiplicative, and additive rules:

k
3  27  3   27   3 
P( P)  P (CP )     P (CC 
  C P )     
36  36  36  k  36   36 
Win  P CP  CCC  CP 
k i
 3   27  3   27   3   3    27 
P ( Win)                
 36   36  36   36   36   36  i  0  36 
 3  1  1  1
     ( 4) 
 36  1  27 / 36  12  3
CRAPS
• Similar Patterns arise for points 5,6,8, and 9:
– For 5 and 9: P(P) = 4/36 P(C) = 26/36
– For 6 and 8: P(P) = 5/36 P(C) = 25/36

i
 4    26   4  1   4  36  2
Points 5 and 9 : P ( Win )              
 36  i 0  36   36  1  26 / 36   36  10  5
i
 5    25   5  1   5  36  5
Points 6 and 8 : P ( Win )              
36 36 36
  i 0     1  25 / 36   36  11  11

Finally, we can obtain player’s probability of winning:


CRAPS - P(Winning)
Come Out Roll P(Roll) P(Win|Roll) P(Roll&Win) P(Roll&Win) P(Roll|Win)
2 1/36 0 0 0 0
3 2/36 0 0 0 0
4 3/36 1/3 1/36 0.02777778 0.05635246
5 4/36 2/5 2/45 0.04444444 0.09016393
6 5/36 5/11 25/396 0.06313131 0.12807377
7 6/36 1 1/6 0.16666667 0.33811475
8 5/36 5/11 25/396 0.06313131 0.12807377
9 4/36 2/5 2/45 0.04444444 0.09016393
10 3/36 1/3 1/36 0.02777778 0.05635246
11 2/36 1 1/18 0.05555556 0.11270492
12 1/36 0 0 0 0
Sum 1 0.49292929 1
P(Win)

Note in the previous slides we derived P(Win|Roll), we multiply those


by P(Win to obtain P(Roll&Win) and sum those for P(Win). The last
column gives the probability of each come out roll given we won.

You might also like