Download as pdf or txt
Download as pdf or txt
You are on page 1of 1

PROBABILITY CHEAT SHEET

1. Introduction
Let there be three variables
A with events A = {a1 , a2 , a3 } B with events B = {b1 , b2 } C with events C = {c1 , c2 }
To make a specific example, if A describes the weather, then A would be all possible events for the weather, e.g.
a1 = sunny, a2 = cloudy and a3 = rainy. We assume that variables can only take one event and not two or more at
the same time, thus the weather can not be sunny and cloudy at the same time.

2. Probability principles
There are a number of basic probability principles.
• The probability for any event must be between 0 and 1, e.g. for event a1 of A
0 ≤ P (A = a1 ) ≤ 1
• The probability of an event plus its complementary probability must be 1
P (A = a1 ) + P (A = not a1 ) = 1
• As a direct consequence, the probabilities of all possible events of a variable must add to 1
X
P (A = a1 ) + P (A = a2 ) + P (A = a3 ) = P (A = a) = 1
a∈A
P
where the latter expression with the summation sign is just a different notation with a∈A meaning that we
sum over all possible events of variable A, so a1 , a2 , a3 .

3. Conditional probabilities
The joint probability of two events, e.g. P (A = a1 , B = b1 ) is the probability that both events a1 and b1 occur
jointly, i.e. we observe a combination of high temperature with sunny weather.
The conditional probability of an event given a second event is defined as
P (A = a1 , B = b1 )
P (A = a1 |B = b1 ) =
P (B = b1 )
Reformulating this expression yields
P (A = a1 ) × P (B = b1 |A = a1 ) = P (A = a1 , B = b1 ) = P (B = b1 ) × P (A = a1 |B = b1 )
We have the following basic principles for joint and conditional probabilities.
• The probability of all events of a variable given some event must add to 1, e.g.
X
P (A = a1 |B = b1 ) + P (A = a2 |B = b1 ) + P (A = a3 |B = b1 ) = P (A = a|B = b1 ) = 1
a∈A
• We can calculate the probability of an event by the sum of all joint probabilities with a second variable or
even summing over the event combinations with a second and third variable
X X
P (A = a1 ) = P (A = a1 , B = b) = P (A = a1 , B = b)
b∈B b∈B,c∈C
= P (A = a1 , B = b1 , C = c1 ) + P (A = a1 , B = b1 , C = c2 )
+ P (A = a1 , B = b2 , C = c1 ) + P (A = a1 , B = b2 , C = c2 )
• Putting the above together, we can write
P (A = a1 ) = P (B = b1 ) × P (A = a1 |B = b1 ) + P (B = b2 ) × P (A = a1 |B = b2 )

4. Bayes Theorem
Bayes Theorem is now a direct consequence of the above statements
P (A = a1 , B = b1 ) P (A = a1 ) × P (B = b1 |A = a1 )
P (A = a1 |B = b1 ) = =
P (B = b1 ) P (B = b1 )
We use the following name convention
• P (A = a1 ) is called the prior probability
• P (B = b1 |A = a1 ) is called the likelihood
• P (B = b1 ) is called the evidence
• P (A = a1 |B = b1 ) is called the posterior probability
Bayes Theorem with two dependent variables reads
P (A = a1 |C = c1 ) × P (B = b1 |A = a1 , C = c1 )
P (A = a1 |B = b1 , C = c1 ) =
P (B = b1 |C = c1 )
5. Conditional & Absolute independence
We call two variables A and B absolutely independent if for all possible events a ∈ A, b ∈ B we have
P (A = a|B = b) = P (A = a) .
If two variables are independent we can write P (A = a, B = b) = P (A = a) × P (B = b). We call two variables A and
B conditionally independent given a third variable C if for all possible events a ∈ A, b ∈ B, c ∈ C we have
P (A = a|B = b, C = c) = P (A = a|C = c) .
1

You might also like