Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

PKM 02-09-2021

The very basic concepts/properties of probability

Probability is a real-valued function P defined on some sample space S, and is characterized by


the following three axioms.
Axiom 1. [Probability is non-negative.]
For any event A (⊆ S), P (A) ≥ 0.
Axiom 2. [Total probability is one.]
P (S) = 1.
Axiom 3. [Probability is additive.]
Let A1 , A2 , . . . be pairwise disjoint events, i.e., Ai ∩ Aj = ∅ for i 6= j.
[Sometimes also referred to as Ai ’s being mutually exclusive.] Then
m m
!
[ X
P Ai = P (Ai ) for any integer m as well as m = ∞.
i=1 i=1

The following properties can be derived immediately from the axioms.


(1o ) P (∅) = 0
(2o ) If A ⊆ B, then P (A) ≤ P (B).
(3o ) For any event A, 0 ≤ P (A) ≤ 1.
(4o ) For any event A, P (Ac ) = 1 − P (A).
(5o ) For any two events A and B, P (A ∪ B) = P (A) + P (B) − P (A ∩ B).
Note that if A and B are disjoint, i.e., A ∩ B = ∅, then P (A ∩ B) = P (∅) = 0. This will
imply P (A ∪ B) = P (A) + P (B), which is Axiom 3.

Definition [Conditional Probability]: Let A and B be two events with P (B) > 0. The
conditional probability of A given B is defined as
P (A ∩ B)
P (A|B) = .
P (B)

If one represents the probabilities as areas on a Venn diagram, the conditional probability can
be interpreted as the part of B (in proportion to the whole of B) that is (also) part of A.

From the definition of conditional probability, one can easily derive useful formulae to calculate
joint probabilities, and/or to manipulate other probabilities.
(6o ) P (A ∩ B) = P (A)P (B|A) = P (B)P (A|B)
(7o ) P (A1 ∩ A2 ∩ · · · ∩ An ) = P (A1 )·P (A2 |A1 )·P (A3 |A1 ∩ A2 ) · · · P (An |A1 ∩ A2 ∩ · · · ∩ An−1 )
(8o ) [Total probability law] : Suppose {Ai }ni=1 form a partition of S, i.e., Ai ∩ Aj = ∅ for i 6= j
n
S
and S = Ai . Suppose also that P (Ai ) > 0, ∀i. Then for any event B
i=1
n
X
P (B) = P (Ai )P (B|Ai ).
i=1

1
Definition [Independence]: Two events A and B are said to be independent if the joint
probability is given by the product of the individual probabilities, i.e., P (A ∩ B) = P (A)P (B).
This is equivalent to conditional probability of one event, say A, given the other, B, being the
same as the unconditional probability of the former, i.e.,

P (A|B) = P (A) and P (B|A) = P (B),

provided, of course, the conditional probabilities exist, i.e., the conditioning event has positive
probability.

Definition of independence for more than two events is a bit more complicated. Here the condi-
tion is that joint probability of any number of events (from the collection) must be equal to the
product of the individual probabilities. Formally,
Definition [Independence]: Events A1 , A2 , . . . , An are said to be independent if for every k,
(2 ≤ k ≤ n) and for every set of distinct indices i1 , i2 , . . . , ik from {1, 2, . . . , n}, we have
k
Y
P (Ai1 ∩ Ai2 ∩ · · · ∩ Aik ) = P (Aij ).
j=1

Remark: The property disjointness is a physical concept. The property independence, how-
ever, is a mathematical concept (and not a physical one). Sometimes one may derive indepen-
dence from physical properties, but one should be very careful about using intuitive arguments
to assume independence of events. Intuitive reasoning may lead to wrongful assumption of in-
dependence. Sometimes events may seem (physically) independent whereas according to the
mathematical definition they are not. Vice versa: some events may not seem to be independent
(physically), but (mathematically) they are so.

You might also like