Lecture 2

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 11

Chapter 2: Probability

Shilpa G.

1 / 24
Thinking conditionally
• Probability is a language for expressing our degrees of belief or
uncertainties about events.
• Whenever we observe new evidence (i.e., obtain data), we
acquire information that may a↵ect our uncertainties.
• Conditional probability is the concept that addresses this
fundamental question: how should we update our beliefs in
light of the evidence we observe?
• Conditional probability is essential for scientific, medical, and
legal reasoning, since it shows how to incorporate evidence
into our understanding of the world in a logical, coherent
manner.
• Most probabilities are conditional ; whether or not it’s written
explicitly, there is always background knowledge (or
assumptions) built into every probability.
• Conditioning is the soul of statistics!
2 / 24
Conditional probability
Definition
Let A1 and A2 be events such that P[A1 ] 6= 0. The conditional
probability of A2 given A1 , denoted by P[A2 |A1 ], is defined by

P[A1 \ A2 ]
P[A2 |A1 ] := .
P[A1 ]

Here A2 is the event whose uncertainty we want to update, and A1


is the evidence we observe (or want to treat as given). We call
P[A2 ] the prior probability of A2 and P[A2 |A1 ] the posterior
probability of A2 .
P[A1 \ A2 ] = P[A2 |A1 ]P[A1 ].

3 / 24
Thinking ’independently’
• We have several examples where conditioning on one event
changes our beliefs about the probability of another event.
The situation where events provide no information about each
other is called independence.
• Alternatively, A and B are independent if learning that B
occurred gives us no information that would change our
probabilities for A occurring (and vice versa).
• Note that independence is a symmetric relation: if A is
independent of B, then B is independent of A.
• Independence is completely di↵erent from disjointness.

4 / 24
Thinking ’independently’

• If A provides no information about whether or not B occurred, then


it also provides no information about whether or not B c occurred.
• Events A and B are independent i↵ P(A|B) = P(A) i↵
P(B|A) = P(B).
Definition
Events A1 and A2 are independent if P[A1 \ A2 ] = P[A1 ] P[A2 ].
• Let C = {Ai |i = 1, . . . , n} be a finite collection of events. These
events are independent i↵ given any subcollection A(1) , . . . , A(m) of
elements of C ,
m
Y
P[A(1) \ A(2) \ · · · \ A(m) ] = P[A(i) ].
i=1

Proposition. If A and B are independent, then A and B c are


independent, Ac and B are independent, and Ac and B c are independent.
5 / 24
6 / 24
Example

Problem
A survey of 1085 adults asked, ” Do you enjoy shopping for
clothing for yourself?” The results (data extracted from ”Split
Decision on Cloths Shopping,” USA Today, January 28, 2011, p.
1B) indicated that 51% of the females enjoyed shopping for
clothing for themselves as compared to 44% of the males. The
sample sizes of male and females were not provided. Suppose that
the results were as shown in the following table:
Enjoys shopping Male Female Total
for clothing
Yes 238 276
No 304 267
Total

7 / 24
a) Suppose that the respondent chosen is a female. What is the
probability that she does not enjoy shopping for clothing?
b) Suppose that the respondent chosen enjoys shopping for
clothing. What is the probability that the individual is a male?
c) Are enjoying shopping for clothing and the gender of the
individual independent? Justify.

8 / 24
Example
Problem
A standard deck of cards is shu✏ed well. Two cards are drawn
randomly, one at a time without replacement. Let A be the event
that the first card is a heart, and B be the event that the second
card is red. Find P[A|B] and P[B|A].

9 / 24
Prove and apply for the problems...
• P(A|S) = P(A)
• If B ⇢ A and P(B) 6= 0, then P(A|B) = 1.
• If A \ B = and P(B) 6= 0, then P(A|B) = 0.
• If A \ B = and P(C ) 6= 0, then
P((A [ B)|C ) = P(A|C ) + P(B|C ).
• For any events A and B with positive probabilities,

P(A \ B) = P(A|B)P(B) = P(B|A)P(A).

• In general, we have

P[E1 \ E2 · · · \ En ] =
P[E1 ]P[E2 |E1 ]P[E3 |E1 \ E2 ] · · · P[En |E1 \ · · · \ En 1 ].

• If P(A) 6= 0, and P(B) 6= 0 then P(A|B) P(A), is


equivalent to P(B|A) P(B).
10 / 24
11 / 24

You might also like