Download as pdf or txt
Download as pdf or txt
You are on page 1of 55

Beyond finite sample space I

Presidency University

November, 2021
Classical definition: Issues

I Classical definition of probability is only applicable for finite


sample spaces.
Classical definition: Issues

I Classical definition of probability is only applicable for finite


sample spaces.

I Many people criticize classical definition on the ground that we


cannot find the probability of picking a digit from (0,1) using
this definition.
Classical definition: Issues

I Classical definition of probability is only applicable for finite


sample spaces.

I Many people criticize classical definition on the ground that we


cannot find the probability of picking a digit from (0,1) using
this definition.

I However, this criticism is not valid because classical definition


was never meant for infinite sample spaces: we need to extend
the notion of probability for such cases.
Classical definition: Issues

I Classical definition of probability is only applicable for finite


sample spaces.

I Many people criticize classical definition on the ground that we


cannot find the probability of picking a digit from (0,1) using
this definition.

I However, this criticism is not valid because classical definition


was never meant for infinite sample spaces: we need to extend
the notion of probability for such cases.

I Another criticism against classical definition is about the


phrase “equally likely” which is another word for “equally
probable”. Thus it seems that we are using the notion of
probability to define probability.
I However in classical definition we assume the outcomes are
equally likely and we use the definition to find probability of
events (not outcomes).
I However in classical definition we assume the outcomes are
equally likely and we use the definition to find probability of
events (not outcomes).

I There is no point in asking the probability of head while


tossing an unbiased coin. We may ask the probability of
obtaining two heads.
I However in classical definition we assume the outcomes are
equally likely and we use the definition to find probability of
events (not outcomes).

I There is no point in asking the probability of head while


tossing an unbiased coin. We may ask the probability of
obtaining two heads.

I Thus classical definition of probability, rather than defining


probability is a tool for algebra of probability.
I However in classical definition we assume the outcomes are
equally likely and we use the definition to find probability of
events (not outcomes).

I There is no point in asking the probability of head while


tossing an unbiased coin. We may ask the probability of
obtaining two heads.

I Thus classical definition of probability, rather than defining


probability is a tool for algebra of probability.

I Nevertheless we shall look for extending the concept of


probability for infinite sample spaces.
I However in classical definition we assume the outcomes are
equally likely and we use the definition to find probability of
events (not outcomes).

I There is no point in asking the probability of head while


tossing an unbiased coin. We may ask the probability of
obtaining two heads.

I Thus classical definition of probability, rather than defining


probability is a tool for algebra of probability.

I Nevertheless we shall look for extending the concept of


probability for infinite sample spaces.

I But before such extension we shall decide when we can accept


any number to be probability of an event.
Requirements of probability

I For any event A we want to associate a number P(A) which


represents the probability of the event – that is P(A) is the
chance that the event A will occur.
Requirements of probability

I For any event A we want to associate a number P(A) which


represents the probability of the event – that is P(A) is the
chance that the event A will occur.

I There are some natural demands to be satisfied (all of which


follows from our intuition which has been developed from the
classical definition of probability).
Requirements of probability

I For any event A we want to associate a number P(A) which


represents the probability of the event – that is P(A) is the
chance that the event A will occur.

I There are some natural demands to be satisfied (all of which


follows from our intuition which has been developed from the
classical definition of probability).

I We shall treat P(A) to be probability if the following demands


are satisfied:
I P(∅) = 0
I P(Ω) = 1 P
I For disjoint events A1 , A2 , ...., P(∪Ai ) = P(Ai ).
Classical definition: alternative formulation
I In order to have an extension of idea of probability, let us
revisit the classical definition once again.
Classical definition: alternative formulation
I In order to have an extension of idea of probability, let us
revisit the classical definition once again.

I For any event A we can write


m
P(A) =
n
where n is the total number of outcomes of the random
experiment and m is the number of outcomes favourable to A.
Classical definition: alternative formulation
I In order to have an extension of idea of probability, let us
revisit the classical definition once again.

I For any event A we can write


m
P(A) =
n
where n is the total number of outcomes of the random
experiment and m is the number of outcomes favourable to A.

I More precisely, if we have Ω = {ω1 , ω2 , ..., ωn } and


A = {ωi1 , ωi2 , ..., ωim }, then
m
P(A) =
n
assuming the outcomes ωi0 s are equally likely.
I Let us look at this definition in an alternative way.
I Let us look at this definition in an alternative way.

I For each outcome ωi of Ω we can assign a probability


p(ωi ) = n1 .
I Let us look at this definition in an alternative way.

I For each outcome ωi of Ω we can assign a probability


p(ωi ) = n1 .

I And then for any A ⊆ Ω, we can define


m
X X 1 m
P(A) = p(ω) = = .
n n
ω:ω∈A i=1
I Let us look at this definition in an alternative way.

I For each outcome ωi of Ω we can assign a probability


p(ωi ) = n1 .

I And then for any A ⊆ Ω, we can define


m
X X 1 m
P(A) = p(ω) = = .
n n
ω:ω∈A i=1

I This particular assignment of probabilities to each outcome ωi


of Ω is called equally likely assignment.
I Let us look at this definition in an alternative way.

I For each outcome ωi of Ω we can assign a probability


p(ωi ) = n1 .

I And then for any A ⊆ Ω, we can define


m
X X 1 m
P(A) = p(ω) = = .
n n
ω:ω∈A i=1

I This particular assignment of probabilities to each outcome ωi


of Ω is called equally likely assignment.

I This is just another way of saying equally probable.


Equally likely assignment is not possible for countable Ω

I Clearly when Ω = {ω1 , ω2 , ....} is not finite, such equally likely


assignment is not possible.
Equally likely assignment is not possible for countable Ω

I Clearly when Ω = {ω1 , ω2 , ....} is not finite, such equally likely


assignment is not possible.

I But the alternative formulation of the classical definition now


shows us that we can look for other kind of assignments, in
particular “unequally likely” assignments.
Equally likely assignment is not possible for countable Ω

I Clearly when Ω = {ω1 , ω2 , ....} is not finite, such equally likely


assignment is not possible.

I But the alternative formulation of the classical definition now


shows us that we can look for other kind of assignments, in
particular “unequally likely” assignments.

I As an analogy, imagine dividing a pizza to a number of


persons. If there are finite number of persons, you may think
of dividing the pizza equally among all. But if the number
suddenly increases too much, there is no point of equal
divisions because virtually nobody gets nothing. On the other
hand we can look for unequal sharing of the pizza so that
some get major chunks while others remain empty-handed.
Extension I: Countable sample space

I Thus as an extension of the idea of probability for countable


Ω, we suggest the following procedure.
Extension I: Countable sample space

I Thus as an extension of the idea of probability for countable


Ω, we suggest the following procedure.

I Since Ω is countable we can enlist Ω as Ω = {ω1 , ω2 , ...}.


Extension I: Countable sample space

I Thus as an extension of the idea of probability for countable


Ω, we suggest the following procedure.

I Since Ω is countable we can enlist Ω as Ω = {ω1 , ω2 , ...}.

I Now for each outcome ω ∈ Ω, let us attach a real number


p(ω) such that
X
p(ω) ≥ 0 and p(ω) = 1.
ω∈Ω
Extension I: Countable sample space

I Thus as an extension of the idea of probability for countable


Ω, we suggest the following procedure.

I Since Ω is countable we can enlist Ω as Ω = {ω1 , ω2 , ...}.

I Now for each outcome ω ∈ Ω, let us attach a real number


p(ω) such that
X
p(ω) ≥ 0 and p(ω) = 1.
ω∈Ω

I Then for any event A ⊆ Ω, we define the probability of A as


X
P(A) = p(ω).
ω:ω∈A
P(.) is a probability
I Our next job is to show that P(.) defined above is indeed a
probability.
P(.) is a probability
I Our next job is to show that P(.) defined above is indeed a
probability.

I We note that P(∅) = 0 and


X
P(Ω) = p(ω) = 1.
ω∈Ω
P(.) is a probability
I Our next job is to show that P(.) defined above is indeed a
probability.

I We note that P(∅) = 0 and


X
P(Ω) = p(ω) = 1.
ω∈Ω

I Moreover if we consider two disjoint events A and B, such that


A = {ωi1 , ωi2 , ....} and B = {ωj1 , ωj2 , ...} then

A ∪ B = {ωi1 , ωj1 , ωi2 , ωj2 , ...}

so that
X X
P(A ∪ B) = p(ωik ) + p(ωjk ) = P(A) + P(B).
k k
Example

I Suppose Ω = {0, 1, 2, .....}. Such Ω occurs when we count the


frequencies of any random events.
Example

I Suppose Ω = {0, 1, 2, .....}. Such Ω occurs when we count the


frequencies of any random events.

I For each i ∈ Ω, let us attach a real number

e −2 2i
p(i) = .
i!
Example

I Suppose Ω = {0, 1, 2, .....}. Such Ω occurs when we count the


frequencies of any random events.

I For each i ∈ Ω, let us attach a real number

e −2 2i
p(i) = .
i!

I Then we note that p(i) ≥ 0 for each i and



X
−2
X 2i
p(i) = e = e −2 × e 2 = 1.
i!
i∈Ω i=0
I Hence for any A ⊆ Ω, we have
X e −2 2i
P(A) = .
i!
i∈A
I Hence for any A ⊆ Ω, we have
X e −2 2i
P(A) = .
i!
i∈A

I For example, if A = {0, 1} we have

e −2 20 e −2 2
P(A) = + = 3e −2 = 0.41.
0! 1!
Example
I Consider the random experiment where a coin is tossed until a
head appears.
Example
I Consider the random experiment where a coin is tossed until a
head appears.
I Here the sample space is
Ω = {H, TH, TTH, ....} = {ω1 , ω2 , ω3 , ....}.
Example
I Consider the random experiment where a coin is tossed until a
head appears.
I Here the sample space is
Ω = {H, TH, TTH, ....} = {ω1 , ω2 , ω3 , ....}.
I For each i = 1, 2, ... we assign a real number
1
p(ωi ) = .
2i
Example
I Consider the random experiment where a coin is tossed until a
head appears.
I Here the sample space is
Ω = {H, TH, TTH, ....} = {ω1 , ω2 , ω3 , ....}.
I For each i = 1, 2, ... we assign a real number
1
p(ωi ) = .
2i
I Then we have p(ωi ) ≥ 0 for each i and

X X 1 1/2
p(ω) = i
= = 1.
2 1 − 1/2
ω∈Ω i=1
Example
I Consider the random experiment where a coin is tossed until a
head appears.
I Here the sample space is
Ω = {H, TH, TTH, ....} = {ω1 , ω2 , ω3 , ....}.
I For each i = 1, 2, ... we assign a real number
1
p(ωi ) = .
2i
I Then we have p(ωi ) ≥ 0 for each i and

X X 1 1/2
p(ω) = i
= = 1.
2 1 − 1/2
ω∈Ω i=1

I Finally for any event A ⊆ Ω, we define


X
P(A) = p(ω).
ω:ω∈A
Subjective Probability

I How should we assign p(ω) for each ω ∈ Ω?


Subjective Probability

I How should we assign p(ω) for each ω ∈ Ω?

I We note that the for any A ⊆ Ω, the value of P(A) depends


on the assignment of p(ω).
Subjective Probability

I How should we assign p(ω) for each ω ∈ Ω?

I We note that the for any A ⊆ Ω, the value of P(A) depends


on the assignment of p(ω).

I This is based on our subjective belief regarding the outcomes


of the random experiment.
Subjective Probability

I How should we assign p(ω) for each ω ∈ Ω?

I We note that the for any A ⊆ Ω, the value of P(A) depends


on the assignment of p(ω).

I This is based on our subjective belief regarding the outcomes


of the random experiment.

I Recall that probability of any event A, is the quantification of


our subjective belief and selection of p(ω) is the way of
exercising that belief.
I In the previous example, if we believe that the coin is fair then
that would be reflected through the choice of p(ω) as
p(ωi ) = 21i .
I In the previous example, if we believe that the coin is fair then
that would be reflected through the choice of p(ω) as
p(ωi ) = 21i .

I On the other hand if we have reasons to believe that the coin


is biased with probability of head p, then we could have
assigned p(ω) as

p(ωi ) = p(1 − p)i−1 .


I In the previous example, if we believe that the coin is fair then
that would be reflected through the choice of p(ω) as
p(ωi ) = 21i .

I On the other hand if we have reasons to believe that the coin


is biased with probability of head p, then we could have
assigned p(ω) as

p(ωi ) = p(1 − p)i−1 .

I The intuition behind the selection of p(i) in count data


example will be discussed later but the bottomline is we could
have worked with alternative choices as
e −λ λi
p(i) =
i!
for any λ > 0.
I On the other hand even if the sample space is finite, we can
choose for alternative assignment of p(ω) for the outcomes.
I On the other hand even if the sample space is finite, we can
choose for alternative assignment of p(ω) for the outcomes.

I Classical definition of probability considers equally likely


assignments but that is not the only one.
I On the other hand even if the sample space is finite, we can
choose for alternative assignment of p(ω) for the outcomes.

I Classical definition of probability considers equally likely


assignments but that is not the only one.

I For example, consider Ω = {0, 1, 2, ...., n} where n is a natural


number.
I On the other hand even if the sample space is finite, we can
choose for alternative assignment of p(ω) for the outcomes.

I Classical definition of probability considers equally likely


assignments but that is not the only one.

I For example, consider Ω = {0, 1, 2, ...., n} where n is a natural


number.

I If we follow classical definition we should choose


1
p(i) = ∀i.
n+1
I On the other hand even if the sample space is finite, we can
choose for alternative assignment of p(ω) for the outcomes.

I Classical definition of probability considers equally likely


assignments but that is not the only one.

I For example, consider Ω = {0, 1, 2, ...., n} where n is a natural


number.

I If we follow classical definition we should choose


1
p(i) = ∀i.
n+1

I But now we can allow unequal allocation like


1
p(0) = p(1) = , p(i) = 0∀i > 1.
2
I Alternatively we may choose
 
n 1
p(i) = ∀i
i 2n
P
so that p(i) ≥ 0∀i and p(i) = 1.
I Alternatively we may choose
 
n 1
p(i) = ∀i
i 2n
P
so that p(i) ≥ 0∀i and p(i) = 1.

I More generally one may choose


 
n i
p(i) = p (1 − p)n−i ∀i,
i

for some number p ∈ (0, 1).

You might also like