Professional Documents
Culture Documents
BS 4
BS 4
BS/IIFT/Harsh/4 2
Session Objective
BS/IIFT/Harsh/4 3
Content
BS/IIFT/Harsh/4 4
Why Learn Probability?
Nothing in life is certain. In everything we do, we gauge the chances
of successful outcomes, from business to medicine to the weather
Probability provides a quantitative description of the chances or
likelihoods associated with various outcomes
It provides a bridge between Descriptive and Inferential Statistics
Probability
Population Sample
Statistics
BS/IIFT/Harsh/4 5
Probabilistic vs. Statistical Reasoning
BS/IIFT/Harsh/4 6
What is Probability?
Probability is chance of something to happen.
Sample Population
And “How often”
= Relative frequency Probability
BS/IIFT/Harsh/4 7
Basic Concepts
BS/IIFT/Harsh/4 8
Experiments & Events
BS/IIFT/Harsh/4 9
Sample Spaces
BS/IIFT/Harsh/4 11
Basic Concepts
BS/IIFT/Harsh/4 12
Example
The dice toss:
Simple events: Sample space:
1 E1
2
S ={E1, E2, E3, E4, E5, E6}
E2
S
3 E3 •E1 •E3
4 E4 •E5
5
E5 •E2 •E4 •E6
6 E6
BS/IIFT/Harsh/4 13
Basic Concepts
BS/IIFT/Harsh/4 18
Dependent or Independent Events
The Event of a Happy Face GIVEN it is Light Colored
•Tree Diagrams
BS/IIFT/Harsh/4 20
Contingency Table 3/08/23
A Deck of 52 Cards
Red Ace
Not an Total
Ace
Ace
Red 2 24 26
Black 2 24 26
Total 4 48 52
Sample Space
BS/IIFT/Harsh/4 21
Tree Diagram
Event Possibilities
Ace
Red
Cards Not an Ace
Full
Deck
of Cards Ace
Black
Cards
Not an Ace
BS/IIFT/Harsh/4 22
Basic Concepts
Mutually Exclusive Events
BS/IIFT/Harsh/4 23
Exhaustive Events
BS/IIFT/Harsh/4 24
Favorable Events
The number of cases favorable to an event in a trial is
the number of outcomes which entail the happening of
the event.
Example
In drawing a card from a pack of cards the number of
cases favorable to drawing of an ace is 4, for drawing a
spade is 13 and for drawing a red card is 26.
In throwing of two dice ,the number of favorable cases to
get a sum of 5 is :(1,4),(4,1), (2,3), (3,2) i.e 4.
BS/IIFT/Harsh/4 25
Equally Likely Events
BS/IIFT/Harsh/4 26
Independent Events
Events are said to be independent if happening or non
happening of an event is not affected by the
supplementary knowledge concerning the occurrence of
any number of the remaining events.
Examples
In tossing an unbiased coin, the event of getting a head
in the first toss is independent of getting a head in the
second ,third and subsequent throws.
Drawing a card from a pack of a well-shuffled cards and
replacing it before drawing other ,the second draw is
independent of the first. But if the card drawn in the first
draw is not replaced then second draw is dependent on
the first draw.
BS/IIFT/Harsh/4 27
The Probability of an Event
• Examples:
–Toss a fair coin. P(Head) = 1/2
– Suppose that 10% of the Indian population has
brown hair. Then for a person selected at random,
P(Brown hair) =0.10
BS/IIFT/Harsh/4 30
Example 2
BS/IIFT/Harsh/4 32
Example 1
BS/IIFT/Harsh/4 33
Example 2
The sample space of throwing a pair of dice is
BS/IIFT/Harsh/4 34
Example 3
BS/IIFT/Harsh/4 35
Counting Rules
BS/IIFT/Harsh/4 36
The mn Rule
If an experiment is performed in two stages, with m
ways to accomplish the first stage and n ways to
accomplish the second stage, then there are mn ways
to accomplish the experiment.
This rule is easily extended to k stages, with the
number of ways equal to
n1 n2 n3 … nk
m
Example: Toss three coins. The total number of
simple events is: 222=8
Example: Toss two dice.
The total number of 6 6 = 36
simple events is:
Example: Toss three dice. The total number of
simple events is: 6 6 6 = 216
Classical Probability
Assigning probabilities based on the assumption of
equally likely outcomes.
Probability for an event to occur
= Number of outcomes where the event occurs
Total number of possible out comes.
P(H)=1/2, probability of getting a head.
P(3)=1/6, probability of getting 3 on rolling a dice.
Sample Space: S = {1, 2, 3, 4, 5, 6}
If an experiment has n possible outcomes, this method
would assign a probability of 1/n to each outcome.
BS/IIFT/Harsh/4 39
Contd.-Shortcoming of Classical approach
A priori probability, it is another name of classical
probability as we can answer about out come in
advance without tossing a fair coin, rolling an
unbiased dice or drawing a card from standard
deck.
For a prior probability ,probability estimates are
made prior to receiving new information.
Do you encounter such situations in real
management world? …No!
Classical approach of probability assumes a world
that does not exist!
BS/IIFT/Harsh/4 40
Relative frequency of occurrence
Relative Frequency of occurrence is proportion of times
that an event occurs in long run when the conditions are
stable or it is the observed relative frequency of an event in
a very large number of trials.
It is assigning probabilities based on experimentation or
historical data.
By increasing number of observations we increase
accuracy.
BS/IIFT/Harsh/4 42
Example: Lucas Tool Rental
Relative Frequency Method
The probability assignments are given by dividing
the number-of-days frequencies by the total frequency.
Number of Number
Polishers Rented of Days Probability
0 4 4/40
1 6 6/40
2 18 18/40
3 10 10/40 .25
4 2 2/40 .05
total 40 1.00
BS/IIFT/Harsh/4 43
Subjective Probability
Subjective Method
Assigning probabilities based on the assigner’s judgment.
Probabilities are based solely on historical data.
We can use any data available as well as our experience and
intuition, but ultimately a probability value should express our
degree of belief that the experimental outcome will occur.
The best probability estimates are often obtained by combining
the estimates from the classical or relative frequency approach
with the subjective estimates.
BS/IIFT/Harsh/4 44
Example
Subjective probability plays role when event occurs once or at
most few times.
Suppose there are three candidates for social service worker and
each three have attractive appearance, high level of energy and
self confidence, a record of past accomplishments and attitude to
face challenges. What are the chances each will relate to client
successfully? Choosing among three will require to assign a
subjective probability to each persons’ potential.
Choosing a site for construction of nuclear power plant. You don't
have any past record for geological fault at the site. What is
probability of major nuclear disaster. It becomes subjective
probability issue.
BS/IIFT/Harsh/4 45
Permutations
n!
P n
(n r )!
r
1!
BS/IIFT/Harsh/4 46
Ans 12,
Example: How many 2-
(1,2),(1,3),(1,4)
digit lock combinations
can we make from the (2,1),(2,3),(2,4)
numbers 1, 2, 3, and 4? (3,1),(3,2),(3,4)
(4,1),(4,2),(4,3)
BS/IIFT/Harsh/4 47
Examples
0!
BS/IIFT/Harsh/4 48
Combinations
The number of distinct combinations of n distinct
objects that can be formed, taking them r at a time is
n!
C
n
r!(n r )!
r
A B A B
BS/IIFT/Harsh/4 50
Event Relations
The intersection of two events, A and B, is the
event that both A and B occur when the
experiment is performed. We write A B.
A B A B
BS/IIFT/Harsh/4 52
Example
P( A B) P( A) P( B) P( A B)
A B
BS/IIFT/Harsh/4 54
Example: Additive Rule
A: dice add to 3
(1,2 2,1)=2
B: dice add to 6
(1,5 5,1 2,4 4,2 3,3) =5
Find P(AUB).
P(AC) = 1 – P(A)
BS/IIFT/Harsh/4 59
Example
BS/IIFT/Harsh/4 60
Calculating Probabilities for Intersections
Joint Probability
The probability of two events occurring together or in
succession.
Marginal Probability
The unconditional probability of one event occurring or the
probability of a single event.
BS/IIFT/Harsh/4 62
Joint Probability Using Contingency Table
Event
Event B1 B2 Total
A1 P(A1 and B1) P(A1 and B2) P(A1)
BS/IIFT/Harsh/4 63
Compound Probability :Addition Rule
P(A1 or B1 ) = P(A1) +P(B1) - P(A1 and B1)
Event
Event B1 B2 Total
A1 P(A1 and B1) P(A1 and B2) P(A1)
BS/IIFT/Harsh/4 64
Conditional Probabilities
The probability that A occurs, given that event
B has occurred is called the conditional
probability of A given B and is defined as
P( A B)
P( A | B) if P( B) 0
P( B)
“given”
BS/IIFT/Harsh/4 65
Conditional Probability Using Contingency Table
Color
Type Red Black Total
Ace 2 2 4
Non-Ace 24 24 48
Total 26 26 52
P(Ace and Black) 2 / 52 2
P(Ace | Black) =
P(Black) 26 / 52 26
BS/IIFT/Harsh/4 66
Example 1
Toss a fair coin twice. Define
A: head on second toss
B: head on first toss
P(A|B) = ½
HH
1/4 P(A|not B) = ½
HT 1/4
P(A) does not A and B are
TH 1/4
change, whether independent!
TT 1/4 B happens or
not…
BS/IIFT/Harsh/4 67
Example 2
BS/IIFT/Harsh/4 72
Example 1
In a certain population, 10% of the people can be
classified as being high risk for a heart attack. Three
people are randomly selected from this population.
What is the probability that exactly one of the three
are high risk?
Define H: high risk N: not high risk
P(N)=1 - P(H)=1-0.1=0.9
BS/IIFT/Harsh/4 75
The Law of Total Probability
S1
A Sk
A
A S1 Sk
S2….
BS/IIFT/Harsh/4 76
Bayes’ Rule
Let S1 , S2 , S3 ,..., Sk be mutually exclusive and exhaustive
events with prior probabilities P(S1), P(S2),…,P(Sk). If an event
A occurs, the posterior probability of Si, given that A occurred
is
P( Si ) P( A | Si )
P( Si | A) for i 1, 2,...k
P( Si ) P( A | Si )
Proof
P( AS i )
P( A | Si )
P( AS i ) P( Si ) P( A | Si )
P( Si )
P( AS i ) P( Si ) P( A | Si )
P( Si | A)
P( A) P( Si ) P( A | Si )
BS/IIFT/Harsh/4 77
Bayes’ Theorem
Often we begin probability analysis with initial or prior
probabilities.
Then, from a sample, special report, or a product test we
obtain some additional information.
Given this information, we calculate revised or posterior
probabilities.
Bayes’ theorem provides the means for revising the prior
probabilities.
Application
Prior New Posterior
of Bayes’
Probabilities Information Probabilities
Theorem
BS/IIFT/Harsh/4 78
Bayes’ theorem is applicable when the events for which
we want to compute posterior probabilities are mutually
exclusive and their union is the entire sample space.
A Priori Probability- Probability estimate made prior to
receiving new information. [P(Si) i=1,2,….n]
Posterior Probability- A probability that has been revised
after information was obtained. [P(Si/A) i=1,2,….n]
Likelihoods- Probabilities [P(A/Si) i=1,2,….n] are called
Likelihoods as they indicate how likely the event A under
consideration is to occur, given each and every a priori
probability.
BS/IIFT/Harsh/4 79
Bayes’ Theorem
P( A Bi ) P( Bi )
P(Bi A) =
P( A B1 ) P( B1 ) P( A Bk ) P( Bk )
P( Bi and A)
P( A) Adding up
the parts
of A in all
Same
the B’s
Event
BS/IIFT/Harsh/4 80
Problem
The contents of three urns are as follows:
Urn 1 1W,2B,3 R
Urn 2 2W,1B, 1R
Urn 3 4W,5B, 3R
One urn is chosen at random and two balls are drawn.
They happen to be white and red. What is the
probability that they came from urn I,II or III?
BS/IIFT/Harsh/4 81
Solution-Assignment
Urn 1 1W,2B,3 R
Urn 2 2W,1B, 1R
Urn 3 4W,5B, 3R
Let E1 E2 E3 are the events that Urn 1,Urn 2,Urn 3 is chosen respectively.
Let A be the event that the two balls taken from the selected urn are
White and Red.
P(E1)= P(E2)= P(E3)=1/3
P(A/ E1) =1*3/6C2 =1/5
P(A/ E2) =2*1/4C2 =1/3
P(A/ E3) =4*3/12C2 =2/11
P(E2/A)= P(E2)* P(A/ E2)
P(E1)* P(A/ E1)+ P(E2)* P(A/ E2)+ P(E3)* P(A/ E3)
=55/118=0.466 BS/IIFT/Harsh/4 82
Definitions and Learning
The probability of any outcome of a random phenomenon is the
proportion of times the outcome would occur in a very long series of
repetitions.
In classical approach, probability is known as long run relative
frequency. That is if we repeat the experiment long number of time,
or in other way, suppose we have a large sample size, then proportion
of times the event A occurs (that is the relative frequency) will be
very close to the true value of the probability of that event.
BS/IIFT/Harsh/4 83
Probability Rules:
Rule 1:
Let N(A) = Number of events favorable to event A.
N= Total number of events in the Sample Space S.
𝑁(𝐴)
P(A)=
𝑁
BS/IIFT/Harsh/4 84
Probability Rules:
BS/IIFT/Harsh/4 85
Probability Rules
Rules 5: Addition Rule:
Two events A and B are disjoint if they have no outcomes in
common and so can never occur simultaneously. If A and B are
disjoint,
P(A or B) = P(A) + P(B)
BS/IIFT/Harsh/4 86
Probability Rules:
Rule 6: General Addition Rule for Any Two Events.
BS/IIFT/Harsh/4 87
Probability Rules:
Rule 7: Conditional Probability
When P(A) > 0, the conditional probability of B given A is,
P(B|A) = P( A and B)
P(A)
BS/IIFT/Harsh/4 88
Probability Rules:
Rule 8: Multiplication Rule for Independent Events
Two events A and B are independent if knowing that one occur
does not change the probability that the other occurs. If A and B are
independent,
P( A and B) = P(A)P(B)
BS/IIFT/Harsh/4 89
Probability Rules:
BS/IIFT/Harsh/4 90
Probability Rules:
Rule 10: General Multiplication Rule for Any Two Events
The probability that both of two events A and B happen together can
be found by
P( A and B) = P(A) P( B|A)
Here P (B|A) is the conditional probability that B occurs given the
information that A occurs.
BS/IIFT/Harsh/4 91
Bayes’ Theorem
To find the posterior probability that event Bi
will occur given that event A has occurred we
apply Bayes’ theorem.
P( A Bi ) P( Bi )
P(Bi A) = P( A B1 ) P( B1 ) P( A Bk ) P( Bk )
P( A / B1) * P( B1)
P( B1 / A)
P( A / B1) * P( B1) P( A / B 2) * P( B 2)
BS/IIFT/Harsh/4 92
Thanks
BS/IIFT/Harsh/4 93