Download as pdf or txt
Download as pdf or txt
You are on page 1of 50

Probability Concepts and Applications

Introduction

• Life is uncertain; we are not sure what the


future will bring.
• Probability is a numerical statement about
the likelihood that an event will occur.
Fundamental Concepts

1. The probability, P, of any event or state of


nature occurring is greater than or equal to 0
and less than or equal to 1. That is:

0  P (event)  1

2. The sum of the simple probabilities for all


possible outcomes of an activity must equal 1.
Topics that use Probability

Decision Analysis
Regression Models
Forecasting
Inventory Control Models
Project Management
Waiting Lines and Queuing Theory Models
Markov Analysis
Diversey Paint Example
• Demand for white latex paint at Diversey Paint and
Supply has always been either 0, 1, 2, 3, or 4 gallons per
day.
• Over the past 200 days, the owner has observed the
following frequencies of demand:

QUANTITY
NUMBER OF DAYS PROBABILITY
DEMANDED
0 40 0.20 (= 40/200)
1 80 0.40 (= 80/200)
2 50 0.25 (= 50/200)
3 20 0.10 (= 20/200)
4 10 0.05 (= 10/200)
Total 200 Total 1.00 (= 200/200)
Diversey Paint Example
• Demand for white latex paint at Diversey Paint and
Supply
Notice hasindividual
the always been either 0, 1, 2, 3, or 4 gallons per
probabilities
areday
all between 0 and 1
• Over the past 200 days, the owner has observed the
0 ≤frequencies
following P (event) of
≤1demand
And the total of all event
QUANTITY equals 1
probabilities NUMBER OF DAYS PROBABILITY
DEMANDED
0 ∑P (event) = 1.00
40 0.20 (= 40/200)
1 80 0.40 (= 80/200)
2 50 0.25 (= 50/200)
3 20 0.10 (= 20/200)
4 10 0.05 (= 10/200)
Total 200 Total 1.00 (= 200/200)
Types of Probability
Determining objective probability :
• Relative frequency
▫ Typically based on historical data
Number of occurrences of the event
P (event) =
Total number of trials or outcomes

◼ Classical or logical method


◼ Logically determine probabilities without
trials
1 Number of ways of getting a head
P (head) =
2 Number of possible outcomes (head or tail)
Types of Probability

Subjective probability is based on the


experience and judgment of the person making
the estimate.
▫ Opinion polls
▫ Judgment of experts
▫ Delphi method
Mutually Exclusive Events

Events are said to be mutually exclusive if only one


of the events can occur on any one trial.

◼ Tossing a coin will result


in either a head or a tail.
◼ Rolling a die will result in
only one of six possible
outcomes.
Collectively Exhaustive Events

Events are said to be collectively exhaustive if the list


of outcomes includes every possible outcome.
▫ Both heads and
tails as possible
outcomes of
OUTCOME
coin flips. OF ROLL
PROBABILITY
▫ All six possible 1 1/
6
outcomes 2 1/
6
of the roll 3 1/
6
of a die. 4 1/
6
5 1/
6
6 1/
6
Total 1
Adding Mutually Exclusive Events

We often want to know whether one or a


second event will occur.
◼ When two events are mutually
exclusive, the law of addition is:

P (event A or event B) = P (event A) + P (event B)


Adding Not Mutually Exclusive Events

The equation must be modified to account


for double counting.
◼ The probability is reduced by
subtracting the chance of both events
occurring together.
P (event A or event B) = P (event A) + P (event B)
– P (event A and event B both occurring)

P (A or B) = P (A) + P (B) – P (A and B)


Venn Diagrams

P (A and B)

P (A) P (B) P (A) P (B)

Events that are mutually Events that are not


exclusive. mutually exclusive.

P (A or B) = P (A) + P (B) P (A or B) = P (A) + P (B)


– P (A and B)
Statistically Independent Events

Events may be either independent or dependent.


• For independent events, the occurrence of one event
has no effect on the probability of occurrence of the
second event.
Which Sets of Events Are Independent?

(a) Your education


Dependent events
(b) Your income level

(a) Snow in Santiago, Chile


Independent events
(b) Rain in Tel Aviv, Israel
Three Types of Probabilities
• Marginal (or simple) probability is just the probability of a
single event occurring.
P (A)
◼ Joint probability is the probability of two or more events occurring and is
equal to the product of their marginal probabilities for independent
events.
P (AB) = P (A) x P (B)

◼ Conditional probability is the probability of event B given that event A


has occurred.
P (B | A) = P (B)
◼ Or the probability of event A given that event B has occurred
P (A | B) = P (A)
Joint Probability Example

The probability of tossing a 6 on the first


roll of the die and a 2 on the second roll:
P (6 on first and 2 on second)
= P (tossing a 6) x P (tossing a 2)
= 1/6 x 1/6 = 1/36 = 0.028
Independent Events
A bucket contains 3 black balls and 7 green balls.
◼ Draw a ball from the bucket, replace it, and
draw a second ball.

1. The probability of a black ball drawn on first draw is:


P (B) = 0.30 (a marginal probability)
2. The probability of two green balls drawn is:
P (GG) = P (G) x P (G) = 0.7 x 0.7 = 0.49
(a joint probability for two independent events)
Independent Events
A bucket contains 3 black balls and 7 green balls.
◼ Draw a ball from the bucket, replace it, and
draw a second ball.
3. The probability of a black ball drawn on the second draw if the
first draw is green is:
P (B | G) = P (B) = 0.30
(a conditional probability but equal to the marginal
because the two draws are independent events)
4. The probability of a green ball drawn on the second draw if the
first draw is green is:
P (G | G) = P (G) = 0.70
(a conditional probability as in event 3)
Statistically Dependent Events
The marginal probability of an event occurring is computed
in the same way:
P (A)
Calculating conditional probabilities is slightly more complicated. The
probability of event A given that event B has occurred is:

P (AB)
P (A | B) =
P (B)
The formula for the joint probability of two events is:
P (AB) = P (B | A) P (A)
When Events Are Dependent

Assume that we have an urn containing 10 balls of


the following descriptions:
◼ 4 are white (W) and lettered (L)
◼ 2 are white (W) and numbered (N)
◼ 3 are yellow (Y) and lettered (L)
◼ 1 is yellow (Y) and numbered (N)

P (WL) = 4/10 = 0.4 P (YL) = 3/10 = 0.3


P (WN) = 2/10 = 0.2 P (YN) = 1/10 = 0.1
P (W) = 6/10 = 0.6 P (L) = 7/10 = 0.7
P (Y) = 4/10 = 0.4 P (N) = 3/10 = 0.3
When Events Are Dependent

The conditional probability that the ball drawn


is lettered, given that it is yellow, is:

P (YL) 0.3
P (L | Y) = = = 0.75
P (Y) 0.4
Joint Probabilities for Dependent Events

If the stock market reaches 12,500 point by January,


there is a 70% probability that Tubeless Electronics
will go up.
◼ You believe that there is only a 40% chance the
stock market will reach 12,500.
◼ Let M represent the event of the stock market
reaching 12,500 and let T be the event that
Tubeless goes up in value.

P (MT) = P (T | M) x P (M) = (0.70)(0.40) = 0.28


Bayes’ Theorem

Suppose we have estimated


prior probabilities for events
we are concerned with, and
then obtain new
information. We would like
to have a sound method to
compute revised or
posterior probabilities.
Bayes’ theorem gives us a
way to do this.
Revising Probabilities with Bayes’ Theorem

Bayes’ theorem is used to incorporate additional information and help


create posterior probabilities.

Prior
Probabilities

Bayes’ Posterior
Process Probabilities

New
Information
General Form of Bayes’ Theorem

We can compute revised probabilities more


directly by using:

P ( B | A) P ( A)
P( A | B) =
P ( B | A) P ( A) + P ( B | A ) P ( A )

where
A = the complement of the event A;
Application of Bayes’ Theorem

•Consider a
manufacturing firm that
receives shipment of
parts from two
suppliers.
•Let A1 denote the event
that a part is received
from supplier 1; A2 is
the event the part is
received from supplier 2
We get 65 percent of our
parts from supplier 1 and
35 percent from supplier
2.

Thus:
P(A1) = .65 and P(A2) = .35
Quality levels differ between suppliers

Percentage Percentage
Good Parts Bad Parts
Supplier 1 98 2
Supplier 2 95 5

Let G denote that a part is good and B


denote the event that a part is bad. Thus we
have the following conditional probabilities:

P(G | A1 ) = .98 and P(B | A1) = .02

P(G | A2 ) = .95 and P(B | A2 ) = .05


Tree Diagram for Two-Supplier Example
Step 1 Step 2 Experimental
Supplier Condition Outcome
(A1, G)
G

A1
B
(A1, B)

A2 G (A2, G)

B
(A2, B)
Each of the experimental
outcomes is the
intersection of 2 events.
For example, the
probability of selecting a
part from supplier 1 that
is good is given by:

P( A1 , G) = P( A1 ) P(G | A1 )
Probability Tree for Two-Supplier Example

Step 1 Step 2
Probability of Outcome
Supplier Condition

P(G | A1) P( A1G) = P( A1 ) P(G | A1 ) = .6370


.98
P( A1B) = P( A1 ) P( B | A1 ) = .0130
P(A1)
P(B | A1)

.65 .02

P(G | A2)
P(A2) P( A2G) = P( A2 ) P(G | A2 ) = .3325
.95
.35
P(B | A2) P( A2 B) = P( A2 ) P(G | A2 ) = .0175

.05
A bad part broke one of
our machines—so we’re
through for the day.
What is the probability
the part came from
suppler 1?
We know from the law of
conditional probability that:

P( A1 B)
P( A1 | B) =
P( B)

Observe from the


probability tree that:

P( A1B) = P( A1 ) P( B | A1 )
The probability of selecting a bad part is found by
adding together the probability of selecting a bad part
from supplier 1 and the probability of selecting bad
part from supplier 2.

That is:

P( B) = P( A1 B) + P( A2 B)
= P( A1 ) P( B | A1 ) + P( A2 ) P( B / A2 )
Tabular Approach to Bayes’ Theorem

(1) (2) (3) (4) (5)


Prior Conditional Joint Posterior
Events Probabilities Probabilities Probabilities Probabilities
Ai P(Ai) P(B | Ai ) P(Ai B) P(Ai | B)

A1 .65 .02 .0130 .0130/.0305 =.4262

A2 .35 .05 .0175 .0175/.0305 =.5738

1.00 P(B)=.0305 1.0000


Random Variables

A random variable assigns a real number to


every possible outcome or event in an
experiment.
X = number of refrigerators sold during the day

Discrete random variables can assume only a finite


or limited set of values.
Continuous random variables can assume any one
of an infinite set of values.
Random Variables – Numbers
RANGE OF
RANDOM
EXPERIMENT OUTCOME RANDOM
VARIABLES
VARIABLES
Stock 50 Number of Christmas X 0, 1, 2,…, 50
Christmas trees trees sold
Inspect 600 Number of acceptable Y 0, 1, 2,…, 600
items items
Send out 5,000 Number of people Z 0, 1, 2,…, 5,000
sales letters responding to the
letters
Build an Percent of building R 0 ≤ R ≤ 100
apartment completed after 4
building months
Test the lifetime Length of time the S 0 ≤ S ≤ 80,000
of a lightbulb bulb lasts up to 80,000
(minutes) minutes
Random Variables – Not Numbers

RANGE OF
EXPERIMENT OUTCOME RANDOM VARIABLES RANDOM
VARIABLES
Students Strongly agree (SA) 5 if SA 1, 2, 3, 4, 5
respond to a Agree (A) 4 if A..
questionnaire Neutral (N) X= 3 if N..
Disagree (D) 2 if D..
Strongly disagree (SD) 1 if SD
One machine Defective Y= 0 if defective 0, 1
is inspected Not defective 1 if not defective

Consumers Good 3 if good…. 1, 2, 3


respond to Average Z= 2 if average
how they like a Poor 1 if poor…..
product
Probability Distribution of a Discrete Random Variable

For discrete random variables a probability is


assigned to each event.
The students in Pat Shannon’s statistics class
have just completed a quiz of five algebra
problems. The distribution of correct scores is
given in the following table:
Probability Distribution of a Discrete Random Variable

RANDOM VARIABLE NUMBER PROBABILITY


(X – Score) RESPONDING P (X)
5 10 0.1 = 10/100
4 20 0.2 = 20/100
3 30 0.3 = 30/100
2 30 0.3 = 30/100
1 10 0.1 = 10/100
Total 100 1.0 = 100/100
Probability Distribution for Dr. Shannon’s Class

0.4 –

0.3 –
P (X)

0.2 –

0.1 –

0–
| | | | | |
1 2 3 4 5
X
The Binomial Distribution

• Many business experiments can be characterized


by the Bernoulli process.
• The Bernoulli process is described by the
binomial probability distribution.
1. Each trial has only two possible outcomes.
2. The probability of each outcome stays the same from
one trial to the next.
3. The trials are statistically independent.
4. The number of trials is a positive integer.
Binomial Distribution Settings
• A manufacturing plant labels items as either
defective or acceptable
• A firm bidding for a contract will either get the
contract or not
• A marketing research firm receives survey
responses of “yes I will buy” or “no I will not”
• New job applicants either accept the offer or
reject it
The Binomial Distribution

The binomial distribution is used to find the probability


of a specific number of successes in n trials.

We need to know:
n = number of trials
p = the probability of success on any
single trial

We let
r = number of successes
q = 1 – p = the probability of a failure
The Binomial Distribution
The binomial formula is:

n!
Probability of r successes in n trials = p r q n− r
r! ( n − r )!

The symbol ! means factorial, and


n! = n(n – 1)(n – 2)…(1)
For example
4! = (4)(3)(2)(1) = 24
By definition
1! = 1 and 0! = 1
The Binomial Distribution
Binomial Distribution for n = 5 and p = 0.50.
NUMBER OF 5!
Probability = (0.5)r(0.5)5 – r
HEADS (r) r!(5 – r)!
0 0.03125 = 5! (0.5)0(0.5)5 – 0
0!(5 – 0)!
1 0.15625 = 5! (0.5)1(0.5)5 – 1
1!(5 – 1)!
2 0.31250 = 5! (0.5)2(0.5)5 – 2
2!(5 – 2)!
3 0.31250 = 5! (0.5)3(0.5)5 – 3
3!(5 – 3)!
4 0.15625 = 5! (0.5)4(0.5)5 – 4
4!(5 – 4)!
5 0.03125 = 5! (0.5)5(0.5)5 – 5
5!(5 – 5)!
Solving Problems with the Binomial Formula

We want to find the probability of 4 heads in 5 tosses.

n = 5, r = 4, p = 0.5, and q = 1 – 0.5 = 0.5

Thus
5!
P = (4 successes in 5 trials ) = 0.540.55−4
4!(5 − 4)!

5( 4)(3)(2)(1)
= (0.0625)(0.5) = 0.15625
4(3)(2)(1)(1! )
Solving Problems with Binomial Tables

MSA Electronics is experimenting with the


manufacture of a new transistor.
◼ Every hour a random sample of 5 transistors is
taken.
◼ The probability of one transistor being defective
is 0.15.
What is the probability of finding 3, 4, or 5 defective?
So n = 5, p = 0.15, and r = 3, 4, or 5

We could use the formula to solve this problem, but using the table is
easier.
Solving Problems with Binomial Tables

P
n r 0.05 0.10 0.15
5 0 0.7738 0.5905 0.4437
1 0.2036 0.3281 0.3915
2 0.0214 0.0729 0.1382
3 0.0011 0.0081 0.0244
4 0.0000 0.0005 0.0022
5 0.0000 0.0000 0.0001

We find the three probabilities in the table


for n = 5, p = 0.15, and r = 3, 4, and 5 and
add them together.
Solving Problems with Binomial Tables

P
P (3 or
n more
r 0.05 ) = P (3) + P0.10
defects ( 4 ) + P (5 ) 0.15
5 0 0.7738 = 0.0244 + 0.0022 + 0.0001
0.5905 0.4437 = 0.0267
1 0.2036 0.3281 0.3915
2 0.0214 0.0729 0.1382
3 0.0011 0.0081 0.0244
4 0.0000 0.0005 0.0022
5 0.0000 0.0000 0.0001

We find the three probabilities in the table


for n = 5, p = 0.15, and r = 3, 4, and 5 and
add them together

You might also like