Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

Lecture-- 5

Lecture
Discrete Probability Distributions
Random Variables
Lecture-- 5
Lecture 
 Discrete Probability Distributions
Discrete Probability Distributions  Expected Value and Variance
 Binomial Probability Distribution
 Poisson Probability Distribution
Course Instructor: Rumana Hossain
 Hypergeometric Probability Distribution
Deparment of Physical Sciences; SECS
.40

.30

.20

.10

0 1 2 3 4

Slide 1 Slide 2

Random Variables Example: JSL Appliances

 A random variable is a numerical description of the  Discrete random variable with a finite number of
outcome of an experiment. values
 A random variable can be classified as being either Let x = number of TV sets sold at the store in one day
discrete or continuous depending on the numerical where x can take on 5 values (0, 1, 2, 3, 4)
values it assumes.
 A discrete random variable may assume either a  Discrete random variable with an infinite sequence of
finite number of values or an infinite sequence of values
values. Let x = number of customers arriving in one day
 A continuous random variable may assume any where x can take on the values 0, 1, 2, . . .
numerical value in an interval or collection of We can count the customers arriving, but there is no
intervals. finite upper limit on the number that might arrive.

Slide 3 Slide 4

1
Discrete Probability Distributions Example: JSL Appliances

 The probability distribution for a random variable  Using past data on TV sales (below left), a tabular
describes how probabilities are distributed over the representation of the probability distribution for TV
values of the random variable. sales (below right) was developed.
 The probability distribution is defined by a
probability function,
function, denoted by f(x), which provides Number
the probability for each value of the random variable. Units Sold of Days x f(x)
 The required conditions for a discrete probability 0 80 0 .40
function are: 1 50 1 .25
f(x) > 0 2 40 2 .20
f(x) = 1 3 10 3 .05
 We can describe a discrete probability distribution 4 20 4 .10
with a table, graph, or equation. 200 1.00

Slide 5 Slide 6

Example: JSL Appliances Discrete Uniform Probability Distribution

 Graphical Representation of the Probability  The discrete uniform probability distribution is the
Distribution simplest example of a discrete probability
distribution given by a formula.
.50  The discrete uniform probability function is
Probability

.40
f(x) = 1/n
1/n
.30 where:
.20 n = the number of values the random
variable may assume
.10
 Note that the values of the random variable are
equally likely.
0 1 2 3 4
Values of Random Variable x (TV sales)

Slide 7 Slide 8

2
Expected Value and Variance Example: JSL Appliances

 The expected value,


value, or mean, of a random variable is  Expected Value of a Discrete Random Variable
a measure of its central location.
• Expected value of a discrete random variable: x f(x) xf((x)
xf
0 .40 .00
E(x) =  = xf
xf((x)
1 .25 .25
 The variance summarizes the variability in the values
2 .20 .40
of a random variable.
• Variance of a discrete random variable: 3 .05 .15
4 .10 .40
Var(xx) =  2 = (x - )2f(x)
Var(
E(x) = 1.20
 The standard deviation,
deviation, , is defined as the positive
square root of the variance. The expected number of TV sets sold in a day is 1.2

Slide 9 Slide 10

Example: JSL Appliances Binomial Probability Distribution

 Variance and Standard Deviation  Properties of a Binomial Experiment


of a Discrete Random Variable • The experiment consists of a sequence of n
identical trials.
x x- (x - )2 f(x) (x - )2f(x)
• Two outcomes, success and failure
failure,, are possible on
0 -1.2 1.44 .40 .576 each trial.
1 -0.2 0.04 .25 .010 • The probability of a success, denoted by p, does
2 0.8 0.64 .20 .128 not change from trial to trial.
3 1.8 3.24 .05 .162 • The trials are independent.
4 2.8 7.84 .10 .784
1.660 =  
The variance of daily sales is 1.66 TV sets squared
squared..
The standard deviation of sales is 1.2884 TV sets.

Slide 11 Slide 12

3
Example: Evans Electronics Binomial Probability Distribution

 Binomial Probability Distribution  Binomial Probability Function


Evans is concerned about a low retention rate for n!
employees. On the basis of past experience, f ( x)  p x (1  p) ( n  x )
x !( n  x )!
management has seen a turnover of 10% of the
hourly employees annually. Thus, for any hourly where:
employees chosen at random, management estimates f(x) = the probability of x successes in n trials
a probability of 0.1 that the person will not be with n = the number of trials
the company next year.
p = the probability of success on any one trial
Choosing 3 hourly employees a random, what is
the probability that 1 of them will leave the company
this year?
Let::
Let p = .10, n = 3, x = 1

Slide 13 Slide 14

Example: Evans Electronics Example: Evans Electronics

 Using the Binomial Probability Function  Using the Tables of Binomial Probabilities
n! p
f ( x)  p x (1  p) ( n  x ) n x .10 .15 .20 .25 .30 .35 .40 .45 .50
x !( n  x )!
3 0 .7290 .6141 .5120 .4219 .3430 .2746 .2160 .1664 .1250
3! 1 .2430 .3251 .3840 .4219 .4410 .4436 .4320 .4084 .3750
f (1)  ( 0.1)1 ( 0. 9 ) 2 2 .0270 .0574 .0960 .1406 .1890 .2389 .2880 .3341 .3750
1!( 3  1)!
3 .0010 .0034 .0080 .0156 .0270 .0429 .0640 .0911 .1250
= (3)(0.1)(0.81)

= .243

Slide 15 Slide 16

4
Example: Evans Electronics Binomial Probability Distribution

 Using a Tree Diagram  Expected Value


First Second Third Value
Worker Worker Worker of x Probab. E(x) =  = np
L (.1) 3 .0010
Leaves (.1)  Variance
S (.9) 2 .0090
Leaves (.1) Var((x) =  2 = np
Var (1 - p)
np(1
L (.1) 2 .0090
Stays (.9)  Standard Deviation
S (.9) 1 .0810
SD( x )    np (1  p )
L (.1) 2 .0090
Leaves (.1)
S (.9) 1 .0810
Stays (.9) L (.1) 1 .0810
Stays (.9)
S (.9) 0 .7290
Slide 17 Slide 18

Geometric distribution
Example: Evans Electronics

 Binomial Probability Distribution  The geometric random variable x is defined as the


• Expected Value number of trials needed to obtain the first success.
E(x) =  = 3(.1) = .3 employees out of 3 Therefore, if success is obtained in the nth trial, this
means that the previous (n(n – 1) trials ended up in
• Variance failure.
Var(x) =  2 = 3(.1)(.9) = .27
• Standard Deviation
SD( x)    3(.1)(.9)  .52 employees

Slide 19 Slide 20

5
Geometric distribution Geometric distribution

 If probability of success is considered to be p, the  Mean, m = E[x] = 1


probability of failure would be (1 – p). For sake of p q
simplicity, the probability of failure is often denoted  Variance, s2 = E[x2] – E[x]2 =
by q. Therefore the geometric probability distribution p2
can be written as
p(x = n) = (1 – p)n – 1p = qn – 1p
Or simply q
f(x) = qx – 1p  Standard deviation, s =
p2

Slide 21 Slide 22

Example
 The requested probability is
 The probability that a wafer contains a large particle  p(X = 125) = (0.985)1240.015 = 0.0023
of contamination is 0.015. If it is assumed that the  Average number of wafers those have to be tested before
wafers are independent, what is the probability that a contamination is found is .
exactly 125 wafers need to be analyzed before a large
particle is detected?
1 1
  67
Let X denote the number of samples analyzed until a p 0 . 015
large particle is detected. Then X is a geometric  Standard deviation of the number of wafer those have to
random variable with p = 0.01. be tested is

q 0 .985
  66
p2 0 .015 2
Slide 23 Slide 24

6
Negative binomial distribution
Negative binomial distribution

 The negative binomial distribution is a distribution  Therefore the characteristics of the problem are
that can be thought of as ‘reverse’ of the binomial  The experiment consists of a series of independent
distribution. In binomial distribution, the number of and identical trial, each with a probability of success
trials, n, is fixed, and the number of successes, x, is a p;
variable. We essentially look for the probability of x  The trials are observed until exactly r successes are
successes in n trials. obtained, where r is fixed by the experimenter;
 In negative binomial distribution, the number of  The random variable x is the number of trials needed
successes is fixed, and we look for the number of to obtain r successes.
trials that would be needed to obtain the success.
Therefore, the number of success, r, is kept fixed; and
the number trials, x, is made a variable.

Slide 25 Slide 26

Negative binomial distribution Negative binomial distribution

Several important observations can be here.  The mean, m =


 The experiment would always end with a success; r
 There would be exactly r – 1 successes in the previous x –
1 trials, in any combination;  The variance, s2 =
p
 There would be exactly x – r failures in x – 1 trials.
 Therefore the probability distribution function can be rq
written as
p2
 x  1
f ( x )    1  p x  r p r

 r 1

Slide 27 Slide 28

7
Example

 Cotton linters used in the production of solid rocket  Here ‘success’ is obtaining a defective lot; and hence
propellants are subjected to a nitration process that p = 0.1 and r = 3. The probability that x = 20 is given
enables the cotton fibers to go into the solution. The by
process is 90% effective in that the materials
produced can be shaped as desired in a later 19 
processing stage with probability 0.9. What is the f (20)   (0.9)17 (0.1)3  0.0285
probability that exactly 20 lots will be produced in 2
order to obtain the third defective lot?  The expected value is 3/0.1 = 30, meaning that on an
average 30 trails would be required to produce the
third defective lot.

Slide 29 Slide 30

Poisson Probability Distribution Poisson Probability Distribution

 Properties of a Poisson Experiment  Poisson Probability Function


• The probability of an occurrence is the same for  xe
any two intervals of equal length. f ( x) 
x!
• The occurrence or nonoccurrence in any interval is
independent of the occurrence or nonoccurrence where:
in any other interval. f(x) = probability of x occurrences in an interval
 = mean number of occurrences in an interval
e = 2.71828

Slide 31 Slide 32

8
Example: Mercy Hospital Example: Mercy Hospital

 Using the Poisson Probability Function  Using the Tables of Poisson Probabilities
Patients arrive at the emergency room of Mercy

Hospital at the average rate of 6 per hour on x 2.1 2.2 2.3 2.4 2.5 2.6 2.7 2.8 2.9 3.0
weekend evenings. What is the probability of 4 0 .1225 .1108 .1003 .0907 .0821 .0743 .0672 .0608 .0550 .0498
arrivals in 30 minutes on a weekend evening? 1 .2572 .2438 .2306 .2177 .2052 .1931 .1815 .1703 .1596 .1494
2 .2700 .2681 .2652 .2613 .2565 .2510 .2450 .2384 .2314 .2240
 = 6/hour = 3/half-
3/half-hour, x = 4 3 .1890 .1966 .2033 .2090 .2138 .2176 .2205 .2225 .2237 .2240
4 .0992 .1082 .1169 .1254 .1336 .1414 .1488 .1557 .1622 .1680
34 ( 2 . 71828 ) 3
f (4)   .1680 5 .0417 .0476 .0538 .0602 ..0668 .0735 .0804 .0872 .0940 .1008
4! 6 .0146 .0174 .0206 .0241 .0278 .0319 .0362 .0407 .0455 .0504
7 .0044 .0055 .0068 .0083 .0099 .0118 .0139 .0163 .0188 .0216
8 .0011 .0015 .0019 .0025 .0031 .0038 .0047 .0057 .0068 .0081

Slide 33 Slide 34

Hypergeometric Probability Distribution Hypergeometric Probability Distribution

 The hypergeometric distribution is closely related to  Hypergeometric Probability Function


the binomial distribution.
 With the hypergeometric distribution, the trials are  r  N  r 
  
not independent, and the probability of success x n  x 
changes from trial to trial. f ( x)    for 0 < x < r
N 
 
n
where: f(x) = probability of x successes in n trials
n = number of trials
N = number of elements in the population
r = number of elements in the population
labeled success

Slide 35 Slide 36

9
Example: Neveready Example: Neveready

 Hypergeometric Probability Distribution  Hypergeometric Probability Distribution


Bob Neveready has removed two dead batteries
from a flashlight and inadvertently mingled them  r  N  r   2  2   2!  2! 
        
with the two good batteries he intended as x n  x   2  0   2!0! 0!2! 1
f ( x)        .167
replacements. The four batteries look identical. N 4  4!  6
     
Bob now randomly selects two of the four n 2  2!2!
batteries. What is the probability he selects the two where:
good batteries? x = 2 = number of good batteries selected
n = 2 = number of batteries selected
N = 4 = number of batteries in total
r = 2 = number of good batteries in total

Slide 37 Slide 38

End of lecture-
lecture- 5

Slide 39

10

You might also like