Download as pdf or txt
Download as pdf or txt
You are on page 1of 27

Discrete Random Variables

Aria Nosratinia Probability and Statistics 3-1


Why Random Variables
So far we looked at experiments and outcomes
In engineering, we are interested in randomness of certain class of
objects:
Voltage
Current
Power
.......
Our outcomes of interest are numbers
Random variables happen when we give probabilities to numbers.
Example: throw of a die is a random variable.
Aria Nosratinia Probability and Statistics 3-2
Examples
Consider the experiment of buying semiconductor chips from
Whizywhiz corp., each of them having a 2% failure probability.
We have bought 10 chips. The number of bad chips in this set is
a Random variable that can take values in the set:
{0, 1, 2, 3, 4, . . . , 10}
A photo-multiplier counts the number of photons received. This
device is used especially for the case where the incoming radiation
is weak, and every photon must be counted. The number of
incoming photons in a given minute is a random variable, which
can take any integer value greater than or equal to zero.
{0, 1, 2, 3, 4, . . .}
Aria Nosratinia Probability and Statistics 3-3
Denition
A Random variable consists of an experiment with a probability
measure P() dened on a sample space S, and a function that
assigns a real number to each outcome of the sample space.
We use a more general denition, to bring in experiments whose
outcomes are not numbers to begin with.
This increases the usefulness of the denition (and the scope of
our possibilities)
We will show random variables with upper case letters, e.g. X.
An outcome will be shown with lower case letters, e.g., x.
Aria Nosratinia Probability and Statistics 3-4
Example
Consider the sample space of a coin toss:
S = {heads, tails}
Assume the coin is crooked, with P(heads) = 0.7.
We can dene a random variable on this experiment, call it X.
X(heads) = 0
X(tails) = 1
Note that this is a function from the sample space to {0, 1}
Now we can write:
P(X = 0) = 0.7 P(X = 1) = 0.3
or
P
X
(0) = 0.7 P
X
(1) = 0.3
Aria Nosratinia Probability and Statistics 3-5
Probability Mass Function
The probability mass function is a function that, for each value of
the random variable, gives its probability.
P
X
(x) = Prob[X = x]
Example: the probability mass function (pmf) of the previous
example is:
0.7
0.3
1
X
P
X
(x)
0
Aria Nosratinia Probability and Statistics 3-6
Example
A fair coin is tossed twice. We are interested in the number of
tails. Draw the probability mass function of this random
variable.
Aria Nosratinia Probability and Statistics 3-7
Bernoulli Random Variable
Denition: X is a Bernoulli random variable if the pmf has the
form:
P
X
(x) =
_

_
1 p x = 0
p x = 1
0 otherwise
The Bernoulli is the simplest (nontrivial) random variable.
It describes any binary event, and is the building block for many
other random variables.
Aria Nosratinia Probability and Statistics 3-8
Meet the Bernoullis!
Jakob
(1654-1705)
Bernoulli Distribution
Johann
(1710-1790)
LHospitals rule
Daniel
(1700-1782)
Fluid Mechanics
Aria Nosratinia Probability and Statistics 3-9
More Bernoullis!
Aria Nosratinia Probability and Statistics 3-10
Repeated Trials until Success
Consider the following experiment: we ip a coin until we get a
tail. The coin comes tails with probability p.
How many times until success?
Lets calculate the probability that it will take exactly 3 coin ips
until success:
P(success at 3) = (1 p) (1 p) p
In general, we can describe the probability of success at n:
P
X
(n) = p(1 p)
n1
Aria Nosratinia Probability and Statistics 3-11
Geometric Random Variable
Denition: X is a geometric distribution if the pmf has the form:
P
X
(x) =
_
_
_
p(1 p)
x1
x = 1, 2, . . .
0 otherwise
X
P
X
(x)
Geometric distribution is motivated by repeated Bernoulli trials
until success.
Aria Nosratinia Probability and Statistics 3-12
Geometric RV Example
An urn contains 5 white balls and 50 black balls. We draw one ball at
random from the urn, observe the color, and return it to the urn. We
repeat the experiment until a white ball is observed.
What is the probability that we will see a white ball on the rst
draw?
What is the probability that we will need exactly 6 draws to see
the rst white ball?
Aria Nosratinia Probability and Statistics 3-13
Probability of k Successes out of n
We return to the coin toss that gives tails with probability p.
We throw the coin 10 times. What is the probability of getting
tails exactly 3 times?
P(3 out of 10) =
_
10
3
_
p
3
(1 p)
7
In general, probability of getting k out of n is:
P(k out of n) =
_
n
k
_
p
k
(1 p)
nk
Aria Nosratinia Probability and Statistics 3-14
Binomial Random Variable
Denition: X is a binomial random variable if the pmf has the
form:
P
X
(x) =
_
_
_
_
n
x
_
p
x
(1 p)
nx
x = 0, 1, 2, . . . , n
0 otherwise
X
P
X
(x)
The binomial distribution is motivated by number of Bernoulli
successes in a pre-determined number of trials.
Aria Nosratinia Probability and Statistics 3-15
Binomial RV Example
An urn contains 5 white balls and 50 black balls. We draw one ball at
random from the urn, observe the color, and return it to the urn. We
repeat the experiment 12 times.
What is the probability of observing 9 black balls?
Aria Nosratinia Probability and Statistics 3-16
Uniform Random Variable
Denition: X is a (discrete) uniform random variable if the pmf
has the form:
P
X
(x) =
_
_
_
1
k+1
x = k, . . . ,
0 otherwise
X
P
X
(x)
k l
The uniform distribution is used in equi-probable situations.
Aria Nosratinia Probability and Statistics 3-17
Uniform RV Example
The throw of a die is a uniformly distributed random variable over
the support {1, 2, 3, 4, 5, 6}.
A Bernoulli-
1
2
random variable is the simplest uniform random
variable. However, it is often referred to as Bernoulli, not uniform.
Aria Nosratinia Probability and Statistics 3-18
Poisson Random Variable
Denition: X is a Poisson random variable if its pmf has the
form:
P
X
(x) =
_
_
_

x e

x!
x = 0, 1, 2, . . .
0 otherwise
Poisson distribution describes random arrivals
Consider that customers arrive at a bank at the average rate of
per minute. What is the probability of exactly x customers
arriving in a time interval of T minutes?
ANSWER: Poisson distribution with parameter = T.
Aria Nosratinia Probability and Statistics 3-19
Some Examples for Poisson Distributions
Poisson distribution has a very large range of applications. Some
examples include:
The number of brush res per year in California
The number of deaths of life insurance policy-holders of a given
insurance company, per year.
The number of electrons emitting from the cathode of a vacuum
tube per microsecond.
The number of wrong telephone numbers dialed in a day in the
Dallas-Fort Worth metroplex.
Aria Nosratinia Probability and Statistics 3-20
Poisson Example
Suppose the probability of typographical errors on a given page has a
Poisson distribution with parameter = 0.5. Then what is the
probability that there is at least one error on a given page?
Aria Nosratinia Probability and Statistics 3-21
Poisson Approximates Binomial
Wimtel chips have failure rate of 0.02%. Out of a batch of
100,000, what is the probability that two will fail?
_
100000
2
_
(0.0002)
2
(0.9998)
99998
=?
Very large n and very small p, but = np is a moderate number.
lim
n
_
n
k
_
p
k
(1 p)
nk
= lim
n
n!
(n k)!k!
_

n
_
k
_
1

n
_
nk
=
1
k!

k
lim
n
n!
(n k)!n
k
. .
1
_
1

n
_
n
. .
e

_
1

n
_
k
. .
1
=
1
k!

k
e

We can use Poisson to approximate Binomial.


Aria Nosratinia Probability and Statistics 3-22
Example
Suppose that the probability that an item produced by a certain
machine is defective is 0.1. Find the probability that a sample of 10
items will have at most 1 defective item. Find the desired probability
directly, as well as with Poisson approximation.
Aria Nosratinia Probability and Statistics 3-23
Derivation of Poisson pmf

Assume
Exactly one event happens in interval of length t with probability
t + o(t).
Two or more events happen in t with probability o(t).
Number of events in non-overlapping intervals are independent.
Lets call N(t) the number of events in [0, t]. Divide [0, t] into n
subintervals.
Now show that the probability of having two events in any subinterval
goes to zero as n .
Then show that the probability of exactly k intervals each having one
event is:
_
n
k
__
t
n
o
_
t
n
_
_
k
_
1
t
n
o
_
t
n
_
_
nk
Show that this is, in the limit, the Poisson probability.
Aria Nosratinia Probability and Statistics 3-24
Pascal Random Variable
Pascal pmf gives probability of number of Bernoulli trials until
k-th success (RECALL: geometric until rst success.)
Also known as the Negative Binomial pmf.
Denition: A Pascal (k, p) random variable has the pmf:
P
X
(x) =
_
_
_
_
x1
k1
_
p
k
(1 p)
xk
x = k, k + 1, . . .
0 otherwise
Reasoning: achieving k-th success at time x is equivalent to:
Exactly k 1 successes in x 1 tries, and ...
Exactly one success on the x-th try.
These events are independent, so we multiply the probabilities.
Aria Nosratinia Probability and Statistics 3-25
Example The Banach Match Problem
A pipe-smoking mathematician carries at all times 2 matchboxes, one
in his left-hand pocket, one in his right-hand pocket. Each time he
needs a match, he is equally likely to take it from either pocket.
Consider the moment the mathematician discovers an empty
matchbox. At that time, what is the probability that there are exactly
k matches in the other box. Both boxes started o with N matches.
Aria Nosratinia Probability and Statistics 3-26
Cumulative Distribution Function (CDF)
Denition: The cdf of a random variable X, shown F
X
(), is
dened as
F
X
() = Prob(X )
Example:
0.7
0.3
1
x
P
X
(x)
0 1
x
F
X
(x)
0
0.7
1
Aria Nosratinia Probability and Statistics 3-27
CDF Example 2
0.5
0.25
1
X
P
X
(x)
0.25
2 0 1
x
F
X
(x)
0 2
1
0.75
0.25
Aria Nosratinia Probability and Statistics 3-28
CDF Exercise
Find the cdf of the following random variable:
0.1
1
x
P
X
(x)
0.3
-4
0.4
9
0.2
11
Aria Nosratinia Probability and Statistics 3-29
CDF Motivation and Properties
We shall see that there are also continuous r.v., and mixed r.v.
CDF are the common bond between all types of r.v. They provide
an umbrella for a common understanding
The CDF is the most mathematically rigorous way of presenting
random variables. (we wont be able to see the full extent of that
in this introductory course).
Properties:
lim

F
X
() = 0
lim
+
F
X
() = 1
F
X
() is right-continuous.
Aria Nosratinia Probability and Statistics 3-30
Expected Value
Denition: The expected value (a.k.a. mean) of a random
variable is dened as:
E[X] =
X
=

x
i
P
X
(x
i
)
Other interpretations:
Weighted average
Center of gravity
Examples:
1
0.5
0.25
X
P
X
(x)
0.25
2 0
0.7
0.3
1
X
P
X
(x)
0 0.3
Aria Nosratinia Probability and Statistics 3-31
Expected Value Exercise
Calculate the expected value of the following random variable:
P
X
(x) =
_

_
0.3 x = 4
0.1 x = 1
0.4 x = 9
0.2 x = 11
0.1
1
x
P
X
(x)
0.3
-4
0.4
9
0.2
11
Aria Nosratinia Probability and Statistics 3-32
Some Expected Values
Bernoulli:
E[X] = 0 (1 p) + 1 p = p
Geometric: setting q = 1 p,
E[X] =

x=1
xP
X
(x) =

x=1
xpq
x
1 =
p
q

xq
x
=
p
q
q
(1 q)
2
=
1
p
Binomial: E[X] = np
Pascal (k, p): E[X] =
k
p
Uniform (k, l): E[X] =
k+l
2
Poisson: E[X] =
Aria Nosratinia Probability and Statistics 3-33
Notes about Expected Values
Did you notice that the mean of binomial is n times the mean of
Bernoulli?
Did you notice that the mean of Pascal is k times the mean of
geometric?
Is this a coincidence? If not, what is the reason behind it?
Exercise:

Derive the mean of Binomial, Pascal, and Poisson


random variables.
Hint: For binomial, use the identity:
i
_
n
i
_
= n
_
n 1
i 1
_
Aria Nosratinia Probability and Statistics 3-34
Derived Random Variable
Assume X is a random variable. Then we can dene another r.v.
Y = g(X)
For a discrete random variable X, the pmf of Y = g(X) is:
P
Y
(y) =

x:g(x)=y
P
X
(x)
g(x)
x
y
Aria Nosratinia Probability and Statistics 3-35
Example
If X is the r.v. of a fair die throw, draw the pmf of X + 2, X
2
, and
|X 3.5|
Aria Nosratinia Probability and Statistics 3-36
Expected Value of Derived RV
Theorem: The Expected value of Y = g(X) is:
E[Y ] =

sS
x
g(x)P
X
(x)
Proof:
E[Y ] =

yS
y
y P
Y
(y)
=

yS
y
y

x:g(x)=y
P
X
(x)
=

x:g(x)=y
y P
X
(x)
=

xS
x
g(x)P
X
(x)
Aria Nosratinia Probability and Statistics 3-37
Expected Value of Derived RV
Basic Idea: multiplying locations times their probabilities.
This makes it easier to calculate some expected values. Essentially
we are saying:
E
Y
[Y ] = E
X
[g(X)]
In other words, it is not necessary to calculate P
Y
if all we want is
just E[Y ]. We can still use P
X
in the manner mentioned above.
Aria Nosratinia Probability and Statistics 3-38
Example
A voltage signal X can take integer values between [4, 4] with
probabilities P
X
(x) =
1
60
x
2
. Find the average (mean) power
dissipated by this voltage on a 5 ohm resistor.
Aria Nosratinia Probability and Statistics 3-39
Example
A fax transmission contains 1,2, or 3 pages with probability 0.2
(each), and 4,5,6,or 7 pages with probability of 0.1 (each). If the
cost of transmitting a fax is 10 cents for one or two pages, 20
cents for 3 or 4 pages, and 30 cents for more than 4 pages,
calculate the average (mean) cost. Draw the pmf of the cost.
Aria Nosratinia Probability and Statistics 3-40
Linearity of Expected Value
Theorem: for any two constants a, b and random variable X
E[aX + b] = aE[X] + b
Proof: Follows directly from the expected value of derived r.v.
This is the most important property of expected value.
From it, we can deduce, for example, that:
E[X
X
] = E[X]
X
= 0
Aria Nosratinia Probability and Statistics 3-41
Variance
Denition: The variance of a random variable is dened:
Var(X) = E
_
(X
X
)
2
_
Variance is a measure of how spread-out is a random variable.
In other words, variance tells us how far does X typically get from
its mean.
x
P
X
(x)
x
P
X
(x)
lower variance higher variance
Aria Nosratinia Probability and Statistics 3-42
Calculating the Variance
Theorem:
V ar(X) = E[X
2
] (E[X])
2
= E[X
2
]
2
X
Proof:
V ar(X) = E[(X
X
)
2
]
= E[X
2
2X
X
+
2
X
]
= E[X
2
] E[2
X
X] + E[
2
X
]
= E[X
2
] 2
X
E[X] +
2
X
= E[X
2
]
2
X
Aria Nosratinia Probability and Statistics 3-43
Example
Calculate the variance of a Bernoulli-p random variable.
V ar(X) = E[X
2
]
2
X
= ((1 p) 0
2
+ p 1
2
) p
2
= p p
2
= p(1 p)
Aria Nosratinia Probability and Statistics 3-44
Example
Find the variance of the following random variable:
X =
_
_
_
1 with prob. 1/2
1 with prob. 1/2
Now calculate the variance of the following random variable:
Y =
_
_
_
2 with prob. 1/2
2 with prob. 1/2
Variance is a measure of the spread of a random variable.
Is the variance proportional to the spread?
Aria Nosratinia Probability and Statistics 3-45
Properties of the Variance
Variance is always non-negative
V ar(X) 0
Variance is invariant to a shift V ar(X + b) = V ar(X)
Variance is a quadratic function: V ar(aX) = a
2
V ar(X)
Together we have:
V ar(aX + b) = a
2
V ar(X)
Aria Nosratinia Probability and Statistics 3-46
Example Means and Variances
Random Variable Mean Variance
Bernoulli p p(1 p)
Geometric 1/p (1 p)/p
2
Binomial np np(1 p)
Pascal k/p k(1 p)/p
2
Poisson
Uniform
lk+1
2
(lk)(lk+2)
12
Aria Nosratinia Probability and Statistics 3-47
Standard Deviation
Denition: Standard deviation is the square root of variance.

X
=
_
V ar(X)
Calculate the standard deviation of the random variables in the
previous examples.
Aria Nosratinia Probability and Statistics 3-48
Moments
Denition: For a random variable X
The n-th moment is E[X
n
]
The central n-th moment is E[(X
X
)
n
]
The rst moment is the mean, the second central moment is the
variance.
Exercise: Compute the third moment of a Bernoulli random
variable.
Aria Nosratinia Probability and Statistics 3-49
Conditional PMF
Recall the basic denition of conditional probability:
P(A|B) =
P(A, B)
P(B)
Now consider the event A = {X = x} and rewrite:
P{X = x |B} =
P({X = x} B)
P(B)
This is known as the conditional pmf, and denoted by P
X|B
(x).
Clearly, if x / B, then P
X|B
(x) = 0.
If x B, then P
X|B
(x) =
P
X
(x)
P(B)
(why?).
Aria Nosratinia Probability and Statistics 3-50
Example
Consider the random variable arising from the throw of a fair die.
Now consider the event:
B = { 1 did NOT happen }
Find the conditional pmf.
Aria Nosratinia Probability and Statistics 3-51
Example
Consider the random variable X which denotes the number of
tosses of a fair coin until we observe the rst heads. Draw the
pmf of this random variable.
Now consider the event:
B = { the rst toss came tails }
Calculate and draw the conditional pmf given B.
Aria Nosratinia Probability and Statistics 3-52
Properties of Conditional PMF
For any x B, P
X|B
(x) 0


xB
P
X|B
(x) = 1
For any C B, we have
Prob{C|B} =

xC
P
X|B
(x)
Aria Nosratinia Probability and Statistics 3-53
Conditional Moments
The conditional expected value is dened as:
E[X|B] =

xB
x P
X|B
(x)
The conditional expected value of Y = g(X) is
E[Y |B] = E[g(X)|B] =

xB
g(x) P
X|B
(x)
Conditional variance is given by:
V ar(X|B) = E[(X E[X|B])
2
|B] = E[X
2
|B] (E[X|B])
2
Aria Nosratinia Probability and Statistics 3-54

You might also like