Professional Documents
Culture Documents
Moment Generating Functions
Moment Generating Functions
∞
etx p(x) etx f (x)dx.
P
In the discrete case mX is equal to x and in the continuous case
−∞
Let us compute the moment generating function for some of the distributions we have been
working with.
n n
X n−k
X k n
etk nk n
(1 − p)n−k = pet + (1 − p)
k
pet
p (1 − p) = k
k=0 k=0
by the Binomial formula.
Proposition 13.1
Suppose X1 , ..., Xn are n independent random variables, and the random variable Y
is dened by
Y = X1 + ... + Xn .
Then
mY (t) = mX1 (t) · ... · mXn (t) .
mY (t) = E etX1 · ... · etXn = EetX1 · ... · EetXn = mX1 (t) · ... · mXn (t) .
Proposition 13.2
Suppose for two random variables X and Y we have mX (t) = mY (t) < ∞ for all t in
an interval, then X and Y have the same distribution.
We will not prove this, but this statement is essentially the uniqueness of the Laplace trans-
form L. Recall that the Laplace transform of a function f (x) dened for all positive real
numbers s>0 ∞
(Lf ) (s) := f (x) e−sx dx
0
Thus if X is a continuous random variable with the PDF such that fX (x) = 0 for x < 0,
then ∞
etx fX (x)dx = LfX (−t).
0
Proposition 13.1 allows to show some of the properties of sums of independent random
variables we proved or stated before.
One problem with the moment generating function is that it might be innite. One way
to get around this, at the cost of considerable work, is to use the
√ characteristic function
ϕX (t) = EeitX , where i = −1. This is always nite, and is the analogue of the Fourier
transform.
Denition
Joint MGF The joint moment generating function of X and Y is
Solution :
We can match this MGF to a known MGF of one of the distributions we considered
3(et −1) t
and then apply Proposition 13.2. Observe that m(t) = e = eλ(e −1) , where λ = 3. Thus
X ∼ P oisson(3), and therefore
0
−λ λ
P (X = 0) = e = e−3 .
0!
This example is an illustration to why mX (t) is called the moment generating function.
Namely we can use it to nd all the moments of X by dierentiating m(t) and then evaluating
at t = 0. Note that
0 d tX d tX
= E XetX .
m (t) = E e =E e
dt dt
Now evaluate at t=0 to get
Similarly
d tX
m00 (t) = = E X 2 etX ,
E Xe
dt
so that
m00 (0) = E X 2 e0 = E X 2 .
Example 13.10. Suppose X is a discrete random variable and has the MGF
1 3 2 1
mX (t) = e2t + e3t + e5t + e8t .
7 7 7 7
What is the PMF of X ? Find EX .
Solution : this does not match any of the known MGFs directly. Reading o from the MGF
we guess
4
1 2t 3 3t 2 5t 1 8t X txi
e + e + e + e = e p(xi )
7 7 7 7 i=1
© Copyright 2017 Phanuel Mariano, Patricia Alonso Ruiz, 2020 Masha Gordina.
13.2. FURTHER EXAMPLES AND APPLICATIONS 175
To nd E[X] we can use Proposition 13.3 by taking the derivative of the moment generating
function as follows.
2 9 10 8
m0 (t) = e2t + e3t + e5t + e8t ,
7 7 7 7
so that
2 9 10 8 29
E [X] = m0 (0) = + + + = .
7 7 7 7 7
13.3. Exercises
Exercise 13.1. Suppose that you have a fair 4-sided die, and let X be the random variable
representing the value of the number rolled.
Exercise 13.2. Let X be a random variable whose probability density function is given
by (
e−2x + 21 e−x x>0
fX (x) = .
0 otherwise
Exercise 13.3. Suppose that a mathematician determines that the revenue the UConn
Dairy Bar makes in a week is a random variable, X , with moment generating function
1
mX (t) =
(1 − 2500t)4
Find the standard deviation of the revenue the UConn Dairy bar makes in a week.
Exercise 13.4. Let X and Y be two independent random variables with respective moment
generating functions
1 1 1 1
mX (t) = , if t < , mY (t) = , if t< .
1 − 5t 5 (1 − 5t)2 5
Find E (X + Y )2 .
Exercise 13.5. Suppose X and Y are independent Poisson random variables with param-
eters λx , λy , respectively. Find the distribution of X +Y.
Exercise 13.6. True or False? If X ∼ Exp (λx ) and Y ∼ Exp (λy ) then X +Y ∼
Exp (λx + λy ). Justify your answer.
13.4. SELECTED SOLUTIONS 177
1 5
EX = m0X (0) = (1 + 2 + 3 + 4) =
4 2
and
1 15
EX 2 = m00X (0) = (1 + 4 + 9 + 16) = .
4 2
∞
−2x 1 −x
mX (t) = E etX = tx
e e + e dx
0 2
x=∞
1 tx−2x 1
tx−x
= e + e =
t−2 2(t − 1)
x=0
1 1
=0− +0−
2−t 2 (t − 1)
1 1 t
= + =
t − 2 2 (1 − t) 2(2 − t)(1 − t)
Solution to Exercise 13.4: First recall that if we let W = X +Y, and using that X, Y
are independent, then we see that
1
mW (t) = mX+Y (t) = mX (t)mY (t) = ,
(1 − 5t)3
recall that E [W 2 ] = m00W (0), which we can nd from
15
m0W (t) = ,
(1 − 5t)4
300
m00W (t) = ,
(1 − 5t)5
thus
300
E W 2 = m00W (0) =
= 300.
(1 − 0)5
Solution to Exercise 13.5: Since X ∼ Pois(λx ) and Y ∼ Pois(λy ) then
Then
but
λx + λy λx λy
6= ·
λx + λy − t λx − t λy − t
and hence the statement is false.