Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

CHAPTER 13

Moment generating functions

13.1. Denition and examples


Denition (Moment generating function)
The moment generating function (MGF) of a random variable X is a function mX (t)
dened by
mX (t) = EetX ,
provided the expectation is nite.

∞
etx p(x) etx f (x)dx.
P
In the discrete case mX is equal to x and in the continuous case
−∞
Let us compute the moment generating function for some of the distributions we have been
working with.

Example 13.1 (Bernoulli).


mX (t) = e0·t (1 − p) + e1·t p = et p + 1 − p.

Example 13.2 (Binomial). Using independence,


P Y Y
Eet Xi
=E etXi = EetXi = (pet + (1 − p))n ,
where the Xi are independent Bernoulli random variables. Equivalently

n n
X n−k
X k n
etk nk n
(1 − p)n−k = pet + (1 − p)
 k
pet

p (1 − p) = k
k=0 k=0
by the Binomial formula.

Example 13.3 (Poisson).


∞ ∞
X etk e−λ λk −λ
X (λet )k t t
Ee tX
= =e = e−λ eλe = eλ(e −1) .
k=0
k! k=0
k!

Example 13.4 (Exponential).


 ∞
λ
EetX
= etx λe−λx dx =
0 λ−t
if t < λ, and ∞ if t > λ.
171
172 13. MOMENT GENERATING FUNCTIONS

Example 13.5 (Standard normal). Suppose Z ∼ N (0, 1), then


 
1 2 /2 2 /2 1 2 /2 2 /2
mZ (t) = √ etx e−x dx = et √ e−(x−t) dx = et .
2π 2π

Example 13.6 (General normal). Suppose X ∼ N (µ, σ 2 ), then we can write X = µ + σZ ,


and therefore
2 /2 2 σ 2 /2
mX (t) = EetX = Eetµ etσZ = etµ mZ (tσ) = etµ e(tσ) = etµ+t .

Proposition 13.1
Suppose X1 , ..., Xn are n independent random variables, and the random variable Y
is dened by
Y = X1 + ... + Xn .
Then
mY (t) = mX1 (t) · ... · mXn (t) .

Proof. By independence of X and Y and Proposition 12.1 we have

mY (t) = E etX1 · ... · etXn = EetX1 · ... · EetXn = mX1 (t) · ... · mXn (t) .


Proposition 13.2
Suppose for two random variables X and Y we have mX (t) = mY (t) < ∞ for all t in
an interval, then X and Y have the same distribution.

We will not prove this, but this statement is essentially the uniqueness of the Laplace trans-
form L. Recall that the Laplace transform of a function f (x) dened for all positive real
numbers s>0  ∞
(Lf ) (s) := f (x) e−sx dx
0
Thus if X is a continuous random variable with the PDF such that fX (x) = 0 for x < 0,
then  ∞
etx fX (x)dx = LfX (−t).
0

Proposition 13.1 allows to show some of the properties of sums of independent random
variables we proved or stated before.

Example 13.7 (Sums of independent normal random variables) . If X ∼ N (a, b2 ) and


2
Y ∼ N (c, d ), and X and Y are independent, then by Proposition 13.1
2 t2 /2 2 t2 /2 2 +d2 )t2 /2
mX+Y (t) = eat+b ect+d = e(a+c)t+(b ,
13.1. DEFINITION AND EXAMPLES 173

which is the moment generating function for N (a + c, b2 + d2 ). Therefore Proposition 13.2


2 2
implies that X + Y ∼ N (a + c, b + d ).

Example 13.8 (Sums of independent Poisson random variables) . Similarly, if X and Y


are independent Poisson random variables with parameters a and b, respectively, then
t t t
mX+Y (t) = mX (t)mY (t) = ea(e −1) eb(e −1) = e(a+b)(e −1) ,
which is the moment generating function of a Poisson with parameter a + b, therefore X + Y
is a Poisson random variable with parameter a + b.

One problem with the moment generating function is that it might be innite. One way
to get around this, at the cost of considerable work, is to use the
√ characteristic function
ϕX (t) = EeitX , where i = −1. This is always nite, and is the analogue of the Fourier
transform.
Denition
Joint MGF The joint moment generating function of X and Y is

mX,Y (s, t) = EesX+tY .

If X and Y are independent, then

mX,Y (s, t) = mX (s)mY (t)


by Proposition 13.2. We will not prove this, but the converse is also true: if mX,Y (s, t) =
mX (s)mY (t) for all s and t, then X and Y are independent.
174 13. MOMENT GENERATING FUNCTIONS

13.2. Further examples and applications


Example 13.9. Suppose that m.g.f of X is given by
t
m(t) = e3(e −1) . Find P (X = 0).

Solution :
We can match this MGF to a known MGF of one of the distributions we considered
3(et −1) t
and then apply Proposition 13.2. Observe that m(t) = e = eλ(e −1) , where λ = 3. Thus
X ∼ P oisson(3), and therefore

0
−λ λ
P (X = 0) = e = e−3 .
0!
This example is an illustration to why mX (t) is called the moment generating function.
Namely we can use it to nd all the moments of X by dierentiating m(t) and then evaluating
at t = 0. Note that
 
0 d  tX  d tX
= E XetX .
 
m (t) = E e =E e
dt dt
Now evaluate at t=0 to get

m0 (0) = E Xe0·X = E [X] .


 

Similarly

d  tX 
m00 (t) = = E X 2 etX ,
 
E Xe
dt
so that

m00 (0) = E X 2 e0 = E X 2 .
   

Continuing to dierentiate the MGF we have the following proposition.

Proposition 13.3 (Moments from MGF)


For all n>0 we have
E [X n ] = m(n) (0) .

Example 13.10. Suppose X is a discrete random variable and has the MGF

1 3 2 1
mX (t) = e2t + e3t + e5t + e8t .
7 7 7 7
What is the PMF of X ? Find EX .

Solution : this does not match any of the known MGFs directly. Reading o from the MGF
we guess

4
1 2t 3 3t 2 5t 1 8t X txi
e + e + e + e = e p(xi )
7 7 7 7 i=1

© Copyright 2017 Phanuel Mariano, Patricia Alonso Ruiz, 2020 Masha Gordina.
13.2. FURTHER EXAMPLES AND APPLICATIONS 175

then we can take p(2) = 71 , p(3) = 73 , p(5) = 2


7
and p(8) = 1
7
. Note that these add up to 1, so
this is indeed a PMF.

To nd E[X] we can use Proposition 13.3 by taking the derivative of the moment generating
function as follows.
2 9 10 8
m0 (t) = e2t + e3t + e5t + e8t ,
7 7 7 7
so that
2 9 10 8 29
E [X] = m0 (0) = + + + = .
7 7 7 7 7

Example 13.11. Suppose X has the MGF


1 1
mX (t) = (1 − 2t)− 2 for t< .
2
Find the rst and second moments of X.
Solution : We have
1 3 3
m0X (t) = − (1 − 2t)− 2 (−2) = (1 − 2t)− 2 ,
2
3 5 5
m00X (t) = − (1 − 2t)− 2 (−2) = 3 (1 − 2t)− 2 .
2
So that
3
EX = m0X (0) = (1 − 2 · 0)− 2 = 1,
5
EX 2 = m00X (0) = 3 (1 − 2 · 0)− 2 = 3.
176 13. MOMENT GENERATING FUNCTIONS

13.3. Exercises
Exercise 13.1. Suppose that you have a fair 4-sided die, and let X be the random variable
representing the value of the number rolled.

(a) Write down the moment generating function for X.


(b) Use this moment generating function to compute the rst and second moments of X.

Exercise 13.2. Let X be a random variable whose probability density function is given
by (
e−2x + 21 e−x x>0
fX (x) = .
0 otherwise

(a) Write down the moment generating function for X.


(b) Use this moment generating function to compute the rst and second moments of X.

Exercise 13.3. Suppose that a mathematician determines that the revenue the UConn
Dairy Bar makes in a week is a random variable, X , with moment generating function

1
mX (t) =
(1 − 2500t)4
Find the standard deviation of the revenue the UConn Dairy bar makes in a week.

Exercise 13.4. Let X and Y be two independent random variables with respective moment
generating functions

1 1 1 1
mX (t) = , if t < , mY (t) = , if t< .
1 − 5t 5 (1 − 5t)2 5
Find E (X + Y )2 .

Exercise 13.5. Suppose X and Y are independent Poisson random variables with param-
eters λx , λy , respectively. Find the distribution of X +Y.

Exercise 13.6. True or False? If X ∼ Exp (λx ) and Y ∼ Exp (λy ) then X +Y ∼
Exp (λx + λy ). Justify your answer.
13.4. SELECTED SOLUTIONS 177

13.4. Selected solutions


Solution to Exercise 13.1(A):
1 1 1 1
mX (t) = E etX = e1·t + e2·t + e3·t + e4·t
 
4 4 4 4
1 1·t
e + e2·t + e3·t + e4·t

=
4

Solution to Exercise 13.1(B): We have


1 1·t
m0X (t) = e + 2e2·t + 3e3·t + 4e4·t ,

4
1
m00X (t) = e1·t + 4e2·t + 9e3·t + 16e4·t ,

4
so

1 5
EX = m0X (0) = (1 + 2 + 3 + 4) =
4 2
and

1 15
EX 2 = m00X (0) = (1 + 4 + 9 + 16) = .
4 2

Solution to Exercise 13.2(A): for t < 1 we have

 ∞ 
−2x 1 −x
mX (t) = E etX = tx
 
e e + e dx
0 2
x=∞
1 tx−2x 1
tx−x
= e + e =
t−2 2(t − 1)
x=0
1 1
=0− +0−
2−t 2 (t − 1)
1 1 t
= + =
t − 2 2 (1 − t) 2(2 − t)(1 − t)

Solution to Exercise 13.2(B): We have


1 1
m0X (t) = 2 +
(2 − t) 2 (1 − t)2
2 1
m00X (t) = 3 +
(2 − t) (1 − t)3

and so EX = m0X (0) = 3


4
and EX 2 = m00X = 5
4
.
178 13. MOMENT GENERATING FUNCTIONS

Solution to Exercise 13.3: Var(X) = EX 2 − (EX)2 .


p
We have SD (X) = Var(X) and
Therefore we can compute

m0 (t) = 4 (2500) (1 − 2500t)−5 ,


m00 (t) = 20 (2500)2 (1 − 2500t)−6 ,
EX = m0 (0) = 10, 000
EX 2 = m00 (0) = 125, 000, 000
Var(X) = 125, 000, 000 − 10, 0002 = 25, 000, 000
p
SD(X) = 25, 000, 000 = 5, 000.

Solution to Exercise 13.4: First recall that if we let W = X +Y, and using that X, Y
are independent, then we see that

1
mW (t) = mX+Y (t) = mX (t)mY (t) = ,
(1 − 5t)3
recall that E [W 2 ] = m00W (0), which we can nd from

15
m0W (t) = ,
(1 − 5t)4
300
m00W (t) = ,
(1 − 5t)5
thus
300
E W 2 = m00W (0) =
 
= 300.
(1 − 0)5
Solution to Exercise 13.5: Since X ∼ Pois(λx ) and Y ∼ Pois(λy ) then

mX (t) = eλx (e −1) ,


t

mY (t) = eλy (e −1) .


t

Then

mX+Y (t) = mX (t)mY (t)


independence
eλx (e −1) eλy (e −1) = e(λx +λy )(e −1) .
t t t
=
Thus X + Y ∼ Pois (λx + λy ).
Solution to Exercise 13.6: We will use Proposition 13.2. Namely, we rst nd the MGF
of X +Y and compare it to the MGF of a random variable V ∼ Exp (λx + λy ). The MGF
of V is
λx + λy
mV (t) = for t < λx + λy .
λx + λy − t
By independence of X and Y
λx λy
mX+Y (t) = mX (t)mY (t) = · ,
λx − t λy − t
13.4. SELECTED SOLUTIONS 179

but
λx + λy λx λy
6= ·
λx + λy − t λx − t λy − t
and hence the statement is false.

You might also like