Professional Documents
Culture Documents
Moment
Moment
1
of the probability mass function f (x) = P (X=x) Proof: mZ (t) = E(eZt ) = E(e(X+Y )t ) = E(eXt eY t ).
as X Now, since X and Y are independent, so are eXt
tX tx
m(t) = E(e ) = e f (x). and eY t . Therefore, E(eXt eY t ) = E(eXt )E(eY t ) =
x mX (t)mY (t). q.e.d.
For continuous distributions, the moment generat- Note that this property of convolution on mo-
ing function can be expressed in terms of the prob- ment generating functions implies that for a sam-
ability density function f as ple sum Sn = X1 + X2 + + Xn , the moment
Z generating function is
tX tx
m(t) = E(e ) = e fX (x) dx.
mSn (t) = (mX (t))n .
We can couple that with the standardizing prop-
Properties of moment generating functions. erty to determine the moment generating function
Translation. If Y = X + a, then for the standardized sum
mY (t) = eat mX (t). Sn n
Sn = .
n
Proof: mY (t) = E(eY t ) = E(e(X+a)t ) =
E(eXt eat ) = eat E(eXt ) = eat mX (t). q.e.d. Since themean of Sn is n and its standard devia-
tion is n, so when its standardized, we get
Scaling. If Y = bX, then
mSn (t) = ent/( n) mSn ( t n )
mY (t) = mX (bt).
= e nt/ mSn ( t n )
Proof: mY (t) = E(eY t ) = E(e(bX)t ) = E(eX(bt) ) = n
nt/ t
mX (bt). q.e.d. = e mX ( n )
Standardizing. From the last two properties, if Well use this result when we prove the central limit
X theorem
X =
Some examples. The same definitions for these
is the standardized random variable for X, then apply to discrete distributions and continuous ones.
mX (t) = et/ mX (t/). That is, if X is any random variable, then its nth
moment is
Proof: First translate by to get n = E(X n )
and its moment generating function is
mX (t) = et mX (t).
tX
X k
Then scale that by a factor of 1/ to get mX (t) = E(e ) = tk .
k=0
k!
m(X)/ = mX (t/)
The only difference is that when you compute them,
= et/ mX (t/) you use sums for discrete distributions but integrals
q.e.d. for continuous ones. Thats because expectation is
defined in terms of sums or integrals in the two
Convolution. If X and Y are independent vari- cases. Thus, in the continuous case
ables, and Z = X + Y , then Z
n
mZ (t) = mX (t) mY (t). n = E(X ) = xn fX (x) dx
2
and The moment generating function for a uni-
Z
form distribution on [0, 1]. Let X be uniform
m(t) = E(etX ) = etx fX (x) dx. on [0, 1] so that the probability density function fX
has the value 1 on [0, 1] and 0 outside this interval.
Well look at one example of a moment gener- Lets first compute the moments.
ating function for a discrete distribution and three
for continuous distributions. Z
n
n = E(X ) = xn fX (x) dx
The moment generating function for a geo- Z 1
metric distribution. Recall that when indepen- = xn dx
0
dent Bernoulli trials are repeated, each with prob- 1
ability p of success, the time X it takes to get the xn+1 1
= =
first success has a geometric distribution. n+1 0 n+1
P (X = j) = q j1 p, for j = 1, 2, . . . .
Next, lets compute the moment generating func-
Lets compute the generating function for the geo- tion.
metric distribution.
Z
etx fX (x) dx
X
m(t) = etj q j1 p m(t) =
j=1 Z 1
p X tj j = etx dx
= e q 0
q j=1 1
1 tx et 1
= e =
t 0 t
X
X
The series etj q j = (et q)j is a geometric series
j=1 j=1
t Note that the expression for g(t) does not al-
eq
with sum . Therefore, low t = 0 since there is a t in the denominator.
1 et q
Still g(0) can be evaluated by using power series or
p et q pet LHopitals rule.
m(t) = = .
q 1 et q 1 qet
3
Lets compute its moment generating function. From these values of all the moments, we can com-
Z pute the moment generating function.
m(t) = etx fX (x) dx
n
Z
X
m(t) = tn
n!
= etx ex dx n=0
0
Z X 2m 2m
= t
= e(t)x dx m=0
(2m)!
0
(t)x (2m)! 1
e
X
= = m m! (2m)!
t2m
t 0 m=0
2
e(t)x e0
X 1
= lim = t2m
x t t 2m m!
m=0
t2 /2
Now if t < , then the limit in the last line is 0, so = e
in that case
Thus, the moment generating function for the stan-
m(t) = .
t dard normal distribution Z is
This is a minor, yet important point. The mo- 2
mZ (t) = et /2 .
ment generating function doesnt have to be defined
for all t. We only need it to be defined for t near More generally, if X = Z + is a normal distribu-
0 because were only interested in its derivatives tion with mean and variance 2 , then the moment
evaluated at 0. generating function is
2 2
The moment generating function for the gX (t) = exp(t + t /2).
standard normal distribution. Let Z be a ran-
dom variable with a standard normal distribution.
Math 217 Home Page at
Its probability density function is
http://math.clarku.edu/~djoyce/ma217/
1 2
fZ (x) = ex /2 .
2
Its moments can be computed from the definition,
but it takes repeated applications of integration by
parts to compute
Z
1 2
n = xn ex /2 dx.
2
(2m)!
2m = .
2m m!