Classical Method of Moments: M y KTH

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

2. Classical Method of Moments Let be a m-vector of parameters that characterize the distribution random variable y.

The kth moment (provided it exists) is dened as k ( ) = E y k . Suppose we have sample of size T from y with observations y1, y2, . . . , yT (considered as T independent random variables). The corresponding sample moments are 1 T k k = ^ yt . T t=1 With the method of moments one estimates the components of by simply equating the rst m population moments k ( ) with the corresponding sample moments k , and solv^ ing for the components in the parameter vector .

Example 2.1. Normal distribution, y1, . . . , yT a sample from N(, 2). Then = (, 2), 1( ) = E[y] = , and from
2

= Var[y] = E[(y 2( ) = E[y 2] =

)2] = E[y 2]
2

2,

+ 2.

Equating these with the sample moments, we have 1 1( ^) = T and 1 2( ^) = T That is 1 = ^ T and 1 ^ ^2 + 2 = T Arrangin terms, we get for ^2 1 ^ = T
2 T 2 yt t=1 T 2 yt t=1 T T 2 yt . t=1 T

yt
t=1

yt = y
t=1

1 = ^ T
2

T 2 yt t=1

y2.

Recalling that nally and

T t=1 (yt

y )2 = =y ^

T 2 t=1 yt

T y , we get

1 ^2 = T

(yt
t=1

y )2
2.

as the MM estimators of and

Example 2.2. The t-distribution. Suppose y is a random variable following a t-distribution with degrees of freedom. The density is then f (yt; ) = where (x) =
0

(( + 1)/2) [1 + (yt/ )2] ( /2) tx


1

1 2

( +1)

e t dt

is the gamma function, with the property that, if n is an integer ( 0) then (n+1) = n! = n(n 1) 21, note that by denition 0! = 1. From sample y1, . . . , yT , the MM estimator is obtained as follows. Provided that > 2 the expected value of a t-distributed random variable is zero (E[Yt ] = 0) and variance , 2 which at the same time is the second (population) moment of the distribution. Equate this again with the second sample moment and solve for to obtain ^= 2^2 2 1 ^
T 2 t=1 yt .

2 = E (Yt

E[Yt ])2 = E Yt2 =

^ provided that 2 > 1, where 2 = (1/T ) ^ erwise the estimate does not exist.

Oth-

The moment estimators are functions of averages of random variables. Thus the Law of Large Numbers implies that they converge toward their theoretical moments, implying consistency. Furthermore in many cases CLT implies that the asymptotic distributions of MM estimators are normal.

10

You might also like