Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

Extremum Estimators

Walter Sosa-Escudero
Econ 507. Econometric Analysis. Spring 2009

May 5, 2009

Walter Sosa-Escudero

Extremum Estimators

Motivation

A general class of estimators that includes all cases studied in


this course.
Structure: an estimator is defined, its asymptotic properties
are studied (consistency, asymptotic normality, efficiency).
An asymptotic variance is obtained together with asymptotic
normality. A consistent estimator for it is proposed.

Reference: Newey and McFadden (1994) provide a detailed and more general treatement.

Walter Sosa-Escudero

Extremum Estimators

Extremum estimators

Definition: is an extremum estimator if there is an objective


n () such that
function Q
n ()
= argmax Q

n () depends on a sample of size n.


The function Q
is the parameter space.

Walter Sosa-Escudero

Extremum Estimators

Particular cases

1) Maximum likelihood
z1 , . . . , zn and iid sample with density function f (z; 0 ). Take
n

X
n () = 1
ln f (zi ; )
Q
n
i=1

Walter Sosa-Escudero

Extremum Estimators

2) Non-linear least squares and OLS


Let zi = (yi , xi ) and E(y|x) = h(x; 0 ). Then take
n () = 1
Q
n

n
X
(yi h(xi ; ))2
i=1

This is the NLS estimator. Obviously the OLS estimator


corresponds to the case h(x, 0 ) = x0 0 .

Walter Sosa-Escudero

Extremum Estimators

GMM and TSLS


3) GMM, IV and TSLS
Suppose there exists a vector of functions g(z; ) such that
. Then, take
E(g(z; 0 )) = 0, and a psd matrix W
" n
#0 " n
#
X
X
1
1
n () =

Q
g(zi ; ) W
g(zi ; )
n
n
i=1

i=1

This is the GMM estimator. For the IV case in the linear model,
take
g(z, ) = z(y x0 0 )
where z is the vector of instruments.
The TLS estimator
Pn
1
0

corresponds to W = n
i zi .
i=1 z

Walter Sosa-Escudero

Extremum Estimators

Consistency

A general consistency result: if there is a function Q0 () such that


1

Q0 () is uniquely maximized at 0

is compact.

Q0 () is continuous.
n () converges uniformly in probability to Q0 ()
Q

p
then 0 .

Proof: we did it when we proved consistency of the MLE estimator (the last step).

Walter Sosa-Escudero

Extremum Estimators

Consistency: the movie

z N (, 1), 0 = 2
Q0 () = E(l()) 0.5 (5 4 + 2 )
P
Qn () 0.5 (n1 zi2 2
z + 2 )

Walter Sosa-Escudero

Extremum Estimators

n = 50

Walter Sosa-Escudero

Extremum Estimators

n = 50, 100

Walter Sosa-Escudero

Extremum Estimators

n = 50, 100, 200

Walter Sosa-Escudero

Extremum Estimators

n = 50, 100, 200, 2500

Walter Sosa-Escudero

Extremum Estimators

Discussion
1

(Unique maximizer at true value) This is usually an


identification assumption.

(Compactness) This implies a bounded parameter space, it is


a restrictive assumption.

(Continuity) Usually a consequence of 4).

(Uniform Convergence) Typically implies imposing primitive


conditions to use a uniform LLN (moment existence,
sampling).

Walter Sosa-Escudero

Extremum Estimators

Identification
MLE: information inequality: if 0 6= implies f () 6= f (0 ),
then E(l()) is uniquely maximized by 0 .


NLS: the limiting function is E (y h(x, ))2 ). By the
properties of conditional expectations, this is minimized by the
conditional expectation, in this case h(x, 0 ), then for
identification in NLS we need
6= 0 implies h(x, ) 6= h(x, 0 )
IV/GMM in the linear model: the rank condition guarantees
identification E(zi x0i ) = zx is a p K matrix, that exists, is
finite, an has full column rank.
Non-linear GMM: recall the global vs. local identification
discussion.

Walter Sosa-Escudero

Extremum Estimators

Uniform convergence and continuity

A general uniform LLN: if the data are i.i.d., is compact, a(zi , )


is a continuous function at each wp1, and there is d(z) with
||a(z, )|| d(z) forP
all and E[d(z)] < , then E[a(z, )] is
continuous and n1 ni=1 a(zi , ) converges uniformly in
probability to E(a(z, )).

Walter Sosa-Escudero

Extremum Estimators

Example: Consistency of MLE: Under 1) Zi , i = 1, . . . , n, iid


f (zi ; 0 ), 2) 6= 0 f (zi ; ) 6= f (zi ; 0 ). , 3) a
compact set, 4) ln f (zi ; ) is continuous at each w.p.1., and


p
4) E sup |ln f (z; )| < , 0 .

Detect which of these conditions are required for identification and


which ones for uniform convergence and continuity. Perform a
similar check for the non-linear GMM framework.

Walter Sosa-Escudero

Extremum Estimators

Asymptotic Normality
Assume the conditions for consistency hold, and add the following
conditions
1
2

3
4

0 is an interior point of .
n () is twice continuously differentiable in a neighborhood
Q
N of 0 .

d
n (0 )
n Q
N (0, ).
n ()
There is H() continuous at 0 such that Q
converges uniformly in probability to H().
H H(0 ) is non-singular

Then

n ( 0 ) N (0, H 1 H)

Walter Sosa-Escudero

Extremum Estimators

Proof:
Conditions 1-3 imply that satifies the FOCs
=0
n ()
Q
Take a mean value expansion around 0 and solve to get

1 n Q
)
n (0 )
n( 0 ) = H(

n (). Since H()

with H()
Q
converges uniformly in probability
p

to H(), and since 0 , then


p

)
H(
H(0 )
p
1
)
H(0 )1 . The result
and by continuity of matrix inversion H(
follows from 3) and Slutzkys Theorem.

Walter Sosa-Escudero

Extremum Estimators

Discussion
AN is driven mostly by AN of the first derivatives.
The result starts by linearizing the FOCs and then solving for
the normalized estimator.
This produces two factors, one that does not explode (related
to the inverse of the second derivatives) and the other that is
asymptotically normal (the first derivatives).
Slutzkys theorem implies normality with a sandwich type
asymptotic variance.

Walter Sosa-Escudero

Extremum Estimators

You might also like