Estimadores Extremos: Cristine Campos de Xavier Pinto

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 41

Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency

Estimadores Extremos
GMM
Cristine Campos de Xavier Pinto
CEDEPLAR/UFMG
Maio/2010
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Most part of the method in econometrics, we impose the
restriction that the vector of parameters
0
R
p
= is
assumed to satisfy the unconditional moment restrictions
g
0
(
0
) = E[g (Z
i
,
0
)] = 0 (1)
g (Z
i
, ) is a vector r x1 of functions

0
is a vector px1 of parameters

0
is the unique solution for this system of equations.
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
The method of the moments estimator

solves the sample
analog of this system of equations
_

GMM
_
.
To ge the sample analog: Replace the population expectation
by sample averages.
Order condition: One of the conditions to get a unique
0
is
r _ p.
We have two cases:
1 just-identied case: r = p
2 overidentied case: r > p
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
When
0
is identied and r = p, the method of moments
estimator is the solution to the system with the p sample
moments equations:
g
N
_

MM
_
=
1
N
N

i =1
g
_
Z
i
,

MM
_
= 0
GMM 1: Z
i

N
i =1
is a random sample that satises the
moment conditions above.
Most part of the methods in econometrics are examples of the
GMM estimator
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Example 1: Maximum Likelihood Estimator
X
i
, Y
i

N
i =1
is a random sample with conditional distribution
f (y[ x, ).
Likelihood Function:
L ( [ Z) =
N

i =1
f (y
i
[ ; x
i
)
Log-likelihood Function:
l ( [ Z) =
N

i =1
log f (y
i
[ ; x
i
)
MLE: maximizes the log-likelihood function.
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
In the population, MLE satises the "score identity "
E[g (z
i
,
0
)] = 0
where g (z
i
, ) =
log f ( z
i
[)

.
The sample analog is
1
N
N

i =1
log f (z
i
[ )

ML
= 0
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Example 2: Ordinary Least Squares Estimator
Random Sample Y
i
, X
i

N
=1
,that follows a linear population
model
Y
i
= X
/
i
+
i
We assume that
E[
i
X
i
] = E
__
Y
i
X
/
i

_
X
i

= 0
and E[X
/
i
X
i
] has full rank.
Sample Analog
1
N
N

i =1
_
Y
i
X
/
i

_
X
i
= 0
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Example 3: Instrumental Variable Estimator
Random sample Y
i
, X
i
, W
i

N
i =1
, that follows a linear model
Y
i
= X
/
i
+
i
with
E[
i
W
i
] = E
__
Y
i
X
/
i

_
W
i

= 0
where W
i
R
r
and E[X
/
i
W
i
] has full rank.
Sample Analog
1
N
N

i =1
_
Y
i
X
/
i

IV
_
W
i
= 0
=

IV
=
_
1
N
N

i =1
X
/
i
W
i
_
1

_
1
N
N

i =1
Y
i
W
i
_
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
The system of equations is overidentied, r > p.
In this case, in general no solution for the sample analog of 1
exists.
We use an analog estimator that makes g
N
() closes to zero

GMM
= arg max

Q
N
() (2)
where

Q
N
() = [g
N
()]
/

W
N
g
N
()

W
N
is a positive semidenite matrix such that

W
N

p
W
0
.
All the large-sample properties of the

GMM
depends on the
limiting weight matrix W
0
.
What is W
0
?
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
We can use nonstochastic weight matrices, like

W
N
= I
r
,
where r xr is an identity matrix.
Or we can use stochastic weight matrices
The optimal weight matrix is the matrix that minimizes the
asymptotic variance
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Idea: Using the Law of Large Numbers
g
N
()
p
g
0
() = E[g (Z
i
, )]
and

W
N

p
W
0
By continuity of the multiplication,

Q
N
() Q
0
() = g
0
()
/
W
0
g
0
()
This function has a maximum of zero at
0
, so
0
will be
identiedif it is less than zero for ,=
0
.
The identication condition is
W
0
g
0
() ,= 0 for ,=
0
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Identication: If
(i ) W
0
is positive semi-denite;
(ii ) g
0
() = E[g (Z
i
, )] , g
0
(
0
) = 0
(iii ) Wg
0
() ,= 0 for ,=
0
then Q
0
() = g
0
()
/
W
0
g
0
() has a unique maximum at

0
.
A necessary order condition for identication is: r _ p.
If the moments functions are linear:
g (z, ) = g (z) +G (z) , then a necessary and sucient
condition for identication is that the rank of WE[G (z)] is
equal to the number of columns.
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Example (Linear Moments): Instrumental Variables
(Overidentied Case)
Random sample Y
i
, X
i
, W
i

N
i =1
, that follows a linear model
Y
i
= X
/
i
+
i
with
E[
i
W
i
] = E
__
Y
i
X
/
i

_
W
i

= 0
where W
i
R
r
and r > p.
The two-stage least squares of is a GMM estimator with

W =
_

N
i =1
W
/
i
W
i
N
_
1
Suppose that E[W
/
i
W
i
] exists and is nonsingular, so
W
0
= (E[W
/
i
W
i
])
1
exists.
The rank condition is E[W
/
i
X
i
] has full rank.
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
To get consistency, we need to show uniform consistency of

Q
N
() to its probabilty limit Q
0
() ,
sup

Q
N
() Q
0
()

0
and then establish that the limiting Q
0
() is uniquely
maximized at =
0
.
Consistency: Suppose that Z
i

N
i =1
are i.i.d,

W
p
W
0
,and
(i ) W
0
is positive semi-denite and
W
0
E[g (Z
i
, )] ,= 0 for ,=
0
(ii )
0
, is compact
(iii ) g (z, ) is continous for each with
probability one
(iv) E[sup

|g (z, )|] <


Then,


p

0
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
If (i ) holds, then Q
0
() is uniquely maximized at
0
(ii ) is the assumption that the parameter space is compact.
If (iii ) holds, then Q
0
() is continuous.
By compact, g
0
() is bounded on (by iv), we can use
the triangle and Cauchy-Schwartz inequality to show uniform
convergence.
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Assuming that
0
int () and g (z; ) is continously
dierentiable in a neighborhood A of
0
, with probability one,
0 =
Q
N
_

GMM
_

= 2
_
_
1
N
N

i =1
g
_

GMM
_

_
_
/
W
N
g
N
_

GMM
_
If the derivative of the moment function converges uniformly
in probability to its expectation value on the neighborhood A
of
0
, and if


p

0

G
N
_

GMM
_
=
_
_
1
N
N

i =1
g
_

GMM
_

_
_
and E[\

g (Z,
0
)] = G
0
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Doing a mean value expansion of g
N
_

GMM
_
around
0
g
N
_

GMM
_
= g
N
(
0
) +
_
_
g
N
_

_
_
_

GMM

0
_
(3)
Plugging in 3 into the expansion,
0 =
_
_
1
N
N

i =1
g
_

GMM
_

_
_
/
W
N
g
N
(
0
)
+
_
_
1
N
N

i =1
g
_

GMM
_

_
_
/
W
N
_
_
1
N
N

i =1
g
_

_
_
_

GMM

0
_
where

is a mean value, between

GMM
and
0
.
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Re-writing this expression
_
N
_

GMM

0
_
=
_

G
N
_

GMM
_
/
W
N

G
N
_

_
_
1


G
N
_

GMM
_
/
W
N
_
Ng
N
(
0
)
Under the assumption that E[sup
A
|\

g (Z, )|] <


and if


p

0

G
N
_

GMM
_

p
G
0

G
N
_

_

p
G
0
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Assuming that G
/
0
W
0
G
0
is nonsingular
_

G
N
_

GMM
_
/
W
N

G
N
_

_
_
1


G
N
_

GMM
_
/
W
N

p
_
G
/
0
W
0
G
0
_
1
G
/
0
W
0
At the end
_
N
_

GMM

0
_
=
_
G
/
0
W
0
G
0
_
1
G
/
0
W
0
1
_
N
N

i =1
g (Z
i
,
0
) +o
p
(1)
where E[g (Z
i
,
0
)] = 0 and
Var [g (Z
i
,
0
)] = E
_
g (Z
i
,
0
) g (Z
i
,
0
)
/

= < .
Using the CLT,
_
N
_

GMM

0
_

d
A
_
0,
_
G
/
0
W
0
G
0
_
1
G
/
0
W
0
W
0
G
0
_
G
/
0
W
0
G
0
_
1
_
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Asymptotic Normality: Suppose that the hypothesis
necessary for consistency is satised,

W
p
W
0
, and
(i )
0
int ()
(ii ) g (Z, ) is continously dierentiable in a
neighborhood A of
0
with probability
approaching one
(iii ) E[g (Z,
0
)] = 0 and E
_
|g (Z,
0
)|
2
_
<
(iv) E[sup
A
|\

g (Z, )|] <


(v) G
/
0
W
0
G
0
is nonsingular for G
0
= E[\

g (Z,
0
)]
Then for = E
_
g (Z,
0
) g (Z,
0
)
/
_
,
_
N
_

GMM

0
_

d
A (0, V
0
)
where
V
0
=
_
G
/
0
W
0
G
0
_
1
G
/
0
W
0
W
0
G
0
_
G
/
0
W
0
G
0
_
1
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
To estimate the asymptotic variance of the GMM estimator,
we just need to nd a consistent estimator for G
0
and W
0
.
Using a weight matrix

W such that

W
p
W
0
, if g (Z, ) is
continously dierentiable at
0
with probability approaching
one and E
_
sup
A
|g (Z, )|
2
_
< ,

G =
1
N
N

i =1
\

g
_
Z,

=
1
N
N

i =1
g
_
Z,

_
g
_
Z,

_/
the estimator of the asymptotic variance

V =
_

G
/

W

G
_
1

G
/

W

G
_

G
/

W

G
_
1
and

V
p
V
0
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
What is the optimal W
0
?
Since W
0
is positive semi-denite
_
G
/
0
W
0
G
0

1
G
/
0
W
0

0
W
/
0
G
0
_
G
/
0
W
0
G
0

1
_
_
G
/
0
W
0
G
0

1
The optimal W
0
W
+
0
=
1
= [Var [g (Z
i
,
0
)]]
1
The optimal GMM estimator is obtained when W
N

p
W
+
0
,
and
_
N
_

GMM

0
_
d
A
_
0,
_
G
/
0
W
0
G
0
_
1
_
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
A feasible optimal GMM estimator requires a estimator of .
We can obtain the feasible optimal GMM estimator in two
steps.
Step 1: Get a consisten estimator fo W
+
0
Obtain a GMM estimator non-optimal
_

_
using a sequence

W
N
such that

is a consistent estimator to
0
,

=
1
N
N

i =1
g
_
Z
i
,

_
g
_
Z
i
,

_
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Step 2: Using

, obtain the optimal GMM estimator

+
= arg min

[g
N
()]
/

1
g
N
()
A consistent estimtor for the asymptotic variance of the
optimal GMM estimator is

V
GMM
=
_

G
/

G
_
1
onde

G =
1
N

N
i =1
g
(
Z
i
,

+
)

.
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Example: Instrumental Variable: X
i
R is the endogenous
variable and W
i
R
r
is a vector of instruments, r > p
Y
i
= X
/
i
+
i
E[
i
W
i
] = E
__
Y
i
X
/
i

_
W
i

= 0
Step 1: Using a consistent estimator for W
0
,

W
N
=
_
1
N

N
i =1
W
/
i
W
i
_
1

= arg min

_
_
_
_
1
N
N

i =1
_
Y
i
X
/
i

_
W
i
_

_
1
N
N

i =1
W
/
i
W
i
_
1

_
1
N
N

i =1
_
Y
i
X
/
i

_
W
i
_
/
_
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Solving the minization problem: The rst order condition is
2
_
1
N
N

i =1
W
/
i
X
i
_

_
1
N
N

i =1
W
/
i
W
i
_
1
_
1
N
N

i =1
_
Y
i
X
/
i

_
W
i
_
= 0
=

=
_
_
_
1
N
N

i =1
W
/
i
X
i
__
1
N
N

i =1
W
/
i
W
i
_
1
_
1
N
N

i =1
X
/
i
W
i
_
_
_
1

_
_
_
1
N
N

i =1
W
/
i
X
i
__
1
N
N

i =1
W
/
i
W
i
_
1
_
1
N
N

i =1
Y
i
W
i
_
_
_
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
GMM estimator:

+
GMM
= arg min

__
1
N
N

i =1
_
Y
i
X
/
i

_
W
i
_
W
N

_
1
N
N

i =1
_
Y
i
X
/
i

_
W
i
_
/
_
onde W
N
=
_
1
N

N
i =1
_
Y
i
X
/
i

_
2
W
i
W
/
i
_
1
.
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency

+
GMM
=
_
_
1
N
N

i =1
W
/
i
X
i
__
1
N
N

i =1
_
Y
i
X
/
i

_
2
W
i
W
/
i
_
1
_
1
N
N

i =1
X
/
i
W
i
_
_
1

_
_
1
N
N

i =1
W
/
i
X
i
__
1
N
N

i =1
_
Y
i
X
/
i

_
2
W
i
W
/
i
_
1
_
1
N
N

i =1
Y
i
W
i
_
_
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Asymptotic Distribution
_
N
_

+
GMM

0
_
d
A
_
0,
1
N
_

G
/

G
_
1
_
where

G =
_
1
N
N

i =1
W
/
i
X
i
_
,

1
=
_
1
N
N

i =1
_
Y
i
X
/
i

_
2
W
i
W
/
i
_
1
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
If the error term is homoskedastic,
E
_

2
i

W
i

=
2
then

0
=
2
E
_
W
i
W
/
i

1
and the optimal GMM estimator has the same asymptotic
variance as the two-stage least squares estimator.
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
We can compare asymptotically normal estimators based on
their asymptotic variances.
Asymptotic Ecient: An estimator is asymptotically
ecient relative to another if it has at least as small an
asymptotic variance for all possible true parameter values.
If

is asymptotically ecient relative to

for all constants K
Pr
_

_
K
_
N
_
> Pr
_

_
K
_
N
_
for large N enought.
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
All the estimators we have studied are
_
Nasymptotically
normal with variance matrices of the form
V = H
1
E
_
s (Z) s (Z)
/
_
H
1
where
s (Z): score of the objective function (evaluated at
0
)
H : expected value of the Jacobian of the score, evaluated at

0
.
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Relative Eciency: Let

and

be two
_
Nasymptotically
normal estimators of the Px1 parameter vector
0
with
asymptotic variances of the form
V

= H ()
1
E
_
s (Z; ) s (Z; )
/
_
H ()
1
. If for some p > 0
E
_
s
_
Z,

_
s
_
Z,

_/
_
= pH
_

_
(4)
E
_
s
_
Z,

_
s
_
Z,

_/
_
= pH
_

_
(5)
then V

is positive semidenite.
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Condition 4 is the generalized information matrix inequality
that we have at MLE.
In this case, A

is symmetric and positive denite.


Note that if both estimators are jointly asymptotically normal
distributed, assumptions 4 and 5 imply that the asymptotic
covariance between
_
N
_


0
_
and
_
N
_


0
_
is
H
_

_
1
E
_
s
_
Z,

_
s
_
Z,

_/
_
H
_

_
1
= H
_

_
1
pH
_

_
H
_

_
1
= pH
_

_
1
= Avar
_
_
N
_


0
__
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Under this conditions, the asymptotic covariance between the
estimators is equal to the asymptotic variance of the ecient
estimator.
This theorem can help dene eciency of an estimator in an
entire class of estimators.
For each index in an index set, T , the estimator

has an
associated s
_
Z,

_
and H
_

_
such that the variance
_
N
_


0
_
has the sandwidth form above.
The index is used to distinguish dierent
_
N-asymptotically
normal estimators of
0
.
Example 1: T consists of objetive functions g (., .) such
that
0
uniquely maximizes E[g (Z, )] over and g satises
the twice continuously dierentiable and bounded moments
assumptions necessary for asymptotic normality.
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Ecient in a class of estimators: Let
_

: T
_
be a
class of
_
N-asymptotically normal estimators with variance
matrix of the form
V

= H (

)
1
E
_
s (Z;

) s (Z;

)
/
_
H (

)
1
. If for some

+
T and p > 0
E
_
s (Z;

) s (Z;

+
)
/
_
= pH (

) for all T
then

+
is asymptotically relatively ecient in the class
_

: T
_
.
We only compare

+
with all the estimators in this class.
A special case of this theorem is the Gauss-Markov Theorem.
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
Lets look at the eciency of MLE
Consider the class of estimators solving the rst-order
condition
1
N
N

i =1
g
_
Z
i
,

_
= 0 (6)
where the Px1 function g (Z
i
, ) such that
E[ g (Z, )[ x] = 0 for all x A and
and the functions g (Z
i
, ) satises the general information
matrix equality
E[ \

g (Z, )[ x] = E
_
g (Z, ) s (Z, )
/

x
_
(7)
where s (Z, )
/
is the score of log f (y[ x; ) .
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
If we tale expectations of both sides with respect to x, and
assuming p = 1
E[\

g (Z, )] = E
_
g (Z, ) s (Z, )
/
_
and
H (

) = E[\

g (Z,

)] and s (Z;

+
) = s (Z,

+
)
/
Under some regularity conditions, the conditional MLE is
ecient in the class of these estimators.
Variance of the CMLE is
_
E
_
s (Z,
0
) s (Z,
0
)
/
__
1
is an
ecient bound.
In this class of estimators, no estimator can have an
asymptotic variance smaller than
_
E
_
s (Z,
0
) s (Z,
0
)
/
__
1
.
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
In this class of estimators, we only require the regularity
conditions we used to derive the asymptotic distribution of the
estimators, the distribution of X is unrestricted and can
depend on
0
.
Conditional MLE, ignores information on
0
that might be
contained in the distribution of X. The estimators in this class
does the same.
The unconditional MLE is ecient in the class of the
estimators that satises 6 and 7 with E[g (Z, )] = 0 for all
.
This class is a very broad class of estimators, incluing the
previous one (Why?).
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
The unconditional MLE is more ecient than the conditinal
MLE.
However, in this class of estimator we need to model the joint
distribution of (Y, X) instead of only the conditional
distribution of Y conditional on X.
We need to know information on the density of X.
If the marginal distribution of X depends on that is not
related to , CMLE is identical to the unconditional MLE.
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos
Just Identied Case Overidentied Case Identicatition and Consistency Asymptotic Normality Asymptotic Eciency
References
Amemya: 4
Wooldridge:8 and 14
Rudd: 14 and15
Newey, W. and D. McFadden (1994). "Large Sample
Estimation and Hypothesis Testing", Handbook of
Econometrics, Volume IV, chapter 36.
Cristine Campos de Xavier Pinto Institute
Estimadores Extremos

You might also like