Professional Documents
Culture Documents
L
L
Consistency
Asymptotic Normality
Inference
Wald test
LM test
ie-Slides06
Efficiency
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
Lecture Plan
ie-Slides06
LM test
Efficiency
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
The strategy does work for the OLS estimators under MLR1-5.
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
The strategy does work for the OLS estimators under MLR1-5.
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
The strategy does work for the OLS estimators under MLR1-5.
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
The strategy does work for the OLS estimators under MLR1-5.
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
The strategy does work for the OLS estimators under MLR1-5.
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
The strategy does work for the OLS estimators under MLR1-5.
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
The strategy does work for the OLS estimators under MLR1-5.
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
The strategy does work for the OLS estimators under MLR1-5.
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
n.
bj is consistent for j if and only if
P bj differs from j
tends to zero as n goes infinity. (see also Appendix C)
Consistency comes from the LLN (law of large numbers).
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
n.
bj is consistent for j if and only if
P bj differs from j
tends to zero as n goes infinity. (see also Appendix C)
Consistency comes from the LLN (law of large numbers).
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
n.
bj is consistent for j if and only if
P bj differs from j
tends to zero as n goes infinity. (see also Appendix C)
Consistency comes from the LLN (law of large numbers).
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
j
)
C)
N
ers).
)
Dr. Rachida Ouysse
ie-Slides06
Efficiency
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
Pn
b1 = 1 +
(u u)(xi
i=1
Pn i
2
i=1 (xi x)
x)
(1)
indicates that
P
plim ni=1 (ui u)(xi x)/n
Pn
plim b1 = 1 +
2
plim
i=1 (xi x) /n
Cov (ui , xi )
plim b1 = 1 +
as n .
Var (xi )
When Cov (ui , xi ) = 0, we have consistency
plim b1 = 1
Inconsistency results from Cov (u, x1 ) 6= 0.
Dr. Rachida Ouysse
ie-Slides06
(2)
(3)
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
Pn
b1 = 1 +
(u u)(xi
i=1
Pn i
2
i=1 (xi x)
x)
(1)
indicates that
P
plim ni=1 (ui u)(xi x)/n
Pn
plim b1 = 1 +
2
plim
i=1 (xi x) /n
Cov (ui , xi )
plim b1 = 1 +
as n .
Var (xi )
When Cov (ui , xi ) = 0, we have consistency
plim b1 = 1
Inconsistency results from Cov (u, x1 ) 6= 0.
Dr. Rachida Ouysse
ie-Slides06
(2)
(3)
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
Pn
b1 = 1 +
(u u)(xi
i=1
Pn i
2
i=1 (xi x)
x)
(1)
indicates that
P
plim ni=1 (ui u)(xi x)/n
Pn
plim b1 = 1 +
2
plim
i=1 (xi x) /n
Cov (ui , xi )
plim b1 = 1 +
as n .
Var (xi )
When Cov (ui , xi ) = 0, we have consistency
plim b1 = 1
Inconsistency results from Cov (u, x1 ) 6= 0.
Dr. Rachida Ouysse
ie-Slides06
(2)
(3)
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
Pn
b1 = 1 +
(u u)(xi
i=1
Pn i
2
i=1 (xi x)
x)
(1)
indicates that
P
plim ni=1 (ui u)(xi x)/n
Pn
plim b1 = 1 +
2
plim
i=1 (xi x) /n
Cov (ui , xi )
plim b1 = 1 +
as n .
Var (xi )
When Cov (ui , xi ) = 0, we have consistency
plim b1 = 1
Inconsistency results from Cov (u, x1 ) 6= 0.
Dr. Rachida Ouysse
ie-Slides06
(2)
(3)
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
b2
.
SSTj (1 Rj2 )
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
b2
.
SSTj (1 Rj2 )
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
b2
.
SSTj (1 Rj2 )
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
Asymptotic analysis:Inference
Under
CLM
(MLR1-6), we did inference with
With
only MLR1-5, Theorem 5.2 indicates
a
bj j /se(bj ) Normal(0, 1).
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
Asymptotic analysis:Inference
Under
CLM
(MLR1-6), we did inference with
With
only MLR1-5, Theorem 5.2 indicates
a
bj j /se(bj ) Normal(0, 1).
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
Asymptotic analysis:Inference
Under
CLM
(MLR1-6), we did inference with
With
only MLR1-5, Theorem 5.2 indicates
a
bj j /se(bj ) Normal(0, 1).
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
Asymptotic analysis:Inference
Under
CLM
(MLR1-6), we did inference with
With
only MLR1-5, Theorem 5.2 indicates
a
bj j /se(bj ) Normal(0, 1).
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
Asymptotic analysis:Inference
Under
CLM
(MLR1-6), we did inference with
With
only MLR1-5, Theorem 5.2 indicates
a
bj j /se(bj ) Normal(0, 1).
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
Asymptotic analysis:Inference
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
Asymptotic analysis:Inference
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
Asymptotic analysis:Inference
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
Asymptotic analysis:Inference
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
Asymptotic analysis:Inference
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
H0 : kq+1 = 0, , k = 0
The unrestricted model:
y = 0 + 1 x1 + + k xk + u
The restricted model:
y = 0 + 1 x1 + + kq xkq + u(r )
Test statistic
SSRr SSRur
a
2q under H0 .
(SSRur /(n k 1))
(4)
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
H0 : kq+1 = 0, , k = 0
The unrestricted model:
y = 0 + 1 x1 + + k xk + u
The restricted model:
y = 0 + 1 x1 + + kq xkq + u(r )
Test statistic
SSRr SSRur
a
2q under H0 .
(SSRur /(n k 1))
(4)
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
H0 : kq+1 = 0, , k = 0
The unrestricted model:
y = 0 + 1 x1 + + k xk + u
The restricted model:
y = 0 + 1 x1 + + kq xkq + u(r )
Test statistic
SSRr SSRur
a
2q under H0 .
(SSRur /(n k 1))
(4)
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
H0 : kq+1 = 0, , k = 0
The unrestricted model:
y = 0 + 1 x1 + + k xk + u
The restricted model:
y = 0 + 1 x1 + + kq xkq + u(r )
Test statistic
SSRr SSRur
a
2q under H0 .
(SSRur /(n k 1))
(4)
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
H0 : kq+1 = 0, , k = 0
The unrestricted model:
y = 0 + 1 x1 + + k xk + u
The restricted model:
y = 0 + 1 x1 + + kq xkq + u(r )
Test statistic
SSRr SSRur
a
2q under H0 .
(SSRur /(n k 1))
(4)
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
H0 : kq+1 = 0, , k = 0
The unrestricted model:
y = 0 + 1 x1 + + k xk + u
The restricted model:
y = 0 + 1 x1 + + kq xkq + u(r )
Test statistic
SSRr SSRur
a
2q under H0 .
(SSRur /(n k 1))
(4)
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
heteroskedasticity).
We introduce it here as a test of linear restrictions.
b(r ) .
a) Run restricted regression and save the residual, u
b(r ) on all x variables (known as auxiliary
b) Regress the residual u
regression) and save the Rbu2(r ) .
c) LM = nRbu2(r ) =(sample size)(R 2 of auxiliary regression).
Approximately, it has 2q distribution under H0
d) Reject the restrictions (ie, H0 ) if LM > c, (the 2q critical
value), where q is the number of restrictions.
LM is based on the restricted model, while Wald is based on
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
heteroskedasticity).
We introduce it here as a test of linear restrictions.
b(r ) .
a) Run restricted regression and save the residual, u
b(r ) on all x variables (known as auxiliary
b) Regress the residual u
regression) and save the Rbu2(r ) .
c) LM = nRbu2(r ) =(sample size)(R 2 of auxiliary regression).
Approximately, it has 2q distribution under H0
d) Reject the restrictions (ie, H0 ) if LM > c, (the 2q critical
value), where q is the number of restrictions.
LM is based on the restricted model, while Wald is based on
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
heteroskedasticity).
We introduce it here as a test of linear restrictions.
b(r ) .
a) Run restricted regression and save the residual, u
b(r ) on all x variables (known as auxiliary
b) Regress the residual u
regression) and save the Rbu2(r ) .
c) LM = nRbu2(r ) =(sample size)(R 2 of auxiliary regression).
Approximately, it has 2q distribution under H0
d) Reject the restrictions (ie, H0 ) if LM > c, (the 2q critical
value), where q is the number of restrictions.
LM is based on the restricted model, while Wald is based on
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
heteroskedasticity).
We introduce it here as a test of linear restrictions.
b(r ) .
a) Run restricted regression and save the residual, u
b(r ) on all x variables (known as auxiliary
b) Regress the residual u
regression) and save the Rbu2(r ) .
c) LM = nRbu2(r ) =(sample size)(R 2 of auxiliary regression).
Approximately, it has 2q distribution under H0
d) Reject the restrictions (ie, H0 ) if LM > c, (the 2q critical
value), where q is the number of restrictions.
LM is based on the restricted model, while Wald is based on
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
heteroskedasticity).
We introduce it here as a test of linear restrictions.
b(r ) .
a) Run restricted regression and save the residual, u
b(r ) on all x variables (known as auxiliary
b) Regress the residual u
regression) and save the Rbu2(r ) .
c) LM = nRbu2(r ) =(sample size)(R 2 of auxiliary regression).
Approximately, it has 2q distribution under H0
d) Reject the restrictions (ie, H0 ) if LM > c, (the 2q critical
value), where q is the number of restrictions.
LM is based on the restricted model, while Wald is based on
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
heteroskedasticity).
We introduce it here as a test of linear restrictions.
b(r ) .
a) Run restricted regression and save the residual, u
b(r ) on all x variables (known as auxiliary
b) Regress the residual u
regression) and save the Rbu2(r ) .
c) LM = nRbu2(r ) =(sample size)(R 2 of auxiliary regression).
Approximately, it has 2q distribution under H0
d) Reject the restrictions (ie, H0 ) if LM > c, (the 2q critical
value), where q is the number of restrictions.
LM is based on the restricted model, while Wald is based on
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
heteroskedasticity).
We introduce it here as a test of linear restrictions.
b(r ) .
a) Run restricted regression and save the residual, u
b(r ) on all x variables (known as auxiliary
b) Regress the residual u
regression) and save the Rbu2(r ) .
c) LM = nRbu2(r ) =(sample size)(R 2 of auxiliary regression).
Approximately, it has 2q distribution under H0
d) Reject the restrictions (ie, H0 ) if LM > c, (the 2q critical
value), where q is the number of restrictions.
LM is based on the restricted model, while Wald is based on
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
heteroskedasticity).
We introduce it here as a test of linear restrictions.
b(r ) .
a) Run restricted regression and save the residual, u
b(r ) on all x variables (known as auxiliary
b) Regress the residual u
regression) and save the Rbu2(r ) .
c) LM = nRbu2(r ) =(sample size)(R 2 of auxiliary regression).
Approximately, it has 2q distribution under H0
d) Reject the restrictions (ie, H0 ) if LM > c, (the 2q critical
value), where q is the number of restrictions.
LM is based on the restricted model, while Wald is based on
ie-Slides06
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
LM statistic: Example
Take the example of housing price regression model:
H0 : 1 = 1, 2 = 0, 3 = 0, 4 = 0 should hold.
Implementing the LM test:
a) Regress [log(price) log(assess)] on a constant and save the
b(r )
residuals, u
b(r ) on
b) Regress the residuals u
{log(assess), log(lotsize), log(sqrft), bdrms} and save the
R-squared, Rbu2(r )
c) as previous page c). LM = nRbu2(r ) is approximately distributes
as a 24
d) as previous page d).
Dr. Rachida Ouysse
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
LM statistic: Example
Take the example of housing price regression model:
H0 : 1 = 1, 2 = 0, 3 = 0, 4 = 0 should hold.
Implementing the LM test:
a) Regress [log(price) log(assess)] on a constant and save the
b(r )
residuals, u
b(r ) on
b) Regress the residuals u
{log(assess), log(lotsize), log(sqrft), bdrms} and save the
R-squared, Rbu2(r )
c) as previous page c). LM = nRbu2(r ) is approximately distributes
as a 24
d) as previous page d).
Dr. Rachida Ouysse
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
LM statistic: Example
Take the example of housing price regression model:
H0 : 1 = 1, 2 = 0, 3 = 0, 4 = 0 should hold.
Implementing the LM test:
a) Regress [log(price) log(assess)] on a constant and save the
b(r )
residuals, u
b(r ) on
b) Regress the residuals u
{log(assess), log(lotsize), log(sqrft), bdrms} and save the
R-squared, Rbu2(r )
c) as previous page c). LM = nRbu2(r ) is approximately distributes
as a 24
d) as previous page d).
Dr. Rachida Ouysse
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
LM statistic: Example
Take the example of housing price regression model:
H0 : 1 = 1, 2 = 0, 3 = 0, 4 = 0 should hold.
Implementing the LM test:
a) Regress [log(price) log(assess)] on a constant and save the
b(r )
residuals, u
b(r ) on
b) Regress the residuals u
{log(assess), log(lotsize), log(sqrft), bdrms} and save the
R-squared, Rbu2(r )
c) as previous page c). LM = nRbu2(r ) is approximately distributes
as a 24
d) as previous page d).
Dr. Rachida Ouysse
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
LM statistic: Example
Take the example of housing price regression model:
H0 : 1 = 1, 2 = 0, 3 = 0, 4 = 0 should hold.
Implementing the LM test:
a) Regress [log(price) log(assess)] on a constant and save the
b(r )
residuals, u
b(r ) on
b) Regress the residuals u
{log(assess), log(lotsize), log(sqrft), bdrms} and save the
R-squared, Rbu2(r )
c) as previous page c). LM = nRbu2(r ) is approximately distributes
as a 24
d) as previous page d).
Dr. Rachida Ouysse
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
LM statistic: Example
Take the example of housing price regression model:
H0 : 1 = 1, 2 = 0, 3 = 0, 4 = 0 should hold.
Implementing the LM test:
a) Regress [log(price) log(assess)] on a constant and save the
b(r )
residuals, u
b(r ) on
b) Regress the residuals u
{log(assess), log(lotsize), log(sqrft), bdrms} and save the
R-squared, Rbu2(r )
c) as previous page c). LM = nRbu2(r ) is approximately distributes
as a 24
d) as previous page d).
Dr. Rachida Ouysse
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
LM statistic: Example
Take the example of housing price regression model:
H0 : 1 = 1, 2 = 0, 3 = 0, 4 = 0 should hold.
Implementing the LM test:
a) Regress [log(price) log(assess)] on a constant and save the
b(r )
residuals, u
b(r ) on
b) Regress the residuals u
{log(assess), log(lotsize), log(sqrft), bdrms} and save the
R-squared, Rbu2(r )
c) as previous page c). LM = nRbu2(r ) is approximately distributes
as a 24
d) as previous page d).
Dr. Rachida Ouysse
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Efficiency of OLS
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Efficiency of OLS
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Efficiency of OLS
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
large-sample approximations.
Wald and LM test. (pay attention to critical values)
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
large-sample approximations.
Wald and LM test. (pay attention to critical values)
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
large-sample approximations.
Wald and LM test. (pay attention to critical values)
ie-Slides06
Summary
Motivation
Consistency
Asymptotic Normality
Inference
Wald test
LM test
Efficiency
Summary
large-sample approximations.
Wald and LM test. (pay attention to critical values)
ie-Slides06
Summary