Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

Chapter 05

Ch.5 Multiple Regression: Asymptotic

Multiple Regression Analysis 1. Consistency


2. Asymptotic Normality & Large
y = β0 + β1x1 + β2x2 + . . . + βkxk + u Sample Inference
3. Asymptotic Efficiency of OLS
3. Asymptotic Properties
* The issue of this chapter is deeply based on Appendix C.3.

Econometrics 1 Econometrics 2

5.1 Consistency Sampling Distributions as n grows


Under the Gauss-Markov assumptions OLSE
n3
is BLUE, but in other cases it won’t always n1 < n2 < n3
be possible to find unbiased estimators.
In those cases, we may settle for estimators
that are consistent, meaning the distribution n2
of the estimator becomes more tightly
distributed around βj as the sample size grows.
„ As n → ∞, the distribution of the estimator
n1
collapses to the parameter value.
β1
Econometrics 3 Econometrics 4

Consistency of OLS A Weaker Assumption (MLR.4’)


Under Assumptions MLR.1-4, the OLS For unbiasedness, we assumed a zero
estimator is consistent (and unbiased). conditional mean:
„ Consistency can be proved in a manner similar
to the proof of unbiasedness. E(u|x1, x2,…,xk) = 0
n −1 ∑ ( xi1 − x1 )ui For consistency, we can have the weaker
βˆ1 = β1 + (5.2)
n −1 ∑ ( xi1 − x1 ) assumption of zero mean and zero correlation:
Cov( x1 , u ) E(u) = 0 & Cov(xj,u) = 0, for j = 1, 2, …, k
plimβˆ1 = β1 + = β1 (5.3)
Var ( x1 )
„ Without this assumption, OLS will be biased and
„ By the law of large numbers, (5.2) can converge inconsistent.
in probability to the population quantity.
Econometrics 5 Econometrics 6

Multiple Regression 3: Asymptotic 1


Chapter 05

Deriving the Inconsistency Asymptotic Bias


Just as we could derive the omitted variable The direction of the asymptotic bias is the
bias earlier, now we want to think about the same as that of bias for an omitted variable.
inconsistency, or asymptotic bias, in this case, „ Main difference is that asymptotic bias uses the
True model : y = β 0 + β1 x1 + β 2 x2 + v population variance and covariance, while bias
You think : y = β 0 + β1 x1 + u , uses the sample counterparts.

so that u = β 2 x2 + v and, The inconsistency problem doesn’t go


~ away as the sample size grows.
plimβ1 = β1 + β 2δ (5.5)
Even if only x1 is correlated with u, all the
where δ = Cov(x1 , x2 ) Var ( x1 ) (5.6) OLSEs are generally inconsistent.
Econometrics 7 Econometrics 8

5.2 Asymptotic Normality & Inference Central Limit Theorem (CLT)


Recall that under the CLM assumptions, the
Based on the Gauss-Markov assumptions,
sampling distributions are normal, so we could
we can show that OLS estimators are
derive t and F distributions for testing.
asymptotically normal.
„ This exact normality was due to assuming the
population error distribution was normal. The CLT states that the standardized
„ This assumption of normal errors implied that the average of any population with mean μ and
distribution of y, given the x’s, was normal as well. variance σY2 converges to N(0,1), or
But, any clearly skewed variable, like wages, Y − μY
N (0,1)
„
Z= ⎯
⎯→
d
arrests, savings, etc. can’t be normal (See fig.5.2). σY n
Econometrics 9 Econometrics 10

Asymptotic Normality 1 Asymptotic Normality 2


OLSE in the simple regression case OLSE in the multiple regression case
Under the Gauss-Markov assumptions, Under the Gauss-Markov assumptions,

( ) ( ) ( )
a
a ⎛ σ2 ⎞ (i) n βˆ j − β j ~ Normal 0, σ 2 a 2j ,
(i) n βˆ1 − β1 ~ Normal⎜⎜ 0, ⎟⎟,
⎝ Var ( x) ⎠ where a = plim n
2
j ( ∑ rˆ ) −1 2
ij
(ii) σˆ 2 is a consistent estimator of σ 2
(ii) σˆ is a consistent estimator of σ 2
2

( ) ( )
(iii) βˆ1 − β1 se βˆ1 ~ Normal(0,1)
a

( ) ( )
(iii) βˆ j − β j se βˆ j ~ Normal(0,1)
a

Econometrics 11 Econometrics 12

Multiple Regression 3: Asymptotic 2


Chapter 05

Cont. Asymptotic Normality 2 Asymptotic Standard Errors


If u is not normally distributed, we sometimes
Because the t distribution approaches the will refer to the standard error as an asymptotic
normal distribution for large df, we can also standard error, since
say that
(βˆ ) ( ) a
− β j se βˆ j ~ t n − k −1   
(5.8)
( )
se βˆ j =
σˆ 2
(
SST j 1 − R 2j ) ,    
(5.9)'
j

Note that while we no longer need to ( )


se βˆ j ≈ c j n       
(5.10)
assume normality with a large sample, we So, we can expect standard errors to shrink at a
do still need homoskedasticity. rate proportional to the inverse of the square root
of n.
Econometrics 13 Econometrics 14

Large Sample Tests: LM statistic Cont. LM Statistic


Suppose we have a standard model,
Once we are using large samples and relying
y = β0 + β1x1 + β2x2 + . . . + βkxk + u
on asymptotic normality for inference, we
And our null hypothesis is
can use more that t and F stats.
H0: βk-q+1 = 0, ... , βk = 0
The Lagrange multiplier (LM) statistic is
First, we just run the restricted model
an alternative for testing multiple exclusion ~ ~ ~
y = β 0 + β1 x1 + ... + β k − q xk − q + u~   
(5.13)
restrictions.
Now take the residuals, u~, and regress
Because the LM statistic uses an auxiliary ~
u on x1 , x2 ,..., xk (i.e. all the variables)   
(5.14)
regression it’s sometimes called an nR2 stat.
LM = nRu2 , where Ru2 is from this regression
Econometrics 15 Econometrics 16

Cont. LM Statistic 5.3 Asymptotic Efficiency of OLSE

LM~χ2q, so we can choose a critical value, Estimators besides OLS will be consistent.
c, from a χ2q distribution, or just calculate a However, under the Gauss-Markov
p-value for LM stat. assumptions, the OLS estimators will have
„ With a large sample, the result from an F test the smallest asymptotic variances.
and from an LM test should be similar. „ We say that OLS is asymptotically efficient.
„ Unlike the F test and t test for one exclusion, „ It is important to remember our assumptions
the LM test and F test will not be identical. though, if not homoskedastic, not true.

Econometrics 17 Econometrics 18

Multiple Regression 3: Asymptotic 3

You might also like