Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

Today we are going to review following topics:

Idea of OLS estimation


Properties of Estimator: Unbiasedness, efficiency, and consistency.

1 Idea of OLS estimation


1.1 Linear Population Model
Let the true population model:
yi = + xi + ui ,

E[ui |xi ] = 0

For instance, think of


yi Score of midterm of student i
xi Time spent on preparing the midterm by student i
ui IQ of student i, Age of student i... etc which are unobservable
Role of Assumption E[ui |xi ] = 0 implies that Cov(ui , xi ) = 0.
We assume that this one holds and proceed to estimate the true parameters and
.
(Q) Does this assumption innocuous? What if students with high IQ are more likely to
spend their time on preparing the midterm?
Consequence of violation of E[u|x] = 0 If E[u|x] 6= 0 in truth, then our OLS estimator
will be biased since we assume wrongly that E[u|x] = 0 and get our estimates.
For instance, suppose that in truth, students with higher IQ spent more time preparing
the midterm. So there is a positive correlation between u and X. Then our OLS estimator (constructed under the assumption that X and u are uncorrelated) will overestimate
the true relationship between X and y, i.e. OLS estimator is biased upward.

1.2 OLS estimation


How can one obtain
We want to estimate parameters and by estimators
and .
such estimators?
The idea of estimation method called OLS is to minimize the sum of squared error:

= argmin

n
X

u2i = argmin

i=1

n
X

(yi xi )2

i=1

= argmin

n
X

u2i = argmin

i=1

n
X

(yi xi )2

i=1

Solve two first order conditions you will get extremely familiar formulas:
=

Pn

)(yi
i=1 (xi x
Pn
)2
i=1 (xi x

y)

= y x

which are just numbers.


Example

Suppose your model is


yi = + ui ,

E[ui ] = 0

We want to estimate the parameter . The idea of estimation method called OLS is to
P
minimize the sum of squared error: min ni=1 u2i where ui = yi .

First order condition will give such .


d

2 X
)
=0
=
(2(yi ))
d

i (yi

P
Therefore = i yi /n, or equivalently y.

2 Properties of Estimators
Let be an estimator for parameter .
Unbiasedness

is an unbiased estimator if
=
E[]

Efficiency Suppose there are two estimators and for the parameter . We would
prefer an estimator with small variance. is said to be more efficient than if
< V ()

V ()
Consistency
as n .

0
Theorem: is a consistent estimator for if it is unbiased and V ()

Example We have shown that = y in the regression model y = + ui . Its easy to


0.
show that it is unbiased, and V ()
Example: Review problem 2
Example: Review problem 4
2

You might also like