Advanced Econometrics I: Tesfaye Chofana (PHD)

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 59

ECSU

Advanced Econometrics I
Tesfaye Chofana (PhD)
March, 2020

Tesfaye C. (ECSU)
Chapter 3-2 outline
Autocorrelation
 Definition and concepts of AC
 Causes of AC
 Properties of OLS estimator
 Testing for autocorrelation
 Correcting autocorrelation
 Application using STATA

Tesfaye C. (ECSU)
Definition and concepts
Suppose a linear regression model
𝑌𝑡 = 𝛽1 + 𝛽2 𝑋2𝑡 + ⋯ + 𝛽𝑘 𝑋𝑘𝑡 + 𝑢𝑡
where t is time and t= 1, 2, …, n
No autocorrelation
 𝑐𝑜𝑣 𝑢𝑡 , 𝑢𝑡+𝑠 = 0 for 𝑠 ≠ 𝑡

Autocorrelation (AC) or serial correlation


 𝑐𝑜𝑣 𝑢𝑡 , 𝑢𝑡+𝑠 ≠ 0 for 𝑠 ≠ 𝑡

Tesfaye C. (ECSU)
Non-autocorrelation
• Let a model

• Assumption of no AC & heteroskedasticity


expressed by the following covariance matrix

• The off-diagonal
elements of the
covariance matrix are
nonzero

Tesfaye C. (ECSU)
Autocorrelation
The autocorrelation of order k as the ratio
of autocovariance to variance of the random
error

In presence of autocorrelation, the structure


of the variance–covariance matrix will be the
following:
Tesfaye C. (ECSU)
Autocorrelation

Or

Tesfaye C. (ECSU)
Autocorrelation definition …
Autocorrelation is the persistence of the
effects of excluded variables
For cross-sectional data:
The expenditure behavior of a household does
affect the expenditure of any other household
For time series and panel data:
AC exists when error term at time t is
correlated with error term at any other time
Tesfaye C. (ECSU)
Autocorrelation definition …
The disturbance occur one time period
carry over into another period.
The effect of machine breakdown in one
month may affect current month’s output
and subsequent months output
Output is function of K & L, labor strike
this quarter may affect output in next
quarter

Tesfaye C. (ECSU)
Autocorrelation definition …
If by chance a correlation is observed in
cross-sectional units, it is called spatial
autocorrelation, that is, correlation in
space rather than over time (serial
correlation).
E.g., Consumption increases with income
for a family may affect another family
consumption
Tesfaye C. (ECSU)
Causes of autocorrelation
Misspecification of the model
 Omitted variables (or exclusion of relevant variables)
 𝑦𝑡 = 𝛽1 + 𝛽2 𝑋2𝑡 + 𝛽3 𝑋3𝑡 + 𝛽4 𝑋4𝑡 + 𝑢𝑡 (correct
model)
 𝑦𝑡 = quantity of beef demand, 𝑋2 price of beef, 𝑋3
consumer income , 𝑋4 price of pork, t time
 𝑦𝑡 = 𝛽1 + 𝛽2 𝑋2𝑡 + 𝛽3 𝑋3𝑡 + 𝑣𝑡 (misspecified model),
𝑣𝑡 = 𝛽4 𝑋4𝑡 + 𝑢𝑡
 The error or disturbance term 𝑣𝑡 will reflect a systematic
pattern as price of pork affects beef consumption

Tesfaye C. (ECSU)
Causes of autocorrelation …
 Incorrect functional forms,
 E.g., the correct specification of production
function is Translog while Cobb-Doulas was
specified
 An inadequate dynamic specification of
the model, inappropriate lag length
Lags:
𝑐𝑜𝑛𝑠𝑡 = 𝛽1 + 𝛽2 𝑐𝑜𝑛𝑠𝑡−1 + 𝛽3 𝑖𝑛𝑐𝑡−1 + 𝑢𝑡

Tesfaye C. (ECSU)
Causes of autocorrelation …
 The reason for lag is that consumers do not
change their consumption habits readily for
psychological, technological and institutional
reasons
 If we neglect lagged term the resulting error
term will reflect a systematic pattern due to
the influence of lagged consumption on current
consumption.

Tesfaye C. (ECSU)
Causes of autocorrelation …
Cobweb Phenomenon: supply reacts to
price with a lag of one time period
𝑆𝑢𝑝𝑝𝑙𝑦𝑡 = 𝛽1 + 𝛽2 𝑃𝑡−1 + 𝑢𝑡
 𝑢𝑡 is not random rather systematic as farmers
plan and adjust last year output given current
output price
Inertia, or sluggishness: GNP, inflation,
employment etc., cycles
Tesfaye C. (ECSU)
First order autocorrelation
First order autocorrelation is exists when
observed error tends to be influenced by
the observed error that immediately
precedes it in previous time period
First order autoregressive process, AR(1)
process
Suppose a model: 𝑦𝑡 = 𝛽𝑥𝑡 + 𝜀𝑡 , t= 1,
2,…,n
Tesfaye C. (ECSU)
First order autocorrelation
In this case the error term is assumed to
depend upon its predecessor as follows:
𝜀𝑡 = 𝜌𝜀𝑡−1 + 𝑣𝑡
Where 𝐸(𝑣𝑡 ) = 0, 𝑣𝑎𝑟(𝑣𝑡 ) = 𝜎𝑣2 ,
𝑐𝑜𝑣(𝑥𝑡 , 𝑣𝑡 ) = 0 and 𝑐𝑜𝑣(𝑣𝑡 , 𝑣𝑠 ) = 0 for
𝑡≠𝑠
While 𝑐𝑜𝑣(𝜀𝑡 , 𝜀𝑠 ) ≠ 0 for 𝑡 ≠ 𝑠
𝑣𝑎𝑟 (𝜀𝑡 ) ≠ 𝜎𝜀2
Tesfaye C. (ECSU)
First order autocorrelation

Alternatively, 𝑣𝑡 is independent over time


𝑣𝑡 fulfills Gauss-Markov assumptions (i.e.,
2-5 CLRM assumptions in chapter 2)
𝜀𝑡 = 𝜌𝜀𝑡−1 + 𝑣𝑡 assumes that the value of
error term in any observation is equal to 𝜌
times its value in the previous observation
plus a fresh component 𝑣𝑡

Tesfaye C. (ECSU)
First order autocorrelation
Where, 𝜌 is autocorrelation coefficient
𝜀𝑡 is the error from a regression in the
current time period
𝜀𝑡−1 is the error from the preceding time
period
If 𝜌= 0.25, then on average each error will
tend to be one fourth of the value of the
preceding error

Tesfaye C. (ECSU)
First order autocorrelation
Let’s derive covariance matrix for 𝜀𝑡 ,
Assume 𝜌 < 1,
When 𝜌 < 1 holds AR(1) process is stationary
 A stationary process is such that the mean, and
variance of 𝜀𝑡 do not change overtime while 𝜀𝑡
covariance do not depend on time it occurred
E.g. 𝑣𝑎𝑟 𝜀𝑡 = 𝑣𝑎𝑟 𝜀𝑡−2
Imposing stationerity, it easily follows from
𝐸 𝜀𝑡 = 𝜌 𝐸 𝜀𝑡−1 + 𝐸 𝑣𝑡 = 0
=0

Tesfaye C. (ECSU)
First order autocorrelation
𝑣𝑎𝑟 𝜀𝑡 = 𝑣𝑎𝑟 𝜌𝜀𝑡−1 + 𝑣𝑡
= 𝜌2 𝑣𝑎𝑟(𝜀𝑡−1 ) + 𝜎𝑣2
⇒ 𝑣𝑎𝑟 𝜀𝑡 − 𝜌2 𝑣𝑎𝑟(𝜀𝑡−1 ) = 𝜎𝑣2
From Stationarity assumption 𝑣𝑎𝑟 𝜀𝑡 =
𝑣𝑎𝑟 𝜀𝑡−1
⇒ 𝑣𝑎𝑟 𝜀𝑡 − 𝜌2 𝑣𝑎𝑟(𝜀𝑡 ) = 𝜎𝑣2
⇒ 𝑣𝑎𝑟 𝜀𝑡 (1 − 𝜌2 ) = 𝜎𝑣2
𝜎𝑣2
⇒ 𝑣𝑎𝑟 𝜀𝑡 = , the diagonal element
(1−𝜌2 )
Tesfaye C. (ECSU)
First order autocorrelation
The non-diagonal element of variance-covariance
matrix of 𝜀 is
𝑐𝑜𝑣 𝜀𝑡 , 𝜀𝑡−1 = 𝐸(𝜀𝑡 𝜀𝑡−1 )
⇒ 𝐸 𝜌𝜀𝑡−1 + 𝑣𝑡 𝜀𝑡−1
2
⇒ 𝐸(𝜌𝜀𝑡−1 + 𝑣𝑡 𝜀𝑡−1 )
2
⇒ 𝜌𝐸 𝜀𝑡−1 + 𝐸 𝑣𝑡 𝜀𝑡−1
⇒ 𝜌 𝑣𝑎𝑟(𝜀𝑡 )
𝜎𝑣2
⇒ 𝑐𝑜𝑣 𝜀𝑡 , 𝜀𝑡−1 = 𝜌𝜎𝜀2 = 𝜌
(1−𝜌2 )
Tesfaye C. (ECSU)
First order autocorrelation
Similarly, the covariance between error terms 2 or
3 periods apart is
𝑐𝑜𝑣 𝜀𝑡 , 𝜀𝑡−2 = 𝐸 𝜀𝑡 𝜀𝑡−2 = 𝜌2 𝜎𝜀2
𝑐𝑜𝑣 𝜀𝑡 , 𝜀𝑡−3 = 𝐸 𝜀𝑡 𝜀𝑡−3 = 𝜌3 𝜎𝜀2 Given AR.doc
Or in general,
𝑐𝑜𝑣 𝜀𝑡 , 𝜀𝑡−𝑠 = 𝐸 𝜀𝑡 𝜀𝑡−𝑠 = 𝜌 𝑠 𝜎𝜀2
Equivalently, we have
𝑐𝑜𝑣 𝜀𝑡 , 𝜀𝑠 = 𝐸 𝜀𝑡 𝜀𝑠 = 𝜌 𝑠−𝑡 𝜎𝜀2
Thus, the relationship between disturbances
depends on the value of parameter 𝜌.
Tesfaye C. (ECSU)
First order autocorrelation
𝜎 2
𝑐𝑜𝑣 𝜀𝑡 , 𝜀𝑡−𝑠 = 𝜌 𝑠 ( 2)
𝑣
shows that
1−𝜌
All elements in the error term vector 𝜀 are
mutually correlated with a decreasing
covariance if the distance in time gets large
(i.e. if s gets large as 𝜌 ≤ 1)
The covariance matrix of 𝜀 is thus a full
matrix (a matrix without zero elements)

Tesfaye C. (ECSU)
First order autocorrelation
From the above relation:
𝑐𝑜𝑣 𝜀𝑡 , 𝜀𝑡−1 = 𝜌𝜎𝜀2
⇒ 𝜌 = 𝑐𝑜𝑣 𝜀𝑡 , 𝜀𝑡−1 /𝜎𝜀2
𝑐𝑜𝑣 𝜀𝑡 ,𝜀𝑡−1
⇒𝜌= , as 𝑎𝑟 𝜀𝑡 =
𝑣𝑎𝑟(𝜀𝑡 ) 𝑣𝑎𝑟(𝜀𝑡−1 )
𝑣𝑎𝑟 𝜀𝑡−1 = 𝜎𝜀2
𝜌 is the correlation coefficient between 𝜀𝑡 and
𝜀𝑡−1
That is, the correlation coefficient is the ratio of
covariance to variance.

Tesfaye C. (ECSU)
First order autocorrelation
𝑐𝑜𝑟 𝜀𝑡 , 𝜀𝑡−1 = 𝜌, 𝑐𝑜𝑟 𝜀𝑡 , 𝜀𝑡−2 = 𝜌2 and so
on
Covariance is a measure of association
between two variables
𝑐𝑜𝑣 𝑦, 𝑥 is a statistics summarizes the
association with a single number
1 𝑛
𝑐𝑜𝑣 𝑦, 𝑥 = 𝑖=1(𝑥 − 𝑥 )(𝑦 − 𝑦) for two
𝑛
variables

Tesfaye C. (ECSU)
Interpretation of 𝜌
Correlation coefficient for x & y two random
variables
𝑐𝑜𝑣 𝑥,𝑦
𝑐𝑜𝑟𝑟 = = 𝜌𝑥𝑦
𝑣𝑎𝑟(𝑥) 𝑣𝑎𝑟(𝑦)
Where −1 ≤ 𝜌 ≤ 1
𝜌 = −1 ⇒ perfect –ve association between x
&y
𝜌 = 1 ⇒ perfect +ve association between x &
y
𝜌 = 0 ⇒ no association between x & y
Tesfaye C. (ECSU)
Interpretation of 𝜌
The value of ρ must fall between -1 and 1.
If ρ > 1 error tend to be larger than the one
before it, rare case
As the error gets larger and larger, the
regression become unstable, explodes

Tesfaye C. (ECSU)
Interpretation of 𝜌
If ρ = 1 or − 1, the effect of one error on
next error would not die out over time.
For this reason ρ < 1
If ρ = 0, then the one error nothing to do
with the next error, so there is no AC
If ρ is positive, the errors tend to have the
same sign from one period to the next

Tesfaye C. (ECSU)
Interpretation of 𝜌
If ρ is negative, the errors tend to alternate
signs.
When negative autocorrelation a positive
observed error term is usually followed by
a negative one, which is usually followed by
positive and so on.

Tesfaye C. (ECSU)
Positive versus negative AC
Correlogram depicts below zero is negative autocorrelation

• For most lags AC for residuals is insignificant

• lags 7, 8, and 17 have significant AC

Tesfaye C. (ECSU)
Time /lags
Interpretation of 𝜌
Negative autocorrelation is less common
than positive autocorrelation
If seasonal data are being used, the error 𝑒𝑡
could depend on the error from the same
season a year ago 𝑒𝑡−4 instead of 𝑒𝑡−1 .
In the case of AR(1) is given by
𝑒𝑡 = 𝜌𝑒𝑡−4 + 𝑣𝑡

Tesfaye C. (ECSU)
Properties of OLS estimators
OLS estimators under autocorrelation are
unbiased and consistent
Simple linear model in deviation form is
𝑦𝑖 = 𝛽𝑥𝑖 + 𝜀𝑖 i= 1, 2, …, n
𝜀𝑡 = 𝜌𝜀𝑡−1 + 𝑣𝑡 , 𝜌 < 1,
𝑣𝑡 satisfies Gauss-Markov assumptions
𝑛
𝑖=1 𝑥𝑖 𝑦𝑖
The OLS estimator of 𝛽 is: 𝛽 = 𝑛 2
𝑥
𝑖=1 𝑖

Tesfaye C. (ECSU)
Properties of OLS estimators
𝑛
𝑖=1 𝑥𝑖 𝜀𝑖
⇒𝛽=𝛽+ 𝑛 2
𝑥
𝑖=1 𝑖
𝑛
𝑖=1 𝑥𝑖 𝐸(𝜌𝜀𝑡−1 +𝑣𝑡 )
𝐸(𝛽 ) = 𝛽 + 𝑛 2
𝑥
𝑖=1 𝑖

⇒ 𝐸(𝛽 ) = 𝛽
Still 𝛽 is unbiased estimator of 𝛽
Recall that the variance of OLS estimator
𝜎𝜀2
without autocorrelation is 𝑣𝑎𝑟 𝛽 = 𝑛 2
𝑥
𝑖=1 𝑖

Tesfaye C. (ECSU)
Properties of OLS estimators
Variance, AR(1)
Under AR(1) scheme, the variance of OLS
estimator:
2
2 𝑥𝑡 𝜀 𝑡
𝑣𝑎𝑟(𝛽)𝐴𝑅(1) = 𝐸 𝛽 − 𝛽 =𝐸
𝑥𝑡2
1 𝑛 2 2 𝑛
2𝐸 𝑖=1 𝑡 𝜀𝑡
𝑥 + 2 𝑠≠𝑡 𝑥𝑡 𝜀𝑡 𝑥𝑠 𝜀𝑠
𝑥𝑡2
1
= 2 2 𝑥𝑡 2 𝐸(𝜀𝑡2 ) + 2 𝑛
𝑡≠𝑠 𝑥𝑡 𝑥𝑠 𝐸(𝜀𝑡 𝜀𝑠 )
𝑥𝑡
But 𝐸(𝜀𝑡 𝜀𝑠 ) = 𝜌 𝑠−𝑡 𝜎𝜀2

Tesfaye C. (ECSU)
Properties of OLS estimators
𝑛
𝑥𝑡 2 𝜎𝜀2 2𝜌 𝑠−𝑡 𝜎𝜀2 𝑡≠𝑠 𝑥𝑡 𝑥𝑠
= +
𝑥𝑡2 𝑥𝑡2 𝑥𝑡2 𝑥𝑡2
𝑛
𝜎𝜀2 2𝜌 𝑠−𝑡 𝜎𝜀2 𝑡≠𝑠 𝑥𝑡 𝑥𝑠
= +
𝑥𝑡2 𝑥𝑡2 𝑥𝑡2
𝑛
2𝜌 𝑠−𝑡 𝜎𝜀2 𝑡≠𝑠 𝑥𝑡 𝑥𝑠
= 𝑣𝑎𝑟(𝛽 ) +
𝑥𝑡2 𝑥𝑡2

Tesfaye C. (ECSU)
Properties of OLS estimators
Therefore, if 𝜌 > 0 the 𝑥𝑡 is positively
correlated with 𝑥𝑠
i.e., the second term on the right hand side is
usually positive and we have 𝑣𝑎𝑟(𝛽)𝐴𝑅(1) >
𝑣𝑎𝑟(𝛽)
Note: When 𝜌 and 𝑥𝑡 𝑥𝑠 > 0 is common in
economic time series data
⇒High consumption in period 1 leads to high
consumption in period 2

Tesfaye C. (ECSU)
Consequences of AC
OLS estimators are still linear, unbiased
OLS estimators are consistent (i.e., their
variance approaches to zero as the sample size
gets larger and larger)
OLS estimators are no longer efficient,
minimum variance
The estimated variance of the OLS estimators
are biased and as a consequence, the
conventional confidence intervals and tests of
significance are not valid. Consquences of AC.docx

Tesfaye C. (ECSU)
Consequences of AC
When AC using OLS results in
Consquences of AC.docx

 Variance of regression coefficients under


estimated leading to
 Narrow confidence intervals
𝜀𝑖2
 High value of 𝑅2 (i.e., 𝑅2 = 1 − )
𝑦𝑖2
 Inflated t ratio

Tesfaye C. (ECSU)
Testing for AR(1) AC
Graphical method
Plot the estimated residuals against time
 𝜀𝑡 = 𝑦𝑡 − 𝑦𝑡 against time
If we see clustering of neighboring
residuals one or the other side of the line
𝜀𝑡 = 0, then a such clustering is a sign that
errors are AC (see the next figure)

Tesfaye C. (ECSU)
Graphical Method

Tesfaye C. (ECSU)
T-test for AR(1)
Want to be able to test for whether the
errors are serially correlated or not
Want to test the null that r = 0 in 𝜀 t = r 𝜀 t-1
+ vt, t =2,…, n, where 𝜀 t is the model error
term and vt fulfills Gauss-Markov conditions
With strictly exogenous regressors, the test
is very straightforward – simply regress the
residuals on lagged residuals and use a t-test

Tesfaye C. (ECSU)
Durbin-Watson test for AR(1)
An alternative is the Durbin-Watson
(DW) statistic, which is calculated by many
packages
The DW test statistic is computed as:
𝑇 2
𝑡=2 𝜀𝑡 −𝜀𝑡−1
𝑑= 𝑇 𝜀2
𝑡=1 𝑡
We compare the test statistic with the
Durbin-Watson lower 𝑑𝐿 (𝛼) and upper
𝑑𝑈 (𝛼) bounds (critical values).
Tesfaye C. (ECSU)
Durbin-Watson test for AR(1)
Decision rule:
reject 𝐻0 if 𝑑 < 𝑑𝐿 (𝛼); do not reject
𝑑 > 𝑑𝑈 (𝛼) DW Table.docx

The test is inconclusive if 𝑑𝐿 𝛼 < 𝑑 < 𝑑𝑈 (𝛼)

Tesfaye C. (ECSU)
Limitations of DW test
There are critical regions where the test is
inconclusive
The test is valid when there is an intercept
term in the model
The test is invalid when the lagged values of
dependent variable appears as regressors
The test is valid for the AR(1) error
scheme only.
Tesfaye C. (ECSU)
Testing for AR(1) Serial
Correlation (continued)
If the regressors are not strictly
exogenous, then neither the t or DW test
will work
Regress the residual (or y) on the lagged
residual and all of the x’s
The inclusion of the x’s allows each xjt to
be correlated with 𝜀 t-1, so don’t need
assumption of strict erogeneity

Tesfaye C. (ECSU)
Testing for Higher Order AC
Can test for AR(q) serial correlation in the
same basic manner as AR(1)
Just include q lags of the residuals in the
regression and test for joint significance
Can use F test or LM test, where the LM
version is called a Breusch-Godfrey
(BG) test and is (n)R2 using R2 from
residual regression
Can also test for seasonal forms
Tesfaye C. (ECSU)
Breusch-Godfrey test steps
Assume that the error term follows the
autoregressive scheme of order p, AR(p)
given by:
𝜀𝑡 = 𝜌1 𝜀𝑡−1 + 𝜌2 𝜀𝑡−2 + ⋯ + 𝜌𝑝 𝜀𝑡−𝑝 + 𝑣𝑡
𝑣𝑡 satisfies Gauss-Markov conditions
Test hypothesis, 𝐻0 : 𝜌1 = 𝜌2 = ⋯ = 𝜌𝑝 =
0

Tesfaye C. (ECSU)
Breusch-Godfrey test steps
1) Estimate the model 𝑌𝑡 = 𝛼 + 𝛽𝑋𝑡 + 𝜀𝑡
t= 1, 2, …, T
2) Predict the residuals, 𝜀𝑡 , from step 1
3) Regress 𝜀𝑡 𝑜𝑛 𝑋𝑡 , 𝜀𝑡−1 , 𝜀𝑡−2 , … , 𝜀𝑡−𝑝
⇒ 𝜀𝑡 = 𝛼 + 𝛽𝑋𝑡 + 𝜌1 𝜀𝑡−1 + 𝜌2 𝜀𝑡−2 + ⋯
+ 𝜌𝑝 𝜀𝑡−𝑝 + 𝑣𝑡

Tesfaye C. (ECSU)
Breusch-Godfrey test steps
4) Obtain 𝑅2 , coefficient of determination
from auxiliary regression
5) If the sample size T is large, Breusch and
Godfrey have shown that (𝑇 − 𝑝)𝑅2
follows the Chi-square (𝜒 2 ) distribution
with p degrees of freedom.

Tesfaye C. (ECSU)
Breusch-Godfrey test steps
Decision rule:
 Reject the null hypothesis of no AC if
(𝑇 − 𝑝)𝑅2 exceeds the critical value from the
𝜒 2 distribution with p degrees of freedom for a
given level of significance, 𝛼.

Tesfaye C. (ECSU)
Advantages of the BG test
a) The test is always conclusive.
b) The test is valid when lagged values of
the dependent variable appear as
regressors.
c) The test is valid for higher order AR
schemes (not just for AR(1) error scheme).

Tesfaye C. (ECSU)
PACF test
Test based on the partial autocorrelation
function (PACF) of OLS residuals
We plot the PACF of the OLS residuals.
If the function at lag one extends beyond the
95% upper or lower confidence limits, then
this is an indication that the errors follow the
AR(1) process.
Higher order error processes can be detected
similarly.
Tesfaye C. (ECSU)
PACF test
Fig Correlogram for residuals from AR(1)

Conclusion:
• The 1st three AC are
significant

• The errors are


serially correlated.
• More lags are
needed to improve
the forecasting
specification
• The least squares
standard errors given
are invalid.

Tesfaye C. (ECSU)
Correcting for AC
Assume errors follow AR(1) so 𝜀 t = r 𝜀 t -1
+ vt, t =2,…, n and 𝜌 < 1
Var(𝜀 t) = 𝜎𝑣2 /(1-r2)
Where vt satisfies the Gauss–Markov
conditions
We need to transform the equation so we
have no serial correlation or AC in the
errors like vt = 𝜀 t - r 𝜀 t -1
Tesfaye C. (ECSU)
Correcting for AC (continued)
Consider that since yt = b0 + b1xt + 𝜀 t , then
yt-1 = b0 + b1xt-1 + 𝜀 t -1
If you multiply the second equation by r, and
subtract if from the first you get
yt – r yt-1 = (1 – r)b0 + b1(xt – r xt-1) + vt ,
since vt = 𝜀 t – r 𝜀 t-1 or
𝑌𝑡 − 𝜌𝑌𝑡−1 = 𝛼(1 − 𝜌) + 𝛽 𝑋𝑡 − 𝜌𝑋𝑡−1 + 𝜀𝑡 − 𝜌𝜀𝑡−1
=𝑌𝑡∗ =𝛼∗ 𝑋𝑡∗ 𝑣𝑡

Tesfaye C. (ECSU)
Correcting for AC (continued)
𝑌𝑡∗ = 𝛼 ∗ + 𝛽𝑋𝑡∗ + 𝑣𝑡
This quasi-differencing fulfills Gauss-Markove
conditions and results in a model without AC
𝑣𝑡 = 𝜀𝑡 − 𝜌𝜀𝑡−1 satisfies Gauss- Markov
condirtions
We can apply OLS to 𝑌𝑡∗ = 𝛼 ∗ + 𝛽𝑋𝑡∗ + 𝑣𝑡 to
get estimators which are BLUE.

Tesfaye C. (ECSU)
Feasible GLS Estimation
Problem with this method is that we don’t
know r, so we need to get an estimate first
a) From OLS residuals
Can just use the estimate obtained from
regressing residuals on lagged residuals
𝜀𝑡 = 𝜌𝜀𝑡−1 + 𝑣𝑡 , 𝜌 ≤1
𝑣𝑡 satisfies Gauss-Markov conditions

Tesfaye C. (ECSU)
Feasible GLS Estimation
b) From Durbin’s Method
Run the following regression
𝑌𝑡 = 𝛼 + 𝛽𝑋𝑡 + 𝜌𝑦𝑡−1 + 𝛽0 𝑋𝑡 + 𝛽1 𝑋𝑡−1 +
𝛽2 𝑋𝑡−1 + ⋯ + 𝛽𝑝 𝑋𝑡−𝑝 + 𝑣𝑡
An estimator of 𝜌 is estimated coefficient of
𝑦𝑡−1

Tesfaye C. (ECSU)
Feasible GLS Estimation
The resulting OLS estimator for ρ is given
by
rˆ    
1 '
'
t 1 t 1 t 1 t  
Depending on how we deal with the first
observation, this is either called Cochrane-
Orcutt or Prais-Winsten estimation

Tesfaye C. (ECSU)
Feasible GLS (continued)
Often both Cochrane-Orcutt and Prais-
Winsten are implemented iteratively
This basic method can be extended to allow
for higher order serial correlation, AR(q)
Most statistical packages will automatically
allow for estimation of AR models without
having to do the quasi-differencing by hand

Tesfaye C. (ECSU)

You might also like