Professional Documents
Culture Documents
Econometrics I: Nicolás Corona Juárez, Ph.D. 4.11.2020
Econometrics I: Nicolás Corona Juárez, Ph.D. 4.11.2020
Variance of 𝑢𝑖 is supposed
to be constant.
2
Heteroscedasticity
𝑌𝑖 = 𝛽1 + 𝛽2 𝑋2 + 𝛽3 𝑋3 + 𝑢𝑖
• Heteroscedasticity: Consequences?
• Heteroscedasticity: How to detect it?
• Heteroscedasticity: How to fix this?
3
Heteroscedasticity
𝑌 Savings
The variance of 𝑢𝑖 , that is
𝜎𝑖2 , increases with the
income level.
𝛽1 + 𝛽2 𝑋𝑖
𝑋
More savings and more
Income
possibilities to make use
of those savings
4
Number of employees
The problem of heteroscedasticity
is more common in cross sectional
data than in time series data.
• Research question: Is the wage per worker (employee) explained by the number of employees?
𝑌𝑖 = 𝛽1 + 𝛽2 𝑋𝑖 + 𝑢𝑖
6
OLS estimation under heteroscedasticity
7
OLS estimation under heteroscedasticity
What happens with the OLS estimators and their variances if there is
heteroscedasticity but all other OLS assumptions hold?
𝜎2 σ 𝑥𝑖2 𝜎𝑖2
𝑉𝑎𝑟 𝛽መ2 = 𝑉𝑎𝑟 𝛽መ2 = 2
σ 𝑥𝑖2 σ 𝑥𝑖2
8
OLS estimation under heteroscedasticity
With heteroscedasticity:
9
Generalized Least Squares (GLS)
𝑌𝑖 = 𝛽1 𝑋0𝑖 + 𝛽2 𝑋𝑖 + 𝑢𝑖 Equation 1
Where 𝑋0𝑖 = 1 for each 𝑖
𝑌𝑖 𝑋0𝑖 𝑋𝑖 𝑢𝑖
= 𝛽1 + 𝛽2 + Equation 2
𝜎𝑖 𝜎𝑖 𝜎𝑖 𝜎𝑖
10
Generalized Least Squares (GLS)
1
𝑣𝑎𝑟 𝑢𝑖∗ =𝐸 𝑢𝑖∗ 2 = 2 𝐸 𝑢𝑖2 Because 𝜎𝑖2 is known.
𝜎𝑖
1
𝑣𝑎𝑟 𝑢𝑖∗ = 𝐸 𝑢𝑖∗ 2 = 𝜎𝑖
2
𝜎𝑖2
The variance of the transformed
error is a constant. 𝑣𝑎𝑟 𝑢𝑖∗ = 𝐸 𝑢𝑖∗ 2 =𝟏 A constant
11
Generalized Least Squares (GLS)
• 𝛽መ1∗ and 𝛽መ2∗ are the Generalized Least Squares (GLS) estimators.
• Note that 𝛽መ1 and 𝛽መ2 are not BLUE (they are not efficient; i.e. do not have
minimum variance).
12
Generalized Least Squares (GLS): How do we estimate 𝜷∗𝒌 ?
2
σ 𝑢ො 𝑖2∗ =σ 𝑌𝑖∗ − 𝛽መ1∗ 𝑋0𝑖
∗
− 𝛽መ2∗ 𝑋𝑖∗
2 2
𝑢ො 𝑖 𝑌𝑖 𝑋0𝑖 𝑋𝑖
This is the same: = መ
− 𝛽1 ∗ መ
− 𝛽2∗
𝜎𝑖 𝜎𝑖 𝜎𝑖 𝜎𝑖
13
Generalized Least Squares (GLS): How do we estimate 𝜷∗𝒌 ?
This means that GLS reduces the weighted sum of squares of the residuals:
2
σ 𝑤𝑖 𝑢ො 𝑖2 = σ 𝑤𝑖 𝑌𝑖 − 𝛽መ1∗ − 𝛽መ2∗ 𝑋𝑖∗
1
The weights are: 𝑤𝑖 =
𝜎𝑖2
14
Generalized Least Squares (GLS): How do we estimate 𝜷∗𝒌 ?
ෝ𝑖2
𝜕 σ 𝑤𝑖 𝑢
= 2 σ 𝑤𝑖 𝑌𝑖 − 𝛽መ1∗ − 𝛽መ2∗ 𝑋𝑖 −1
𝜕𝛽1∗
𝜕 σ 𝑤𝑖 𝑢ො 𝑖2
= 2 𝑤 𝑌
𝑖 𝑖 − መ
𝛽1
∗
− መ
𝛽 ∗
2 𝑋𝑖 −𝑋𝑖
∗
𝜕𝛽2
15
Generalized Least Squares (GLS): How do we estimate 𝜷∗𝒌 ?
2 σ 𝑤𝑖 𝑌𝑖 − 𝛽መ1∗ − 𝛽መ2∗ 𝑋𝑖 −1 = 0
16
Generalized Least Squares (GLS): How do we estimate 𝜷∗𝒌 ?
σ 𝑤𝑖 𝑌𝑖 = 𝛽መ1∗ σ 𝑤𝑖 + 𝛽መ2∗ σ 𝑤𝑖 𝑋𝑖
17
Generalized Least Squares (GLS): How do we estimate 𝜷∗𝒌 ?
𝛽መ1∗ = 𝑌ത ∗ − 𝛽መ2∗ 𝑋ത ∗
σ 𝑤𝑖 σ 𝑤𝑖 𝑋𝑖 𝑌𝑖 − σ 𝑤𝑖 𝑋𝑖 σ 𝑤𝑖 𝑌𝑖
𝛽መ2∗ =
σ 𝑤𝑖 σ 𝑤𝑖 𝑋𝑖2 − σ 𝑤𝑖 𝑋𝑖 2
18
Generalized Least Squares (GLS): How do we estimate 𝜷∗𝒌 ?
σ 𝑤𝑖
𝑣𝑎𝑟(𝛽መ2∗ ) =
σ 𝑤𝑖 σ 𝑤𝑖 𝑋𝑖2 − σ 𝑤𝑖 𝑋𝑖 2
1
Where 𝑤𝑖 =
𝜎𝑖2
19
OLS and GLS: Difference
Here we minimize the sum of
squared residuals (observations
In OLS we reduce: are given the same weight).
2
σ 𝑢ො 𝑖2 = σ 𝑌𝑖 − 𝛽መ1 − 𝛽መ2 𝑋𝑖
20
OLS and GLS: Difference
In GLS the observation C will be
given a lower weight than A and B
In OLS each 𝑢ො 𝑖2 belonging to the when minimizing the SSR.
points A, B and C will be given
the same weight when
minimizing the SSR.
21
OLS and GLS: Difference
.. note that this is a weighted RSS; thus, this procedure is also known as Weighted
Least Squares.
22
OLS and Heteroscedasticity: What happens if we estimate the model?
23
OLS and Heteroscedasticity: What happens if we estimate the model?
24
OLS and Heteroscedasticity: What happens if we estimate the model?
25
How to detect the presence of Heteroscedasticity?
26
How to detect Heroscedasticity: Informal methods
• Plot the estimated residuals vs the fitted values or the estimated residuals vs
some of the control variables in the model.
27
ෝ 𝟐 vs. 𝒀
Plot 𝒖
In (a) there is no
systematic pattern.
28
ෝ 𝟐 vs. 𝑿
Plot 𝒖
In (a) there is no
systematic pattern.
29
How to detect Heteroscedasticity: Formal methods
Park test
Consider the model:
𝑌𝑖 = 𝛽1 + 𝛽2 𝑋𝑖 + 𝑢𝑖
30
How to detect Heteroscedasticity: Formal methods
Park (1966) suggests that 𝜎𝑖2 is some type of function of the control variable 𝑋𝑖 :
𝛽
𝜎𝑖2 = 𝜎 2 𝑋𝑖 𝑒 𝑣𝑖
or
𝑙𝑛𝜎𝑖2 = 𝑙𝑛𝜎 2 + 𝛽𝑙𝑛𝑋𝑖 + 𝑣𝑖
𝑙𝑛𝑢ො 𝑖2 = 𝛼 + 𝛽𝑙𝑛𝑋𝑖 + 𝑣𝑖
31
How to detect Heteroscedasticity: Formal methods
𝑌𝑖 = 𝛽1 + 𝛽2 𝑋𝑖 + 𝑢𝑖
32
How to detect Heteroscedasticity: Formal methods
𝑙𝑛𝑢ො 𝑖2 = 𝛽1 + 𝛽2 𝑋𝑖 + 𝑣𝑖
We want to test 𝐻0 : 𝛽መ2 = 0
33
Breusch-Pagan-Godfrey Test
𝑌𝑖 = 𝛽1 + 𝛽2 𝑋2𝑖 + ⋯ + 𝛽𝑘 𝑋𝑘𝑖 + 𝑢𝑖
34
Breusch-Pagan-Godfrey Test
𝐻0 : 𝛼2 = 𝛼3 = ⋯ = 𝛼𝑚 = 0
35
Breusch-Pagan-Godfrey Test: Steps
ෝ𝑖2
σ𝑢
Step 2. Obtain 𝜎 2 =
𝑛
ෝ𝑖2
𝑢
Step 3. Build up the variables 𝑝𝑖 defined as: 𝑝𝑖 = 2
𝜎
36
Breusch-Pagan-Godfrey Test: Steps
1
Φ = (𝑆𝐸𝐶)
2
We have that:
2
Φasym
~ 𝜒𝑚−1
37
Breusch-Pagan-Godfrey Test: Steps
Step 5. If Φ𝑒𝑠𝑡𝑖𝑚𝑎𝑡𝑒𝑑 > Φ𝑐𝑟𝑖𝑡𝑖𝑐𝑎𝑙 at the chosen significance level, then reject:
𝐻0 : 𝛼2 = 𝛼3 = ⋯ = 𝛼𝑚 = 0
38
White Test
𝑌𝑖 = 𝛽1 + 𝛽2 𝑋2𝑖 + 𝛽3 𝑋3𝑖 + 𝑢𝑖
Step 1. Estimate the model with OLS and obtain the residuals 𝑢ො 𝑖
Step 2. Estimate the auxiliary regression:
39
White Test
𝐻0 : 𝛼2 = 𝛼3 = ⋯ = 𝛼6 = 0
If this is fulfilled then:
𝑢ො 𝑖2 = 𝛼1
40
White Test
Step 3. Calculate:
2
𝑛𝑅2 asym
~ 𝜒𝑔𝑙
In this example we have that 𝑔𝑙 = 5 given that we have 5 control variables in the
auxiliary regression.
41
White Test
2 2
Step 4. If 𝜒𝑒𝑠𝑡𝑖𝑚𝑎𝑡𝑒𝑑 > 𝜒𝑐𝑟𝑖𝑡𝑖𝑐𝑎𝑙 at the chosen significance level, then reject:
𝐻0 : 𝛼2 = 𝛼3 = ⋯ = 𝛼𝑚 = 0
42
Heteroscedasticity: Corrective measures
43
Heteroscedasticity: Corrective measures
• This means that once the problem of heteroscedasticity has been corrected, the
significance of a variable can disappear.
44
Heteroscedasticity: Corrective measures
You can have a look at the appendix 11 A.4 for a general description of the method.
The standard errors corrected using the White method are called:
45