Introduction To Econometrics, 5 Edition: Chapter 3: Multiple Regression Analysis

You might also like

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 25

Type author name/s here

Dougherty

Introduction to Econometrics,
5th edition
Chapter heading
Chapter 3: Multiple Regression
Analysis

© Christopher Dougherty, 2016. All rights reserved.


MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

EARNINGS = b1 + b2S + b3EXP + u

b1

EARNINGS

EXP

This sequence provides a geometrical interpretation of a multiple regression model with


two explanatory variables.

1
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

EARNINGS = b1 + b2S + b3EXP + u

b1

EARNINGS

EXP

Specifically, we will look at an earnings function model where hourly earnings, EARNINGS,
depend on years of schooling (highest grade completed), S, and years of work experience,
EXP.
2
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

EARNINGS = b1 + b2S + b3EXP + u

b1

EARNINGS

EXP

The model has three dimensions, one each for EARNINGS, S, and EXP. The starting point
for investigating the determination of EARNINGS is the intercept, b1.

3
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

EARNINGS = b1 + b2S + b3EXP + u

b1

EARNINGS

EXP

Literally the intercept gives EARNINGS for those respondents who have no schooling and
no work experience. However, there were no respondents with less than 6 years of
schooling. Hence a literal interpretation of b1 would be unwise.
4
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

EARNINGS = b1 + b2S + b3EXP + u

b1 + b2S
b1
pure S effect
EARNINGS

EXP

The next term on the right side of the equation gives the effect of variations in S. A one
year increase in S causes EARNINGS to increase by b2 dollars, holding EXP constant.

5
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

EARNINGS = b1 + b2S + b3EXP + u

b1 + b3EXP
pure EXP effect

b1

EARNINGS

EXP

Similarly, the third term gives the effect of variations in EXP. A one year increase in EXP
causes earnings to increase by b3 dollars, holding S constant.

6
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

EARNINGS = b1 + b2S + b3EXP + u

b1 + b2S + b3EXP
b1 + b3EXP combined effect of S
pure EXP effect and EXP

b1 + b2S
b1
pure S effect
EARNINGS

EXP

Different combinations of S and EXP give rise to values of EARNINGS which lie on the
plane shown in the diagram, defined by the equation EARNINGS = b1 + b2S + b3EXP. This is
the nonstochastic (nonrandom) component of the model.
7
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

EARNINGS = b1 + b2S + b3EXP + u

b1 + b2S + b3EXP + u
u
b1 + b2S + b3EXP
b1 + b3EXP combined effect of S
pure EXP effect and EXP

b1 + b2S
b1
pure S effect
EARNINGS

EXP

The final element of the model is the disturbance term, u. This causes the actual values of
EARNINGS to deviate from the plane. In this observation, u happens to have a positive
value.
8
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

EARNINGS = b1 + b2S + b3EXP + u

b1 + b2S + b3EXP + u
u
b1 + b2S + b3EXP
b1 + b3EXP combined effect of S
pure EXP effect and EXP

b1 + b2S
b1
pure S effect
EARNINGS

EXP

A sample consists of a number of observations generated in this way. Note that the
interpretation of the model does not depend on whether S and EXP are correlated or not.

9
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

EARNINGS = b1 + b2S + b3EXP + u

b1 + b2S + b3EXP + u
u
b1 + b2S + b3EXP
b1 + b3EXP combined effect of S
pure EXP effect and EXP

b1 + b2S
b1
pure S effect
EARNINGS

EXP

However we do assume that the effects of S and EXP on EARNINGS are additive. The
impact of a difference in S on EARNINGS is not affected by the value of EXP, or vice versa.

10
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

True model Fitted model


Y  1   2 X 2   3 X 3  u Yˆ  b1  b2 X 2  b3 X 3

The regression coefficients are derived using the same least squares principle used in
simple regression analysis. The fitted value of Y in observation i depends on our choice of
b1, b2, and b3.
11
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

True model Fitted model


Y  1   2 X 2   3 X 3  u Yˆ  b1  b2 X 2  b3 X 3

uˆ i  Yi  Yˆi  Yi  b1  b2 X 2 i  b3 X 3 i

The residual ei in observation i is the difference between the actual and fitted values of Y.

12
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

True model Fitted model


Y  1   2 X 2   3 X 3  u Yˆ  b1  b2 X 2  b3 X 3

uˆ i  Yi  Yˆi  Yi  b1  b2 X 2 i  b3 X 3 i

RSS   uˆ i2    Yi  b1  b2 X 2 i  b3 X 3 i 
2

We define RSS, the sum of the squares of the residuals, and choose b1, b2, and b3 so as to
minimize it.

13
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

True model Fitted model


Y  1   2 X 2   3 X 3  u Yˆ  b1  b2 X 2  b3 X 3

uˆ i  Yi  Yˆi  Yi  b1  b2 X 2 i  b3 X 3 i

RSS   uˆ i2    Yi  b1  b2 X 2 i  b3 X 3 i 
2

  (Yi 2  b12  b22 X 22i  b32 X 32i  2b1Yi  2b2 X 2 iYi


 2b3 X 3 iYi  2b1b2 X 2 i  2b1b3 X 3 i  2b2 b3 X 2 i X 3 i )
  Yi 2  nb12  b22  X 22i  b32  X 32i  2b1  Yi
 2b2  X 2 iYi  2b3  X 3 iYi  2b1b2  X 2 i
 2b1b3  X 3 i  2b2 b3  X 2 i X 3 i

First, we expand RSS as shown.

14
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

True model Fitted model


Y  1   2 X 2   3 X 3  u Yˆ  b1  b2 X 2  b3 X 3

uˆ i  Yi  Yˆi  Yi  b1  b2 X 2 i  b3 X 3 i

RSS   Yi 2  nb12  b22  X 22i  b32  X 32i  2b1  Yi

 2b2  X 2 iYi  2b3  X 3 iYi  2b1b2  X 2 i

 2b1b3  X 3 i  2b2 b3  X 2 i X 3 i

RSS RSS RSS


0 0 0
b1 b2 b3

Then we use the first order conditions for minimizing it.

15
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

True model Fitted model


Y  1   2 X 2   3 X 3  u Ŷ  ˆ1  ˆ2 X 2  ˆ3 X 3

 X X 
 2i 2 i  3i 3
 Y  Y   X  X  2

   X 3 i  X 3   Yi  Y    X 2 i  X 2   X 3 i  X 3 
ˆ
2 
  X 2i  X 2    X 3i  X 3      X 2i  X 2   X 3i  X 3  
2 2 2

ˆ1  Y  ˆ2 X 2  ˆ3 X 3

We thus obtain three equations in three unknowns. Solving these equations, we obtain
expressions for the specific values that satisfy the OLS criterion. (The expression for ̂ 3 is
the same as that for ̂ 2 , with the subscripts 2 and 3 interchanged everywhere.)
16
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

True model Fitted model


Y  1   2 X 2   3 X 3  u Ŷ  ˆ1  ˆ2 X 2  ˆ3 X 3

 X X 
 2i 2 i  3i 3
 Y  Y   X  X  2

   X 3 i  X 3   Yi  Y    X 2 i  X 2   X 3 i  X 3 
ˆ
2 
  X 2i  X 2    X 3i  X 3      X 2i  X 2   X 3i  X 3  
2 2 2

ˆ1  Y  ˆ2 X 2  ˆ3 X 3

The expression for ˆ1 is a straightforward extension of the expression for it in simple
regression analysis.

17
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

True model Fitted model


Y  1   2 X 2   3 X 3  u Ŷ  ˆ1  ˆ2 X 2  ˆ3 X 3

 X X 
 2i 2 i  3i 3
 Y  Y   X  X  2

   X 3 i  X 3   Yi  Y    X 2 i  X 2   X 3 i  X 3 
ˆ
2 
  X 2i  X 2    X 3i  X 3      X 2i  X 2   X 3i  X 3  
2 2 2

ˆ1  Y  ˆ2 X 2  ˆ3 X 3

However, the expressions for the slope coefficients are considerably more complex than
that for the slope coefficient in simple regression analysis.

18
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

True model Fitted model


Y  1   2 X 2   3 X 3  u Ŷ  ˆ1  ˆ2 X 2  ˆ3 X 3

 X X 
 2i 2 i  3i 3
 Y  Y   X  X  2

   X 3 i  X 3   Yi  Y    X 2 i  X 2   X 3 i  X 3 
ˆ
2 
  X 2i  X 2    X 3i  X 3      X 2i  X 2   X 3i  X 3  
2 2 2

ˆ1  Y  ˆ2 X 2  ˆ3 X 3

For the general case when there are many explanatory variables, ordinary algebra is
inadequate. It is necessary to switch to matrix algebra.

19
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

. reg EARNINGS S EXP


----------------------------------------------------------------------------
Source | SS df MS Number of obs = 500
-----------+------------------------------ F( 2, 497) = 35.24
Model | 8735.42401 2 4367.712 Prob > F = 0.0000
Residual | 61593.5422 497 123.930668 R-squared = 0.1242
-----------+------------------------------ Adj R-squared = 0.1207
Total | 70328.9662 499 140.939812 Root MSE = 11.132
----------------------------------------------------------------------------
EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-----------+----------------------------------------------------------------
S | 1.877563 .2237434 8.39 0.000 1.437964 2.317163
EXP | .9833436 .2098457 4.69 0.000 .5710495 1.395638
_cons | -14.66833 4.288375 -3.42 0.001 -23.09391 -6.242752
----------------------------------------------------------------------------

ˆ
EARNINGS  14.67  1.88 S  0.98 EXP

Here is the regression output for the wage equation using Data Set 21.

20
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

. reg EARNINGS S EXP


----------------------------------------------------------------------------
Source | SS df MS Number of obs = 500
-----------+------------------------------ F( 2, 497) = 35.24
Model | 8735.42401 2 4367.712 Prob > F = 0.0000
Residual | 61593.5422 497 123.930668 R-squared = 0.1242
-----------+------------------------------ Adj R-squared = 0.1207
Total | 70328.9662 499 140.939812 Root MSE = 11.132
----------------------------------------------------------------------------
EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-----------+----------------------------------------------------------------
S | 1.877563 .2237434 8.39 0.000 1.437964 2.317163
EXP | .9833436 .2098457 4.69 0.000 .5710495 1.395638
_cons | -14.66833 4.288375 -3.42 0.001 -23.09391 -6.242752
----------------------------------------------------------------------------

ˆ
EARNINGS  14.67  1.88 S  0.98 EXP

It indicates that hourly earnings increase by $1.88 for every extra year of schooling and by
$0.98 for every extra year of work experience.

21
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

. reg EARNINGS S EXP


----------------------------------------------------------------------------
Source | SS df MS Number of obs = 500
-----------+------------------------------ F( 2, 497) = 35.24
Model | 8735.42401 2 4367.712 Prob > F = 0.0000
Residual | 61593.5422 497 123.930668 R-squared = 0.1242
-----------+------------------------------ Adj R-squared = 0.1207
Total | 70328.9662 499 140.939812 Root MSE = 11.132
----------------------------------------------------------------------------
EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-----------+----------------------------------------------------------------
S | 1.877563 .2237434 8.39 0.000 1.437964 2.317163
EXP | .9833436 .2098457 4.69 0.000 .5710495 1.395638
_cons | -14.66833 4.288375 -3.42 0.001 -23.09391 -6.242752
----------------------------------------------------------------------------

ˆ
EARNINGS  14.67  1.88 S  0.98 EXP

Literally, the intercept indicates that an individual who had no schooling or work experience
would have hourly earnings of –$14.67.

22
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE

. reg EARNINGS S EXP


----------------------------------------------------------------------------
Source | SS df MS Number of obs = 500
-----------+------------------------------ F( 2, 497) = 35.24
Model | 8735.42401 2 4367.712 Prob > F = 0.0000
Residual | 61593.5422 497 123.930668 R-squared = 0.1242
-----------+------------------------------ Adj R-squared = 0.1207
Total | 70328.9662 499 140.939812 Root MSE = 11.132
----------------------------------------------------------------------------
EARNINGS | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-----------+----------------------------------------------------------------
S | 1.877563 .2237434 8.39 0.000 1.437964 2.317163
EXP | .9833436 .2098457 4.69 0.000 .5710495 1.395638
_cons | -14.66833 4.288375 -3.42 0.001 -23.09391 -6.242752
----------------------------------------------------------------------------

ˆ
EARNINGS  14.67  1.88 S  0.98 EXP

Obviously, this is impossible. The lowest value of S in the sample was 8. We have obtained
a nonsense estimate because we have extrapolated too far from the data range.

23
Copyright Christopher Dougherty 2016.

These slideshows may be downloaded by anyone, anywhere for personal use.


Subject to respect for copyright and, where appropriate, attribution, they may be
used as a resource for teaching an econometrics course. There is no need to
refer to the author.

The content of this slideshow comes from Section 3.2 of C. Dougherty,


Introduction to Econometrics, fifth edition 2016, Oxford University Press.
Additional (free) resources for both students and instructors may be
downloaded from the OUP Online Resource Centre
www.oxfordtextbooks.co.uk/orc/dougherty5e/.

Individuals studying econometrics on their own who feel that they might benefit
from participation in a formal course should consider the London School of
Economics summer school course
EC212 Introduction to Econometrics
http://www2.lse.ac.uk/study/summerSchools/summerSchool/Home.aspx
or the University of London International Programmes distance learning course
EC2020 Elements of Econometrics
www.londoninternational.ac.uk/lse.

2016.04.28

You might also like