Professional Documents
Culture Documents
Pertemuan Ke-1 Multiple Regression
Pertemuan Ke-1 Multiple Regression
Pertemuan Ke-1 Multiple Regression
Chapter 12
Multiple Regression
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-1
Chapter Goals
After completing this chapter, you should be able to:
Apply multiple regression analysis to business decision-
making situations
Analyze and interpret the computer output for a multiple
regression model
Perform a hypothesis test for all regression coefficients
or for a subset of coefficients
Fit and interpret nonlinear regression models
Incorporate qualitative variables into the regression
model by using dummy variables
Discuss model specification and analyze residuals
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-2
12.1
The Multiple Regression Model
Y β0 β1X1 β 2 X 2 βk Xk ε
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-3
Multiple Regression Equation
yˆ i b0 b1x1i b 2 x 2i bk x ki
In this chapter we will always use a computer to obtain the
regression slope coefficients and other regression
summary measures.
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-4
Multiple Regression Equation
(continued)
Two variable model
y
yˆ b0 b1x1 b 2 x 2
x1
e
abl
i
var
r
fo
ope x2
Sl
varia ble x 2
pe fo r
Slo
x1
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-5
Multiple Regression Model
Two variable model
y Sample
yi
<
observation yˆ b0 b1x1 b 2 x 2
yi
<
ei = (yi – yi)
x2i
x2
<
x1i The best fit equation, y ,
is found by minimizing the
x1 sum of squared errors, e2
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-6
Standard Multiple Regression
Assumptions
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-7
Standard Multiple Regression
Assumptions
(continued)
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-9
Pie Sales Example
Pie Price Advertising
Week Sales ($) ($100s) Multiple regression equation:
1 350 5.50 3.3
2
3
460
350
7.50
8.00
3.3
3.0
Sales = b0 + b1 (Price)
4 430 8.00 4.5 + b2 (Advertising)
5 350 6.80 3.0
6 380 7.50 4.0
7 430 4.50 3.0
8 470 6.40 3.7
9 450 7.00 3.5
10 490 5.00 4.0
11 340 7.20 3.5
12 300 7.90 3.2
13 440 5.90 4.0
14 450 5.00 3.5
15 300 7.00 2.7
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-10
12.2
Estimating a Multiple Linear
Regression Equation
Excel will be used to generate the coefficients and
measures of goodness of fit for multiple regression
Data / Data Analysis / Regression
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-11
Multiple Regression Output
Regression Statistics
Multiple R 0.72213
R Square 0.52148
Adjusted R Square 0.44172
Standard Error 47.46341 Sales 306.526 - 24.975(Price) 74.131(Advertising)
Observations 15
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-12
The Multiple Regression Equation
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-14
Coefficient of Determination, R2
(continued)
Regression Statistics
SSR 29460.0
Multiple R 0.72213
R 2
.52148
R Square 0.52148 SST 56493.3
Adjusted R Square 0.44172
Standard Error 47.46341 52.1% of the variation in pie sales
Observations 15 is explained by the variation in
price and advertising
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-15
Estimation of Error Variance
Consider the population regression model
Yi β0 β1x1i β2 x 2i βK xKi ε i
i
e 2
SSE
s 2e i 1
n K 1 n K 1
where ei y i yˆ i
The square root of the variance, se , is called the
standard error of the estimate
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-16
Standard Error, se
Regression Statistics
Multiple R 0.72213
R Square 0.52148
se 47.463
Adjusted R Square 0.44172
The magnitude of this
Standard Error 47.46341
value can be compared to
Observations 15
the average y value
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-17
Adjusted Coefficient of
Determination, R 2
models
What is the net effect of adding a new variable?
We lose a degree of freedom when a new X
variable is added
Did the new X variable add enough
(continued)
Used to correct for the fact that adding non-relevant
independent variables will still reduce the error sum of
squares
SSE / (n K 1)
R 1
2
SST / (n 1)
(where n = sample size, K = number of independent variables)
Regression Statistics
Multiple R 0.72213 R .44172
2
R Square 0.52148
Adjusted R Square 0.44172
44.2% of the variation in pie sales is
Standard Error 47.46341 explained by the variation in price and
Observations 15 advertising, taking into account the sample
size and number of independent variables
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-20
Coefficient of Multiple
Correlation
The coefficient of multiple correlation is the correlation
between the predicted value and the observed value of
the dependent variable
R r(yˆ , y) R 2
Is the square root of the multiple coefficient of
determination
Used as another measure of the strength of the linear
relationship between the dependent variable and the
independent variables
Comparable to the correlation between Y and X in
simple regression
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-21
12.4
Evaluating Individual
Regression Coefficients
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-22
Evaluating Individual
Regression Coefficients
(continued)
Test Statistic:
bj 0
t (df = n – k – 1)
Sb j
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-23
Evaluating Individual
Regression Coefficients
(continued)
Regression Statistics
Multiple R 0.72213
t-value for Price is t = -2.306, with
R Square 0.52148
p-value .0398
Adjusted R Square 0.44172
Standard Error 47.46341 t-value for Advertising is t = 2.855,
Observations 15 with p-value .0145
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-24
Example: Evaluating Individual
Regression Coefficients
From Excel output:
H0: βj = 0
Coefficients Standard Error t Stat P-value
H1: βj 0 Price -24.97509 10.83213 -2.30565 0.03979
Advertising 74.13096 25.96732 2.85478 0.01449
d.f. = 15-2-1 = 12
= .05 The test statistic for each variable falls
t12, .025 = 2.1788 in the rejection region (p-values < .05)
Decision:
/2=.025 /2=.025 Reject H0 for each variable
Conclusion:
There is evidence that both
Reject H0
-tα/2
Do not reject H0
tα/2
Reject H0
Price and Advertising affect
0
-2.1788 2.1788 pie sales at = .05
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-25
Confidence Interval Estimate
for the Slope
Confidence interval limits for the population slope βj
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-27
12.5
Test on All Coefficients
F-Test for Overall Significance of the Model
Shows if there is a linear relationship between all
of the X variables considered together and Y
Use F test statistic
Hypotheses:
H0: β1 = β2 = … = βk = 0 (no linear relationship)
H1: at least one βi ≠ 0 (at least one independent
variable affects Y)
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-28
F-Test for Overall Significance
Test statistic:
MSR SSR/K
F 2
se SSE/(n K 1)
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-30
F-Test for Overall Significance
(continued)
H0 : α1 α2 αr 0
H1 : at least one of α j 0 (j 1,..., r)
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-32
Tests on a Subset of
Regression Coefficients
(continued)
Goal: compare the error sum of squares for the
complete model with the error sum of squares for the
restricted model
First run a regression for the complete model and obtain SSE
Next run a restricted regression that excludes the z variables
(the number of variables excluded is r) and obtain the
restricted error sum of squares SSE(r)
Compute the F statistic and apply the decision rule for a
significance level
( SSE(r) SSE ) / r
Reject H0 if F 2
Fr,nK r 1,α
se
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-33
12.6
Prediction
Given a population regression model
y i β0 β1x1i β2 x 2i βK x Ki ε i (i 1,2,, n)
It is risky to forecast for new X values outside the range of the data used
to estimate the model coefficients, because we do not have data to
support that the linear model extends beyond the observed range.
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-34
Using The Equation to Make
Predictions
Predict sales for a week in which the selling
price is $5.50 and advertising is $350:
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-36
Quadratic Regression Model
Model form:
y i β0 β1x1i β x ε i 2
2 1i
where:
β0 = Y intercept
β1 = regression coefficient for linear effect of X on Y
β2 = regression coefficient for quadratic effect on Y
εi = random error in Y for observation i
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-37
Linear vs. Nonlinear Fit
Y Y
X X
residuals
X residuals X
X1 X1 X1 X1
β1 < 0 β1 > 0 β1 < 0 β1 > 0
β2 > 0 β2 > 0 β2 < 0 β2 < 0
β1 = the coefficient of the linear term
β2 = the coefficient of the squared term
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-39
Testing for Significance:
Quadratic Effect
Testing the Quadratic Effect
Compare the linear regression estimate
yˆ b0 b1x1
with quadratic regression estimate
ˆy b0 b1x1 b 2 x12
Hypotheses
H0: β2 = 0 (The quadratic term does not improve the model)
H1: β2 0 (The quadratic term improves the model)
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-40
Testing for Significance:
Quadratic Effect
(continued)
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-42
Example: Quadratic Model
Purity
Filter
Time
Purity increases as filter time
3 1 increases:
7 2 Purity vs. Tim e
8 3
100
15 5
22 7 80
33 8
40 10 60
P u rity
54 12
40
67 13
70 14
20
78 15
85 15 0
87 16 0 5 10 15 20
99 17 Tim e
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-43
Example: Quadratic Model
(continued)
Simple regression results:
^y = -11.283 + 5.985 Time
Standard
Coefficients Error t Stat P-value
Intercept -11.28267 3.46805 -3.25332 0.00691
Time 5.98520 0.30966 19.32819 2.078E-10
Residuals
Time 1.56496 0.60179 2.60052 0.02467
0
Time-squared 0.24516 0.03258 7.52406 1.165E-05
0 5 10 15 20
-5
Regression Statistics Time
F Significance F
R Square 0.99494
1080.7330 2.368E-13 Time -square d Re sidual Plot
Adjusted R Square 0.99402
10
Standard Error 2.59513
5
Residuals
The quadratic term is significant and 0
0 100 200 300 400
improves the model: R2 is higher and se is -5
Time-squared
lower, residuals are now random
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-45
The Log Transformation
Y β0 X X ε β1
1
β2
2
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-46
Interpretation of coefficients
For the multiplicative model:
log Yi log β0 β1 log X1i log ε i
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-47
12.8
Dummy Variables
A dummy variable is a categorical independent
variable with two levels:
yes or no, on or off, male or female
recorded as 0 or 1
Regression intercepts are different if the
variable is significant
Assumes equal slopes for other variables
If more than two levels, the number of dummy
variables needed is (number of levels - 1)
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-48
Dummy Variable Example
yˆ b0 b1x1 b 2 x 2
Let:
y = Pie Sales
x1 = Price
x2 = Holiday (X2 = 1 if a holiday occurred during the week)
(X2 = 0 if there was no holiday that week)
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-49
Dummy Variable Example
(continued)
Different Same
intercept slope
y (sales)
If H0: β2 = 0 is
b0 + b 2
Holi rejected, then
day
b0 (x2 = “Holiday” has a
No H 1)
oli da significant effect
y (x
2 = 0 on pie sales
)
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-51
Interaction Between
Explanatory Variables
Hypothesizes interaction between pairs of x
variables
Response to one x variable may vary at different
levels of another x variable
yˆ b0 b1x1 b 2 x 2 b3 x 3
b0 b1x1 b 2 x 2 b3 (x1x 2 )
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-52
Effect of Interaction
Given:
Y β0 β 2 X 2 (β1 β3 X 2 )X1
β0 β1X1 β 2 X 2 β3 X1X 2
Without interaction term, effect of X1 on Y is
measured by β1
With interaction term, effect of X1 on Y is
measured by β1 + β3 X2
Effect changes as X2 changes
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-53
Interaction Example
Suppose x2 is a dummy variable and the estimated
regression equation is yˆ 1 2x1 3x 2 4x1x 2
y
12
x2 = 1:
^y = 1 + 2x + 3(1) + 4x (1) = 4 + 6x
8 1 1 1
4 x2 = 0:
^y = 1 + 2x + 3(0) + 4x (0) = 1 + 2x
1 1 1
0
x1
0 0.5 1 1.5
Slopes are different if the effect of x1 on y depends on x2 value
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-54
Significance of Interaction Term
The coefficient b3 is an estimate of the difference
in the coefficient of x1 when x2 = 1 compared to
when x2 = 0
The t statistic for b3 can be used to test the
hypothesis
H0 : β3 0 | β1 0, β 2 0
H1 : β3 0 | β1 0, β 2 0
<
ei = (yi – yi)
Assumptions:
The errors are normally distributed
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-56
Analysis of Residuals
in Multiple Regression
These residual plots are used in multiple
regression:
<
Residuals vs. yi
Residuals vs. x1i
Residuals vs. x2i
Residuals vs. time (if time series data)
Use the residual plots to check for
violations of regression assumptions
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-57
Chapter Summary
Developed the multiple regression model
Tested the significance of the multiple regression model
Discussed adjusted R2 ( R2 )
Tested individual regression coefficients
Tested portions of the regression model
Used quadratic terms and log transformations in
regression models
Explained dummy variables
Evaluated interaction effects
Discussed using residual plots to check model
assumptions
Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 12-58