Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 26

Module 3

Quantitative Demand
Analysis
© 2017 by McGraw-Hill Education. All Rights Reserved. 1
Learning Outcomes
5. Show how to determine elasticities from linear
and log-linear demand functions.

6. Explain how regression analysis may be used to


estimate demand functions, and how to
interpret and use the output of a regression.

2
Obtaining Elasticities From Demand Functions

Elasticities for Linear Demand Functions


• From
  a linear demand function, we can easily
compute various elasticities.
• Given a linear demand function:

– Own price elasticity: .


– Cross price elasticity: .
– Income elasticity: .

© 2017 by McGraw-Hill Education. All Rights Reserved. 3-3


Obtaining Elasticities From Demand Functions

Elasticities for Linear Demand


Functions In Action
•The
  daily demand for Invigorated PED shoes is
estimated to be:

Suppose good X sells at $25 a pair, good Y sells


at $35, the company utilizes 50 units of
advertising, and average consumer income is
$20,000. Calculate the own price, cross-price
and income elasticities of demand.
– units.

© 2017 by McGraw-Hill Education. All Rights Reserved. 3-4


Obtaining Elasticities From Demand Functions

Elasticities for Linear Demand


Functions In Action
•The
  daily demand for Invigorated PED shoes is
estimated to be:

Suppose good X sells at $25 a pair, good Y sells


at $35, the company utilizes 50 units of
advertising, and average consumer income is
$20,000. Calculate the own price, cross-price
and income elasticities of demand.
– units.
– Own price elasticity: .
© 2017 by McGraw-Hill Education. All Rights Reserved. 3-5
Obtaining Elasticities From Demand Functions

Elasticities for Linear Demand


Functions In Action
•The
  daily demand for Invigorated PED shoes is
estimated to be:

Suppose good X sells at $25 a pair, good Y sells at


$35, the company utilizes 50 units of advertising, and
average consumer income is $20,000. Calculate the
own price, cross-price and income elasticities of
demand.
– units.
– Own price elasticity: .
– Cross-price elasticity:
© 2017 by McGraw-Hill Education. All Rights Reserved. 3-6
Obtaining Elasticities From Demand Functions

Elasticities for Linear Demand


Functions In Action
•The  daily demand for Invigorated PED shoes is estimated
to be:

Suppose good X sells at $25 a pair, good Y sells at $35,


the company utilizes 50 units of advertising, and
average consumer income is $20,000. Calculate the own
price, cross-price and income elasticities of demand.
– units.
– Own price elasticity: .
– Cross-price elasticity: .
– Income elasticity: .
© 2017 by McGraw-Hill Education. All Rights Reserved. 3-7
Obtaining Elasticities From Demand Functions

Elasticities for Nonlinear Demand


Functions
•  One non-linear demand function is the log-
linear demand function:

– Own price elasticity: .


– Cross price elasticity: .
– Income elasticity: .

© 2017 by McGraw-Hill Education. All Rights Reserved. 3-8


Obtaining Elasticities From Demand Functions

Elasticities for Nonlinear Demand


Functions In Action
•An  analyst for a major apparel company estimates that the
demand for its raincoats is given by

where denotes the daily amount of rainfall and the


level of advertising on good Y. What would be the
impact on demand of a 10 percent increase in the daily
amount of rainfall?
. So, .

A 10 percent increase in rainfall will lead to a 30 percent


increase in the demand for raincoats.
© 2017 by McGraw-Hill Education. All Rights Reserved. 3-9
Regression Analysis

Regression Analysis
• How does one obtain information on the
demand function?
– Published studies
– Hire consultant
– Statistical technique called regression analysis
using data on quantity, price, income and other
important variables.

© 2017 by McGraw-Hill Education. All Rights Reserved. 3-10


© 2017 by McGraw-Hill Education. All Rights Reserved. 11
Regression Analysis

Regression Line and Least Squares


Regression
•• True
  (or population) regression model
– unknown population intercept parameter.
– unknown population slope parameter.
– random error term with mean zero and standard deviation .
• Least squares regression line

– least squares estimate of the unknown parameter .


– least squares estimate of the unknown parameter.
• The parameter estimates and , represent the values of and
that result in the smallest sum of squared errors between a
line and the actual data.
© 2017 by McGraw-Hill Education. All Rights Reserved. 3-12
© 2017 by McGraw-Hill Education. All Rights Reserved. 13
Regression Analysis

Excel and Least Squares Estimates


SUMMARY
OUTPUT
  Estimated Demand:
Regression Statistics
Multiple R 0.87
R Square 0.75 𝑎 ^ =1631.47
Adjusted R Square 0.72 ^ 
𝑏=−2.60
Standard Error 112.22
Observations 10.00

ANOVA
  Df SS MS F Significance F
Regression 1 301470.89 301470.89 23.94 0.0012
Residual 8 100751.61 12593.95
Total 9 402222.50     

  Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 1631.47 243.97 6.69 0.0002 1068.87 2194.07
Price -2.60 0.53 -4.89 0.0012 -3.82 -1.37

© 2017 by McGraw-Hill Education. All Rights Reserved. 3-14


Regression Analysis

Evaluating Statistical Significance


• Standard
  error
– Measure of how much each estimated estimate
varies in regressions based on the same true demand
model using different data.
• 95 Percent Confidence interval rule of thumb

• t-statistics rule of thumb


– When , we are 95 percent confident the true
parameter is in the regression is not zero.
© 2017 by McGraw-Hill Education. All Rights Reserved. 3-15
Regression Analysis

Excel and Least Squares Estimates


SUMMARY
OUTPUT

Regression Statistics   (^
𝑠𝑒 𝑎)=243.97
Multiple R 0.87 𝑠𝑒 ^
  (𝑏)=0.53
R Square 0.75
Adjusted R Square 0.72   , the intercept is different
Standard Error 112.22 from zero.
Observations 10.00   , the intercept is different
from zero.
ANOVA
  Df SS MS F Significance F
Regression 1 301470.89 301470.89 23.94 0.0012
Residual 8 100751.61 12593.95
Total 9 402222.50     

  Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 1631.47 243.97 6.69 0.0002 1068.87 2194.07
Price -2.60 0.53 -4.89 0.0012 -3.82 -1.37

© 2017 by McGraw-Hill Education. All Rights Reserved. 3-16


• when
  t stat is large we are confident that it is
not zero thus the standard error is small relative
to the absolute value of the parameter estimate
• , the intercept is statistically different from zero.
• , the intercept is statistically different from zero.

© 2017 by McGraw-Hill Education. All Rights Reserved. 17


• P values are a much more precise measure of
statistical significance
• .0012 = only 12 in 1000 chance that we’ll get an
estimate at least as big as -2.6 in absolute value
if the true coefficient is actually zero
• .05 = estimated coefficient is statistically
significant at the 5% level

© 2017 by McGraw-Hill Education. All Rights Reserved. 18


Regression Analysis

Evaluating the Overall Fit of the


Regression Line
•  R-Square
– Also called the coefficient of determination.
– Fraction of the total variation in the dependent
variable that is explained by the regression.

– Ranges between 0 and 1.


• Values closer to 1 indicate “better” fit.

© 2017 by McGraw-Hill Education. All Rights Reserved. 3-19


Regression Analysis

Evaluating the Overall Fit of the


Regression Line
•  Adjusted R-Square
– A version of the R-square that penalize
researchers for having few degrees of freedom.

– is total observations.
– is the number of estimated coefficients.
– is the degrees of freedom for the regression.

© 2017 by McGraw-Hill Education. All Rights Reserved. 3-20


Regression Analysis

Evaluating the Overall Fit of the


Regression Line
• The F- Statistic
• A measure of the total variation explained by
the regression relative to the total unexplained
variation.
– The greater the F-statistic, the better the overall
regression fit.
– Equivalently, the P-value is another measure of the
F-statistic.
• Lower P-values are associated with better overall
regression fit.
© 2017 by McGraw-Hill Education. All Rights Reserved. 3-21
Regression Analysis

Excel and Least Squares Estimates


SUMMARY
OUTPUT

Regression Statistics
Multiple R 0.87
R Square 0.75
Adjusted R Square 0.72
Standard Error 112.22
Observations 10.00

ANOVA
  Df SS MS F Significance F
Regression 1 301470.89 301470.89 23.94 0.0012
Residual 8 100751.61 12593.95
Total 9 402222.50     

  Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 1631.47 243.97 6.69 0.0002 1068.87 2194.07
Price -2.60 0.53 -4.89 0.0012 -3.82 -1.37

© 2017 by McGraw-Hill Education. All Rights Reserved. 3-22


Regression Analysis

Regression for Nonlinear Functions


and Multiple Regression
• Regression
  techniques can also be applied to
the following settings:
– Nonlinear functional relationships:
• Nonlinear regression example:

– Functional relationships with multiple variables:


• Multiple regression example:

or

© 2017 by McGraw-Hill Education. All Rights Reserved. 3-23


Regression Analysis

Excel and Least Squares Estimates


SUMMARY
OUTPUT

Regression Statistics
Multiple R 0.89
R Square 0.79
Adjusted R Square 0.69
Standard Error 9.18
Observations 10.00

ANOVA
  Df SS MS F Significance F
Regression 3 1920.99 640.33 7.59 0.182
Residual 6 505.91 84.32
Total 9 2426.90     

  Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 135.15 20.65 6.54 0.0006 84.61 185.68
Price -0.14 0.06 -2.41 0.0500 -0.29 0.00
Advertising 0.54 0.64 0.85 0.4296 -1.02 2.09
Distance -5.78 1.26 -4.61 0.0037 -8.86 -2.71

© 2017 by McGraw-Hill Education. All Rights Reserved. 3-24


dr exp X insuranceY
SUMMARY OUTPUT
5 64
Regression Statistics
2 87 Multiple R 0.7679342
R Square 0.58972294
12 50 Adjusted R
Square 0.52134343
9 71 Standard
Error 10.3199364
15 44 Observatio
ns 8
6 56
ANOVA
25 42   df SS MS F Significance F
Regression 1 918.4934811 918.4935 8.624264 0.026058804
16 60 Residual
Total
6
7
639.0065189 106.5011
1557.5      

  Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Lower 95.0% Upper 95.0%
90 474 Intercept
dr exp
76.6603651
-1.547588
6.961327256 11.01232 3.33E-05
0.526980245 -2.93671 0.026059
59.6266109
-2.837062212
93.69411922 59.6266109 93.69411922
-0.258113799 -2.837062212 -0.258113799

© 2017 by McGraw-Hill Education. All Rights Reserved. 25


Cholesterol
Age
189 58
235 69
193 43
177 39
154 63
191 52 SUMMARY OUTPUT
213 47
165 31 Regression Statistics
198 74 Multiple R 0.408431
181 36 R Square 0.166816
Adjusted R
Square 0.062668
Standard
Error 14.05073
Observatio
ns 10

ANOVA
Significance
  df SS MS F F
Regression 1 316.2161 316.2161 1.601719 0.2412728
Residual 8 1579.384 197.423
Total 9 1895.6      

Standard Lower
  Coefficients Error t Stat P-value Lower 95% Upper 95% 95.0% Upper 95.0%
Intercept 2.527676 38.71402 0.065291 0.949544 -86.74702 91.80237 -86.747 91.8023727
Cholesterol 0.256711 0.202839 1.26559 0.241273 -0.211036 0.724457 -0.21104 0.72445726

© 2017 by McGraw-Hill Education. All Rights Reserved. 26

You might also like