Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 27

Basic Estimation Techni

ques
Agus Rizal Firdaus 175020218113005
Muhammad Hanief Al Kautsar 175020218113008
Ellya Ramadhianingrum 175020218113015
Learning Objectives
 Set up and interpret simple linear regression equations.
 Estimate intercept and slope parameters of a regression line using the
method of least‐squares.
 Determine statistical significance using either t‐tests or p values
associated with parameter estimates.
 Evaluate the “fit” of a regression equation to the data using the R2
statistic and test for statistical significance of the whole regression
equation using an F‐test.
 Set up and interpret multiple regression models that use more than one
explanatory variable.
 Use linear regression techniques to estimate the parameters of two
common nonlinear models: quadratic and log‐linear regression models.
Basic Estimation
Parameters
The coefficients in an equation that determine
the exact mathematical relation among the
variables.

Parameter Estimation
The process of finding estimates of the
numerical values of the parameters of an
equation.
Regression Analysis
Regression Analysis
A statistical technique for estimating the parameters
of an equation and testing for statistical significance.

Dependent Variable
Variable whose variation is to be explained.

Explanatory Variables
Variables that are thought to cause the
dependent variable to take on different values.
Simple Linear Regression
 True regression line relates dependent variable Y to
one explanatory (or independent) variable X.

Y  a  bX

• Intercept parameter (a) gives value of Y where regression


line crosses Y-axis (value of Y when X is zero)
• Slope parameter (b) gives the change in Y associated with a
one-unit change in X:
b = ΔY/ΔX
Simple Linear Regression
 Regression line shows the average or expected value
of Y for each level of X.
 True (or actual) underlying relation between Y and X
is unknown to the researcher but is to be discovered
by analyzing the sample data.
 Random error term.
The True Regression Line: Relating Sales
and Advertising Expenditures
The Impact of Random Effects on
January Sales:
Fitting A Regression Line
The purpose of regression analysis is twofold:
1. To estimate the parameters (a and b) of the true
regression line.
2. To test whether the estimated values of the
parameters are statistically significant.
Fitting A Regression Line
Time Series

A data can be collected over time for specific firm


(or specific industry).

Cross Sectional

A data can be collected from several different firms


or industries at a given time.
Method of least-squares

A method of estimating the parameters of a linear


regression equation by finding the line that minimizes
the sum of the squared distances from each sample
data point to the sample regression line.
Sales and Advertising Expenditures
for a Sample of Seven Travel
Agencies:
The Sample Regression Line:
Relating Sales and Advertising
Expenditures
Statistical Significance
Statistical Significance

There is sufficient evidence from the sample t


indicate that the true value of the coefficient is not 0.

Hypothesis Testing

A statistical technique for making a probabilistic


statement about the true value of a parameter.
Statistical Significance
 Must determine if there is sufficient statistical
evidence to indicate that Y is truly related to X (i.e., b  0).
 Even if b = 0, it is possible that the sample will produce an
estimate that is different from zero.
 Test for statistical significance using t-tests or p-values.
The Relative Frequency Distribution for b̂ :
Relative Frequency Distribution for b̂ when b
=5
The concept of a t-Ratio:

Where b̂ is the least-squares estimate of b and


is the standard error of the estimate, both of
Which are calculated by the computer.
Performing a t-Test for Statistical Significance

 Determine the level of significance.


- Probability of finding a parameter estimate to be statistically
different from zero when, in fact, it is zero.
- Probability of a Type I Error.
 (1- Level of Significance = Level of Confidence)
- Level of confidence is the probability of correctly failing to
reject the true hypothesis that b = 0.
Using p-Values to Determine Statistical Significance

 Treat as statistically significant only those parameter


estimates with p-values smaller than the maximum
acceptable significance level.
 p-value gives exact level of significance
- Also the probability of finding significance when none
exists.
Evaluation of The Regression Equation
The Coefficient of Determination (R2)

 R2 measures the fraction of total variation in the


dependent variable (Y) that is explained by the
variation in X
- Ranges from 0 to 1
- High R2 indicates Y and X are highly
correlated, and does not prove that Y and X
are causally related.
Evaluation of The Regression Equation
The F-Statistic

 Used to test for significance of overall regression equation.


 Compare F-statistic to critical F-value from F-table
- Two degrees of freedom, n – k & k – 1
- Level of significance
 If F-statistic exceeds the critical F, the regression equation
overall is statistically significant at the
specified level of significance.
Multiple Regression

 Regression models that use more than one explanatory


variable to explain the variation in the dependent variable.
 Coefficient for each explanatory variable measures
the change in the dependent variable associated with a
one-unit change in that explanatory variable, all else constant.
Nonlinear Regression Analysis
Quadratic Regression Models

 Use when curve fitting scatter plot is U-shaped or


∩-shaped.

Y = a + bX + cX2

- For linear transformation compute new variable Z = X2


- Estimate Y = a + bX + cZ
Nonlinear Regression Analysis
Log-Linear Regression Models

 Use when relation takes the form: Y = aXbZc

 Transform by taking natural logarithms:

lnY  ln a  b ln X  c ln Z

b and c are elasticities


Thank Y
ou

You might also like