Basic Regression Analysis: Two Techniques To Identify Closer Approximation (SRF) of PRF

You might also like

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 10

Basic Regression Analysis

Two Techniques to Identify Closer Approximation (SRF) of PRF

 The Ordinary Least Squares Method (OLS)

 Maximum Likelihood Method (ML)

 Ordinary Least Squares (OLS) – Carl Friedrich Gauss

 Attractive Statistical Properties

 Most Popular and Attractive method in Regression Analysis


ORDINARY LEAST SQUARE TECHNIQUE

 The Ordinary Least Squares Method (OLS)

 Based on PRICINPLE of Least Squares

 Recall, the PRF

Yi = β1 + β2Xi + ui

As PRF is unobservable, we want to estimate it from SRF

Yi = βˆ1 + βˆ2Xi +uˆi

Yi =Yˆi + uˆi

Whre Yˆi estimated value (conditional mean) of Y


ORDINARY LEAST SQUARE TECHNIQUE
How is SRF determined?

 We need to determine SRF in a manner such that Yˆi is as close as possible to Yi

• We know that

• uˆi = Yi -Yˆi

• uˆi = Yi – ( βˆ1 + βˆ2Xi )

• Which gives uˆi as the difference between actual and the estimated value

• And,
Thus ifWeYdefine a Criterion:
i = Yˆi ; uˆi = 0

 Wherein we choose SRF in such a way that


 uˆi = (Yi – Y^i)
is as Much Small as Possible
ORDINARY LEAST SQUARE TECHNIQUE

There is a problem with this Criterion !

 Equal weightage to all uˆi (uˆ1 + uˆ2 + uˆ3 + uˆ4 +………. )

 Some uˆi are larger and Some may be closer

 Consequence:
Even when individual uˆi are scattered very widely from SRF, we may end up having
u^i = 0

 Problem of Unique SRF


ORDINARY LEAST SQUARE TECHNIQUE

 An Alternative Criterion

 SRF can be determined in such a way that

 uˆi 2 = (Yi – Y^i)2 =  (Yi - βˆ1 + βˆ2Xi )2

is as Much Small as Possible


ORDINARY LEAST SQUARE TECHNIQUE

Min  uˆi 2 = (Yi – Y^i)2 =  (Yi - βˆ1 + βˆ2Xi )2

Advantages:

 Different weight is give to different ui based on


magnitude in more appropriate manner

 Here, for ui’s closely scattered from SRF, we will


have lower  uˆi 2

 And for ui’s widely scattered, we will


have higher  uˆi 2

 Further, the Estimators from this Least


Squares technique have some desirable
properties
ORDINARY LEAST SQUARE TECHNIQUE
Choosing the best ( βˆ1 and βˆ2 )

 It is obvious that,  uˆ 2 = f (βˆ , βˆ )


i 1 2

Implying that Sum of Squared Deviations is some function of the Estimators

 Thus for a given data set, different values of estimators (different SRFs) will
generate different  uˆi 2

 Which pair of estimators will be chosen ?


ORDINARY LEAST SQUARE TECHNIQUE

In General we follow:

Two Steps – OLS

I. Estimate all possible values of (βˆ1 , βˆ2)

II. Consider all possible values of the estimators (βˆ1 , βˆ2) !

 Identify the set of Estimators which minimize sum of squared residuals  uˆi 2

 Trial and Error method

• Possible but Time Constraint

 Use Calculus
• Less time consuming and more accurate
ORDINARY LEAST SQUARE TECHNIQUE

Calculating the Least Squares Estimators βˆ1 and βˆ2

We know that OLS is based on Minimization of  uˆi 2 w.r.t. β1 and β2

 uˆi 2 = (Yi – Y^i)2

Or  uˆi 2 =  (Yi - βˆ1 - βˆ2Xi )2


 Use of Calculus gives the following two equations

And then Set these partial derivative equal to zero


ORDINARY LEAST SQUARE TECHNIQUE
Calculating the Least Squares Estimators βˆ1 and βˆ2
Rearrangement of First Order Conditions give the following two equation for
estimating βˆ1 and βˆ2

where n is sample size


Solving them simultaneously, we get

You might also like