Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 7

Least Squares Regression Analysis

1
Least Squares Regression Analysis (1 of 2)

The model that has the best fit is the one that minimizes the
sum of squared error. This model is the least squares
regression line, and the procedure for finding it is the
method of least squares.
Definition of Least Squares Regression Line
For a set of points
(x1, y1), (x2, y2), . . . , (xn yn)
the least squares regression line is the linear function
f(x) = a0 + a1x
that minimizes the sum of squared error

2
Least Squares Regression Analysis (2 of 2)

Matrix Form for Linear Regression


For the regression model Y = XA + E, the coefficients of
the least squares regression line are given by the matrix
equation

and the sum of squared error is

3
Example 7 – Finding the Least Squares Regression Line

Find the least squares regression line for the points (1, 1),
(2, 2), (3, 4), (4, 4), and (5, 6).

Solution:
The matrices X and Y are

4
Example 7 – Solution (1 of 3)
This means that

and

5
Example 7 – Solution (2 of 3)
Now, using to find the coefficient matrix A, you have

6
Example 7 – Solution (3 of 3)
So, the least squares regression line is
y = −0.2 + 1.2x
as shown in Figure 2.8.

The sum of squared error for


this line is 0.8 (verify this),
which means that this line fits
the data better than either of
the two experimental linear
models determined earlier.
Figure 2.8

You might also like