Lecture 5 CAE Curve Fitting

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 46

Computing Applications for Engineers (ENG60104)

Curve Fitting

1
Learning Outcome
LO2 - Apply numerical methods to find the roots of
algebraic equations, perform interpolation, curve fitting,
numerical differentiation and integration

Lesson Learning Outcome


At the end of the lesson, student should be able to:
• apply different curve fitting methods to perform
calculation
• understand how to find the best fit curve

2
Curve Fitting

Linear Polynomial
Regression Regression

Goodness of fit

3
Linear Regression

4
Linear Regression

• is also known as least-squares regression

• Fit a straight line to a set of paired observations: (x1, y1),


(x2, y2), …, (xn, yn)

y y

x x
5
Linear Regression

Y = a0 + a1x + e

a0 - Intercept
a1 - slope
e - error, or residual between the model and the observations

6
Linear Regression
Criteria for a ‘Best’ fit
Best strategy is to minimise the sum of the squares
of the residuals between the measure y
(from the given data)
and the y calculated with linear model
(means equations from regression):

Sr = = (yi(measured) - yi(model)) 2

Low Sr means good correlation


Sr = is the sum of the squares of the residuals

7
Linear Regression

yi model is: Y = a0 + a1x

Equation of Sr

Sr = = (yi(measured) - yi(model)) 2

8
Linear Regression

yi model is: Y = a0 + a1x

Equation of Sr

Sr = = (yi(measured) - yi(model)) 2

Sr = (yi -

9
Linear Regression

yi model is: Y = a0 + a1x

Equation of Sr

Sr = = (yi(measured) - yi(model)) 2

(yi - a0 - a1xi )
2
Sr =

10
Linear Regression

(yi - a0 - a1xi )
2
Sr =

dSr
= 2(yi - a0 - a1xi ) (-1) = 0
da0

(yi - a0 - a1xi ) = 0

yi - a0 - a1xi = 0

yi - n a0 - a1 xi = 0

11
Linear Regression

yi - n a0 - a1 xi = 0

12
Linear Regression

yi - a1 xi = n a0

yi xi
- a1 = a0
n n

— —
a0 = y - a1 x

13
Linear Regression

(yi - a0 - a1xi )
2
Sr =

dSr
= 2(yi - a0 - a1xi ) (-xi) = 0
da1

(yi - a0 - a1xi )(xi) = 0

yixi - a0xi - a1xi = 0


2

14
Linear Regression

Y = a0 + a1x

15
Example
Fit a straight line through the x and y values in the following
table
28

24
1 0.5 0.5 1
2.5 5.0 4
2
119.5
3 2.0 6.0 9
4 4.0 16 16 140
3.5 17.5 25
5
28
6 6.0 36 36 =4
7
7 5.5 38.5 49
24
28 24 119.5 140 = 3.428571
7

16
28

24

119.5

140
28
=4
7
24
= 3.428571
7

17
Equation of model
Y = 0.07142857 + 0.8392857 x

18
Goodness of fit

close to 1
mean good
r2 = coefficient of determination
r = correlation coefficient

St = total sum of the squares of the residuals between data


points and the mean

Sr = is the sum of the squares of the residuals

19
Goodness of fit
𝑛
2
St = ෍ 𝑦𝑖 𝑚𝑒𝑎𝑠𝑢𝑟𝑒𝑑 − 𝑦ത
𝑖=1

Sr = = (yi(measured) - yi(model))2

20
quantifies the improvement or error reduction due to
describing data in terms of a straight line rather than as
an average value

21
Example 1
Determine the correlation coefficient of the following data
using linear regression line

1 0.5
2 2.5
3 2.0
4 4.0
5 3.5
6 6.0
7 5.5

22
Example 1 - Solution

Determine the correlation coefficient of the following data

1 0.5
2 2.5
3 2.0
4 4.0
5 3.5
6 6.0
7 5.5
28 24

23
Example 1 - Solution

24
= 3.428571
7

For the first answer, xi = 1 , y1 = 0.5

= 8.5765

24
Example 1 - Solution

Determine the correlation coefficient of the following data

1 0.5 8.5765
2 2.5 0.8622
3 2.0 2.0408
4 4.0 0.3265
5 3.5 0.0051
6 6.0 6.6122
7 5.5 4.2908
28 24 22.7143

25
Remember the equation of model.

Y = 0.07142857 + 0.8392857 x

For the first answer, xi = 1 , y1 = 0.5

Substitute x1 = 1 into Y equation,

Y = 0.9107

= 0.1687

26
Example 1 - Solution
Determine the correlation coefficient of the following data

1 0.5 8.5765 0.1687


2 2.5 0.8622 0.5625
3 2.0 2.0408 0.3473
4 4.0 0.3265 0.3265
5 3.5 0.0051 0.5896
6 6.0 6.6122 0.7972
7 5.5 4.2908 0.1993
28 24 22.7143 2.9911
St Sr

27
Goodness of fit

Alternative formula for r

28
Algorithm for linear regression

x = …;
y = …;
Plot the graph using the data
n = length (x); points available. In the same
sx = sum(x); graph, plot the best fit line
across the data points
sy = sum(y);
sx2 = sum(x.*x);
sy2 = sum(y.*y);
sxy = sum(x.*y);

%enter the formula of calculating the formula of correlation coefficient

29
Example 2

The following data is given

x 5 8 13

y 8 11 14

Apply linear regression (linear least squares regression) to


find the equation of the model

30
Example 2 - Solution
x 5 8 13 26
y 8 11 14 33
xy 40 88 182 310
x2 25 64 169 258

31
Example 2 - Solution
x 5 8 13 26
y 8 11 14 33
xy 40 88 182 310
x2 25 64 169 258

32
Example 2 - Solution
Y = a0 + a1x

33
Polynomial Regression

34
Polynomial Regression

Let us consider 2nd order polynomial (quadratic) is defined


by:

The residuals between the model and the data

35
Polynomial Regression

The sum of squares of the residual:

36
Derivation of 2nd order polynomial

simplified the equations

37
Summarised into matrix form

38
Polynomial Regression

2nd order

39
Polynomial Regression

Coefficient of determination

40
Example
Determine the correlation coefficient of the following data
using second order polynomial regression line

0 2.1

1 7.7

2 13.6

3 27.2

4 40.9

5 61.1

41
Example - Solution

Determine the correlation coefficient of the following data

0 2.1 0 0 0 0 0

1 7.7 1 1 1 7.7 7.7

2 13.6 4 8 16 27.2 54.4

3 27.2 9 27 81 81.6 244.8

4 40.9 16 64 256 163.6 654.4

5 61.1 25 125 625 305.5 1527.5

15 152.6 55 225 979 585.6 2488.8

42
Example - Solution

Solving the 3 x 3 matrix —>

43
Example - Solution

0 2.1 544.44 0.1433

1 7.7 314.47 1.0029

2 13.6 140.03 1.0816

3 27.2 3.12 0.8049

4 40.9 239.22 0.6195

5 61.1 1272.11 0.0944

15 152.6 2513.39 3.7466


St Sr

44
Example - Solution

45
Thank You!

46

You might also like