Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

Theory of Least square fitting

1 Theory of Least Square Fitting


Linear least square fitting is a technique used to find the best-fit line that minimizes the sum of the squared
residuals between the observed data points and the predicted values. Given a set of data points {(xi , yi )}, the
goal is to find the line of the form y = mx + c that best fits the data. Linear curve :

y = mx + c

Slope and intercept is given by the definition of least square curve fitting :
P P P
n x.y − ( x).( y)
Slope(m) = P P
n x2 − ( x)2

Intercept(c) = ȳ − m · x̄

Theory of Linear Least square fitting

The best-fit line is then given by y = mx + c, where m and c are the values obtained from solving the normal
equations.

Code and output of Linear square fitting

# considerring the equation to be y = mx + c


import math
import matplotlib.pyplot as plt
import numpy as np

x=np.array([i for i in range(1,7)])


y=np.array([1.5,2.5,3.5,4.5,13.6,9.9])

xy = x*y
xx = x*x
n= len(x)

sxy = sum(xy)
sxx = sum(xx)
sx = sum(x)
sy = sum(y)
Figure 1: Linear Function
# m = n.sum(x.y) - sum(x).sum(y)/ {n.sum(x^2) - sum(x).sum(x)}

m = (n*sxy - sx*sy)/(n*sxx - sx*sx)

c = (sy - m*sx)/n
z=[]

for i in x :
z.append(m*i+c)

print("Data points : " )


for i , j in zip(x,y):
print(f'({i},{j})', end = '')
print()
print("Fitted points : ")
for i , j in zip(x,z):
print(f'({i},{j})', end = '')
plt.scatter(x,y,color = 'black' , label = 'Data Points ')
plt.plot(x,y,color = 'blue' , label = 'Data Points ')
plt.plot(x,z,color = 'red' , label = 'Fitted Line ')
xx = np. linspace(0,15,100)
plt.xlabel("X axis ")
plt.ylabel("Y axis")
plt.title("Linear least Square Fitting")
plt.legend()
plt.grid()
plt.show()

Theory of square fitting of power function

By taking the logarithm of both sides of the power function, you


can transform it into a linear equation that can be fitted using linear
regression. Here’s how you can do it:

• Start with the exponential equation: y = axb

• Take the logarithm (loge ) of both sides of the equation: loge (y) =
loge (axb ).

• Apply the properties of logarithms: loge (y) = loge (a) + b.loge (x)

• Now fit the curve same as linear regression Y = MX+C M and C are
slope and intercept
Y = loge (y), X = loge (x), M = b, C = loge (a)

Code and output of square fitting of power function

# considerring the power function to be y =ax**b


# logy = log(a) + blog(x)
# now comparing with linear equation ..........Y = log(y)...\\
#..X = log(x) ........m=b.........c= log(a)
#.....Y = mX+c
import math
import matplotlib.pyplot as plt
import numpy as np

x=np.array([i for i in range(1,6)])


y=np.array([2,16,54,128,251])

X = np.log(x)
Y= np.log(y)

X_mean = np.mean(X)
Y_mean = np.mean(Y)
n=len(x) Figure 2: Linear Function

sxy = np.sum(X*Y) -n*(X_mean*Y_mean )


sxx = np.sum(X*X) -n*(X_mean*X_mean )
# m = b
m= sxy/sxx
c = Y_mean - m*X_mean
a = 2.71828**c
print("Value of the slope = ", m)
print("Value of the intercept = ", c)
print()
print("The value of a in function ",a)
b=m
print("The value of b in the function ",b)

z=[]

for i in x:
z.append((a*i)**b)
xx = np.linspace(1,6,100)
yfit = a*np.power(xx,b)
print("Data points : " )
for i , j in zip(x,y):
print(f'({i},{j})', end = '')
print()
print("Fitted points : ")
for i , j in zip(x,z):
print(f'({i},{j})')

plt.scatter(x,y,color = 'black' , label = 'Data Points ')


plt.plot(x,y ,color = 'blue' , label = 'Data Points ')
plt.plot(x,z,color = 'red' , label = 'Fitted Line ')
xx = np. linspace(1,6,100)
plt.xlabel("X axis ")
plt.ylabel("Y axis")
plt.title("Exponential least Square Fitting")
plt.legend()
plt.grid()
plt.show()

Theory of square fitting of exponential function

By taking the logarithm of both sides of the exponential equation,


you can transform it into a linear equation that can be fitted using
linear regression. Here’s how you can do it:

• Start with the exponential equation: y = aebx

• Take the logarithm (log) of both sides of the equation:

loge (y) = loge (aebx ).

• Apply the properties of logarithms: loge (y) = loge (a) + b.loge (x)

• Now fit the curve same as linear regression Y = MX+C {M and C are
slope and intercept}

Y = loge (y), X = loge (x), M = b, C = loge (a)


Code and output of square fitting of exponential function

# considerring the exponential equation to be y =ae^bx


# logy = log(a) + bx
# now comparing with linear equation ...Y = log(y)...X = x
#m=b...c= log(a)
#.....Y = mX+c
import math
import matplotlib.pyplot as plt
import numpy as np

x=np.array([i for i in range(1,7)])


y=np.array([2,5,11,29,78,94])
X = x
Y= np.log(y)
X_mean = np.mean(X)
Y_mean = np.mean(Y)
n=len(x)
sxy = np.sum(X*Y) -n*(X_mean*Y_mean )
sxx = np.sum(X*X) -n*(X_mean*X_mean )
# m = b
m= sxy/sxx Figure 3: Secant method
c = Y_mean - m*X_mean
a = 2.71828**c

print("Value of the slope = ", m)


print("Value of the intercept = ", c)
print()
print("The value of a in function ",a)
b=m
print("The value of b in the function ",b)
z=[]
for i in x:
z.append((a*2.71828)**i)
xx = np.linspace(1,6,100)
#yfit = a*np.power(xx,b)
print("Data points : " )
for i , j in zip(x,y):
print(f'({i},{j})', end = '')
print()
print("Fitted points : ")
for i , j in zip(x,z):
print(f'({i},{j})')

plt.scatter(x,y ,color = 'black' , label = 'Data Points ')


plt.plot(x,y ,color = 'blue' , label = 'Data Points ')
plt.plot(x,z,color = 'red' , label = 'Fitted Line ')
plt.scatter(x,z,color = 'black' , label = 'Fitted Line ')
xx = np. linspace(1,5,10)
plt.xlabel("X axis ")
plt.ylabel("Y axis")
#plt.plot(xx,yfit)
plt.title("Exponential least Square Fitting")
plt.legend()
plt.grid()
plt.show()

You might also like