Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

1

Supervised Learning
TOPIC 4:
Understand Regression and Classification

Prepared by Nima Dema


CSA203:Artificial Intelligence and Machine Learning
2
Regression
Multiple Linear Regression

3
Multiple Linear Regression

Previously- Simple Linear Regression

Single
Target
Feature

4
Multiple Linear Regression
j = 1…4
n=4

X3(2) = 2

i th
X = features of i training example
Xj = jth feature

n= number of features Xj(i) = value of feature j in ith training


example
5
Multiple Linear Regression
Model
Previously:

fw,b(x)= w1x1+w2x2+w3x3+w4x4+b

Example:
fw,b(x)= 0.1 x1 + 4x2 + 10x3 + -2x4 + 80
size # bedrooms # floors years
6
Multiple Linear Regression

fw,b(x)= w1x1+w2x2+w3x3+w4x4+b
W= [w1,w2,w3,…wn] Parameters of the model
b is a number
Vector

dot product Multiple Linear Regression


7
Regression
Vectorization

8
Multiple Linear Regression
Without Vectorization
Parameters and features
W= [w1 w2 w3] where n = 3
W= [w1 w2 w3] where n = 3
b is a number
b is a number

f = w[0] * x[0] +
w[1] * x[1] +
w[2] * x[2] + b

9
Multiple Linear Regression
Without Vectorization With Vectorization

f = 0 f = np.dot(w,x) + b
for j in range(0,n)
f = f + w[j] * x[j]
f = f + b

10
Multiple Linear Regression
Without Vectorization With Vectorization

11
Normal Equation to learn parameters

12
THANK YOU ☺

13

You might also like