Professional Documents
Culture Documents
Developing A Machining Learning Models From Start To Finish.
Developing A Machining Learning Models From Start To Finish.
Developing A Machining Learning Models From Start To Finish.
Dr. A. OBULESU
Assoc. Professor
1
Learn by Doing…
2
A little about me
1982 – 2020 . . .
3
Agenda
④ Clarifications on Models
4
programmed"
Ma
Machine learning is about predicting the future
chinbased on
the past.
-- Hal Daume III e
Lear
past future
ning ic t
rn e d
lea r
Training model/ Testing
is…
model/
p
AI can Achieve
Robotics
Machine
Learning
DL
8
Instance based
How They Generalize? Modl based
Model Based
9
Supervised learning
label
• Feed The Label Dataset to The Model…
apple
apple
banana
In My Words: Supervised Learning is
banana Learning By Teacher…
Ma
chin
rn e i ct
lea pr
e d
Lear
model/
predictor
ning
is…
Supervised
Learning
11
Test Data
12
Classification
Classification : Predicting of Discrete Value .i.e. Apple or Banana or 0 or 1
Applications
• Face recognition
• Character recognition ,
• Spam detection,
• Medical diagnosis: From symptoms to illnesses,
• Biometrics: Recognition/authentication using physical and/or behavioral
characteristics: Face, iris, signature, etc..
14
3.2
4.3
Regression Applications
• Economics/Finance: predict the value of a stock
• Epidemiology
Unsupervised learning
In My Words: Un-Supervised
Learning is Learning With Out
Teacher…
Unsupervised learning
•Learn clusters/groups without any label
•Image compression
Reinforcement learning
•Given a sequence of examples/states and a reward after
completing that sequence, learn to predict the action to take in
for an individual example/state
… WIN! In My Words:
Feedback System
… LOSE!
…
18
Lets us understand with use cases?
P
UL B
I
S H
Collect the data automatically Self driving car on the road Accurate results in Google search
2. Collecting Data
This is the first real step towards the real development of a machine learning
model
The more and better data that we get , The better model we will design
There are many sources calling Web scraping like
Kaggle
UCI Machine Learning Repository: Created as an ftp archive in 1987 by
David Aha and fellow graduate students at UC Irvine
23
2. Case study :
Taken files from Kaggle
features.csv
target.csv
24
2. Case study :
Taken files from Kaggle
features.csv : Features Names
target.csv : Features Name
25
•
28
Learning model
data that has learned patterns that are too specific to training data and
•
Getting more data is usually the best solution, a model trained on more data
Regularization
•L1 regularization: The cost is proportional to the absolute the value of the
•L2 regularization: The cost is proportional to the square of the value of the
34
•
To develop a benchamark model that serves us as a baseline
Regression
• Predict future scores on Y based on measured scores on X
Predictions are based on a correlation from a sample where
both X and Y were measured.
Equation is linear:
y = bx + a
y = predicted score on y
x = measured score on x
b = slope
a = y-intercept
37
Y i 0 1X i i
Dependent Independent (Explanatory) Variable
(Response) Variable
(e.g., COVID-19.) (e.g., age)
38
Logistic regression
That the probability can not be negative, so we introduce a term called
Since the probability can never be greater than 1, we need to divide our
Knn Classifier
• Data set:
• Training (labeled) data: T = {(x i , yi)}
• x i ∈ Rp
• Test (unlabeled) data: x0 ∈ Rp
• Tasks:
• Classification: yi ∈ {1, . . . , J}
• Regression: yi ∈ R
• Given new x0 predict y0
• Methods:
• Model-based
• Memory-based
44
Classification
Regression
KNN Classifier
• 1 NN
• Predict the same value/class as the nearest instance
in the training set
• k NN
• Find the k closest training points (small ǁxi − x0ǁ
according to some metric, for ex. euclidean,
manhattan, etc.)
• predicted class: majority vote
• Predicted value: average weighted by inverse
distance
47
K NN
49
k NN - Example
k NN Classification
k NN
k NN
• Choice of k
Geometric Margin
Definition: The margin of example 𝑥 w.r.t. a linear sep. 𝑤 is the distance
from 𝑥 to the plane 𝑤 ⋅ 𝑥 = 0.
Thank you !
obuleshcse@cvsr.ac.in