Professional Documents
Culture Documents
SVM Présentation
SVM Présentation
SVM Présentation
Machines
Here is where the presentation
begins
INDEX.HTML
“If you possess a restricted amount of information for
solving some problem, try to solve the problem directly and
never solve a more general one as an intermediate step. It is
possible that the available information is sufficient for a
direct solution but it is insufficient for solving a more
general intermediate problem.”
INDEX.HTML
<WHOA!
> Members:
Mehyar MLAWEH
Sami FARHAT
Ghada SOLTANI
Fide BEN SLIMEN
Youssef MOUSSA
INDEX.HTML
/Summary
INDEX.HTML
/01
/
INTRODUCTI
ON
Let’s Discover the SVM
INDEX.HTML
/
INTRODUCTI
ON
SVM is a supervised learning Algorithm
developed by Vladimir Vapnik and it was first
heard in 1992
INDEX.HTML
Before we continue with the Svm, let’s first understand this term
/SUPERVISED
ALGORITHMS
In supervised learning, the training data provided to the
machines work as the supervisor that teaches the
machines to predict the output correctly. It applies the
same concept as a student learns in the supervision of
the teacher.
INDEX.HTML
/BASIC IDEA
The algorithm creates a line or a hyperplane
which separates the data into classes.
INDEX.HTML
/02
/INPUTS
What is the proper format for input data?
INDEX.HTML
/INPUTS & OUTPUTS
INDEX.HTML
/03
/Principles of
Operation
HOW IT WORKS?
INDEX.HTML
/CASES?
/NON-LINEAR
/LINEAR PROBLEM
PROBLEM
SVM is used for linearly separable The nonlinear problems are mapped
data, which means if a dataset can be into a high-dimensional feature space,
classified into two classes by using a in which the linear classification is
single straight line. possible.
INDEX.HTML
/How does SVM work? (LINEAR PROBLEM)
/How does SVM work?
/SVM’s way to find the best line
According to the SVM algorithm we find the points closest to the line from both the classes.
These points are called support vectors. Now, we compute the distance between the line and
the support vectors. This distance is called the margin. Our goal is to maximize the margin.
The hyperplane for which the margin is maximum is the optimal hyperplane.
INDEX.HTML
/How does SVM work?
/How does SVM work? (NON-LINEAR PROBLEM)
KERNEL Trick
Back to original
Kernel Functions Space
The idea is to transform the input data representation space into a higher dimensional space
using kernel Trick*, where a linear classifier can be used and achieve good performance.
Kernel Trick:
The knowledge of K(x,y ) makes it possible to calculate a scalar product where Φ(x) intervenes
without knowing the expression of Φ(y)
However, K(x,y) must satisfy certain conditions (Mercer conditions) for the corresponding Φ ( )
to exist.
INDEX.HTML
/ PROCESS
INDEX.HTML
/04
/USE CASE
The field use requirements
INDEX.HTML
<SVM IS WIDELY
USED>
INDEX.HTML
/USES
/Text and
/Classification
/Face detection Hypertext
of images
Categorization
SVMc classify parts of the image SVMs allow Text and hypertext
SVMs provides better search
as a face and non-face and create a categorization for both inductive
accuracy for image classification.
square boundary around the face and transductive models.
/Generalized
/Handwriting /
predictive
recognition Bioinformatics
control(GPC)
We use SVMs to recognize It includes protein classification Use SVM based GPC to control
handwritten characters used widely. and cancer classification chaotic dynamics with useful
parameters.
INDEX.HTML
/THANKS!
/DO YOU HAVE ANY QUESTIONS?
INDEX.HTML