Professional Documents
Culture Documents
Lecture 3 Part 2: Classifiers (Support Vector Machines, Decision Trees, Nearest Neighbor Classification)
Lecture 3 Part 2: Classifiers (Support Vector Machines, Decision Trees, Nearest Neighbor Classification)
Lecture 3 part 2
Classifiers
(Support Vector Machines, Decision Trees,
Nearest Neighbor Classification)
2
n-fold cross-validation
•The available data is partitioned into n equal-size disjoint
subsets.
. متساوية الحجمn •يتم تقسيم البيانات المتوفرة إلى مجموعات فرعية مفككة
•Use each subset as the test set and combine the rest n-1
subsets as the training set to learn a classifier.
n-1 •استخدم كل مجموعة فرعية كـ مجموعة االختبار واجمع المجموعات الفرعية
.المتبقية كـ مجموعة التدريب لتعلم مصنف
•10-fold and 5-fold cross-validations are commonly used.
. أضعاف عبر التحقق من صحة5 أضعاف و10 •وتستخدم عادة
•This method is used when the available data is not large.
•يتم استخدام هذا األسلوب عندما ال تكون البيانات المتوفرة كبيرة.
3
Outlier data points
4
Support Vector Machine
SVM
5
Main Ideas
6
Main Ideas
7
Main Ideas
• We can do better in choosing the threshold such that:
8
Main Ideas
10
Main Ideas
11
Main Ideas
12
Main Ideas
13
Main Ideas
14
Main Ideas
15
Main Ideas
16
Main Ideas
17
Main Ideas
18
Main Ideas
20
Main Ideas
21
Main Ideas behind
Support Vector Machine
22
Main Ideas behind
Support Vector Machine
23
Main Ideas behind
Support Vector Machine
24
Main Ideas behind
Support Vector Machine
25
Main Ideas behind
Support Vector Machine
26
Main Ideas behind
Support Vector Machine
27
Mathematics behind Support
Vector Machine
الرياضيات وراء دعم آلة ناقالت
28
Tennis example
Temperature
Humidity
= play tennis
= do not play tennis
29
Linear Support Vector
Machines
Data: <xi,yi>, i=1,..,l
xi Rd
yi {-1,+1}
x2
=+1
=-1
30
x1
Linear SVM
f(x) =-1
=+1
34
Constrained Optimization Problem-
dual problem
Characteristics:
=-1
=+1
Φ: x → φ(x)
38
39
Examples of Kernel Functions
• Polynomial kernel with degree d
40