Download as xlsx, pdf, or txt
Download as xlsx, pdf, or txt
You are on page 1of 6

Q. No.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23

24

25

26
27

28

29
30

31
32
33

34
35

36
37
38
39
40
41
Question
What is Machine learning? What is its need?
Explain four examples of machine learning in detail?
Consider the problem of sorting ‘n’ numbers. Is it wise to apply machine learning to solve this problem? Justify.
What is supervised and unsupervised learning? Explain with the examples.
Explain structure of machine learning?
Explain components or ingredients of learning.
Explain learning Vs Designing?
Explain Training verses Testing.
Explain Bias variance trade off.
Explain and differentiate predictive and descriptive learning task.
Explain geometric models in detail with example.
Explain logical models in detail with example.
Explain Probabilistic models in detail with example.
What do you meant by features? What are the different properties of features? Explain types of features?
What do you meant by feature transformation and feature construction?
Explain feature selection in details.
Write short note on Reinforcement Learning
What is Hypothesis in Machine Learning?
What is F1-score and How Is It Used?
Explain how the ROC curve works?
Explain different types of predictive machine learning tasks
State and explain different levels of measurement with example.
Explain different states of data preparation

Define and explain following terms with example


(i) label (ii) label space
(iii) Output Space (iv) Classification problem
(vi) Regression Problem (vi) Ranking and scoring problem
(vii) Probability Estimation Problem
(viii) VC dimension
(ix) Decision Boundary (x) Decision Region

Explain with diagram


1.Univariate Binary Classification.
2. Bivariate Binary Classification.
3. Multivariate Binary Classification.

Prove that
(i) FPR = 1 – TNR (ii) TNR = 1 – FPR
(iii) FNR = 1 – TPR (iv) TPR = 1 – FNR
(v) Accuracy = 1 – Error Rate (vi) Error Rate = 1 – Accuracy
Define and explain following terms.
1. True Positive 2. True Negative
3. False Positive 4. False Negative
5. TPR 6. TNR
7. FPR 8. FNR
9. Sensitivity 10. Recall
11. Specificity 12. Fallout
13. Miss rate

Explain construction of multi-class classifier.


1. One vs all approach
2. One vs one approach
3. Error correcting output codes approach.

Explain confusion matrix for multi-class classifier. Write formulae for following measures used for performance evaluation of
multi-class classification.
(i) Accuracy of multi-class classifier.
(ii) Error Rate of multi-class classifier.
(iii) Precision of multi-class classifier.
(iv) Recall of multi-class classifier
Write output code matrix for symmetric one versus one scheme, where k=4.
Write Short note on following two examples of machine learning applications
1. Learning Association
2. Reinforcement Learning
Explain with example forward & backward selection methods of subset selection.
Explain with example K-fold cross validation
Derive & explain output code matrix for asymmetric one Vs one and Ordered one Vs rest scheme for construction of multiclas
classifier (for 3 classes).
Write short note on Gram Matrix and explain with example.

Consider the following decision tree. (Positive Class: Fast Learner)


1. Find contingency table.
2. Find Recall.
3. Precision.
4. Negative recall.
5. False positive rate.

Why SVM is an example of a large margin classifier?


What is the intuition of a large margin classifier?
What is a kernel in SVM? Why do we use kernels in SVM?
Can we apply the kernel trick to logistic regression? Why is it not used in practice then?
What is the difference between logistic regression and SVM
CO BT Level Marks Unit No.
C01 L1 5 1
C01 L2 5 1
C01 L3 5 1
C01 L2 5 1
C01 L1 5 1
C01 L2 5 1
C01 L2 5 1
C01 L2 5 1
C01 L3 5 1
C01 L2 5 1
C01 L2 5 1
C01 L2 5 1
C01 L2 5 1
C01 L2 5 1
C01 L2 5 1
C01 L2 5 1
C01 L2 5 1
C01 L2 5 1
C01 L2 5 1
C01 L2 5 1
CO1 L2 5 1
CO1 L2 5 1
CO1 L2 5 1

CO2 L2 5 2

CO2 L2 5 2

CO2 L2 5 2
CO2 L2 5 2

CO2 L2 5 2

CO2 L2 5 2
CO2 L3 5 2

CO1 L2 5 2
CO1 L2 5 2
CO1 L2 5 2

CO2 L2 5 2
CO2 L2 5 2

CO2 L3 5 2
CO2 L2 5 2
CO2 L3 5 2
CO2 L3 5 2
CO2 L3 5 2
CO2 L3 5 2

You might also like