Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

NPTEL Online Certification Courses

Indian Institute of Technology Kharagpur

Deep Learning
Assignment- Week 3
TYPE OF QUESTION: MCQ/MSQ
Number of questions: 10 Total mark: 10 X 1 = 10
______________________________________________________________________________

QUESTION 1:
What is the shape of the loss landscape during optimization of SVM?

a. Linear
b. Paraboloid
c. Ellipsoidal
d. Non-convex with multiple possible local minimum
Correct Answer: b

Detailed Solution:

In SVM the objective to find the maximum margin based hyperplane (W) such that

WTx + b =1 for class = +1 else WTx + b = -1

For the max-margin condition to be satisfied we solve to minimize ||W||.

The above optimization is a quadratic optimization with a paraboloid landscape for the loss
function.

QUESTION 2:
For a 2-class problem what is the minimum possible number of support vectors. Assume there
are more than 4 examples from each class

a. 4
b. 1
c. 2
d. 8
Correct Answer: c

Detailed Solution:

To determine the separating hyper-plane, we need at least 1 example (which becomes a


support vector) from each of the classes.
NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur

QUESTION 3:
Choose the correct option regarding classification using SVM for two classes

Statement i : While designing an SVM for two classes, the equation 𝑦𝑖 (𝑎𝑡 𝑥𝑖 + 𝑏) ≥ 1 is used to choose
the separating plane using the training vectors.
Statement ii : During inference, for an unknown vector 𝑥𝑗 , if 𝑦𝑗 (𝑎𝑡 𝑥𝑗 + 𝑏) ≥ 0 , then the vector can be
assigned class 1.
Statement iii : During inference, for an unknown vector 𝑥𝑗 , if (𝑎𝑡 𝑥𝑗 + 𝑏) > 0 , then the vector can be
assigned class 1.

a. Only Statement i is true


b. Both Statements i and iii are true
c. Both Statements i and ii are true
d. Both Statements ii and iii are true

Correct Answer: b

Detailed Solution:

Follow the lecture for detailed explanation

QUESTION 4:
Find the scalar projection of vector b = <-4, 1> onto vector a = <1,2>?

a. 0
−2
b.
√5
2
c.
√7
−2
d. 5

Correct Answer: b

Detailed Solution:
𝒃∙𝒂 −𝟒 ×𝟏+𝟏 ×𝟐 −𝟐
Scalar projection of b onto vector a is given by the scalar value |𝒂|
= =
√𝟏𝟐 +𝟐𝟐 √𝟓

QUESTION 5:
The distance of a feature vector [2, 3, -2] from the separating plane x1+ 2x2+ 2x3 + 5 = 0 is given
by.
NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur

a. 5

b. 5/3

c. 3

d. 13

Correct Answer: c

Detailed Solution:
𝒂𝒚𝟏 + 𝒃𝒚𝟐 +𝒄𝒚𝟑 +𝒅
Distance of a vector [y1, y2, y3] from the plane ax1+ bx2+ cx3 + d = 0 is given by d =
√𝒂𝟐 +𝒃𝟐 +𝒄𝟐

𝟏 ×𝟐+ 𝟐 ×𝟑+𝟐 ×(−𝟐)+𝟓 𝟗


= = =3
√𝟏𝟐 +𝟐𝟐 +𝟐𝟐
𝟑

QUESTION 6:
Suppose we have the below set of points with their respective classes as shown in the table.
Answer the following question based on the table.

X Y Class Label

1 0 +1

-1 0 -1

2 1 +1

-1 -1 -1

2 0 +1

Suppose we train an SVM based on these points. What will happen to maximum margin if we
remove the point (2,1) from the training set?

a. Maximum margin will decrease


b. Maximum margin will increase
NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur

c. Maximum margin will remain same


d. Can not decide

Correct Answer: c

Detailed Solution:

Here the point (2,1) is not a support vector, so if we remove the point, then maximum margin
will not change and remain the same.

QUESTION 7:
Suppose we have the below set of points with their respective classes as shown in the table.
Answer the following question based on the table.

X Y Class Label

1 0 +1

-1 0 -1

2 1 +1

-1 -1 -1

2 0 +1

What will happen to maximum margin if we remove the point (-1,0) from the training set?

a. Maximum margin will decrease


b. Maximum margin will increase
c. Maximum margin will remain same
d. Can not decide

Correct Answer: b

Detailed Solution:
NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur

Here the point (-1,0) is a support vector, so if we remove the point, then maximum margin
will increase.

QUESTION 8:
Suppose we have the below set of points with their respective classes as shown in the table.
Answer the following question based on the table.

X Y Class Label

1 0 +1

-1 0 -1

2 1 +1

-1 -1 -1

2 0 +1

What can be a possible decision boundary of the SVM for the given points?

a. 𝑦=0
b. 𝑥=0
c. 𝑥=𝑦
d. 𝑥+𝑦 =1

Correct Answer: b

Detailed Solution:

Plot the points to visualize the answer.


NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur

QUESTION 9:
Suppose we have the below set of points with their respective classes as shown in the table.
Answer the following question based on the table.

X Y Class Label

1 0 +1

-1 0 -1

2 1 +1

-1 -1 -1

2 0 +1

Find the decision boundary of the SVM trained on these points and choose which of the
following statements are true based on the decision boundary.
i) The point (-1,-2) is classified as -1
ii) The point (1,-2) is classified as -1
iii) The point (-1,-2) is classified as +1
iv) The point (1,-2) is classified as +1

a. Only statement ii is true


b. Both statements i and iii are true
c. Both statements i and iv are true
d. Both statements ii and iii are true

Correct Answer: c

Detailed Solution:

The decision boundary is x=0. For the point (-1,-2) , -1 < 0 so the point is classified as -1.
Similarly, for the point (1,-2) , 1 > 0 so the point is classified as 1.

QUESTION 10:
NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur

Which one of the following is a valid representation of hinge loss (of margin = 1) for a two-class
problem?

y = class label (+1 or -1).

p = predicted (not normalized to denote any probability) value for a class.?

a. L(y, p) = max(0, 1- yp)


b. L(y, p) = min(0, 1- yp)
c. L(y, p) = max(0, 1 + yp)
d. None of the above

Correct Answer: a

Detailed Solution:

Hinge loss is meant to yield a value of 0 if the predicted output (p) has the same sign as that
of the class label and satisfies the margin condition, |p| > 1. If the signs differ, the loss is
meant to increase linearly as a function of p. Option (a) satisfies the above criteria.

______________________________________________________________________________

************END*******

You might also like