Professional Documents
Culture Documents
SS ZG568 EC 2R SECOND SEM 2020 2021 Solution 1617000149821
SS ZG568 EC 2R SECOND SEM 2020 2021 Solution 1617000149821
Mid-Semester Test
(EC-2 Regular)
SOLUTION
Q.2Set. (A) You are training a logistic regression classifier to classify the following data
Input x Class
-0.5 1
0.5 0
Logistic regression function is given by y = 1/ (1 + exp(-w0 – w1*x)). Assume initially at t=0,
(w0, w1) = (0,0). What will be the values of w0 and w1 after one iteration with learning rate =
1. [3 + 3 = 6]
y = 1/(1+exp(−w0−w1x))
Intially t =0 (w0,w1) = (0,0)
With learning rate = 1, w0 (t=1) = 0, w1 (t=1) = -0.25
Q.2Set. (B) Design one logistic regression based classifier which will give the best accuracy on training
data described by the following dataset. Class Y=1 if output of classifier >0.5, Y=0 if output
of classifier < 0.5. Note multiple solutions are possible. [4 + 2 = 6]
(a) What is the logistic regression equation (specify all parameters) for the best possible classifier
(zero classification error) with least chance of overfitting?
h(x1,x2)=1/(1+exp(w0-w1x1^2-w2x2^2))
w0, w1, w2 need to be chosen such that all points with output class Y=0 are inside the ellipse
given by w1x1^2+w2x2^2=w0. w1=w2=1 and 1.0 < w0 < 2 will achieve zero classification
error.
(b) What is the equation of the corresponding decision surface? Draw the decision surface. Note:
classifier output =0.5 for points on the decision surface.
w1x1^2+w2x2^2=w0 is the corresponding decision surface. It will be elliptical in shape
centered at (0,0).
Q.2Set. (C) You are training a logistic regression classifier to classify the following data
Input x Class
-0.5 1
0.5 0
Q.3Set. (A) First five documents in the following figure are used to train a Naive Bayes classifier.
Calculate Prob( + | Test ), Prob( - | Test ). If needed, use Laplace smoothing. Which class
does the Test document belong to? [6]
Q.3Set. (B) First five documents in the following figure are used to train a Naive Bayes classifier.
Calculate Prob( + | Test ), Prob( - | Test ). If needed, use Laplace smoothing. Which class
does the Test document belong to? [6]
Q.3Set. (C) The dataset given below contains 10 training samples for a binary classification problem
with attributes color, type, origin of the car and the class label theft assigned as yes/no.
Predict the probability of theft of a Test record = <Red, Domestic, SUV car>. If needed, use
Laplace smoothing. [6]
P(theft=Y)=0.5 P(theft=N)=0.5
P(yes|test)=P(test|yes)P(yes)/P(test) = P(red|yes)P(Domestic|yes)P(SUV|yes)
P(yes)/P(test)
=3/5*2/5*1/5*0.5/P(test)
P(no|test)= P(red|no)P(Domestic|no)P(SUV|no) P(no)/P(test)
=2/5*3/5*3/5*0.5/P(test)
Q.4Set. (A) Consider the input output pairs <x,y> of the training data given as <1,1>, <1,2>, <2,2>,
<3,3>, <5, 3>, <7, 8>, <6,4>, <7,5>, <6,7> and <4,4> . Let h(x) = w*x be the hypothesis in
one parameter w (lines passing through the origin) used for line fitting for the given data,
where w is to be taken as the slope of the line represented by the hypothesis. [2 + 4 = 6]
(a) What is the equation of sum of squared error or loss function E(w)?
(b) What is the shape of E(w)? Calculate the optimal w=w_optimal and minimum value
of total squared error E(w_optimal).
At Global minima,
(a) Using regression, determine a, b in y=a × 10bx for the table given below.
Check Q4 SetB.pdf
(b) Find the optimal sum of squares error.
Input x Output y
0 1.4
2 16
4 160
6 1400
Input x Output y
1 exp(2)
2 exp(4)
3 exp(6.3)
4 exp(9.2)
ln(y_i)= α x_i
J(α) = (α-2)^2+(2 α-4)^2+(3 α – 6.3)^2 + (4 α – 9.2)^2
Q.5Set. (A) [3 + 1 + 1 + 1 = 6]
(a) Find the equation of maximum margin SVM classifier for the OR logic gate with following
truth table
Q.5Set. (B) [3 + 1 + 1 + 1 = 6]
(a) Find the equation of maximum margin SVM classifier for the AND logic gate with following
truth table
Q.5Set. (C) [3 + 1 + 1 + 1 = 6]
(a) Find the equation for hyper plane using linear Support Vector Machine method. Positive Class
data points: (x1,x2)={(3, 2), (4, 3), (2, 3), (3, -1)} Negative Class data points: {(1, 0), (1, -1),
(0, 2), (-1, 2)}
*******