Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

ML: Ques Bank For Unit :3

1. Write a short note on


Linear regression 6M
2. Explain Regression. Write short note on Non-Linear regression. 6M
3. The weights of a calf taken at weekly intervals are given below. Fit a straight
line using the method of least squares and calculate the average rate of
growth per week. 8M

Age (x) 1 2 3 4 5
Weight (y) 52.5 58.7 65.0 70.2 75.4

Age (x) 6 7 8 9 10
Weight (y) 81.1 87.2 95.5 102.2 106.4

4. The table below gives the temperature T (in 0c) and length l (in mms) of a
heated rod. If l = a0 + a1T, Find the values of a0 and a1using linear least
squares. 4M

T 40 50 60 70 80
l 600.5 600.6 600.8 600.9 601.0

5. How we reduce the problem of fitting the curve y = aebx for finding a least
square straight line through the given data
6. How we reduce the problem of fitting a power function y = axc for finding a
least square straight line through the given data.
7. What is the use of the method of least squares?
8. Use the method of least squares to fit the straight line y = a + bx to the data
given below.
X 0 1 2 3 4
Y 1 2.9 4.8 6.7 8.6

9. Explain how the least square method is used to estimate the parameters of a
linear model.
10. Write down the statistical assumptions in linear regression model.
11. In a linear regression model obtain the least square estimates of the
parameters.
12. Explain probability distribution with real life example.
13. Show the design of a two layer perceptron to solve the XOR problem in a 2-
D input feature space. 8M
14. Explain that a perceptron with J hidden units an I-dimensional input space is
mapped onto the vertices of a hypercube made by J hyperplanes.8M
15. With multilayer networks what is the limitation of the least squares cost
function and suggest an alternate cost function with appropriate equations
that is better suited for pattern recognition tasks and indicate its advantages.
16. For a support vector machine how is the dependency on the weight vector in
the primal space eliminated by recasting the optimization problem in the
dual space and explain the method of finding the optimal hyperplane
corresponding to its optimal weight vectors.

Q. 1. Short note on linear models.


Q. 2. What do you meant by SSE in Regression ? Explain it's
significance in least square method ?
Q. 3. What do you meant by least square method ? Explain least
square method in the context of linear regression.
Q. 4. Prove that in least square method for linear regression.
B1 = xy / xy
Where,
xy = Co-variance of x and y
xx = Co-variance of x and x
Q. 5. Explain how to interpret equation of Regression Equation.
Q. 6. Explain characteristics of Regression line.
Q. 7. Prove that estimation of and by least square method is
unbiased. OR
Prove that and produce by least square method are unbiased.
Q. 8. Prove that calculated by least square method is unbiased
estimator of B1.
Q. 9. Prove that calculated by least square method is unbiased
estimator of B0.
Q. 10. Consider following data. Where
Xi = Rating for movie "Bahubali - part 1" by the person.
Yi = Rating for movie "Bahubali - part 2" by the person.
Where rating is to done on the scale of 1 to 5 and 1 is lowest rating and 5
is highest rating.
(a) Find values of B0 and B1 w.r.t. linear regression model which best fits
given data.
(b) Interpret and explain equation of regression line.
(c) If new person rates "Bahubali part - 1" as 3 then predict the rating of
same person for "Bahubali part - 2".
(d) Can we think that this problem is classification. Justify. If it is regression
problem then also justify.

Person Xi = Rating for Yi = Rating for


movie "Bahubali movie
- part 1" by ith "Bahubali -
the person part 2" by ith
the person
1st 4 3
nd
2 2 4
rd
3 3 2
4th 5 5
5th 1 3
th
6 3 1
Training Data for Question 10.
Q. 11. In the context of regression define and explain following terms
(1) SST
(2) SSR
(3) MSE
(4) SSR : Sum of squares of errors due to regression
(5) SST : SSE + SSR
Q. 12. Find SST, SSE, SSR for regression problem in Q. 10.
Q. 13. What do you meant by coefficient of regression ? Explain it is
significance ? Calculate coefficient of regression for data in Questions
10.
Q. 14. What are properties of regression line ?
Q. 15. Short note on multivariate regression.
Q. 16. Explain difference between multivariate and univariate
regression.
Q. 17. What is multivariate regression ?
Q. 18. With respect to multivariate regression explain following terms
with examples
(i) Response vector
(ii) Prediction vector
(iii) Design matrix
(iv) Slope vector
(v) Error vector
Q. 19. In multivariate regression prove that
T 2
SSE = YTX – 2  X  y + (X )
Q. 20. Applying least square method in multivariate regression, prove
that
T
(i) =  X  y
T
(ii) = X  X  y
Q. 21. What do you meant by collinearity and multi-collinearity ?
Explain effect of multicollinearity on multivariate regression.
Q. 22. In the context of multivariate regression, elefire and explain
following terms
(1) SST : sum of squared total errors
(2) SSE : sum of squares of errors
(3) SSR : sum of squares of errors due to regression
(4) MSST : Mean of sum of squares of total errors
(5) MSE : Mean of sum of squares of errors
(6) MSR : Mean of sum of squares of errors due to
regression
Q. 23. Define F-static and PEV. Explain their sole in deciding quality
of multivariate regression (This questions is for understanding results of
multivariate regressions in R and comparing them. Practically)
Q. 24. What do you mean by Regularized Regression ?
Q. 25. What is need of regularized regression ?
Q. 26. Explain ridge regression.
Q. 27. Explain lasso regression.
Q. 28. Compare ridge and lasso regression.
Q. 29. Lasso regression produces sparse solutions. Justify.
Q. 30. Ridge regression models are difficult to interpret if number of
features in feature vector is large justify.
Q. 31. Lasso regression models are easier to understand and interpret.
Justify.
Q. 32. What do you mean by Binary classification problem ? Illustrate
with example of binary classification where
Number of classes are 2
Number of features is 5
Q. 33. Can we achieve classification by using least square method for
classification. Justify.
Q. 34. What is difference between classification and regression
problem.
Q. 35. Explain least square method for binary classification.
Q. 36. Explain drawbacks of least square method for Binary
classification.
Q. 37. With respect to binary classification comment on
(i)  = wx
(ii)  =
(iii) = sign (B  X0 – t)
Q. 38. What do you mean by linearly separable data and non-linearly
separable data ?
Q. 39. What is perceptron ? Explain now linearly separable data is
classified by perceptron in Binary classification ?
Q. 40. What is condition for error detection in perceptron learning ?
Q. 41. What is perceptron learning algorithm ?
Q. 42. Explain calculation of  weights or model parameters in
perceptron learning ?
Q. 43. Write and explain perceptron learning algorithm ?
Q. 44. What is difference between perceptron learning and dual
perceptron learning ?
Q. 45. Write and explain perceptron learning algorithm and dual
perceptron learning algorithm.
Q. 46. Explain how perceptron learning algorithm for classification
can be modified as perceptron learning algorithm for regression.
Q. 47. Can we use perceptron learning algorithm for classification. If
yes, write and explain perceptron learning algorithm for classification. If
not, Justify.
Q. 48. What do you meant by margin classifier ? What is need of
margin classfier.
Q. 49. What do you meant by support vectors and support vector
machine ?
Q. 50. Support vector machine is maximum margin classifier comment
and Justify correctness or incorrectness of the statement.
Q. 51. Short note on : Support vector machine.
Q. 52. Explain Geometry of support vector machine.
Q. 53. Explain mathematical formulation of SVM objective function
constraints for it.
Q. 54. Derive the criteria to select misclassification of any instance X
by SVM.
Q. 55. w.r.t. support vector machine, define
 Positive margin
 Negative margin
 Margin
 Maximum margin
+ –
 m =m =m
 Langrange coefficients
Q. 56. It is expected that, SVM should not produce any classification
error. Comment on how SVM ensures this
Q. 57. Write and explain objective function and constraint in Dual
optimization problem in dual of SVM.
Q. 58. Optimization function of SVM is quadratic optimization
problem. Justify.
Q. 59. What do you meant by Hard margin and soft margin. Illustrate
difference between them.
Q. 60. Compare SVM and soft margin SVM.
Q. 61. What are advantages of dual optimization problem in SVM ?
Q. 62. What do you meant by stack variables in soft margin SVM ?
Explain its role in soft margin SVM.
Q. 63. Write and explain optimization problem for
(i) SVM
(ii) Soft margin SVM.
Q. 64. Show that
s
L ( , t, i, i, i ) = L (, t, i) + (( – i – i) i )
(Hint : Soft margin SVM)
Q. 65. SVM may suffer form overfitting soft margin SVM avoids
overfitting comment.
(Hint : Justify use of parameter C in soft margin SVM)
Q. 66. Write and explain conditions obtained from dual optimization
problems in soft margin SVM to conclude that an instance is
(i) Xi is on margin
(ii) Xi is either inside or on margin
(iii) Xi is either outside or margin
Q. 67. Explain properties and applications of SVM.
Q. 68. Explain role of kernel methods to handle linearly non-separable
data.
Q. 69. What do you meant by kernel method ? What is need of kernel
methods ?
Q. 70. Explain following types of kernel methods
(i) Polynomial kernels
(ii) Gaussian kernels
Q. 71. Explain with example relation ship between no of dimensions
of feature space and input space.
Q. 72. Write perceptron learning algorithm with polynomial kernels.
Q. 73. With example illustrate, polynomial kernels.
Q. 74. What are characteristics of kernel methods ?
Q. 75. Explain procedure for obtaining class probabilities from linear
classification
Q. 76. Explain process of logistic calibration.
Q. 77. What is Isotonic calibration process or Isotonic calibration
process constructs linear piecewise calibration function illustrate.
Q. 78. Logistic calibration process constructs sigmoid function
Illustrate.
Q. 79. Explain kernel methods which are suitable for perceptions.
Q. 80. Explain kernel methods which are suitable for SVM ?

You might also like