AIML Question Bank-ML-17Jan2022

You might also like

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 2

Sri Taralabalu Jagadguru Education Society®, Sirigere

S T J INSTITUTE OF TECHNOLOGY, RANEBENNUR – 581 115


DEPARTMENT OF COMPUTER SCIENCE & ENGINEERING

Question Bank-ML

1) Define Concept learning, and discuss the terminologies used in concept learning
Problems.
2) For the learning example given by following table, illustrate the general-to-specific
ordering of hypotheses in concept learning.
Example SKY Air Temp. Humidity Wind Water Fore cast Enjoy sport
1 Sunny Warm Normal Strong Warm Same Yes
2 Sunny Warm High Strong Warm Same Yes
3 Rain Cold High Strong Warm Change No
4 Sunny Warm High Strong Cool Change Yes

3) Write FIND-S algorithm, and hence illustrate the algorithm for the learning example given
by the above table.
4) Discuss the limitations of FIND-S algorithm.
5) Trace the (running of) Find-S algorithm on the following training data. The attributes have the
following values: Study: Intense, Moderate, None, Difficulty: Easy, Hard, Sleepy: Very, Somewhat,
Attendance: Frequent, Rare, Hungry: Yes, No, Thirsty: Yes, No. The hypothesis space contains
6-tuples with ?, , a single required value (e.g., Normal) for the attributes.

Example Study Difficulty Sleepy Attendance Hungry Thirsty PassTest


1 Intense Normal Extremely Frequent No No Yes
2Example Intense
SKY Normal
Air Temp.Slightly
Humidity Frequent
Wind No
Water No cast
Fore Yes sport
Enjoy
3 1 None
Sunny HighWarm Slightly
Normal Frequent
Strong No
Warm Yes
Same No Yes
4 2 Intense
Sunny Normal
Warm Slightly
High Frequent
Strong Yes
Warm Yes
Same YesYes
3 Rain Cold High Strong Warm Change No
6) 4 Sunny Warm High Strong Cool Change Yes

Write Candidate-elimination algorithm, and hence illustrate the algorithm for the learning example
given by the following table.

7) Write a note on inductive hypothesis and unbiased learner with respect to the concept learning
algorithms.

8) Consider the following set of training examples:


Instance Classification a1 a2
1 + T T
2 + T T
3 - T F
4 + F F
5 - F T
6 - F T

(a) What is the entropy of this collection of training examples with respect to the target function
classification?
(b) What is the information gain of a2 relative to these training examples?
9) Write the steps of ID3 algorithm.
10) Describe hypothesis Space search in ID3 and contrast it with Candidate-Elimination algorithm.
11) Illustrate Occam’s razor and relate the importance of Occam’s razor with respect to ID3 algorithm.
12) Discuss the effect of reduced error pruning in decision tree algorithm.
13) What are the capabilities and limitations of ID3 algorithm?
14) Explain the Rule Post Pruning.
15) What is a perceptron? Explain.
16) Explain the appropriate problems for ANN learning with its characteristics.
17) Discuss the representational power of perceptron.
18) Explain the perceptron training rule and the delta rule.
19) Describe the derivation of the Gradient descent rule.
20) Write the Gradient descent rule algorithm and explain.
21) Give the steps of backpropagation algorithm used to train multilayer network and explain.
22) Briefly explain the effect of adding momentum to weight update rule in the backpropagation
algorithm.
23) Discuss the derivation of backpropagation rule used in learning of direct acyclic graph considering
different cases.
24) Explain the features of Bayesian learning methods.
25) Define Bayes theorem and MAP hypothesis.
26) Explain Brute force MAP hypothesis learner.
27) Explain the Maximum likelihood and Least-squared error hypothesis.
28) Write a note on Maximum likelihood hypothesis for predicting probabilities.
29) What is minimum description length (MDL) principle? Explain
30) Discuss the Naïve Bayes Classifier.
31) Explain Bayesian belief network and conditional independence with example.
32) Explain EM Algorithm.
33) Discuss the derivation of k Means algorithm.
34) Define True Error and Sample Error. What are they used for?
35) What is the importance of binomial and Normal Distribution?
36) Define mean, variance, and Standard deviation of a random variable.
37) Write a note on Confidence intervals.
38) How to estimate difference in error between two hypotheses using error D(h) and error S(h)?
39) Write a procedure to estimate the difference in error between two learning methods.
40) Describe k-nearest neighbor algorithm. Why is it called instance based learning?
41) Write a note on
a. Locally weighted linear regression
b. Radial basis functions
c. Case-based Reasoning
42) Briefly explain the Reinforcement learning method.
43) Write Q-learning algorithm and explain.

You might also like