Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 5

Course: CSC14003 – Introduction to Artificial Intelligence

Class 19CLC – Term III/2020-2021

Homework 04
Submission Notices:
 Conduct your homework by filling answers into the placeholders given in this file (in Microsoft Word
format).
Questions are shown in black color, instructions/hints are shown in italic and blue color, and your content
should use any color that is different from those.
 After completing your homework, prepare the file for submission by exporting the Word file (filled with
answers) to a PDF file, whose filename follows the following format,
<StudentID-1>_<StudentID-2>_HW04.pdf (Student IDs are sorted in ascending order)
E.g., 1952001_1952002_HW04.pdf
and then submit the file to Moodle directly WITHOUT any kinds of compression (.zip, .rar, .tar, etc.).
 Note that you will get zero credit for any careless mistake, including, but not limited to, the following
things.
1. Wrong file/filename format, e.g., not a pdf file, use “-” instead of “_” for separators, etc.
2. Disorder format of problems and answers
3. Conducted not in English
4. Cheating, i.e., copy other students’ works or let the other student(s) copy your work.

Problem 1. (1.5pts) Column A presents objective measures that are commonly used in decision tree
learning, while Column B lists definitions corresponding to the measures in Column A. Match each
entry of Column A with the most appropriate definition in Column B.
Column A Column B
measures the statistical significance of the information gain
I Gini Index 1
criterion
Likelihood-Ratio Chi–
II 2 is the splitting criteria of ID3
Squared Statistics
measures the divergences between the probability distributions
III DKM Criterion 3
of the target attribute’s values
requires smaller trees for obtaining a certain error than Gini
IV Gain Ratio 4
Index
V Twoing criteria 5 is the splitting criteria of C4.5
VI Information Gain 6 is the splitting criteria of CART

1
Please fill your answer in the table below

II

III

IV

VI

Problem 2. (2pts) Please write your answer in the table


Questions Answers
1. What is the purpose of the activation
The purpose of the activation function is to…
function?
2. What are the advantages of sigmoid function
over hard-limit function (e.g., step or sign)?
3. List names of 3 different types of neural
networks?

4. What is Gradient Descent? Gradient Descent is …

Problem 3. (2pts) Consider the following training dataset, in which Transportation is the target
attribute. Show calculations to choose an attribute for the root node of the ID3 decision tree
Gender Car Ownership Travel Cost Income Level Transportation
Male 0 Cheap Low Bus
Male 1 Cheap Medium Bus
Female 1 Cheap Medium Train
Female 0 Cheap Low Bus
Male 1 Cheap Medium Bus
Male 0 Standard Medium Train
Female 1 Standard Medium Train
Female 1 Expensive High Car
Male 2 Expensive Medium Car
Female 2 Expensive High Car

Please fill your answer in the table below

Counts Metric values


Attribute Bus Car Train H AE IG

2
values
Whole 4 3 3
Gender Female 1 2 2
(0.5pt)
Male 3 1 1
Car 0 2 0 1
Ownershi
1 2 1 2
p (0.5pt)
2 0 2 0
Travel Cheap 4 0 1
Cost
Expensive 0 3 0
(0.5pt)
Standard 0 0 2
Income Low 2 0 0
Level
Medium 2 1 3
(0.5pt)
High 0 0 2

Problem 4. (3pts) Given a neural network with


two inputs, two hidden neurons, two output
neurons, as shown below. Additionally, in the
hidden and output layers, each of which will
include a bias that has a constant output value of
1.
 Learning rate 0.5
 Input values: i1=0.05 i2=0.10
 Target values: t1=0.01 t2=0.99
 Bias values: b1=0.35 b2=0.60
 Initial weight: w1=0.15 w2=0.20 w3=0.25 w4=0.30
w5=0.40 w6=0.45 w7=0.50 w8=0.55
Present all calculations required to perform the backpropagation once (i.e. one forward pass and
one backward pass) on the given neural network in the following cases
a) Ignore all biases
Forward pass
h1 h2 o1 o2
Sum
Sigmoid

Backward pass
w1 w2 w3 w4 w5 w6 w7 w8

3
b) Take into account all biases
Forward pass
h1 h2 o1 o2
Sum
sigmoid

Backward pass
w1 w2 w3 w4 w5 w6 w7 w8 b1 b2

Problem 5. (1.5pts) Label the following as Y (= yes) or N (= no) depending on whether a linear
classifier with a “hard” decision boundary (= a perceptron with a step transfer function) can
correctly classify the examples shown. If your answer is Y (= yes), fill in a set of weights that
correctly classifies them. Use w 0 as the threshold and w i as the weight for input x i. All perceptrons
have three Boolean inputs, x 1, x 2, and x 3, and a “dummy” input, x 0, which is always equal to one.

They all compute the decision function ∑ wi xi >0. You may not transform the input space, i.e.,
they operate on the stated inputs. Recall that the three Boolean inputs map each possible
example to a corner of a three-dimensional cube. The first one is done for you as an example.

a) “At least two inputs are 1.”


List the named set S, represented as 3-bit Boolean numbers: S = {011,101,110,111}.
Correctly classifiable? Y
If yes, weights are w0 = -1.5 ; w1 = 1 ; w2 = 1 ; w3 = 1

c) (0.5 pts) “Exactly two inputs are 1.”


List the named set S, represented as 3-bit Boolean numbers: S =………………..
Correctly classifiable?...............
If yes, weights are w 0 =…………… ; w 1=…………… ; w 2 =…………. ; w 3 =………..

d) (0.5 pts) “At most two inputs are 1.”


List the named set S, represented as 3-bit Boolean numbers: S = ………………………... .
Correctly classifiable? …………………
4
If yes, weights are w 0 =…………… ; w 1=…………… ; w 2 =…………. ; w 3 =………..

e) (0.5 pts) “Input x 1 = 1, input x 2 = 0, input x 3 =anything.”


List the named set S, represented as 3-bit Boolean numbers: S =…………………...
Correctly classifiable?...................
If yes, weights are w 0 =…………… ; w 1=…………… ; w 2 =…………. ; w 3 =………..

You might also like