Professional Documents
Culture Documents
EC3606
EC3606
EC3606
2. Discuss the learning algorithm of single layer perceptron and establish its convergence.
Also explain the limitations of single layer perceptron which motivates for multi-layer
perceptron. [5]
𝐽(𝑤) = ‖𝑦 − 𝑋 𝑤‖ + 𝜆‖𝑤‖
4. Develop an optimal Bayesian classifier for the data {0,1} which probability of generation
is 𝑝(0) = 0.4 and 𝑝(1) = 0.6 in presence of the Gaussian noise 𝒩(0, 𝜎 ). Show that, as
distance between the points 0 and 1 increases, the probability of miss classification
decreases. [5]
5. Develop the learning algorithm for a multilayer perceptron of type [2 × 4 × 1]. In this
algorithm, explain the parameter initialization, activation function, forward- and
backward-pass. [5]
6. Develop backpropagation algorithm for the following architecture. Use kernel variables
𝒲 ( ) and 𝒲 ( ) for the first and second convolution layer. Number of nodes in the hidden
layer of the fully connected layer is 𝑁 . Input tensor is a vector of dimension 𝑁. There is
one neuron in the output layer of fully connected layer. You can consider activation
function of your choice. Assume least square error cost function for classification.
[5]
Con Relu Con Fully Connected
v. v.
Classifier
𝒙 ∈ ℝ𝑴×𝟏