Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

1.

The eigenvalues 𝜆 are given as solution to the equation


Det(𝐀 − 𝜆𝐈) = 0
Ans b
2. The eigenvalues of a Hermitian symmetric matrix are Real but not necessarily
positive
Ans c
3. The eigenvalues of the matrix below can be found as shown
2−𝜆 3
𝐀 − λ𝐈 = [ ] ⇒ |𝐀 − λ𝐈| = (2 − 𝜆)(1 − 𝜆) − 6 = 0
2 1−𝜆
⇒ 𝜆2 − 3𝜆 − 4 = 0 ⇒ (𝜆 − 4)(𝜆 + 1) = 0
⇒ 4, −1
Ans a
4. The pseudo-inverse of the matrix 𝐀 can be evaluated as below
1 1 −1
4 0 −1 1 1 0
(𝐀𝑇 𝐀)−1 = ([1 1 1 1 1 −1
][ ]) = ([ ]) = [ ]
1 −1 −1 1 1 −1 0 4 4 0 1
1 1
1
(𝐀𝑇 𝐀)−1 𝐀𝑇 = [1 0] [1 1 1 1
]
4 0 1 1 −1 −1 1
1 1 1 1 1
= [ ]
4 1 −1 −1 1
Ans b
5. The eigenvalues 𝜆𝑖 of a unitary matrix 𝐔 satisfy the property
|𝜆𝑖 | = 1
Ans c
6. Given ℎ1 = 1 − 𝑗, ℎ2 = −1 − 𝑗, the effective channel matrix for this system is given
as
ℎ ℎ2 1 − 𝑗 −1 − 𝑗
[ 1∗ ]=[ ]
ℎ2 −ℎ1∗ −1 + 𝑗 −1 − 𝑗
Ans d
7. The picture shown corresponds to a Regressor
Ans a
8. Principal Component Analysis (PCA) is used in machine learning for Dimensionality
reduction
Ans a
9. The pseudo-inverse of the matrix 𝐀 is
−1
1 1 1 1
1 1 −1 −1 1 −1
𝐀𝑇 (𝐀𝐀𝑇 )−1 = [ 1 −1 ] ([ ][ ])
−1 1 1 −1 1 −1 −1 1
−1 −1 −1 −1
1 1 1 1
4 0 −1 1 1 −1
= [ 1 −1 ] ([ ]) = [ ]
−1 1 0 4 4 −1 1
−1 −1 −1 −1
Ans d
10. Given 2 dimensional random column vector 𝐱̅ that has the multi-variate Gaussian
−3 2 0
distribution with mean 𝛍̅ = [ ] and covariance matrix 𝚺 = [ ]. Given the
2 0 4
vector
−2 1 1
𝐲̅ = 𝐀𝐱̅ + 𝐛̅ = 𝐲̅ = [ ] 𝐱̅ + [ ]
−3 2 −2
This is Gaussian with mean and covariance as follows
−2 1 −3 1 9
𝛍
̅ 𝑦 = 𝐀𝛍 ̅ + 𝐛̅ = [ ][ ] + [ ] = [ ]
−3 2 2 −2 11
−2 1 2 0 −2 −3
𝚺𝑦 = 𝐀𝚺𝐀𝑇 = [ ][ ][ ]
−3 2 0 4 1 2
−2 1 −4 −6 12 20
=[ ][ ]=[ ]
−3 2 4 8 20 34
Ans c

You might also like