Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

Mathematical Foundations for Machine Learning Machine Learning Group

WiSe 22/23 Fakulty IV, Technische Universität Berlin


Prof. Dr. Klaus-Robert Müller
Email: klaus-robert.mueller@tu-berlin.de

Week 2: Matrices

Homework
Solutions must be submitted on ISIS until Tuesday, Nov 15 at 10am.

Exercise 1 (6 Points)
2 3
4 4 4
1. The rank of 4 4 4 4 5 is
4 4 4

⇤ 1 ⇤ 3 ⇤ 4

2. For any symmetric, invertible matrix A 2 Rn⇥n and vectors v, w 2 Rn it holds that:

⇤ hAv, wi = hv, Awi


⇤ hAv, Awi = hv, wi
⌦ ↵
⇤ hAv, wi = v, A 1 w
3. Which of the following statements is true? For any square n ⇥ n matrix A it holds that:

⇤ det A = 0 ) rank A = 0
⇤ det A = 0 , rank A < n
⇤ rank A = n ) det A = n
4. Which of the following matrices is orthogonal?
  
0 1 1 1 1 1
⇤ ⇤ ⇤
1 0 1 1 1 0

5. Let h·, ·i be the standard scalar product on Rn and let A 2 Rn⇥n be an arbitrary square matrix with full rank. Which
of the following mappings from Rn ⇥ Rn to R defines a scalar product on Rn ?
⇤ f (x, y) := hAx, yi for x, y 2 Rn

x
⇤ g(x, y) := hAx, Ayi for x, y 2 Rn
⌦ ↵
⇤ h(x, y) := Ax, A> y for x, y 2 Rn
6. Which of the following sets of n ⇥ n together with matrix addition and scalar multiplication do not form a real vector
space?
⇤ The set of symmetric matrices {A 2 Rn⇥n | A> = A}
⇤ The set of orthogonal matrices {A 2 Rn⇥n | AA> = A> A = In }
⇤ The set of upper triangle matrices {A 2 Rn⇥n | Aij = 0 for i > j}

Exercise 2 (6 Points)
1. What transformation is described by the following matrix? What is the determinant?

1 1 1
p
2 1 1

2. Show: Orthogonal matrices have determinant 1 or 1.


Hint: Show that
1
det R = (det R)
for any orthogonal matrix R 2 Rn⇥n .
exercises id shank the marty
I Scalar multiplication
t determinant is

È 1 t
2 YRbonogonalmatm
RRERTR
Z RRRERZ RER

i.detcp.TKdetlR
anddetcandetcRT idetcikcdetcp.IN

i det can let RH


Exercise 3 (8 Points)

so o_o ⼆
Let U be an r-dimensional vector subspace of Rn with basis {u1 , . . . , ur } ⇢ Rn . Let U := (u1 , . . . , ur ) 2 Rn⇥r be the matrix
whose columns are the ui . The matrix of the orthogonal projection of Rn onto U is given by P := U (U > U ) 1 U > .

1. Compute the product P · P . What does the result mean intuitively?


2. Show: If it holds that r = n, then P is the identity matrix, i.e., P = I.
What3 is the23 ⇥33 matrix that describes the orthogonal projection onto the vector subspace spanned by the vectors
3. 2
3 1

ooa
4 4 5 and 4 0 5?
0 0

I.p.pivan ocuii
cuvieI.v 1
ci v T
vcwi.I.v 1
ci U

uwiI.ci ci.ci

UUTH.IO U
taivi.II.li
UCUTUHUT
vector
ztmeans.mu the protection of a

itself
onto u is
thhr
uisorthogonal
invertible
2gr n.us
RUN
T.UHU IUCU
1

cciilu WU.ciu T
UIUT
⼆ I
3 闾
340
吲 ⽚ 340
i
派 啊1 io

器1

You might also like