PCA Answer Q3.2

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

Answer:

The data points:

4 3 2
𝑥1 = [1] , 𝑥2 = [2] 𝑎𝑛𝑑 𝑥3 = [4]
2 8 3

The data matrix:

4 3 2
𝑋 = [𝑥1 𝑥2 𝑥3 ] = [1 2 4 ]
2 8 3

The mean vector:

𝑥1 + 𝑥2 + 𝑥3 3.0000
𝑚= = [2.3333]
3
4.3333

The normalized (i.e. mean centered) data matrix:

1.0000 0 −1.0000
𝑋 − 𝑚 = [𝑥1 − 𝑚 𝑥2 − 𝑚 𝑥3 − 𝑚] = [−1.3333 −0.3333 1.6667 ]
−2.3333 3.6667 −1.3333

The covariance matrix

(𝑋 − 𝑚)(𝑋 − 𝑚)𝑇 0.6667 −1.0000 −0.3333


𝐶= = [ −1.0000 1.5556 −0.1111]
3
−0.3333 −0.1111 6.8889
The eigenvalue decomposition gives eigenvalues (𝜆1 , 𝜆2 𝑎𝑛𝑑 𝜆3 ) and eigenvectors
(𝑣1 , 𝑣2 𝑎𝑛𝑑 𝑣3 ):

𝜆1 = 6.9073 , 𝜆2 = 2.2038 𝑎𝑛𝑑 𝜆3 = 0

−0.0516 −0.5424 −0.8385


𝑣1 = [−0.0111] , 𝑣2 = [ 0.8399 ] 𝑎𝑛𝑑 𝑣3 = [−0.5426]
0.9986 −0.0187 −0.0493

Note: eigenvalues are arranged such that 𝜆1 > 𝜆2 > 𝜆3 and the eigenvectors are also arranged
accordingly.

The projected matrix (with redundant dimension):

−2.3668 −1.6187 0
𝑋𝑝 = (𝑋 − 𝑚)𝑇 𝑉 = (𝑋 − 𝑚)𝑇 [𝑣1 𝑣2 𝑣3 ] = [ 3.6653 −0.3484 0]
−1.2984 1.9671 0

Extra:

In the case of reduced dimension:

−2.3668 −1.6187
𝑋𝑝,𝐿 = (𝑋 − 𝑚)𝑇 𝑉𝐿 = (𝑋 − 𝑚)𝑇 [𝑣1 𝑣2 ] = [ 3.6653 −0.3484 ] .
−1.2984 1.9671

You might also like