Professional Documents
Culture Documents
Multivariate Statistical Methods: Abiyot Negash (Assi. Prof)
Multivariate Statistical Methods: Abiyot Negash (Assi. Prof)
Department of Statistics
Introduction
When |c| > 1, vector x is expanded. When |c| < 1, vector x is contracted.
When |c| = 1, there is no change. If c < 0, the direction of vector x is
changed.
Choosing a = L−1 −1
x , we obtain the unit vector La x, which has length 1 and
lies in the direction of x.
x1
Example If n = 2, consider the vector .
x= The length of x is
x2
q
Lx = x12 + x22 .
Geometrically, the length of a vector in two dimensions can be viewed as
the hypotenuse of a right triangle.
α1 x1 + α2 x2 = 0 ⇒
3α1 + 2α2 = 0
4α1 + α2 = 0
holds only if α1 = α2 = 0.
This confirms that x1 and x2 are linearly independent
Abiyot. (JU) Multivariate statistical Methods 8
Introduction and Matrix Algebra Matrix Characteristics
|A−1 | = |A|−1
. If A and B are the same size and nonsingular, then the inverse of their
product is the product of their inverses in reverse order,
Kronecker product
properties
(A ⊗ B) ⊗ C = A ⊗ (B ⊗ C)
(A ⊗ B)(C ⊗ D) = AC ⊗ (BD)
(A + B) ⊗ C = (A ⊗ C) + (B ⊗ C)
tr (A ⊗ B) = tr (A)tr (B)
A = T0 T
Ax = λx
(A − λI)x = 0 ⇒ |A − λI| = 0
(A − λi I)xi = 0 ⇔ Axi = λi xi
1 − λ 2
|A − λI| = = (1 − λ)(4 − λ) + 2 = 0
−1 4−λ
λ2 − 5λ + 6 = (λ − 3)(λ − 2) = 0
from which λ1 = 3 and λ2 = 2. To find the eigenvector corresponding to
λ1 = 3 we use the equation (A − λI)x = 0
1−3 2 x1 0
=
−1 4−3 x2 0
−2x1 + 2x2 = 0
−x1 + x2 = 0
The two equations are redundant and remains a single equation with two
unknowns, x1 = x2 . The solution vector can be written with an arbitrary
constant,
x1 1 1
= x1 =c
x2 1 1
√
If c is set equal to 1/ 2 to normalize the eigenvector, we obtain
Abiyot. (JU) Multivariate statistical Methods 15
Introduction and Matrix Algebra Matrix Characteristics
√
1/ 2
x1 =
√
1/ 2
|A| = Πni=1 λi
Abiyot. (JU) Multivariate statistical Methods 16
Introduction and Matrix Algebra Matrix Characteristics
1 2 x11 x11
Ax1 = λ1 x1 ⇔
= 2
2 −2 x21 x21
1 2
⇒ x21 = x11 ⇒ x1 =
2
1
√
2/ 5
The normalized eigenvector corresponding to λ1 = 2 is e1 =
√ For
1/ 5
√
1/ 5
λ2 = 3 the corresponding normalized eigenvector is e2 = ,
√
−2/ 5
We need to show A = λ1 e1 e01 + λ2 e2 e02
Abiyot. (JU) Multivariate statistical Methods 18
Introduction and Matrix Algebra Matrix Characteristics
√
√
1 2 2 1
2/ 5 1/ 5 1 −2
√ =2 √ √ − 3 √ √ √
2 −2 1/ 5 5 5 −2/ 5 5 5
The matrix is written as a function of eigenvalues and normalized
eigenvectors
In matrix form, the spectral decomposition of A is:
A = PΛP 0
where P = (e1 , e2 , · · · , en ) and Λ = diag(λ1 , λ2, · · · , λn )
Note here that P 0 P = PP 0 = In×n (P is orthogonal, P −1 = P 0 ) In the
above example,
√2 √1 2 0
5 5
P = (e1 , e2 ) = , Λ =
√1 −2
√ 0 −3
5 5
⇒ A = PΛP 0
Again, using spectral decomposition, for a positive definite matrix A
A−1 = PΛ−1 P 0
Abiyot. (JU) Multivariate statistical Methods 19
Introduction and Matrix Algebra Matrix Characteristics
A = UDV0
where U is n × k, D is k × k, and V is p × k.
The diagonal elements of the nonsingular diagonal matrix
D = diag(λ1 , λ2 , · · · , λk ) are the positive square roots of
λ21 , λ22 , · · · , λ2k which are the nonzero eigenvalues of A0 A or of AA0 .
The values λ1 , λ2 , · · · , λk are called the singular values of A.
The k columns of U are the normalized eigenvectors of AA0
corresponding to the eigenvalues λ21 , λ22 , · · · , λ2k
The k columns of V are the normalized eigenvectors of A0 A
corresponding to the eigenvalues λ21 , λ22 , · · · , λ2k
Example:
3 1 1
A=
−1 3 1
3
−1
3 1 1 11 1
AA0 = = and
1 3
−1 3 1 1 11
1 1
3 −1
10 0 2
3 1 1
A0 A = 1 =
3
0
10 4
−1 3 1
1 1 2 4 2
Let x11 = 1 ⇒ x21 = 1 ⇒ U01 = √1 √1
2 2
Similarly the eigenvector corresponding to λ2 = 10 is U02 = √1 −1
√
2 2
⇒ λ1 = 12 or λ2 = 10 or λ3 = 0
Abiyot. (JU) Multivariate statistical Methods 23
Introduction and Matrix Algebra Matrix Characteristics
Eigenvector corresponding to λ1 = 12
10 0 2 x11 x11
AA0 x1 = λ1 x1 ⇒
x21 = 12 x21
0 10 4
2 4 2 x31 x31
Similarly the eigenvector corresponding to λ2 = 10 is V20 = √2 −1
√ 0
5 5
Then
1
√
√2 √1 12 0 1
√6 √2 √1 3 1 1
2 6 6
A = UΛV0 =
=
−1
√ 2 −1
√1 √ 0 10 √ √ 0 −1 3 1
2 2 5 5
A = λ1 e01 e1 + λ1 e02 e2
Hence
x0 Ax = λ1 (x0 e1 )2 + λ2 (x0 e2 )2
LU (LR) decomposition: