Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

The Jordan Form of a Nilpotent Matrix DEFINITION.

An n n matrix (or linear transformation) N is said to be nilpotent if Nk = n for some integer k. The smallest number k with Nk is said to be the order of N. PROPOSITION. An n n matrix N is nilpotent if and only if the spectrum !(N) = {0}. Here !(N) is the spectrum of N. PROOF. Suppose N is nilpotent of order k. If " # !(N), then "k # !(Nk) by the Spectral Mapping Theorem. But !(Nk) = !(0) = {0}, and so "k = 0 and " = 0. Thus, we get that !(N) = {0}. Conversely, suppose that !(N) = {0}. Then the characteristic polynomial of N is pN(") = "m for some integer n. However, every matrix satisfies its own characteristic polynomial by the Q.E.D. CayleyHamilton theorem. So we get Nm = pN(N) = 0. Now let N be an n n nilpotent matrix over the complex numbers of order k. We find a basis of n of the following form

N k(1)
with

q 1 , N k(1)

q 1 , # ,q 1, N k(2)

q 2 , N k(2)

q 2 , # , q 2, # , N k(s)

qs , # , qs

k = k(1) $ k(2) $ $ k(s) with Nk(j)qj = 0 for all 1 % j % s. Using this basis, the matrix (or linear transformation) N has the block diagonal form or Jordan normal (or canonical) form given by diagonal (Jk(1)(0), Jk(2)(0), , Jk(s)(0)), i.e., C- 1JC = diagonal (Js(1)(0), Js(2)(0), , Js(k)(0)) where C is the column matrix given by C = N k(1)
1

q 1 , N k(1)

q 1 , # ,q 1, N k(2)

q 2 , N k(2)

q 2 , # , q 2, # , N k(s)

qs , # , qs

and the matrix Jk(") for abitrary " is the k k matrix " 1 0 0 0 0 " 1 0 0 0 0 " 0 0 Jk(" ) = $ 0 0 0 " 1 0 0 0 0 " First we want to verify that this basis exists and then we want to see how to calculate it. We also want to decompose a general n n matrix into the direct sum of Jordan blocks and show that the decompostion is essentially unique. In order to verify the existence of the basis it is convenient

The Jordan Form of a Nilpotent Matrix page 1

to use induction. Since linear transformations are better suited to induction, we state everything in terms of linear transformations on vector spaces V which will be subspaces of n with the usual hermitian inner product <&%&>. All the preceding definitions make sense for linear transformations N. Recall that Nj is the composition of N with itself j-times. DEFINITION. Let T be a linear transformation on the vector space V. A subspace S of V is said to be invariant under T if T(S) ' S. DEFINITION. Let T be a linear transformation on the vector space V. A subspace S of V is said to be cyclic under T if there is a vector x in S such that {Tjx ( j = 0, 1, 2, } spans S. Note that every cyclic subspace of T is invariant under T. PROPOSITION. Let T be a linear transformation of the complex vector space V with bilinear hermitian form <&%&>. Then there is a unique linear transformation T* of V with <Tx, y> = <x, T*y> for every x and y in V. PROPOSITION. Let T be a linear transformation of the vector space V with hermitian inner product <&%&>. If S is a subspace of V invariant under T, then S) is invariant under T*. PROOF. Let x # S). We must show that T*x # S). But given y in S, we have that Ty is in S and so <T*x, y> = <x, Ty> = 0. So we get that T*(S)) ' S). Q.E.D. PROPOSITION. Let Kj and Rj be the subspaces Kj = kernel Nj and Rj = Range (N*)j. Then K j = Rj ) for every j and (0) = K0 ' K1 ' ' Kk - 1 ' Kk = n, or equivalently, (0) = Rk ' Rk - 1 ' ' R1 ' R0 = !n where the inclusion symbol ' here means strict inclusion. PROOF. We have in general for any square matrix A that ker A = (Ran A*)) since <A*x, y> = 0 for every x if and only if <x, Ay> = 0 for every x. This means that y is perpendicular to the range of A if and only if Ay = 0.

The Jordan Form of a Nilpotent Matrix page 2

Now we have that ker Nj ' ker Nj + 1 since Njx = 0 implies Nj + 1x = 0. Suppose that K j = Kj + 1 for some 1 % j % k - 1. Then we have that K j = Kj + 1 = = Kk = n . We just show the first step K j + 1 = K j + 2. Since Kj + 1 ' Kj + 2, we just need to show the opposite inclusion. But we have that N j + 2x = 0 implies Nj + 1(Nx) = 0. But then Nx # Kj and Nj(Nx) = Nj + 1x = 0 or x # Kj + 1. So we get Kj + 1 = Kj + 2. So we get strict inclusion.

Q.E.D.

DEFINITION. The subspace Kj is called the generalized eigenspace of N of order j. A vector x in Kj but not in Kj - 1 is called an generalized eigenvector of N of order j. The induction step is contained in the next proposition. PROPOSITION. Let N be a nilpotent linear transformation of order k on the complex vector space V with inner product <&%&>, let x be a vector with Nk - 1x * 0, and let x* be a vector with <Nk 1x, x*> * 0. Then each set of vectors + = {x, Nx, , Nk - 1x} and +* = {x*, N*x*, , N*k - 1x*} is linearly independent in V and if S and S* are the subspaces of V generated by + and +* respectively, then S and S* are N invariant and S,(S*)) = V. PROOF. First we see that there is a vector x with Nk - 1x * 0 since Nk - 1 * 0. We may choose x* = Nk - 1x if we like since <v, v> = 0 for any vector v implies v = 0. Now we show that + is a linearly independent set of vectors. Suppose that -0x + -1Nx + + -k - 1Nk - 1x = 0. Then 0 = Nk - 1(-0x + -1Nx + + -k - 1Nk - 1x) = -0Nk - 1x = 0 since Nj = 0 for j $ k. Since Nk - 1x0 = 0, we get -0 = 0. This means that -1Nx + + -k - 1Nk - 1x = 0 and we can now show -1, -2, . are 0 using Nk - 2, Nk - 3, .So + is a linearly independent set. But now the situation with +* is exactly the same since N* is nilpotent of order k with N*k 1 * 0 due to the fact <x, N*k - 1x*> = <Nk - 1x, x*> * 0. So +* is also a linearly independent set of vectors. As we have already remarked cyclic subspaces are invariant. So S is invariant under N and S* The Jordan Form of a Nilpotent Matrix page 3

is invariant under N*. This means that S*) is invariant under N** = N. Now we show that S,S*) = V. First we show that S.S*) = (0). For this let y # S.S*). The proof that y = 0 resembles the one previously given. Then we can write y as y = /0x + /1Nx + + /k - 1N k - 1x by virtue of the fact that y is in S. But we have that /0< Nk - 1x, x*> = < /0Nk - 1x, x*> = <Nk - 1y, x*> = <y, N*k - 1x*> = 0. Since < Nk - 1x, x*> * 0, we get /0 = 0. As before, we can work our way through the entire list and show that /0 = = /k - 1 = 0. Consequently, we get y = 0 and S.S*) = (0). Now we show that S + S*) = V. We derive this from the statement S).S* = (0) which can be proved in exactly the same way as the statement S.S*) = (0). But if S).S* = (0), we get V = (0)) = (S). S*&) = S)) + S*) = S + S*). So we have shown that V = S,S*) and that both S and S*) are Ninvariant. Q.E.D. We now decompose the nilpotent matrix N into the direct sum of Jordan blocks. We use induction. THEOREM. Let N be a nilpotent linear transformation on the vector space V over the complex number . Then there is a basis of V such that the matrix of N with respect to the basis is a Jordan normal form. PROOF. We use induction on the dimension of V. For dim V = 1, the result is clear since we must have that any nilpotent linear transformation N is 0. In fact, if x is a nonzero vector in V, then Nx = -v for some scalar -. Since Nkx = -kx = 0, we see -k = 0 and - = 0. Now suppose that the Theorem is true for all vector spaces of dimension less than n and let N be a nilpotent linear transformation or order k on a vector space V of dimension n. We may assume that V has an inner product since one may always be defined on a complex vector space. Then there is a vector q1 in V with Nk - 1q1 * 0. Using the notation of the previous Proposition, we can can find an Ninvariant subspaces S*) of V such that S,S*) = V where S is the N-invariant subspace with basis {q1, Nq1, , Nk - 1q1}. The space S*) has dimension less than n since dim S + dim S*) = dim V = n.

The Jordan Form of a Nilpotent Matrix page 4

The restriction of N to the N-invariant subspace S*) is still nilpotent of order k(2) % k = k(1). Using the induction hypothesis, there is a basis of S*) of the form

N k(1)

q 1 , N k(1)

q 1 , # ,q 1, N k(2)

q 2 , N k(2)

q 2 , # , q 2, # , N k(s)

qs , # , qs
Q.E.D.

as desired. Jordan Form of an arbitrary n n Matrix

We sketch the steps necessary to decompose a general n n complex matrix A into Jordan normal form. If the spectrum !(A) consists of distinct numbers !(A) = {"1, , "m}, n we decompose the space into a direct sum of Ainvariant subspaces ! n = S1 , , S m such that, for each i, the matrix A - "i is nilpotent on Si. The amalgamation of the cyclic basis of Si for the restriction of A- "i to Si then will produce the Jordan normal form for A. PROPOSITION. Let p1(t), , pm(t) be polynomials. Then +i kernel pi(A) = kernel p(A), where p(t) is the least common multiple of the pi(t). PROOF. First let x # +i kernel pi(A). We may write x = + xi where xi # ker pi(A) for each i. For each i, there is a polynomial di(t) such that p(t) = pi(t)di(t). So we have that p(A)x = p(A)+xi = + pi(A)di(A)x = +di(A)(pi(A)x) = 0. This shows that x # ker p(A) and that +i kernel pi(A) ' kernel p(A). We prove the reverse inclusion by induction on the number of polynomials pi. The essence of the proof is demonstrated for two polynomials p1 and p2. Let x be in the kernel of p. We have to find x1 and x2 in the kernels of p1(A) and p2(A) respectively with x1 + x2 = x. As before let di be a polynomial with pidi = p for i = 1, 2. Then the polynomials d1 and d2 are relatively prime and so there are polynomials q1 and q2 with d1q1 + d2q2 = 1. Let x1 = d1(A)q1(A)x and x2 = d2(A)q2(A)x. Then we get x1 + x2 = (d1(A)q1(A) + d2(A)q2(A))x = x and p1(A)x1 = q1(A)(p1(A)d1(A))x = q1(A)p(A)x = 0, and similarly,

The Jordan Form of a Nilpotent Matrix page 5

p2(A)x2 = 0. So we get that x is in ker p1(A) + ker p2(A) and that ker p(A) ' ker p1(A) + ker p2(A). Thus, we have verified the Proposition for two polynomials p1 and p2. Now suppose that we have verified +i kernel pi(A) = kernel p(A) for m - 1 polynomials p1(t), , pm - 1(t). We verify the Proposition for m polynomials p1(t), , pm(t). We have that lcm (lcm(p1, , pm -1), pm) = lcm (p1, , pm). So we have that ker lcm (p1, , pm)(A) = ker lcm(lcm(p1, , pm -1), pm) = ker lcm(p1, , pm -1)(A) + ker pm(A) = ker p1(A) + + ker pm(A). Q.E.D. COROLLARY. If the polynomials p1(t), , pm(t) are pairwise relative prime and if p = 0pi satisfies p(A) = 0, then !n = +,ker pi(A). PROOF. We have that p = lcm (p1, , pm) = p1p2pm. So we have that
n = ker p(A) = + ker p (A). i

We complete the proof by establishing that the sum + ker pi(A) is really direct sum. We must show that ker pi(A) . +j * iker pj(A) = (0) for every i. With no loss of generality, we may assume that i = 1. Then we have that q = lcm(p2, , pm) = p2pm and that p1 and q are relative prime. There are polynomials q1 and q2 with q1p1 + q2q = 1. Now let x # ker p1(A) . +j * 1ker pj(A) = ker p1(A).ker lcm(p2, , pm)(A) = ker p1(A).ker q(A). We have that x = (q1(A)p1(A) + q2(A)q(A))x = q1(A)p1(A)x + q2(A)q(A)x = 0. So we get that ker p1(A) . +j * 1ker pj(A) = (0). Q.E.D. Now let pA(t) be the characteristic polynomial of A. Write pA(t) = (t - "1)a1(t - "m)am where all the "i are distinct. The number ai is the so-called algebraic multiplicity of the eigenvalue "i. The minimal polynomial mA(t) divides pA(t) and so must have the form mA(t) = (t - "1)k1(t - "m)km. Here we have that km can be any number with 1 % ki % ai. To see that k1 * 0 and likewise that all ki * 0, let v be an eigenvector corresponding to "i. Assuming that k1 = 0, we get

The Jordan Form of a Nilpotent Matrix page 6

0 = mA(A)v = {(A - "2)k2(t - "m)km}v = {("1 - "2)k2("1 - "m)km}v * 0. So we have a contradiction and we must conclude ki $ 1. PROPOSITION. Let mA(t) = (t - "1)k1(t - "m)km be the minimal polynomial of A. Then ker (A- "i)ki = Range 0j * i (A - "j)kj and n = +,ker(A - " )ki. i PROOF. The polynomials pi(t) = (t - "i)ki are pairwise relative prime with p1(A)pm(A) = mA(A) = 0. So !n = +,ker(A - "i)ki due to an earlier proposition. Now we show that ker (A- "i)ki = Range 0j * i (A - "j)kj. With no loss of generality, we may assume that i = 1. Let q1 = p1 and q2 = p2pm. Then q1 and q2 are relatively prime. There exists polynomials d1 and d2 such that q1d1 + q2d2 = 1. Now let x # ker q1(A). We have q2(A)(d2(A)x) = (d1(A)q1(A) + q2(A)d2(A))x = x and so x # Range q2(A)x. Now let x # Range q2(A)x. There is a y with q2(A)y = x. Then we get q1(A)x = q1(A)q2(A)y = mA(A)y = 0 and x # ker q1(A). So we get that ker q1(A) = Range q2(A). Q.E.D. Now we are ready for the direct sum decompostion. THEOREM. Let A be an n n complex matrix. Let mA(t) = (t - "1)k1(t - "m)km be the minimal polynomial of A. Then !n can be written as the direct sum of A-invariant subspaces ker (A - "i)ki, viz., !n = ker (A - "1)k1,ker (A - "2)k2,,ker (A - "m)km and the minimal polynomial of A on each invariant subspace ker (A - "i)ki is (t - "i)ki. PROOF. We have already obtained the direct sum decomposition. It is clear that each subspace ker (A "i)ki is A-invariant. We need to show that the minimal polynomial of the restriction Ai of A to the subspace Si = ker (A "i)ki is pi(t) = (t - "i)ki. Again there is no loss of generality in assuming that i = 1. We have that p1(A)S = (0). This means that the minimal polynomial m1(t) of A1 on S1 must divide p1. So we get that m1(t) = (t - "1)h1 for some 1 % h1 % k1. Suppose that h1 is actually strictly less that k1. Then (A - "1)h1 would The Jordan Form of a Nilpotent Matrix page 7

annihilate S1 and (A - "j)kj would annihilate the Sj for j * 1. So (A - "1)hi(A - "2)k2(A - "m)km would annihilate n = +,Si contradicting the fact that mA(t) = (t - "1)k1(t - "m)km is the minimal polynomial of A. So we must have that hi = ki. Q.E.D. Now we can compute the Jordan normal form of A restricted to one of the subspaces S = Si Suppose that " = "i. Then B = A - " is idempotent on S of order k = gi. This means that there is a basis of S giving the transformation matrix such that C- 1BC has the matrix C-1BC = diag(Jk(1)(0), , Jk(s)(0)) where k = k(1) $ k(2) $ $ k(s). Then A and the subspace has the form C-1AC = C-1(B + ")C + " = diag(Jk(1)("), , Jk(s)(")). We can read some information of the Jordan normal form of a matrix A. Using an alternative and common notation for the sum of diagonal blocks, let the Jordan normal form of A be given by B= C-1AC = Jk(1, 1)("1), ,Jk(1, s(1))("1), ,Jk(m, 1)("m), ,Jk(1, s(m))("m). Then we see that the first basis vector for each Jordan block Jk(i, j)("i) is an eigenvector for B for the eigenvalue "i. So we see that number of Jordan blocks with "i is the geometric multiplicity of the eigen value "i for C. We also have that the sum of all the orders of the Jordan blocks corresponding to "i is the degree of the factor t - "i in characteristic polynomial of B, i.e., the algebraic multiplicity of "i. We also have that +i, j k(i, j) = n. Finally, if mA(t) = (t - "1)k1(t - "m)km is the minimal polynomial of A, then for each "i, there must be at least one Jordan block Jk(i, 1)("i) and dimensions of the other Jordan blocks for "i satisfy ki = k(i, 1) $ $ k(i, s(i)). Exactly the same statements apply to A. Since the geometric multiplicity and algebraic multiplicity does not change under similarity, the information we got from the Jordan normal form applies to A as well. Summarizing we have, !(A) = {"1, , "m} = "s appearing in the Jordan blocks, geometric multiplicity "i = s(i), algebraic multiplicity "i = k(i, 1) + + k(i, s(i)), and +i,j k(i, j) = n. Using these it sometimes possible to find the Jordan form simply from the minimal polynomial and some deductive reasoning. Uniqueness of the Jordan Normal Form As before the first step is to treat the nilpotent matrices. PROPOSITION. Let A = Jn(1)(0),,Jn(s)(0) and

The Jordan Form of a Nilpotent Matrix page 8

B = Jm(1)(0),,Jm(s)(0) be similar Jordan matrices with n(1) $ $ n(s) and m(1) $ $ m(t). Then s=t and n(1) = m(1), , n(s) = m(s). PROOF. We must have that n(1) = m(1) since n(1) - 1 is smallest number n with An = 0. Since An = 0 if and only if Bn = 0, n(1) = m(1). Now suppose that the n-list {n(1), , n(s)} differs from the m-list {m(1), , m(t)}. Let k be the first index where the lists differ. There is no loss of generality in the assumption that n(1) = m(1), , n(k - 1) = m(k - 1), n(k) > m(k). Let m = m(k). Then we have that Am = Jn(1)(0)m,,Jn(k - 1)(0)m,Jn(k)(0)m,',,0 and Bm = Jn(1)(0)m,,Jn(k - 1)(0)m,Jm(k)(0)m,',,0 = Jn(1)(0)m,,Jn(k - 1)(0)m,0,',,0. We have that Jn(k)(0)m * 0 since m < n(k) and and tn(k) is the minimal polynomial of Jn(k)(0). So we see that the rank Am > rank Bm. Since Am and Bm are similar, we must have rank Am = rank Bm. So we have reached a contradiction and the n-list is equal to the m-list. Q.E.D. THEOREM. Suppose that the Jordan matrices A = Jn(1, 1)("1), ,Jn(1, t(1))("1), ,Jn(v, 1)("u), ,Jn(v, t(v))("v) and B = Jm(1, 1)(1), ,Jm(1, s(1))(1), ,Jm(u, 1)("v), ,Jn(u, s(u))(u) are similar with all "i and i distinct. Then u = v and there is a permutation 1 of {1, , u} such that "i = 1(i) for all i. Furthermore, if the n(i, j) and m(i, j) are ordered so that n(i, 1) $ $ n(i, t(i)) m(i, 1) $ $ m(i, s(i)) for every i, then s(i) = t(i) for every i = 1, , u and n(i, j) = m(i, j) for every 1 % i % u and every 1 % j % s(i).

The Jordan Form of a Nilpotent Matrix page 9

PROOF. Note that {"1, , "v} = !(A) = !(B) = {1, , u}. Since all the "i (respectively. all the i) are assumed to be distinct, we get v = u and "i = 1(i) for all i where 1 is a permutation of {1, , u}. To avoid using the 1, we simply identify the "i with the i. Now we use induction on the size of the Jordan matrix. We write A2 = Jn(1, 1)("1), ,Jn(1, t(1))("1) and A22 = Jn(2, 1)("2), ,Jn(2, t(1))("2), ,Jn(v, 1)("v), ,Jn(v, t(v))("v) and B2 = Jm(1, 1)("1), ,Jm(1, s(1))("1) and B22 = Jm(2, 1)("2), ,Jm(2, s(1))("2), ,Jm(v, 1)("v), ,Jm(v, s(v))("v). We then have that A = A2,A22 and B = B2,B22. We show that A2 ~ B2 and A22 ~ B22. The induction will then show that A2 = B2 and A22 = B22. Note that the first stage of the induction is contained in the previous Proposition. To prove the similarity, notice that size of the blocks A2 and B2 are same and equal to the algebraic multiplicity a1 of the eigenvalue "1. So the blocks A22 and B22 must have the same size. So we can write the transform matrix S that implements the similarity between A and B, viz., C 1AC = B, in terms of blocks as C= C 11 C 12 . C 21 C 22

We complete the proof by showing that C12 = 0 and C21 = 0 so that the square matrices C11 and C22 are invertible and C11- 1A2C11 = B2 and C22- 1A22C22 B22 which will allow the induction hypothesis to work. Note that A - "1 = (A2 -"1),(A22 - "1) and (A - "1)n(1,1) = (A2 -"1)n(1,1),(A22 - "1)n(1,1) = 0,(A22 - "1)n(1,1) has rank n - a1 since (A22 - "1)n(1,1) is invertible. But (B - "1)n(1,1) = (B2 -"1)n(1,1),(B22 - "1)n(1,1) must also have the same rank n - a1 and since (B22 - "1)n(1,1) is invertible, this is possible only if (B2 "1)n(1,1) = 0. So we get that m(1, 1) % n(1, 1) and by reversing this we see n(1, 1) = m(1, 1). Setting A222 = (A22 - "1)n(1,1) The Jordan Form of a Nilpotent Matrix page 10

and B222 = (B22 - "1)n(1,1), we obtain invertible matrices A222 and B222 satisfying

C 11 C 12 C 21 C 22
or equivalently,

0 0 0 A222

C 11 C 12 0 0 , = 0 B222 C 21 C 22
0 0 . 0 B222

0 0 0 A222
So we get that

C 11 C 12 C 11 C 12 = C 21 C 22 C 21 C 22
A222C21 = 0

and 0 = C12B222. This means that C21 and C12 are both 0 matrices and that C has the form C=

C 11 0 0 C 22

and then both C11 and C22 and C11 implements A2 ~ B2 while C22 implements the similarity A22 ~ B22. Now using the induction, we can assert that the Jordan form is unique. Q.E.D. Calculation of the Jordan Form of a Nilpotent Matrix Let A be an n n nilpotent. We know that the only eigenvalue of A is 0 so that the eigenvectors of A are the nonzero vectors in the kernel of A. To calculate the Jordan form of A we find a basis of eigenvectors or equivalently a basis of the kernel of A. Then we use the following result. PROPOSITION. Let A be a nilpotent matrix and let S(i) (1 % i % p) be a set of vectors given by S(i) = {Am(i)xi, , Axi, xi} where 0 % m(i) % n. Assume that Am(1)x1, , Am(p)xp are linearly independent eigenvectors of A. Then the set 3S(i) is linearly independent. PROOF. We may assume that m(1) $ m(2) $ $ m(p) $ 0. We note that all the vectors in the set S are nonzero since the top level Am(i)xi consist of eigenvectors which are nonzero. Now suppose that
p m(i)

y=

4 4
i = 1 j(i) = 0

a i j(i)A j(i)x i = 0.

We have to show that all the -ij = 0. We get a contradiction to the assumption that one -ij is nonzero. Let

The Jordan Form of a Nilpotent Matrix page 11

= max {m(i) - j ( -ij * 0} and let (s, t) be a pair with m(s) - t = . Then we have that -ijAjxi = 0 whenever m(s) - t + j < m(i) since m(s) - t < m(i) - j means -ij = 0 by the choice of s and t. We also have that Am(s) - t-ijAjxi = 0 for m(s) - t + j > m(i) since then Am(s) - t-ijAjxi = -ijAm(s) - t + jxi = -ij Am(s) - t + j - m(i) - 1Am(i) + 1xi = 0 . So we have that Am(s) - t-ijAjxi = 0 whenever m(s) - t = m(i) - j. Let k(i) be that unique number with m(s) - t = m(i) - k(i) But now we have that 0 = Am(s) - ty = +i -i k(i)Am(s) - t + k(i)xi and at least one term -stAm(s)xs is nonzero. Here we have that m(s) - t + k(i) % m(i). Let T = {i ( -i k(i)Am(s) - t + k(i)xi * 0}. Note that T is nonvoid since s # T. Then let 5 = max {m(i) - (m(s) - t + k(i)) ( i # 6}. Then we have that A5+i # T -i k(i)Am(s) - t + k(i)xi = 0. But this is a linear combination of a subset of the eigenvectors Am(i)xi with at least one -ik(i) * 0. This contradicts the linear independence of the eigenvectors Am(i)xi in the list. So the set S is a linearly independent set. Q.E.D. PROPOSITION. The vector Am(i)xi is the only eigenvector of the nilpotent matrix A aside from scalar multiplies of itself in the cyclic subspace Span[x1, , Am(i)xi]. PROOF. If

The Jordan Form of a Nilpotent Matrix page 12

-0Ax1 + + -m(i) - 1Am(i)xi = A(-0x1 + + -m(i)Am(i)xi) = 0, then -0 = = -mIi) - 1 = 0 and -0x1 + + -m(i)Am(i)xi = -m(i)Am(i)xi. Q.E.D.

Now to find a basis for the Jordan form of the nilpotent matrix A, we find a basis {x1, , xp} of the kernel of A. Then for each i, we find the maximal set Si of solutions by recurrence of xi = xi0, xi0 = Ax11, xi1 = Axi2, . The process of finding solutions stops when the cyclic subspace is filled. The basis S = 3Si is the basis that sends A to Jordan form. If C is the matrix with the basis S in the columns, then J = C-1SC.

The Jordan Form of a Nilpotent Matrix page 13

Complements on Jordan Form PROPOSITION. The Jordan form of the r r matrix -0 0 A= 0 0 with -1 * 0 is Jr(-0). PROOF. Homework 38. PROPOSITION. The Jordan form of the matrix A in the preceding proposition with - 1 = = - t 1 = 0 is J = Jp + 1( - 0 ) , , J p + 1( - 0 ) , J p ( - 0 ) , , J p ( - 0 ) with q blocks of the form Jp + 1(-0) and t - q blocks of the form Jp(-0) where r = pt + q. PROOF. Homework 38. -1 -0 0 0 -2 -1 0 0 0 0 0 -0 0 -r -r 1 -1 -0

The Jordan Form of a Nilpotent Matrix page 14

You might also like