Download as pdf
Download as pdf
You are on page 1of 16
CHAPTER 20 Linear Operators on Inner Product Spaces ‘This chapter investigates the space A(V) of linear operators T on an inner product space V. [See Chapter 14.] ‘Thus the base field K is either the real field R or the complex field C, In fact, different terminology will be used for the real ease and for the complex case. We also use the fact that the inner product on Euclidean space R" may be defined by _(u,v)=u"v and that the inner product on complex Euclidean space C” may be defined by (u,v}==u7d where w and v are column vectors. 20.1 ADJOINT OPERATORS 20.1 Define the adjoint operator. FA linear operator T on an inner product space V is said to have an adjoint operator T* on V if (T(), 0) =(u,T*(v)) for every u,vEv. 202 Let A be a real n-square matrix viewed as a linear operator on RY, Show that A” fs the adjoint of A. T For every u,vER", (Au, (Au) (u, AT). Thus AT is the adjoint of 4. 20.3 Let B be a complex n-square matrix viewed as a linear operator on C”, Show that B is the adjoint of B [where B* is the conjugate transpose of 3]. F Forevery u,v€C", (Bu, v) = (Bu)"v = u™B" adjoint of B. u" Bp = (u, Bt). Thus B? is the Remark: ‘The notation B* is used to denote the adjoint of B and, previously, to denote the conjugate transpose of B. Problem 20.3 shows that they both give the same result. Problems 20.4-20.6 refer to the following matrices: Hetty Fc 6-91 2471, Bei 74+9i 643i 20.4 Find the adjoint A* of A. 2-3i g+9i), I Take the conjugate transpose of Ato get at=(27 35 $4: 20.8 Find the adj Be of B. B47 Ti 8-1 I The conjugate transpose gives us B’ WW 6H 7-9 243% 6-31 20.6 Find the adjoint C* of C. 147 F Since C is real, the adjoint C* is simply the transpose of C. Thus C*= C7 = @ 5 ’). 36 7 Theorem 20.1: Let Tbe a linear operator on a finite dimensional inner product space V over K._ Then (i) There exists a unique linear operator T* on V such that (7(u),v) = (x, T*(v)) for every u,0€V. [That is, T has an adjoint T*.] (ii) If A is the matrix representation of T with respect to an orthonormal basis $= {e,} of ¥, then the matrix representation of 7 in the basis $ is the conjugate transpose A* of A [or the transpose AT of A when K is real]. 451 52 [1 CHAPTER 20 0.7 18 ).9 110 ‘Theorem 20.1, proved in Problems 20.12~20.13, is the main result in this section. Let T be the linear operator on C? defined by T(x, y,2)=(2x + iy, y~Sizyx+ (I~ iy +3z2). Find T(x, ¥42), 4 Find the matrix A representing T in the usual basis of R°: 2 7 0 As[0 1 =St Li-i 3 Recall that the usual basis is orthonormal. Thus by Theorem 20.1, the matrix of T* in this basis is the conjugate transpose A* of A. Thus form 20 1 “il iH 0 3 Accordingly, T*(x, y,2)=(2x-+2,-ix+ y+ (1+ i)z, Sip +32). Let F:R°R? defined by F(x, y, = (Gx +4y ~5z, 2x —6y +72, Se~9y +z). Find F*(x, y, 2). # First find the maiix A representing T in the usual basis of R. [Recall the rows of A are the coefficients of x, y, z.] Thus 304-5) Aas[2 -6 7 3-9 1 Since the base field is R, the adjoint F* is represented by the transpose A” of A. ‘Thus form 302 5 Aé=( 4 -6 ~9 “371 Then F*(x, y,2)= (Bx -+2y +5z,4x —6y ~9z, Sx + Ty +2). Let T be the linear operator on C* defined by 7(x, y, 2)= (2x + (1~f)y, (3+ 20)x—4lz, Bix + (4—3i)y — 32), Find T*(x, y, 2). HF First find the matrix A representing T'in the usual basis of C*: 2 1-1 oO As[342) 00 =a 2 dn3i 3 Form the conjugate transpose A* of At 2 3-2 29 Ats[lei 0 4837 0 4 +3 Thus T(x, y, 2) = (2x + (3 2#)y —2iz, (14 Dx + (44 3i)z, diy ~32), Let V be an inner product space, Each uGV determines a mapping i: V-+K defined by (uv) = (v4). Show that dis linear. [Thus u belongs to the dual space V*,] F Porany abEK andany v,,v,EV, ilar, + bv,) = (av, + buy, u) = av, 4) + BC 0g, u) = aG(v,) + bf(o,). Thus dis linear and, in other words, 2 is a linear functional on V. Theorem 20.2: Let $ be a linear fanctional on a finite-dimensional inner product space V. Then there exists a unique vector w@V such that $(v)=(v.u) for every veEV. LINEAR OPERATORS ON INNER PRODUCT SPACES [J 453 20.11 Prove Theorem 20,2 which is the converse of Problem 20.10 and which need not be true for vector spaces of infinite dimension, T Let {e,,...,e,} be an orthonormal basis of V. Set u=VOE,Je, + OE), + FEE, Let it bbe the linear functinal on V defined by f(x) =(v,u), for every veV. Then, for n fie, = (e,,u) = (4, Ede, to 4 OE, “Fee: ) = G(e,). Since @ and ¢ agree on each basis vector, a= od. Now suppose u' is another vector in V'for which (u) = (v,u’) for every v€V. Then (v,u) (vu') or (v,u—u')=0. In particular this is true for v=u-u' and so (u=w',u~u')=0. This Sia u~w'=0 and wu", Thus such a vector u is unique as claimed. 20.12 Prove (i) of Theorem 20.1. Ff We first define the mapping T*. Let v be an arbitrary but fixed element of V. The map w+ (T(u),v) isa linear functional on V. Hence by Theorem 20.2 there exists a unique element v’ EV such that (7(u),v) = (u,v") for every wEV. Wedefine T*V-+V by T*(v)=v'. Then (Tw), v) = (ue, T@)) for every uy veEV. We next show that Tis linear. For any u,v,€V andany a,bGK, (u,T*(av, + bv,))= (T(u), au, + bog) = @{ Tu), vy) + 5{ T(u), v9) = &{u, T*(0,)) + Bu, T%(0,)) = (u, aT*(0,) + BT*(,))- But this is true for every u@V; hence 7*(av, + bu,)= aT*(v,)+ bT*(0,)- ‘Thus T* is linear. 20.13 Prove (ii) of Theorem 20.1. The matrices “ (a,;) and B=(by) representing T and 7”, respectively, in the basis (¢,} are given by a,=(The,),¢,) and by=(T*(e)),¢,). Hence by = (Te), e,) = (en, Te) = (Te,), ¢)) 2%. Thus B= A*, as claimed. Problems 20.14~20.17 prove Theorem 20,3 which summarizes some of the properties of the adjoint. Theorem 20. Let Sand T’be linear operators on-V and let kEK. The () (S#T)=s*+T* (iit) (ST)* = T*S* Gi) (ery*= ere Gy) (t= 20.14 Prove (i) of Theorem 20.3. F For any u,veVv, (S+ T)u),v) = (Sw) + Tu), v) = (Su), v) + (TH), 0) = (4, SW) u, T*(v)) = (u, S*(v) + T*(v)) = (u, (S* + T*)(v)). The uniqueness of the adjoint implies (S + T)* = SeT*. 20.15 Prove (fi) of Theorem 20.3. FT For any u,veV¥, ((kT)(u), v) = (kT), v) = k( TW), v) = k(u, TH) = (u, RT) = (u,(ET*)(o)). The uniqueness of the adjoint implies (&7)* = ET*, 20.16 Prove (iii) of Theorem 20.3. H For every uve, ((ST)(u), v) = (S(T(w)), v) = (Tw), S*(0)) = (us, TSO) = (4, (T*S*V@))- ‘The uniqueness of the adjoint implies (ST)* = T*S*. 20.17 Prove (iv) of Theorem 20.3. HP For every uve, (T*u),v) = (v, TW), implies (T*)*= 7. (FO), @) = (u, Tw). The uniqueness of the adjoint | 20.18 Let T be a linear operator on V, and let W be a T-invariant subspace of V. Show that W* is invariant under T*. Folet new! If weW, then TMn)EW andso (w, Tu) = (TOW), u since it ig orthogonal to every w@W. Hence Wis invariant under 7*. 0. Thus T%u)eW* 20.19 Use the definition of the adjoint to show I f Forevery u,vGV, {J(u),v)=(u,v) =(u, Kv); hence I*=7. nn 454 1 CHAPTER 20 20.20 20.22 20.23 20.24 Use the definition of the adjoint to show 0" I Forevery u,veV, (0(),0) = (0,0) 4,0) =(u,0(0))5 hence 0" Suppose T is invertible, Show that (T7‘*=(T*)"* Borers rr 'ys("''T%; hence (1"4)*= (74) Suppose (T(x), v) +0 for every u,vEV. Show that T=0, set v= Tu). Then (7(u), T(u))=0 and hence . T(u) T=0. |, for every wEV. Accordingly, Suppose V is a complex inner product space and (7(u),u)=0 forevery wEV. Show that T'=0. I By hypothesis, (T(v + w),v-+ w) (Tw), W) = 0, for any v,wEV. Expanding and setting (7(v),v) =O and (T(e), w) * (TGv),v) 0 a) Note w is arbitrary in (1). Substituting Av for w, and using (7(v), iw) = 7(T(), w) = —1(T(e),w) and (Fw), v) = (iT), v) = i(T(w), v), we ger ~i( Tv), w) +i(T(w), 0) = 0. Dividing through by i and adding to (1), we obtain (T(w),v)=0 for any v,wEV. Accordingly, T=0. Show that Problem 20.23 does not hold for a real space V, ie., give an example of an operator Ton a real space V for which (7(w),1)=0 for every uEV but THO, I Let Tbe the linear operator on R? defined by T(x, y)=(y,-2). Then (T(w), u)=0 for every uev, but THO. Remark: Let A(V) denote the algebra of all linear operators on an inner product space V of finite dimension. ‘The adjoint mapping T+>T* on A(V) is analogous to the conjugation mapping 2+>2 on the complex field C as given ia Table 20.1. Observe that the table identifies certain classes of operators. TE A(V) whose behavior under the adjoint map imitates the behavior under conjugation of familiar classes of complex numbers. ‘The next few sections investigate these classes of operators. In particular, the analogy between these classes of operators T and ‘complex numbers z is reflected in ‘Theorem 20.4 whose parts are proved later. ‘Table 20.1 Class of Behavior under Class of ‘Behavior under complex numbers conjugation ‘operators in A(V) the adjoint map Real axis Self-adjoint operators T=T Also called: Symmetric (teal case) Hermitian (complex case) Unit circle ([2|=1) zal Orthogonal operators (real case) Imaginary axis Unitary operators (complex case) Skew-adjoint operators Also called: Skew-symmetric (real case) Skew-Hermitian (complex case) Positive half axis (0, ©) tw, w #0 Positive definite operators T= S*S with § nonsingular LINEAR OPERATORS ON INNER PRODUCT SPACES J 455 Theorem 20.4: Let A be an eigenvalue of a linear operator T on V. @ If Tt= 7, then A is reat. (i) If T= 77, then al= 1. (ii) If T7==7, then A is pure imaginary. (iv) If T=S*S with S nonsingular, then A is real and positive. 20.2. SELF-ADJOINT OPERATORS, SYMMETRIC OPERATORS, 20.25 20.26 20.27 20.28, 20.29 Define a self-adjoint operator. F An operator T on V is said to be self-adjoint if T*=T. ‘The terms symmetric and Hermitian are also used for self-adjoint operators on V when the base fields are R and C, respectively. ‘Theorems 20.5~20.8 are the main content of this section, Theorem 20.5: Suppose T is a self-adjoint operator on V, i. of T. Then A is real. suppose T*=T, Let A be an eigenvalue Theorem 20.6: Suppose T is self-adjoint, i eigenvalues are orthogonal, T. ‘Then eigenvectors of T belonging to distinct Theorem 20.7: Let T be a symmetric [self-adjoint] operator on a real finite-dimensional inner product space V, ‘Then there exists an orthonormal basis of V consisting of eigenvectors of T; that is, T can be represented by a diagonal matrix relative to an orthonormal basis. Theorem 20.8 [Alternate Form of Theorem 20.73; Let A be a real symmetric matrix. ‘Then there exists an orthogonal matrix P such that B= P"'4P = P'AP is diagonal, Prove Theorem 20.5. F Let v be a nonzero eigenvector of T belonging to A, that is, T(o)=Av with v0; hence (u,v) is We show that a{u, 0) = Av, v): Av, v) = (Av, v) = (7(0), 0 But (u,v) #0; hence and 1, T*(v)) = (uv, T(v)) = (v, Av) = A(u, v) and so A is real. Prove Theorem 20.6, F Suppose (x)= Av and Tw)= pw where Avy, We show that A(v, w) = 40(0, 0): (0, 700) = (0, ma) [The last step uses the fact that yc is real by Theorem 20.5, so f= y.] Bur Av py hence (v,w)=0 as claimed. A(v, w) = (A, w) = (TQ), (2, w) = (0, ») Let T be a symmetric operator on a real space V of finite dimension. Show that (a) the characteristic polynomial A(z) of T is a product of linear factors [over R], (6) T has a nonzero eigenvector. # (@) Let A be a matrix representing T relative to an orthonormal basis of V; then A= A", Let AC) be the characteristic polynomiat of A. Viewing A as a complex self-adjoint operator, the matrix A has only real eigenvalues. ‘Thus A()=(t=A,)(t-a,)+++("—2,) where the A, are all real. In other words, A(:) is a product of linear polynomials over R. () By (@), T has at least one {real] eigenvalue. Hence T has a nonzero eigenvector. Prove Theorem 20.7. 1 The proof is by induction on the dimension of V, If dimV=1, the theorem trivially holds. Now suppose dimV=n>1. By the preceding problem, there exists a nonzero eigenvector u, of T. Let Wbe vy/llerll- the space spanned by w,, and let u, be a unit vector in W, e-g., let u, 456 [] CHAPTER 20 20.31 20.32 20.33 Since v, is an eigenvector of 7, the subspace WW of V is invariant under 7. Hence W* is invariant under T?=T. Thus the restriction T of T to W* is a symmetric operator. We have dimW!=n-1 since dimW= 1. By induction, there exists an orthonormal basis {u,....4,) of W* consisting of eigenvectors of F and hence of T. But (u,,u,)=0 for i=2,...,n because 4 EW*. Accordingly (14, la,-++, uy} #8 an orthonormal set and consists of eigenvectors of T. Thus the theorem is proved. (3. 3). sind a (real) orthogonal matrix P for which PAP is diagonal. Let 4 # The characteristic polynomial A(#) of A is A= it~ Al= = 645 = (1-5)(¢-1) and thus the eigenvalues of A are Sand 1, Substitute 1=5 into the matrix i~ A to obtain the corresponding homogeneous system of linear equations 2x—2y=0, —2~2y=:0. A nonzero solution is 1,-1). Normalize v, to find the unit solution u, = (1/V3,1/V2). Next substitute into the matrix 1 A_ to obtain the corresponding homogeneous system of Jinear equations ~2x~-2y=0, —2x 2y=0, A nonzero solution is v,=(1,~1). Normalize vy to find the unit solution u, = (1/V2, ~1/V2). Finally let P be the matrix whose columns are «, dnd u,, respectively; then =(WYE wi) yp a(S pa (i Na) m0 P'ar=(5 $) As expected, the diagonal entries of PTAP are the eigenvalues of A. (63 3-3 I The characteristic polynomial A(i) of Bis A(t) of Bis A(e) = Jd — Bl= P —u(B)e + [8] (-O(eF4). Thus the eigenvalues are A=6:-and Hence Prar=(5 _{) Let B )). Find a (eal) orthogonal matcix P suc that P™BP P 2p 2h To find the change-of-basis matrix P, we need to find.the corresponding eigenvectors. Subtract 4=6 down the diagonal of B to obtain the homogeneous system —x+3y=0, 3x~9y=0. A nonzero solution fs v;=(3, 1). Subtract A=—4 down the diagonal of B to obtain the homogeneous system 9x+3y=0, 3x+y=0. A nonzero solution is v,=(-1,3). [As expected from Theorem 20.6, the vectors v; and v, are orthogonal.] Normalize v, and u, to obtain the orthonormal basis 1, GIVT0, 1/V10), v= (-1/V10,3/V10). Then Problems 20.32-20.38 diagonalize the symmetric matrix Find the characteristic polynomial A(#) of C. FB a)= ? ~u(0)P + (Cy + Cy + Cyt“ [C] =P ~ 6? ~ 135¢—400, [Here G, is the cofactor of Cy in C=(,).) Find the eigenvalues of C or, in other words, the roots of A(’). A If A(e) has a rationat root it must divide 400. Testing #=—5 we get 400 54 55+ 400 i= + 0 20.34 20.35 20.36 20.37 20.38 20.39 20.40 LINEAR OPERATORS ON INNER PRODUCT SPACES [] 457 Thus 145 isa factor of A(t) and AC) = (¢-+5)( ~ 111-80) = (+ 5)*(t~ 16). Accordingly, the eigenvalues of Care A= 5. {with multiplicity two] and A=16 [with multiplicity one]. Find orthogonal cigenvectors belonging to the eigenvalue A= —5. HF Subtract A=—5 down the diagonal of C to obtain the homogeneous system 16x~ 8 -+4z=0, —Sxtdy~22=0, 4x—2y+2=0. That is, 4x—2y+2=0, The system has two independent solutions, One solution is v,=(0, 1,2), We seek a second solution v,=(a, 6, ¢) which is orthogonal to v4: ie., such that da—2b-+ and also b~2c=0, One such solution is “»,=(~5,~8,4). Find an eigenvector », belonging to the eigenvalue |= 16. F Subtract A=16 down the diagonal of C to obtain the homogeneous system —Sx~8y +42=0, -8x~1]y~2z=0, 4x-2y—20z=0. This system yields a nonzero solution uv, =(4,-2,1). [AS expected from Theorem 20.66, the eigenvector v, is orthogonal to v, and 0.) Find an orthogonal matrix P such that P~'CP is diagonal. FH Normalize v,, v,, 5 to obtain the orthonormal basis: u, = (0, 1/V3,2/V3),_u, = (-SVI0S, 4¢VT05), (/V2i, ~2)V 31, 11V21). Then P is the matrix whose columns are uy, ly, ;, "Thus 0 -siVIS 4/V7T -5 Pa “5 16 wv3 ~8/VT05 “aN and = P7CP= IVS 4IVI0S VST Consider the quadratic form q(x, y, z)== Llx*— l6xy ~y*+8xz —4yz~42% Find an orthogonal change of coordinates which diagonalizes. FF Since C is the matrix which represents g, use the above matrix P to obtain the required change of coordinates: Sy te Vis * Vai x 8y" 2z' Va7 Vs Var 2 zt P= 45 + Vn * Var Under this change of coordinates, q is transformed into the diagonal form q(x", y", 2’) = ~Sx’ — Sy’ + 162". Find the signature of q. ‘Thus F Since there are two negative diagonal entries and one positive diagonal entry, N=2, P Sig(q)=P-N=1~-2=-1. Suppose Tis self-adjoint and (7(u),4)=0 for every wEV. Show that T=0, J By Problem 20.23, the result holds for the complex case; hence we need only consider the real case. Expanding (T(v+),u+w)=0, we obtain (Te), ») + (TO), 0) =0 o Since T is self-adjoint and since it is a real space, we have (T(), v) = (1, Tlv)) = (7), w). Substituting this into (1), we obtain (T(v),w)=0 forany v,wEV. Thus T=0. Suppose Tis self-adjoint and T?=0. Show that T=0. I For any veV, wehave [TQ]? = (TQ), Te) =v, Tv) = (v,0v) = (0,0) {I7()]]=0 and hence T(o)=0. Since T(v)+0 for every VEV, we have 458 1 CHAPTER 20 20.41 Show that T*T and TT are self-adjoint for any operator T on V. B(rrry=7're= TT, and hence T*T is self-adjoint. Also, (7T*)*=T**T*=TT*, and hence TT™ is self-adjoint, 20.42 Show that 7+ 7% is self-adjoint for any operator Ton V. HOPE TNA THe TMS TE TR THT hence T+T* is self-adjoint. 20.43 Define a skew-adjoint operator. J A linear operator T on V is said to be skew-adjoint if T* 20.44 Suppose T is skew-adjoint, ie., suppose T*=~T. Let A be an eigenvalue of 7, Show that A is pure imaginary, i. F Letu be a nonzero eigenvector of T belonging to A, thatis, T(v)=v with v0, Hence (u,v) # 0, We show that A(v,v)==2(v, 0): A(v,v) = (Av, v) = (T(0), v) = (v, THv)) = (v, —T(0)) = (v,-Av)=—-A(v,v), But {v,v) #0; hence A=—-X or X=—A, and so A is pure imaginary. 20.45 Show that 7—7* is skew-adjoint for any linear operator T on V. FO(r-T)taTt— T= T*-T=-(7-T*); hence T—-T* is skew-adjoint. 20.46 Show that any operator Tis the sum of a self-adjoint operator and a skew-adjoint operator. F ose S=\re7*) and U=i(T~T*). Then T=S+U where S*= (A(T +74)" UT*+ T= UTE T)=$ and Ut=((T- T= M(T*~ T)=—-4(T-—T*)=—-U, that is, Sis self-adjoint and U is skew adjoint. 20.3. ORTHOGONAL AND UNITARY OPERATORS 20.47 ‘Define a unitary and orthogonal operator. Ff Let U be an invertible operator on V such that U*= or equivalently UU*=U*U=1, Then U is said to be orthogonal or unitary according as the underlying field is real or complex. ‘Theorem 20.9, proved in Problem 20.55, gives alternative characterizations of these aperators. Theorem 20. ‘The following conditions on an operator U are equivalent Ut=U} that is [or U is unitary (orthogonal)]. U preserves inner products, i.e., for every v, wEV, (Uw), Ulw)) ={u, w). U preserves lengths, i,e., for every vGV, |UG)I|=]lof. An orthogonal operator T need not be symmetric and so it may not be represented by a diagonal matrix relative to an orthonormal basis. However, such an operator T does have a simple canonical representation, as described in Theorem 20.10 which is proved in Problem 20.58. Theorem 20.10: Let T be an orthogonal operator on a real inner product space V. ‘Then there exists an orthonormal basis B of V such that the matrix representation of T in the basis B has the form of the matrix in Fig. 20-1. [The reader may recognize that the 2X2 diagonal blocks in Fig. 20-1 represent rotations in the corresponding two-dimensional subspaces.) 20.48 Suppose T:R°—+R® is the linear operator which rotates each vector v about the z axis by a fixed angle 0; Le., suppose T(x, y, z)= (x cos @~ y sin 8, xsin @+ycos 6, z). Is T orthogonal? FAs pictured in Fig. 20-2, the length (distance from the origin) of v is preserved under the rotation 7. ‘Thus T is an orthogonal operator. | LINEAR OPERATORS ON INNER PRODUCT SPACES J 459 r 1 sing cose, Fig. 20-1 Te) y ¥ Fig. 20-2 20.49 Let F:R'R? be the linear operator which reflects each vector v through the xy plane; f.e,, let F(x, y,2)=(2, y, 72). Is F orthogonal? FAs pictured in Fig, 20-3, the length of v is preserved under the reflection F. ‘Thus F is orthogonal. * Fw) Fig. 20.3 20.50 Given an example of a vector space V of infinite dimension and a linear map T: V->V_ for which ‘Theorem 20.9 does not hold. T Let V be the l-space of infinite sequences v=(a,,4,,.-.) or real numbers a, such that Da? 0 for every w#0 in V. a 462 1 CHAPTER 20 20.61 20.62 20.63, 20.64 0.65 0.66 0.67 0.69 Theorem 20.15: A complex matrix A=(% >) represents a postive [positive definite] operator if and only if A is seltadjoint [that fs, A*= A. in the complex case and A= A. in the real case} and a, d, and |A|=ad— be are nonnegative [positive] zeal numbers. Problems 20.61~20.66 refer to Theorem 20.15 and the following matrices: aa) (43) OD eG) eC) Is A positive definite? Positive? F Since |A|=0, A is not positive definite. However, A is positive since are nonnegative, |, d=1, and [Also Is B positive definite? Positive? TF Since a= , d=3, and [B|=8 are positive, B is positive definite [and hence positive]. Is C positive definite? Positive? F Since C is not self-adjoint, ic., C™AC, Cis neither positive definite nor positive. Is D positive definite? Positive? H Since a=2, d=2, and |D|=3 are positive, D is positive definite [and hence positive} Is E positive definite? Positive? F Since |E| are nonnegative. , Eis not positive definite. However, Eis positive since a » and [E}=0 Is F positive definite? Positive? F Since |F|=~3, F is neither positive definite nor positive. Suppose T is positive. Let A be an cigenvalue of 7. Show that A is real and nonnegative. # Since Tis positive, T is self-adjoint; hence A is real. Let v be @ nonzero eigenvector of T belonging to A, that is, T(v)= Av with v0; hence (u,v) is positive. Since Tis positive, T=S*S fot some operator S. We show (u,v) = (S(v), S(v)}: A(v, v) = (Av, v) = (TQ), v) = (S*S(W), v) = (S(), S(2)), Since (S{w), S(v)) is nonnegative and (v, v) is positive, we have A is nonnegative, as required. Suppose T is positive definite. Let A be an eigenvalue of 7. Show that A is real and positive. F Since T is positive definite, T is self-adjoint; hence A is real. Let v be a nonzero eigenvector of T belonging to A, that is, T(v)=Av with v0; hence (v,v) is positive. Since T is positive definite, T= S*S for some nonsingular operator S, Thus S(v)0, and hence (S(v), S(v)) is positive. We show that A(v, v) = (S(v), Sv): _A(v, v) = (Av, v) = (To), v} = (S*S(v), v) = (Sw), S(0)). Since (v, v) and (S(v), S(v)) are both positive, we have A is positive, as required. Prove Theorem 20,13. F Suppose (i) holds, ie., P=T? where T=7*, Then P=7T=T*T and so (i) implies suppose (ii) holds. ‘Then Pt =(S*5)*=S*S** = S*S= P and so P is self-adjoint. Furthermore, (P(u), ut) = (S*S(u), u) = {S(u), S(u)) =0. Thus (ii) implies (ii), and so it remains to prove that (fii) implies (i). Now suppose (if) holds. Since P is self-adjoint, there exists an orthonormal basis (1,...,u,} of V consisting of eigenvectors of P; say, P(u,)=,u,.. By Problem 20.61, the A, are nonegative real numbers. Thus VA; is a real number, Let T be the linear operator defined by T(u,)= Vu, for i=1.....n. Since T is represented by a real diagonal matrix relative to the osthonormal basis (1), T'is self-adjoint. Moreover, for each i, T?(u))= TCV A,tt,) = VAT(u,) = VAV Ait = At, = P(u,). Since T? and P agree on a basis of V, P= 7% ‘Thus the theorem is proved. Now

You might also like