Download as pdf
Download as pdf
You are on page 1of 16
(CHAPTER 1 Review of Some Facts about Matrices, Quadratic Forms, and the Multivariate Normal Distribution [Let A be an nn matrix. Denote the determinant of by det(A) = [Al and the ace (of A, hat is the sum ofthe diagonal element of A, by (A). Then we know tt the following condition ar equivalent: (Al #0, rank(A) =n, and Gi) A™* exists, If ‘Aisa diagonal matrix oratriangularmatri, the [Al isequal the prodot ofthe diagonal clement of A. Also 7B i anothen xn mar, then (AB| = [AI abd tr(AB) = ‘e(BA). 1. Orthogonal Matrices ‘Ann x n matix P is called an orthogonal mati if PP! "=P. were P denotes th wanspose ofthe matrix P. ‘hen PP"| = [PIP | = [Pl = [l= 1 robe (P| =, a PPP, tat isi 2, Eigenvalues and Bigenvectors For any n x n mats A, the roots ofthe mth degree peynomisl equation i, [AT — [Al = 0, denoted by 2,2... Ay af called the elgenalue ofthe mauix A. The col- Teton (y+. Aq} ofeigenvetersof A iscalled the spec of A. Any nonzero» 1 vector 2, 2 0 such tht Avy = Az is called an eigenvector of A coesponding tthe clgenvale , Foc any diagonal matrix D = diag, dy,....da) [AI ~D] = [Thiy(~@) = 0 as 00d 20 thd, ‘Note tha since [AT - PAP (PPM Al =/Al~ Al ‘3. Dlagonlizaton of « Symmetric Matix For any xn symmetic matrix A. hati, A = A, thre existe an orhogoal matix such hat PYAP = A= diag(2,... A), where the Ay athe eigenvalues of A. The omespondingelgenvectors of A are he coma vectors of the matrix P. Tose ths let ts denoe the 1 unit ectorsby ey = 1... where eso inthe th potion and zs elwuhere, Then P'AP = A implies dat P'APe, = Ae, = yeu 20 that A(Pey) = (Pei) Hence, p, = Peisi = Ty. the columas of the mata P, ae the ‘lgoaveclorsof A conesponding othe egenvash. 4. Spectral Decomposition Nate that from the above relation P'AP = A we sto have tht A = PAP! Dil, with PP’ = D2 my) = I. Ths representation of A is known a th 2 | REVIEW OF MATE QUADRATC ORM AND NORMAL STALIN spectral decompostion of te symmevic mai A. Not trom PYAP = and pevi- us ema, we ave hat tA) = (A) = Sot, Aan |Al = I = [Toye 5. Quadratic Forme Foran n constant (symmetic)matix A = {athe qudratefnction of vai ables 2, where 2 a) denotes an nx 1 vector, defined by Q(a) = 2 Az Sfey ous, calel guadraeform with mat A. Without loss of genera, ‘wea alvays take the mauis A ofthe qudrate frm tobe symm (iA were nym. ‘mac, we could always ave an equivalent expression with A replaced by the symmetc max (1/2)(A-+ A] in the quadrat form). DDeMNITION 1.1. The quadratic fom Q(z) = 2'Az i called postive defi, and ‘he matix A is lo sad to be postive definite, if Q(x) = 2A > 0 forall z+ Os and is postive semidefinte if Az 2 0. (Similar deftions hol for negative dei and negative semidefnite) 6, Transformation of Quadratic Forms {Let bes nonsngularn xn mati, and consider the nonsinglarlinear transformation y=B™'z,or2 = By. Thea the quaraticfrm in canbe expressed aa quai form inthe transformed variables, since Qe) = 2” Az = y'B'ABy = Q"(y), where Q"(y) ‘sa quadatic orm in y with enatixB’ AB. Then we have the folowing ress 1 is nonsingular, hen Q(z) = 2 Ais positive dest (postive semidefnit) ‘and only if Q"(y) = v'(B‘AB)y isposive definite (positive amide). Thats the Asintenes of a qundeate fori ivaran under nonsingular Iiearwanseratons| (Gi) For any quadratic form Q(z) = 2 Az, there exits an orthogonal tix P and corespondingwansformation y = P'z each that ae) aaseriavitsen Pah we = dag and th ae he sgenvls ofthe mtx A. There {lows om the hagonzaon he symmetic matt As PAP = A iow rom i hat Ais pstine dei (pve sarin) fan enya th gevas fA sist Dy > OC) Also that rnk(A) = rn, oat ‘ank(A) eqn th mmr of nonzero sige of Aan Ao nna and CnlyifA basal nero genres, (i) For any gata for QC) = 2A, he exits nonsingular mas B nd erespning ear trsorsonv = Bt ach ta) va’ A = wD’ AB ‘Pau = Sb whee yo, Deenetadigte hs tga of he Siena Ay d= gy. dy) Tissues by ing tae Vises Vos VBptiv ones Tp dyenDy er we assume the eigenvalues of aoe 0 hat yp. > 0 Apt yey < Oand Mpegs m= Oy =O. Then with Psih at BAP ~ te Ie US DP’ = BE ace = PD = Bu). andso fom i) we ave a a2 = WD" PAPD™ *aD-lus wae where D"P'APD” =B'AB Nate rom i st i Ais psiive defi, otha all eigenvalues ofA apse, shen we have BAB = 1, oth any postive defnte matin A canbe expensed st A= CCC, where C (= B=) is ann x n nonsingular matrix. Moe generally, As postive semideinite with rank(A) =r ebas mean vesor E(2) ‘AE(X) + an eovatance mari Cov(2) = AGov()A". More gneally, tes ‘arene mario te ado ee ACov(X.¥)B! = AEB’. "he covariance mast Ex ft random vor X ie a symmei and postive seni- dette maui sac Var(UX) = Ua,6 > Ofral dense! eso, Texto, ag can abaya Be expen he fr Ee, = PAP were ran otogonl ai whose columns are an orthonormal set of eigenvectors of Sz— and A = diag(At,-.-, 4s) is 2 cigonl matic wow Gngnal cement ae cortipondingcigevasof Ex The mauit Zor abo poss he Chey decompotion nthe form Ey = RDI, ‘where Tis ower eengaar mati ido nthe agonal and Drain att ‘ith non-nepuve diagonal lenens. Therefore by he previo res te random ectr B= ROX bs agonal covariance ai Cor(Z) = Re" = D, 9, The Muldvariate Normal Distribution Let denote ak x brea symmetic postive definite mati with (jj) element eyjy and let y= (ut. pa bo ak 1 vec. Then th eimensional random vector X= (X1.Xay.00)" ih sa wo have a (onsingular) molvarite normal distbuton with mean vector and (postive efit) covariance matrix = (ou), and we say X15 istbuted a8 Nj, 3), ifX has jit pe. of the form $l) = x) *1B |" expl—(e = wy Ee — 1/2) an forall z= (21. 24) € RY, whore | denotes the determinant of. First we wantto ‘esify that the unetion (2) given in (1) satay «pl and that if X has this po. em B(X) = pe and Covi X) = EI(X ~ p28 ~ p= B ae acaly the mean vector snd covariance max of X. To show these fats = fafa) denote an brary 151 veetor of ral numbers, We wll vate te integral (wbich is recognized a the mas of X withpat fla) Li [Leneavraiae eyempa [of emit (e—nE Me mye, (12) and then we can set = (0,...,0) =o ets that th integra of he function given in (is equal to. By eviating he inte gralin 1.2) we wl als be obtaining the ma. of X, M(t) = Ble), “Toevalumte the tmegralin (12, St recall tat since is pose definite, the eigen vats. of Bare postive and theres an onthogonal atx P such that P'S ~ ‘Am dag(Q,s.-yA4). Hence, (PDP) = diag) "Now forthe integral in (1.2) we make the change oF variables by he ines tansormaton v= P(e), withe = Pusu. where y= (py.».o4” The Jacobian of thistansfor ‘mations J = [P which as absolute vai ous 0 1 since i orthogonal. The ila n(1.2) then becomes exw wy2ay*r ier? [-.. [” exe Py ~ 0/2y/ PE Pyley = rote nan) Mtn¥ of ex l'e'y— cava lay =eopleinanr ances” Level morn aps] a3) hee we have set w = P't = (wa) forconvenenc, Also, we know tat [| [P'S |P| = [PEP] = [Al = [15 A. Hence the negra in (1.3 an be writen asthe oduct of ineprais as exo) Tes 95° [~ exp fisny-Cvent/A)dyy, 4) and cach fhe integrals inthe productcan be sent equa the mg. (w,) = Ele™)) of nomalr. ¥; withmean and variance Hence each negra isequl ou exp /2). Tus he expression in (14) ual 0 p(t a) [] e=riAsw}/2] = exp(t'n) ex | vin] = extn + w/hw/2] = exit + PAP /2 olen + € 4/2) Hence when west t= 0 we obi tha th integral ofthe fiction f() n(11) egal 101, Moreover, we have that he it mg of X = (Xion) Xe) Ive A) = Ele) = extn + UH, as) forall. The mas, of cach X; is obsined fom (1.5) by seing al; = O except to obuain M(t) = ECC) = exptsy + 08/2, Ths exch X; is Nar 2) anit iret follows rom the mt in (5) ta we have EC) = = erp of view: we se fom the wansformation res given above pecily (1) ht he random vector defined by ¥ = (Vi. Y4) = P(X —y) has Sisto sch tat te Yar indepndent sand each Ys nomad N'(0, 2). ene ‘we rel se that E(Y) = Oand Cov(¥) = A andi iat noma (0,8) Thun since X = PY +, itfllows that E(X) = PECY) + = pond Cov X) 10, Propet fd Malvarne Normal Distribution 104, Disuibeon of Linear Combinations. Suppo it X = (Xe) as 4 kvarite normal Ge Sse) dtibulon, and e = (ey. ~ ca bea E> 1 Yee ‘or of cont (not allo. The he linear combination Y = X= Ye, eX bas 2 nomal distin with mean EQY) =e, = Tk ny advance Var?) eBare = DLT}; asjey. This follows easly sine the mgt oY i y(t) Ele) = ECON = eat, +e Brae /M whichisthemg ofaN(C hy, eB ae) distribution. Furthermore, itcan be seen from m.g.{.’sthat X is multivariate normal N (j4,, E20) ifand onli eX i die 8 (Cys ¢ Baye) oe every #0, by sting t= tin the expression for E(e'), Margera: comsierp ( tenth 3 Qe and Qa a nde pendent ond hence) Qs [ois x3) why =F ry and = 2 PROOF. WehaveA = A+ wae A snd As ae iempotety Corll 22. nd itigassumed at Ay = AA; i portve seme, Deca of hifi ay ver Such at Az = Otten ths inpesthat Aye = One, Hence ace AQ-A}y = O forall YwermusthaveAyGL= Aly = Oforall yan treo A(~ A) = Oot Ay = ALA. Bucs implies tat Ay = AA = AMAL + As) = Ai + Aya and herefore we iusthave Az = O(as wells ApAy = Oby king wane) To, Qs ad Qe Indpendentssby here? Inadtin, Af = (A-~Ay)? 2A? AAS CAA + ‘Aj = A~ A, = Ay. Bete A iran enpotent matrix ands Qy/? i csbtd os 32 Of) by Cova 22: Tm depresot tecomisry tran) SAA) ae ‘dsm, nonenlty parecer AE = w’Agy/o® 23° 3 6 CCociman's THEOREM. LX = (Xs... Xq) benomalN yo) gndlerQ, = RUAX, nh be gud forms sth that RX Fog Xf = oe Qu that i= Doe, Au Denote = rank(Ay), 4 = toe Then he 3 Qu Qh are ‘maul independent and Qo dsr a 3,38) mhere Mf = 4 As] for each = Ly sh Fand only Fay =m. PROOF. The ees ft conton r= ovis sce tbe Qari pendent andthe oar cisguare died, es Oh, Qo must Gabe aschisqurewith 1, regres feedom adthismstegeinsince Qo? = X'X /o? ischsqure win dopeeso freedom. For apo of the sen of ts ono, ce Hogg and Craig (1978p. 419. o “he flowing hore summarises nd extends some of he resis which have been ected above THEOREM 27. Let X = (Kn Xa’ Be yo™D, Q, = X°ALK,rnk( A) = 1)... such that X'X = SA, Qi. Then the following statements are equivalent any one natement imps eack ofthe oer) ‘0 Qiu oe mata) ndependen re 10 es dibed a8 OB). 4 = Auf 0® fori 1 it) Asoo A are denporen mares, that AP = A for yes or. pls Condo ip ha y= XA inept Dhegi= XU AX sone (l= a) = 06 Tewen 4 Baar ple ‘that AZ = A; and bence that Q; /o? is distributed as chi-square by Corollary 2.2. ‘The Ses tion re te Que rae pay. Crompton eqeber ent = yA ple ta A = ALT Ay ‘Ais ee ht A, etn rh ‘Sterile anon ns Cir 22 Teen? 4nd Cuan mews . BesMOLE22. LAX = (Xi Nabe he = (a sx er fem Got de ae nL ei-a Ear (Ex) = Xx Wx SX KUKI XA, = 11nd Qenr? (&) h whee Ay = fn 21 Ay, Then XX = D2 XP = XA A AGK, wih Tai(Ay) = n= 1 mdiaal(As) = 1, Wecn tal ed tu Af = Aes Ad = Riva’ AvAD = 0 "Thay pena eae fo Creep ee i tbe x20), Qa/o2 = 2/0 eda xf 0) a ad ‘independent, wtf = GAL (ato? = gAYC~ (ymyLY)Ye8 = Dan Bf = GAY ALayfot =) 1/08 op? He ye nad ‘Sesnal oy npn of PH) whichis dd ad ee) EXAMPLE23. Li Yi. Ya bind nal wt ommon Yrne 6? and means ps = 8 eh ep 1 vc fed knows cnsan and B= iy) tp 1 eco neon pene. Tas & On Mino) wren = 20, md kth n= pa whose throw eas Sune rnk(2) = phat (22) ent: Then he a seat af 6 (22) "2'¥ whthty pronoun aN Dro}. Comer oy Yaa) ey 8 AY a = V'Y 1 ¥'0-2 eBay = CY —2By ZB) aY'ALY, areola fllwsice¥'28 = V'2'2)"21Y an 228 Ya(way 'anea) TY = Y'UU2)'Z¥. Now oma, = 220)" ‘soymmenic an enpoet (A ~ A), hich easy se ad Renee lo AAs ‘Aud= Ay) = Ay Af 0, Tvs PY = Q, Oy we laow tht Ohad Ghteinkpedet ne i/o dbo coud ome Rot = 2(2'2)-"2 28 = 0, ayo" dated a3} wi = BZEAIC™ ial eum of sure deo een eps sem ues dened S90) 210, teen or enarum fogs Seed SS, where Ay = XUX/n= X'ASX, (CHAPTER 3 Least Squares Estimation and Properties of LS Estimators ym where the are 31 vectors ofthe explanatory variables andthe are andom errs which are uncorrelated with mean (and variance o*. Leng ¥ = (Hy. Ya) and X = (21,--- 4), wehavetbe lineae model ¥ = X8 +e, wit the vector of random eroes€ = (¢1,--yé” dsuibute such that E(e) = O and Cor(e) = Ble) = oD and he design mats Kis xk, DEFINITION 3.1. A eastsquares estimator (LSE) 3 = A(Y') of Bis any k-dimensional function of ¥ which minimizes the sum of squares fenton (8) = (7 - KAY - x8) = 70-210)" wit respect fo given observation vector Y. s “To minimize we consider the fet-order equation to sbisin a minimum given by asjop' DENITION 3.2. The equations 05/08 = O are called the nonmal equations Since we bave we chitin 95/90 = 2X'XB — 2x" THEOREM 3. wy PROOF. ()Sinerange(X!) = range(X™X)andX¥ € range(X') share mast exis seh a XD = HCY (iNet tit noma equations XX Now XY ae equivalentoX'(Y —XA) S(p) = (r —xBy( -xB) = [ov -x8)+x@- 9) [or -x4)+xi-9)] = (XA -XB) +B A)X'XG- 8) i Bs a soution wo the normal equations Since he cond ter is omepive for any ohne SB) = (¥ ~XB)(¥ ~XA) > (Y — XB)" -XB) = 5B) for any, I Bisa solwion te normal equations. The converse at olds bythe same argument a Nort 1. Now lt equal he unique po ¥ = +8; where €and X are orogonl ic, X°@ = Oand sa ao ff = 0. Tht 5(B) = Y= XalP = | ~ al? + lit — XBIP, sine (¥ ~ 9) and (9 — XB) se ‘rthogenal so tat Bis a LSE if and nly i£XB = # and only ie = Y — XB is ‘thognal i X, ie, X/(Y ~ XB) = 0, which sequialetto XX = X'Y so that B is aLSEifand oly if Bs solution the normal equation Notice that = XB docs not depend onthe choice ofthe LSE A epesning the snigue projection of ¥ in R(X). The minimum value of $(8) is $() = ITY XAP = Y'¥ ~ B'X'Y, which doesnot depen on he choice of LSE B. IF is fal ‘nk so that rank(X) = k, then X°X is nonsingular an here unique LSE piven by 8 = (X°X)"!XY, Ifraak(X) < kth general solution of he noma equations can AIRY + (C= (XIN), whee» ea array eo, nd (X°X)~ denotes a generalized inverse of X°X (Sfined by the propery tat ‘cette X°X(X'K)-X'K = HR). Ths tes han flank sao hare “family” of LS estimates. The fact that i) = X,3 does not depend on the particular LSE 3 also follows algebraically fom the properties thatthe matin P= X(X°X)-X’ i unique snd doesnot depend onthe choice of peerlized inverse (1%), with P represeting te nique projection marx onto R(X). Hence we see tat X(KX)-R'R =X aa = XB = XH) foray generalized inverse (XX) sion oF in R(X) = range(X), 40 that 2. Results on LSE andthe Gauss-Markov Theorem in Full Rank Case Forte linear model ¥ = X +, we assim the random errs € = (e---.ee! se unbiased” sotat Ee) = 0, and he are uncorrelated an ave equal variance that Cov(cise) = Ofors fj and Varfs) = 0%, f= 1...,n. Wecan express the covariance matrix ofthe vector of ears a Cov(e) = Elee!) = of We now consider the ese ‘where the mE design mas X is of al ak fra) In he fl rank case the LS estimator is unique andi given by B= (X)-!X'Y, Hence we have (8) = (0°X)"!X'E(Y) = OC) IX'KS = 8, and Aono fray ir conbinion 9 = ¢8 = Te whe ye os te LS exinwor af = eB wid Ei) = CE(D) = 9 aed ala) eC (Pe) e. Gauss-MARKoV THEOREM (FULL RANK CASE). Let = (XK)-!X"Y and et B be the LSE of } = e/8. Then among the class of linear unbiased eximatos of o6. “ Th = a i ony near nba ‘tintor othe Var) = Nt). Pn0OF, Suppose} = 'Y ian tinsaranid imo XS =e, forall fips tha we ms be eK = Hens Vera'Y) = Varo +0'¥ = Var(e’B) + Var(a'Y ~ e'B) + 2Cov(a’Y — e'8,¢/8) VareB) + Vae'Y — 8) 2 Vere), /B. Then E(alY) = with equality ifand oly if a'¥ = eB, Noe tat the covariance term inthe above ero oval eB) = Covla'¥ e(X°x)'X'Y) =eeX(X'X)% FeXX) rar(e'B), 8) — Var('B) = 0. o and hence Cov(a'Y ~ €8, eB) = Var ‘Asan equivalent but sertive proof we have Vas(a'Y) ~ Var('B) = o%(a'a—€(X"X)e) oM(ala~aX(X'X)""X'a} = ofall X(X'K)*X)a ‘a'R'Ra > since R= I~ X(X°X)—"X' is idempotent and symmetric, R? ‘nd hence Ris postive sem defini. Forthe flkran lnearmodel Y = XB + wit LSE A = (X°X)"!X'Y, the vector of residuls is defined as = ¥ ~ XB = (I~ X0CX)-!X)Y = (1 P)Y, where th projection matrix on R(X). Theesiualoretorsum of squares and = R, SY'Y-¥'XOCK) XY "Nove that we have te decomposition ofthe response vector ¥ a Ya¥+e=Xxb+e=PY+0-Pyy, where ¥ = XB = PY isthe projection of ¥ onto R(X) and is orthogonal to X, ie Xie = X(U~PIY = 0. ence ako ¥'8 = f'X'é = 0, Ths wehave ¥'Y = YY +8000 =¥' VY. RESULT 1. The stistioS? me (¥= XAY(Y —X8)/(n—k)= SSE/(n=R) isan) anbised estimator of PROOF. We have that (a — b)5? = ¥“(I~ PY, and so by a provius result, Blin — 54] = BY PY] = te" -P) + (xay~ PYEXA) ell ~ XOCX) AX] + 8x XOX) XIN (n= te((XX)*X'X]) = on, so tha E(S*) =o o ‘3. Distribution Theory of LS Estimation Under Normality ‘Seppe forthe linear model Y = +, we assume the random errs ae nor sally dstibuted as ii (0,0), 50th € = (eho --sea is dstbuted as (0,01) and hence ¥ is W(X, *) Then he fllowing results Hol. "THEOREM 3.2. Under the above normal linear model, we have: 1. B = (XXX and SSB = (Y ~ XA)(¥ — XA) form a set of minimal ‘ficient and compete saistes for he parameters (8',0°), 2 3 Band. are independent 4 5. Band $* are uniformly minimum variance unbiased (UMVU) etiator of Band 8, respectively 6 re the maximum lielinod (ML) extimatosof land, respective. " 1. BX'XB/o? isdsribuedarx4(32), with? = B'X'X/o?, and (B-A)'X'X(B- BY? is died as (entra) PRoor. 1. Theres follow rom he theory of exponential families of ps, 2. ¥ is dsvioued as NX, 0°1) and B = (XX)"'X'Y is tin oF, so is normally dseibuted by abasic result om the multivariate normal 85. (08 OCX)-Fe) andi follows hat the A as radgpetee “sds ea oka 2 = S = SOCK) dente te xine ‘Variance of the LSE Hence, we arms a 100(1 ~ a)% confidence mera fr. Mere generally bythe theory for quae forms in normal, we know that BEV BNEW = ZB-aclow'x ero - a) 's seb as cm x, anis independent 5* = SSE) which ist such hat —E)*/o? = SE? is 42. Ts hers (= WBAG—w/org | 6 -vyB = EST ~H) ‘ssid as Fyne EXAMPLE 3.1. (Simple Linear Regression) Consierihesimplelinea regression model Y= A+ Bis; +444 = 1,.-.vn, where he response variable ¥ is relted linearly toa single predictor variable. Inmate form we have = XB + with Y = (Vi B= (By 0hY and os, tear ( tothe LSE B= (XX)"1X'F wie oom j= Eiag =m Bi) _ Doyle anh f) OE Jinr = Ue 2 Ene F Ths es tnd fla = F = By In adion, rom Cor(B) = (2°)! we bain ( sothat Vath = yor sh 3+ sear) TeLsBotte mean actrees sy ED) = B+ Bw i penby Bo bax, with Wehr = mya (2) aes leina A useful point to note is that since dy = Y' ~ 3,2, we can express the above LS estimate ‘of E(Yo) as Ao + B.zo = ¥ + ji(ze ~ 2) and the t's P and A, are uncorrelated, hence inset net normal ees ne Isobel sins otek e Vata) dealy Sy? Ay SOKA Under normaly, 7 = (34 ~ 04)/5/ VEE Ce FF dibuted as 100(1 ~ a)% confidence interval for 100(1 a) confidence interval forthe mean respons aly = By a = (1 20)" is given by (By + bi2o) + 1277S Yap, with wy = 3 (0X)"Hay =

You might also like