Professional Documents
Culture Documents
Linear Algebra PDF
Linear Algebra PDF
W W L CHEN
c
This chapter is available free to all individuals, on the understanding that it is not to be used for financial gain,
and may be downloaded and/or photocopied, with or without permission from the author.
However, this document may not be kept on any information storage and retrieval system without permission
from the author, unless such system is not accessible to any individuals other than its owners.
Chapter 5
INTRODUCTION TO
VECTOR SPACES
page 1 of 17
Linear Algebra
Example 5.1.3. Consider the set M2,2 (R) of all 2 2 matrices with entries in R. Consider matrix
addition and also multiplication of matrices by real numbers. Denote by O the 2 2 null matrix. It is
easy to check that we have the following properties:
(1.1) For every P, Q M2,2 (R), we have P + Q M2,2 (R).
(1.2) For every P, Q, R M2,2 (R), we have P + (Q + R) = (P + Q) + R.
(1.3) For every P M2,2 (R), we have P + O = O + P = P .
(1.4) For every P M2,2 (R), we have P + (P ) = O.
(1.5) For every P, Q M2,2 (R), we have P + Q = Q + P .
(2.1) For every c R and P M2,2 (R), we have cP M2,2 (R).
(2.2) For every c R and P, Q M2,2 (R), we have c(P + Q) = cP + cQ.
(2.3) For every a, b R and P M2,2 (R), we have (a + b)P = aP + bP .
(2.4) For every a, b R and P M2,2 (R), we have (ab)P = a(bP ).
(2.5) For every P M2,2 (R), we have 1P = P .
We also turn to an example from the theory of functions.
Example 5.1.4. Consider the set A of all functions of the form f : R R. For any two functions
f, g A, define the function f + g : R R by writing (f + g)(x) = f (x) + g(x) for every x R. For
every function f A and every number c R, define the function cf : R R by writing (cf )(x) = cf (x)
for every x R. Denote by : R R the function where (x) = 0 for every x R. Then it is easy to
check that we have the following properties:
(1.1) For every f, g A, we have f + g A.
(1.2) For every f, g, h A, we have f + (g + h) = (f + g) + h.
(1.3) For every f A, we have f + = + f = f .
(1.4) For every f A, we have f + (f ) = .
(1.5) For every f, g A, we have f + g = g + f .
(2.1) For every c R and f A, we have cf A.
(2.2) For every c R and f, g A, we have c(f + g) = cf + cg.
(2.3) For every a, b R and f A, we have (a + b)f = af + bf .
(2.4) For every a, b R and f A, we have (ab)f = a(bf ).
(2.5) For every f A, we have 1f = f .
There are many more examples of sets where properties analogous to (1.1)(1.5) and (2.1)(2.5) in
the four examples above hold. This apparent similarity leads us to consider an abstract object which
will incorporate all these individual cases as examples. We say that these examples are all vector spaces
over R.
Definition. A vector space V over R, or a real vector space V , is a set of objects, known as vectors,
together with vector addition + and multiplication of vectors by element of R, and satisfying the following
properties:
(VA1) For every u, v V , we have u + v V .
(VA2) For every u, v, w V , we have u + (v + w) = (u + v) + w.
(VA3) There exists an element 0 V such that for every u V , we have u + 0 = 0 + u = u.
(VA4) For every u V , there exists u V such that u + (u) = 0.
(VA5) For every u, v V , we have u + v = v + u.
(SM1) For every c R and u V , we have cu V .
(SM2) For every c R and u, v V , we have c(u + v) = cu + cv.
(SM3) For every a, b R and u V , we have (a + b)u = au + bu.
(SM4) For every a, b R and u V , we have (ab)u = a(bu).
(SM5) For every u V , we have 1u = u.
Remark. The elements a, b, c R discussed in (SM1)(SM5) are known as scalars. Multiplication of
vectors by elements of R is sometimes known as scalar multiplication.
Chapter 5 : Introduction to Vector Spaces
page 2 of 17
Linear Algebra
Example 5.1.5. Let n N. Consider the set Rn of all vectors of the form u = (u1 , . . . , un ), where
u1 , . . . , un R. For any two vectors u = (u1 , . . . , un ) and v = (v1 , . . . , vn ) in Rn and any number c R,
write
u + v = (u1 + v1 , . . . , un + vn )
and
cu = (cu1 , . . . , cun ).
To check (VA1), simply note that u1 +v1 , . . . , un +vn R. To check (VA2), note that if w = (w1 , . . . , wn ),
then
u + (v + w) = (u1 , . . . , un ) + (v1 + w1 , . . . , vn + wn ) = (u1 + (v1 + w1 ), . . . , un + (vn + wn ))
= ((u1 + v1 ) + w1 , . . . , (un + vn ) + wn ) = (u1 + v1 , . . . , un + vn ) + (w1 , . . . , wn )
= (u + v) + w.
If we take 0 to be the zero vector (0, . . . , 0), then u + 0 = 0 + u = u, giving (VA3). Next, writing
u = (u1 , . . . , un ), we have u + (u) = 0, giving (VA4). To check (VA5), note that
u + v = (u1 + v1 , . . . , un + vn ) = (v1 + u1 , . . . , vn + un ) = v + u.
To check (SM1), simply note that cu1 , . . . , cun R. To check (SM2), note that
c(u + v) = c(u1 + v1 , . . . , un + vn ) = (c(u1 + v1 ), . . . , c(un + vn ))
= (cu1 + cv1 , . . . , cun + cvn ) = (cu1 , . . . , cun ) + (cv1 , . . . , cvn ) = cu + cv.
To check (SM3), note that
(a + b)u = ((a + b)u1 , . . . , (a + b)un ) = (au1 + bu1 , . . . , aun + bun )
= (au1 , . . . , aun ) + (bu1 , . . . , bun ) = au + bu.
To check (SM4), note that
(ab)u = ((ab)u1 , . . . , (ab)un ) = (a(bu1 ), . . . , a(bun )) = a(bu1 , . . . , bun ) = a(bu).
Finally, to check (SM5), note that
1u = (1u1 , . . . , 1un ) = (u1 , . . . , un ) = u.
It follows that Rn is a vector space over R. This is known as the n-dimensional euclidean space.
Example 5.1.6. Let k N. Consider the set Pk of all polynomials of the form
p(x) = p0 + p1 x + . . . + pk xk ,
where p0 , p1 , . . . , pk R.
In other words, Pk is the set of all polynomials of degree at most k and with coefficients in R. For any
two polynomials p(x) = p0 + p1 x + . . . + pk xk and q(x) = q0 + q1 x + . . . + qk xk in Pk and for any number
c R, write
p(x) + q(x) = (p0 + q0 ) + (p1 + q1 )x + . . . + (pk + qk )xk
and
To check (VA1), simply note that p0 + q0 , . . . , pk + qk R. To check (VA2), note that if we write
r(x) = r0 + r1 x + . . . + rk xk , then we have
p(x) + (q(x) + r(x)) = (p0 + p1 x + . . . + pk xk ) + ((q0 + r0 ) + (q1 + r1 )x + . . . + (qk + rk )xk )
= (p0 + (q0 + r0 )) + (p1 + (q1 + r1 ))x + . . . + (pk + (qk + rk ))xk
= ((p0 + q0 ) + r0 ) + ((p1 + q1 ) + r1 )x + . . . + ((pk + qk ) + rk )xk
= ((p0 + q0 ) + (p1 + q1 )x + . . . + (pk + qk )xk ) + (r0 + r1 x + . . . + rk xk )
= (p(x) + q(x)) + r(x).
Chapter 5 : Introduction to Vector Spaces
page 3 of 17
Linear Algebra
If we take 0 to be the zero polynomial 0 + 0x + . . . + 0xk , then p(x) + 0 = 0 + p(x) = p(x), giving (VA3).
Next, writing p(x) = p0 p1 x . . . pk xk , we have p(x) + (p(x)) = 0, giving (VA4). To check
(VA5), note that
p(x) + q(x) = (p0 + q0 ) + (p1 + q1 )x + . . . + (pk + qk )xk
= (q0 + p0 ) + (q1 + p1 )x + . . . + (qk + pk )xk = q(x) + p(x).
To check (SM1), simply note that cp0 , . . . , cpk R. To check (SM2), note that
c(p(x) + q(x)) = c((p0 + q0 ) + (p1 + q1 )x + . . . + (pk + qk )xk )
= c(p0 + q0 ) + c(p1 + q1 )x + . . . + c(pk + qk )xk
= (cp0 + cq0 ) + (cp1 + cq1 )x + . . . + (cpk + cqk )xk
= (cp0 + cp1 x + . . . + cpk xk ) + (cq0 + cq1 x + . . . + cqk xk )
= cp(x) + cq(x).
To check (SM3), note that
(a + b)p(x) = (a + b)p0 + (a + b)p1 x + . . . + (a + b)pk xk
= (ap0 + bp0 ) + (ap1 + bp1 )x + . . . + (apk + bpk )xk
= (ap0 + ap1 x + . . . + apk xk ) + (bp0 + bp1 x + . . . + bpk xk )
= ap(x) + bp(x).
To check (SM4), note that
(ab)p(x) = (ab)p0 + (ab)p1 x + . . . + (ab)pk xk = a(bp0 ) + a(bp1 )x + . . . + a(bpk )xk
= a(bp0 + bp1 x + . . . + bpk xk ) = a(bp(x)).
Finally, to check (SM5), note that
1p(x) = 1p0 + 1p1 x + . . . + 1pk xk = p0 + p1 x + . . . + pk xk = p(x).
It follows that Pk is a vector space over R. Note also that the vectors are the polynomials.
There are a few simple properties of vector spaces that we can deduce easily from the definition.
PROPOSITION 5A. Suppose that V is a vector space over R, and that u V and c R.
(a) We have 0u = 0.
(b) We have c0 = 0.
(c) We have (1)u = u.
(d) If cu = 0, then c = 0 or u = 0.
Proof. (a) By (SM1), we have 0u V . Hence
0u + 0u = (0 + 0)u
= 0u
(by (SM3)),
(since 0 R).
It follows that
0u = 0u + 0
= 0u + (0u + ((0u)))
= (0u + 0u) + ((0u))
= 0u + ((0u))
=0
Chapter 5 : Introduction to Vector Spaces
(by (VA3)),
(by (VA4)),
(by (VA2)),
(from above),
(by (VA4)).
page 4 of 17
Linear Algebra
(by (SM2)),
= c0
(by (VA3)).
c0 = c0 + 0
(by (VA3)),
It follows that
= c0 + (c0 + ((c0)))
= (c0 + c0) + ((c0))
= c0 + ((c0))
=0
(by (VA4)),
(by (VA2)),
(from above),
(by (VA4)).
(c) We have
(1)u = (1)u + 0
(by (VA3)),
= (1)u + (u + (u))
(by (VA4)),
= ((1)u + u) + (u)
(by (VA2)),
(by (SM5)),
(by (SM3)),
(since 1 R),
(from (a)),
(by (VA3)).
(by (SM5)),
= (c1 c)u
(since c R \ {0}),
= c1 (cu)
(by (SM4)),
=c
=0
(assumption),
(from (b)),
as required.
5.2. Subspaces
Example 5.2.1. Consider the vector space R2 of all points (x, y), where x, y R. Let L be a line
through the origin 0 = (0, 0). Suppose that L is represented by the equation x + y = 0; in other
words,
L = {(x, y) R2 : x + y = 0}.
Note first of all that 0 = (0, 0) L, so that (VA3) and (VA4) clearly hold in L. Also (VA2) and (VA5)
clearly hold in L. To check (VA1), note that if (x, y), (u, v) L, then x + y = 0 and u + v = 0, so
that (x + u) + (y + v) = 0, whence (x, y) + (u, v) = (x + u, y + v) L. Next, note that (SM2)(SM5)
clearly hold in L. To check (SM1), note that if (x, y) L, then x + y = 0, so that (cx) + (cy) = 0,
whence c(x, y) = (cx, cy) L. It follows that L forms a vector space over R. In fact, we have shown
that every line in R2 through the origin is a vector space over R.
Chapter 5 : Introduction to Vector Spaces
page 5 of 17
Linear Algebra
Definition. Suppose that V is a vector space over R, and that W is a subset of V . Then we say that W
is a subspace of V if W forms a vector space over R under the vector addition and scalar multiplication
defined in V .
Example 5.2.2. We have just shown in Example 5.2.1 that every line in R2 through the origin is a
subspace of R2 . On the other hand, if we work through the example again, then it is clear that we have
really only checked conditions (VA1) and (SM1) for L, and that 0 = (0, 0) L.
PROPOSITION 5B. Suppose that V is a vector space over R, and that W is a non-empty subset
of V . Then W is a subspace of V if the following conditions are satisfied:
(SP1) For every u, v W , we have u + v W .
(SP2) For every c R and u W , we have cu W .
Proof. To show that W is a vector space over R, it remains to check that W satisfies (VA2)(VA5)
and (SM2)(SM5). To check (VA3) and (VA4) for W , it clearly suffices to check that 0 W . Since W
is non-empty, there exists u W . Then it follows from (SP2) and Proposition 5A(a) that 0 = 0u W .
The remaining conditions (VA2), (VA5) and (SM2)(SM5) hold for all vectors in V , and hence also for
all vectors in W .
Example 5.2.3. Consider the vector space R3 of all points (x, y, z), where x, y, z R. Let P be a plane
through the origin 0 = (0, 0, 0). Suppose that P is represented by the equation x + y + z = 0; in
other words,
P = {(x, y, z) R2 : x + y + z = 0}.
To check (SP1), note that if (x, y, z), (u, v, w) P , then x + y + z = 0 and u + v + w = 0, so
that (x + u) + (y + v) + (z + w) = 0, whence (x, y, z) + (u, v, w) = (x + u, y + v, z + w) P . To
check (SP2), note that if (x, y, z) P , then x + y + z = 0, so that (cx) + (cy) + (cz) = 0, whence
c(x, y, z) = (cx, cy, cz) P . It follows that P is a subspace of R3 . Next, let L be a line through the
origin 0 = (0, 0, 0). Suppose that (, , ) R3 is a non-zero point on L. Then we can write
L = {t(, , ) : t R}.
Suppose that u = t(, , ) L and v = s(, , ) L, and that c R. Then
u + v = t(, , ) + s(, , ) = (t + s)(, , ) L,
giving (SP1). Also, cu = c(t(, , )) = (ct)(, , ) L, giving (SP2). It follows that L is a subspace
of R3 . Finally, it is not difficult to see that both {0} and R3 are subspaces of R3 .
Example 5.2.4. Note that R2 is not a subspace of R3 . First of all, R2 is not a subset of R3 . Note also
that vector addition and scalar multiplication are different in R2 and R3 .
Example 5.2.5. Suppose that A is an m n matrix and 0 is the m 1 zero column matrix. Consider
the system Ax = 0 of m homogeneous linear equations in the n unknowns x1 , . . . , xn , where
x1
..
x=
.
xn
is interpreted as an element of the vector space Rn , with usual vector addition and scalar multiplication.
Let S denote the set of all solutions of the system. Suppose that x, y S and c R. Then
A(x + y) = Ax + Ay = 0 + 0 = 0,
giving (SP1). Also, A(cx) = c(Ax) = c0 = 0, giving (SP2). It follows that S is a subspace of Rn . To
summarize, the space of solutions of a system of m homogeneous linear equations in n unknowns is a
subspace of Rn .
Chapter 5 : Introduction to Vector Spaces
page 6 of 17
Linear Algebra
Example 5.2.6. As a special case of Example 5.2.5, note that if we take two non-parallel planes in R3
through the origin 0 = (0, 0, 0), then the intersection of these two planes is clearly a line through the
origin. However, each plane is a homogeneous equation in the three unknowns x, y, z R. It follows that
the intersection of the two planes is the collection of all solutions (x, y, z) R3 of the system formed by
the two homogeneous equations in the three unknowns x, y, z representing these two planes. We have
already shown in Example 5.2.3 that the line representing all these solutions is a subspace of R3 .
Example 5.2.7. We showed in Example 5.1.3 that the set M2,2 (R) of all 2 2 matrices with entries
in R forms a vector space over R. Consider the subset
W =
a11
a21
a12
0
: a11 , a12 , a21 R
a11
a21
a12
0
+
b11
b21
b12
0
=
a11 + b11
a21 + b21
a12 + b12
0
and
a11
a21
a12
0
=
ca11
ca21
ca12
0
,
it follows that (SP1) and (SP2) are satisfied. Hence W is a subspace of M2,2 (R).
Example 5.2.8. We showed in Example 5.1.4 that the set A of all functions of the form f : R R forms
a vector space over R. Let C0 denote the set of all functions of the form f : R R which are continuous
at x = 2, and let C1 denote the set of all functions of the form f : R R which are differentiable at
x = 2. Then it follows from the arithmetic of limits and the arithmetic of derivatives that C0 and C1 are
both subspaces of A. Furthermore, C1 is a subspace of C0 (why?). On the other hand, let k N. Recall
from Example 5.1.6 the vector space Pk of all polynomials of the form
p(x) = p0 + p1 x + . . . + pk xk ,
where p0 , p1 , . . . , pk R.
In other words, Pk is the set of all polynomials of degree at most k and with coefficients in R. Clearly
Pk is a subspace of C1 .
page 7 of 17
Linear Algebra
Example 5.3.3. In R4 , the vector (1, 4, 2, 6) is a linear combination of the two vectors (1, 2, 0, 4) and
(1, 1, 1, 3), for we have (1, 4, 2, 6) = 3(1, 2, 0, 4) 2(1, 1, 1, 3). On the other hand, the vector (2, 6, 0, 9)
is not a linear combination of the two vectors (1, 2, 0, 4) and (1, 1, 1, 3), for
(2, 6, 0, 9) = c1 (1, 2, 0, 4) + c2 (1, 1, 1, 3)
would lead to the system of four equations
c1 + c2 = 2,
2c1 + c2 = 6,
c2 = 0,
4c1 + 3c2 = 9.
It is easily checked that this system has no solutions.
Example 5.3.4. In the vector space A of all functions of the form f : R R described in Example
5.1.4, the function cos 2x is a linear combination of the three functions cos2 x, cosh2 x and sinh2 x. It is
not too difficult to check that
cos 2x = 2 cos2 x + sinh2 x cosh2 x,
noting that cos 2x = 2 cos2 x 1 and cosh2 x sinh2 x = 1.
We observe that in Example 5.3.1, every vector in R2 is a linear combination of the two vectors i and j.
Similarly, in Example 5.3.2, every vector in R3 is a linear combination of the three vectors i, j and k.
On the other hand, we observe that in Example 5.3.3, not every vector in R4 is a linear combination of
the two vectors (1, 2, 0, 4) and (1, 1, 1, 3).
Let us therefore investigate the collection of all vectors in a vector space that can be represented as
linear combinations of a given set of vectors in V .
Definition. Suppose that v1 , . . . , vr are vectors in a vector space V over R. The set
span{v1 , . . . , vr } = {c1 v1 + . . . + cr vr : c1 , . . . , cr R}
is called the span of the vectors v1 , . . . , vr . We also say that the vectors v1 , . . . , vr span V if
span{v1 , . . . , vr } = V ;
in other words, if every vector in V can be expressed as a linear combination of the vectors v1 , . . . , vr .
Example 5.3.5. The two vectors i = (1, 0) and j = (0, 1) span R2 .
Example 5.3.6. The three vectors i = (1, 0, 0), j = (0, 1, 0) and k = (0, 0, 1) span R3 .
Example 5.3.7. The two vectors (1, 2, 0, 4) and (1, 1, 1, 3) do not span R4 .
PROPOSITION 5C. Suppose that v1 , . . . , vr are vectors in a vector space V over R.
(a) Then span{v1 , . . . , vr } is a subspace of V .
(b) Suppose further that W is a subspace of V and v1 , . . . , vr W . Then span{v1 , . . . , vr } W .
Proof. (a) Suppose that u, w span{v1 , . . . , vr } and c R. There exist a1 , . . . , ar , b1 , . . . , br R such
that
u = a1 v1 + . . . + ar vr
Chapter 5 : Introduction to Vector Spaces
and
w = b1 v1 + . . . + br vr .
page 8 of 17
Linear Algebra
Then
u + w = (a1 v1 + . . . + ar vr ) + (b1 v1 + . . . + br vr )
= (a1 + b1 )v1 + . . . + (ar + br )vr span{v1 , . . . , vr }
and
cu = c(a1 v1 + . . . + ar vr ) = (ca1 )v1 + . . . + (car )vr span{v1 , . . . , vr }.
It follows from Proposition 5B that span{v1 , . . . , vr } is a subspace of V .
(b) Suppose that c1 , . . . , cr R and u = c1 v1 + . . . + cr vr span{v1 , . . . , vr }. If v1 , . . . , vr W ,
then it follows from (SM1) for W that c1 v1 , . . . , cr vr W . It then follows from (VA1) for W that
u = c1 v 1 + . . . + cr v r W .
Example 5.3.8. In R2 , any non-zero vector v spans the subspace {cv : c R}. This is clearly a line
through the origin. Also, try to draw a picture to convince yourself that any two non-zero vectors that
are not on the same line span R2 .
Example 5.3.9. In R3 , try to draw pictures to convince yourself that any non-zero vector spans a
subspace which is a line through the origin; any two non-zero vectors that are not on the same line span
a subspace which is a plane through the origin; and any three non-zero vectors that do not lie on the
same plane span R3 .
3
2
1
3
c1
3 c2 ,
3
c3
and so (do not worry if you cannot understand why we take this next step)
( 1 2
x
1)y = (1
z
1
1)2
3
3
2
1
3
c1
3 c2 = ( 0
3
c3
c1
0 ) c2 = ( 0 ) ,
c3
so that x 2y + z = 0. It follows that span{v1 , v2 , v3 } is a plane through the origin and not R3 . Note,
in fact, that 3v1 + 3v2 4v3 = 0. Note also that
1
det 2
3
Chapter 5 : Introduction to Vector Spaces
3
2
1
3
3 = 0.
3
page 9 of 17
Linear Algebra
Example 5.4.2. Consider the three vectors v1 = (1, 1, 0), v2 = (5, 1, 3) and v3 = (2, 7, 4) in R3 . Then
span{v1 , v2 , v3 } = {c1 (1, 1, 0) + c2 (5, 1, 3) + c3 (2, 7, 4) : c1 , c2 , c3 R}
= {(c1 + 5c2 + 2c3 , c1 + c2 + 7c3 , 3c2 + 4c3 ) : c1 , c2 , c3 R}.
Write (x, y, z) = (c1 + 5c2 + 2c3 , c1 + c2 + 7c3 , 3c2 + 4c3 ). Then it is not difficult to see that
x
1
y = 1
z
0
5
1
3
2
c1
7 c2 ,
4
c3
so that
25 26
4
4
3
3
33
x
25 26 33
1 5 2
c1
5 y = 4
4
5 1 1 7 c2
0 3 4
c3
4
z
3
3
4
1 0 0
c1
c1
= 0 1 0 c2 = c2 .
0 0 1
c3
c3
It follows that for every (x, y, z) R3 , we can find c1 , c2 , c3 R such that (x, y, z) = c1 v1 + c2 v2 + c3 v3 .
Hence span{v1 , v2 , v3 } = R3 . Note that
1
det 1
0
5
1
3
2
7 6= 0,
4
1 3 3
c1
0
2 2 3 c2 = 0 .
3 1 3
c3
0
Since
1
det 2
3
3
2
1
3
3 = 0,
3
the system has non-trivial solutions; for example, (c1 , c2 , c3 ) = (3, 3, 4), so that 3v1 + 3v2 4v3 = 0.
Hence v1 , v2 , v3 are linearly dependent.
Chapter 5 : Introduction to Vector Spaces
page 10 of 17
Linear Algebra
Example 5.4.4. Let us return to Example 5.4.2 and consider again the three vectors v1 = (1, 1, 0),
v2 = (5, 1, 3) and v3 = (2, 7, 4) in R3 . Consider the equation c1 v1 + c2 v2 + c3 v3 = 0. This can be
rewritten in matrix form as
1 5 2
c1
0
1 1 7 c2 = 0 .
0 3 4
c3
0
Since
1
det 1
0
5
1
3
2
7 6= 0,
4
for every j = 1, . . . , n,
nj
...
a1r
c1
0
.. .. ..
= . .
.
.
an1
...
anr
cr
If r > n, then there are more variables than equations. It follows that there must be non-trivial solutions
c1 , . . . , cr R. Hence v1 , . . . , vr are linearly dependent.
Remarks. (1) Consider two vectors v1 = (a11 , a21 ) and v2 = (a12 , a22 ) in R2 . To study linear independence, we consider the equation c1 v1 + c2 v2 = 0, which can be written in matrix form as
a11 a12
c1
0
=
.
a21 a22
c2
0
The vectors v1 and v2 are linearly independent precisely when
a11 a12
det
6= 0.
a21 a22
Chapter 5 : Introduction to Vector Spaces
page 11 of 17
Linear Algebra
This can be interpreted geometrically in the following way: The area of the parallelogram formed by
the two vectors v1 and v2 is in fact equal to the absolute value of the determinant of the matrix formed
with v1 and v2 as the columns; in other words,
a11 a12
det
.
a21 a22
It follows that the two vectors are linearly dependent precisely when the parallelogram has zero area;
in other words, when the two vectors lie on the same line. On the other hand, if the parallelogram has
positive area, then the two vectors are linearly independent.
(2) Consider three vectors v1 = (a11 , a21 , a31 ), v2 = (a12 , a22 , a32 ), and v3 = (a13 , a23 , a33 ) in R3 . To
study linear independence, we consider the equation c1 v1 + c2 v2 + c3 v3 = 0, which can be written in
matrix form as
a11 a12 a13
c1
0
a21 a22 a23 c2 = 0 .
a31 a32 a33
c3
0
The vectors v1 , v2 and v3 are linearly independent precisely when
page 12 of 17
Linear Algebra
Definition. Suppose that v1 , . . . , vr are vectors in a vector space V over R. We say that {v1 , . . . , vr }
is a basis for V if the following two conditions are satisfied:
(B1) We have span{v1 , . . . , vr } = V .
(B2) The vectors v1 , . . . , vr are linearly independent.
Example 5.5.2. Consider two vectors v1 = (a11 , a21 ) and v2 = (a12 , a22 ) in R2 . Suppose that
det
a11
a21
a12
a22
6= 0;
in other words, suppose that the parallelogram formed by the two vectors has non-zero area. Then it
follows from Remark (1) in Section 5.4 that v1 and v2 are linearly independent. Furthermore, for every
u = (x, y) R2 , there exist c1 , c2 R such that u = c1 v1 + c2 v2 . Indeed, c1 and c2 are determined as
the unique solution of the system
a11 a12
c1
x
=
.
a21 a22
c2
y
Hence span{v1 , v2 } = R2 . It follows that {v1 , v2 } is a basis for R2 .
Example 5.5.3. Consider three vectors of the type v1 = (a11 , a21 , a31 ), v2 = (a12 , a22 , a32 ) and v3 =
(a13 , a23 , a33 ) in R3 . Suppose that
a11
det a21
a31
a12
a22
a32
a13
a23 6= 0;
a33
in other words, suppose that the parallelepiped formed by the three vectors has non-zero volume. Then
it follows from Remark (2) in Section 5.4 that v1 , v2 and v3 are linearly independent. Furthermore, for
every u = (x, y, z) R3 , there exist c1 , c2 , c3 R such that u = c1 v1 + c2 v2 + c3 v3 . Indeed, c1 , c2 and
c3 are determined as the unique solution of the system
a11 a12 a13
c1
x
a21 a22 a23 c2 = y .
z
a31 a32 a33
c3
Hence span{v1 , v2 , v3 } = R3 . It follows that {v1 , v2 , v3 } is a basis for R3 .
Example 5.5.4. In Rn , the vectors e1 , . . . , en , where
ej = (0, . . . , 0, 1, 0, . . . , 0)
| {z } | {z }
j1
for every j = 1, . . . , n,
nj
are linearly independent and span Rn . Hence {e1 , . . . , en } is a basis for Rn . This is known as the
standard basis for Rn .
Example 5.5.5. In the vector space M2,2 (R) of all 2 2 matrices with entries in R as discussed in
Example 5.1.3, the set
1 0
0 1
0 0
0 0
,
,
,
0 0
0 0
1 0
0 1
is a basis.
Example 5.5.6. In the vector space Pk of polynomials of degree at most k and with coefficients in R
as discussed in Example 5.1.6, the set {1, x, x2 , . . . , xk } is a basis.
Chapter 5 : Introduction to Vector Spaces
page 13 of 17
Linear Algebra
PROPOSITION 5E. Suppose that {v1 , . . . , vr } is a basis for a vector space V over R. Then every
element u V can be expressed uniquely in the form
u = c1 v1 + . . . + cr vr ,
where c1 , . . . , cr R.
...
a1r
c1
0
.. .. ..
= . .
.
.
an1
...
anr
cr
0
page 14 of 17
Linear Algebra
If r > n, then there are more variables than equations. It follows that there must be non-trivial solutions
c1 , . . . , cr R. Hence u1 , . . . , ur are linearly dependent.
PROPOSITION 5G. Suppose that V is a finite-dimensional vector space V over R. Then any two
bases for V have the same number of elements.
Proof. Note simply that by Proposition 5F, the vectors in the basis with more elements must be
linearly dependent, and so cannot be a basis.
We are now in a position to make the following definition.
Definition. Suppose that V is a finite-dimensional vector space over R. Then we say that V is of
dimension n if a basis for V contains exactly n elements.
Example 5.5.8. The vector space Rn has dimension n.
Example 5.5.9. The vector space M2,2 (R) of all 2 2 matrices with entries in R, as discussed in
Example 5.1.3, has dimension 4.
Example 5.5.10. The vector space Pk of all polynomials of degree at most k and with coefficients in R,
as discussed in Example 5.1.6, has dimension (k + 1).
Example 5.5.11. Recall Example 5.2.5, where we showed that the set of solutions to a system of m
homogeneous linear equations in n unknowns is a subspace of Rn . Consider now the homogeneous system
1 3
1 4
1 5
0 3
5
7
9
6
1
3
5
2
x1
5
0
x2
2 0
x = .
9 3
0
x4
1
0
x5
1
1
2
3
x = c1 1 + c2 0 ,
0
5
0
1
where c1 , c2 R (the reader must check this). It can be checked that (1, 2, 1, 0, 0) and (1, 3, 0, 5, 1)
are linearly independent and so form a basis for the space of solutions of the system. It follows that the
space of solutions of the system has dimension 2.
Suppose that V is an n-dimensional vector space over R. Then any basis for V consists of exactly n
linearly independent vectors in V . Suppose now that we have a set of n linearly independent vectors
in V . Will this form a basis for V ?
We have already answered this question in the affirmative in the cases when the vector space is R2
or R3 . To seek an answer to the general case, we first establish the following result.
PROPOSITION 5H. Suppose that V is a finite-dimensional vector space over R. Then any finite set
of linearly independent vectors in V can be expanded, if necessary, to a basis for V .
Proof. Let S = {v1 , . . . , vk } be a finite set of linearly independent vectors in V . If S spans V , then
the proof is complete. If S does not span V , then there exists vk+1 V that is not a linear combination
Chapter 5 : Introduction to Vector Spaces
page 15 of 17
Linear Algebra
of the elements of S. The set T = {v1 , . . . , vk , vk+1 } is a finite set of linearly independent vectors in V ;
for otherwise, there exist c1 , . . . , ck , ck+1 , not all zero, such that
c1 v1 + . . . + ck vk + ck+1 vk+1 = 0.
If ck+1 = 0, then c1 v1 + . . . + ck vk = 0, contradicting the assumption that S is a finite set of linearly
independent vectors in V . If ck+1 6= 0, then
vk+1 =
c1
ck
v1 . . .
vk ,
ck+1
ck+1
contradicting the assumption that vk+1 is not a linear combination of the elements of S. We now study
the finite set T of linearly independent vectors in V . If T spans V , then the proof is complete. If T does
not span V , then we repeat the argument. Note that the number of vectors in a linearly independent
expansion of S cannot exceed the dimension of V , in view of Proposition 5F. So eventually some linearly
independent expansion of S will span V .
PROPOSITION 5J. Suppose that V is an n-dimensional vector space over R. Then any set of n
linearly independent vectors in V is a basis for V .
Proof. Let S be a set of n linearly independent vectors in V . By Proposition 5H, S can be expanded,
if necessary, to a basis for V . By Proposition 5F, any expansion of S will result in a linearly dependent
set of vectors in V . It follows that S is already a basis for V .
Example 5.5.12. Consider the three vectors v1 = (1, 2, 3), v2 = (3, 2, 1) and v3 = (3, 3, 3) in R3 , as
in Examples 5.4.1 and 5.4.3. We showed that these three vectors are linearly dependent, and span the
plane x 2y + z = 0. Note that
v3 =
3
3
v1 + v2 ,
4
4
and that v1 and v2 are linearly independent. Consider now the vector v4 = (0, 0, 1). Note that v4 does
not lie on the plane x 2y + z = 0, so that {v1 , v2 , v4 } form a linearly independent set. It follows that
{v1 , v2 , v4 } is a basis for R3 .
page 16 of 17
Linear Algebra
dy
d2 y
3
+ y = 0.
dx2
dx
page 17 of 17