Professional Documents
Culture Documents
Vector Spaces and Bases
Vector Spaces and Bases
In this course, we have proceeded step-by-step through low-dimensional Linear Algebra. We have
looked at lines, planes, hyperplanes, and have seen that there is no limit to this hierarchy of structures.
We have indexed these objects by their dimension in an ad hoc way. It is time to unify these structures
and ideas; this chapter gives a brief introduction to this more abstract viewpoint. Paradoxically, this
abstraction also makes Linear Algebra more applicable to other areas of Mathematics and Science.
In this higher viewpoint, Linear Algebra is the study of vector spaces and linear mappings between
vector spaces.
Definition of a vector space: A vector space is a set V of objects v called vectors which can be
added and multiplied by scalars t R, subject to the rules:
u + v = v + u,
t(u + v) = tu + tv,
(u + v) + w = u + (v + w),
(t + s)v = tv + sv,
t(sv) = (ts)v,
these holding for all u, v, w V and s, t R. There must also be a vector 0 V with the properties
that
0 + v = v, 0v = 0
for all v V.
Examples of vector spaces:
Rn is a vector space. Any line, plane, hyperplane,... in Rn is a vector space. Even the set
consisting of one vector {0} is a vector space, called the trivial vector space. In fact, any subset
of Rn that is closed under addition and scalar multiplication is a vector space; such a vector space
is called subspace of Rn .
For n 0, the set Pn of all polynomials c0 + c1 x + + cn xn with real coefficients is a vector
space. Here the vectors are polynomials.
The set V of real valued functions f (x) satisfying the differential equation f 00 + f = 0 is a vector
space. For if f and g are solutions so are f + g and cf , where c is a scalar. Note that V is a
subspace of the giant vector space C (R), consisting of all infinitely differentiable functions
f : R R. In fact V = E(1) is an eigenspace for the linear map
d2
: C (R) C (R).
dx2
Any nonzero vector space V contains infinitely many vectors. To write them all down, we want to have
something like the standard basis e1 , . . . , en of Rn . Here, we have
(x1 , x2 , . . . , xn ) = x1 e1 + x2 e2 + xn en .
(1)
Proposition 0.1 A subset S V is linearly dependent if and only if there exist vectors v1 , . . . , vk in
S and nonzero scalars c1 , . . . , ck in R such that
c1 v1 + c2 v2 + + ck vk = 0.
Proof: Suppose there exist vectors vi in S and nonzero scalars ci such that c1 v1 + + ck vk = 0.
Then
ck
c2
v 1 = v 2 vk ,
c1
c1
so v1 is a linear combination of the other vectors v2 , . . . , vk and therefore S is linearly dependent.
Conversely, if S is linearly dependent, then we can write some v in S as a linear combination of other
vectors v1 , . . . , vk in S:
v = c1 v 1 + + ck v k ,
where the ci are nonzero scalars. Then
v c1 v 1 ck v k = 0
with all coefficients nonzero.
Example: The three vectors
u = (1, 2, 3),
v = (4, 5, 6),
w = (7, 8, 9)
To check that a given subset {v1 , . . . , vn } V is a basis of V, you have to check items (i) and (ii).
That is,
(i) To show spanning: Take an arbitrary vector v V , and show that there are scalars c1 , . . . , cn such
that
v = c1 v 1 + + cn v n .
(ii) To show linearly independence: Suppose you have an equation of the form
c1 v1 + + cn vn = 0,
and show this implies c1 = c2 = = cn = 0.
3
v2 = e2 e3 ,
v3 = e3 e4
=0
=0
=0
= 0.
The only solution to these equation is c1 = c2 = c3 = 0. This shows that {v1 , v2 , v3 } is linearly
independent. We have now shown that {v1 , v2 , v3 } is a basis of V. This means that every vector
v V can be uniquely written as
v = c1 v1 + c2 v2 + c3 v3 = (c1 , c2 c1 , c3 c2 , c3 ).
Example 2: Let us now take the vector space W of vectors (x, y, z, w, t) in R5 satisfying the same
equation x + y + z + w = 0. Then the set {v1 , v2 , v3 } in example 2 is no longer a basis of W: the
vector e5 is in W, but e5 is not in the span of {v1 , v2 , v3 }. By the method of example 2 one can check
that the enlarged set {e1 e2 , e2 e3 , e3 e4 , e5 } is a basis of W.
Example 3: We have seen that the vector space Pn of polynomials of degree n has the basis
{1, x, x2 , x3 , . . . , xn }. This may be the most obvious basis of Pn , but for many purposes it is not the
best one. For numerical integration, one prefers to use instead the basis
{P0 , P1 , P2 , . . . , Pn }
4
consisting of Legendre Polynomials. These are the unique polynomials satisfying the conditions
Z 1
Pk P` dx = 0 if k 6= `.
deg Pk = k, Pk (1) = 1,
1
P1 = x,
P2 = 21 (3x2 1),
P3 = 12 (5x3 3x),
(2)
1
dk
k [(x2 1)k ].
2 4 (2k) dx
where a and b are constants. Let and be the roots of the polynomial x2 + ax + b and assume for
simplicity that 6= . I claim that {ex , ex } is a basis of V.
Let D : C (R) C (R) be the linear map of the vector space of infinitely differentiable functions
given by Df = f 0 . The eigenvectors of D with eigenvalue are solutions to the equation Df = f ,
namely f = kex , where k is a constant. In other words, ex spans ker(D ). Since x2 + ax + b =
(x )(x ) we have
f 00 + af 0 + bf = (D2 + aD + b)f = (D )(D )f = (D )(D )f.
If f 00 + af 0 + bf = 0 then (D )f ker(D ) and vice-versa. so
(D )f = k1 ex
and (D )f = k2 ex ,
for some constants k1 and k2 . Solving these two equations for Df we get
( )Df = k1 ex k2 ex .
Integrating both sides, we get
f=
1
k1 ex k2 ex .
Exercise 20.2 Let V be the vector space of solutions of the differential equation f 00 2f 0 + f = 0.
(a) Show that the functions ex and xex belong to V.
(b) Show that the functions ex and xex are linearly independent.1
Exercise 20.3 Let Pn be the vector space of polynomials of degree at most n. Let p0 , p1 , . . . , pn be
polynomials with deg(pi ) = i. Show that {p0 , p1 , . . . , pn } is a basis of Pn .
Exercise 20.4 On Pn , we have the an analogue of the dot product, given by
Z 1
p(x)q(x) dx.
hp, qi =
1
We say p and q are orthogonal if hp, qi = 0. In this problem you may use without proof the fact that
distinct Legendre polynomials Pk and P` are orthogonal: hPk , P` i = 0 if k 6= `.
(a) Suppose f Pn is orthogonal to each Legendre polynomial Pk , for all 0 k n. Show that
f = 0.
(b) We know that any f Pn may be uniquely expressed as a linear combination of the Pk . Show
that this unique linear combination is given by
n
X
hf, Pk i
Pk (x).
f (x) =
hPk , Pk i
k=0
[Hint: Let be the difference of the two sides and show that h, Pk i = 0 for all 0 k n.
Then invoke part (a). ]
Exercise 20.5 Let v1 , . . . vk be nonzero vectors in Rn which are orthogonal with respect to the dot
product. Prove that {v1 , . . . , vk } is linearly independent.