Professional Documents
Culture Documents
Dimension in Linear Algebra
Dimension in Linear Algebra
1 0 1
1 1 0
0 1 1
x
1
x
2
x
3
.
The system we must solve may be written as Ax = y. It may be dealt with in any of the usual ways. For example,
we can augment A with column y and obtain the reduced row echelon form:
rref
1 0 1 y
1
1 1 0 y
2
0 1 1 y
3
1 0 0 (y
1
+ y
2
y
3
) /2
0 1 0 (y
1
+ y
2
+ y
3
) /2
0 0 1 (y
1
y
2
+ y
3
) /2
.
This calculation tells us that x = [(y
1
+ y
2
y
3
) /2, (y
1
+ y
2
+ y
3
) /2, (y
1
y
2
+ y
3
) /2] is the unique solution of
the equation Ax = y. We conclude that it is possible to express
y as a linear combination of u, v, and w, and,
moreover,
y =
y
1
+ y
2
y
3
2
u +
y
1
+ y
2
+ y
3
2
v +
y
1
y
2
+ y
3
2
w
is the only way.
Example 1 shows that the three vectors u, v, w serve to describe all vectors in R
3
just as well as i, j, k do. The
important thing is that each set of vectors has three elements. We shall see that two vectors cannot be used to describe
all elements of R
3
. We shall also see when we use four or more vectors we cannot obtain unique decompositions.
It should be noted that if we want to express each vector in R
3
as a unique linear combination of three xed
vectors v
1
, v
2
, and v
3
then we cannot choose the three vectors in an arbitrary way. For example, if we set v
1
= i,
v
2
= i +j, and v
3
= 3i +j then we cannot represent k as a linear combination of v
1
, v
2
, and v
3
. In other words, the
equation x
1
v
1
+ x
2
v
2
+ x
3
v
3
= y does not have a solution x
1
, x
2
, x
3
for all y R
3
. Furthermore, the vector 2i + j
can be represented as v
1
+v
2
and also as v
1
+v
3
. In other words, the equation x
1
v
1
+x
2
v
2
+x
3
v
3
= y has more
than one solution x
1
, x
2
, x
3
for some y R
3
.
If widen the scope of our discussion, then the fundamental question that we have before us is this:
2
Given a vector space V , how can we tell if a subset {v
1
, v
2
, . . . , v
N
} of vectors has the property that the
equation
x
1
v
1
+ x
2
v
2
+ . . . + x
N
v
N
= y (1)
has a unique solution x
1
, x
2
, . . . , x
N
for every y V ?
For ease of reference we will give a name to a set of vectors with this important property:
Denition Suppose that V is a vector space over a eld F. Suppose that {v
1
, v
2
, . . . , v
N
} is a subset of V . If for
every y V there are unique scalars x
1
F, x
2
F, . . . , x
N
F such that x
1
v
1
+x
2
v
2
+. . . +x
N
v
N
= y then
we say that the set {v
1
, v
2
, . . . , v
N
} is a basis of V . We say that the N-tuple [v
1
, v
2
, . . . , v
N
] is an ordered
basis of V .
The plural of basis is bases. Each basis {v
1
, v
2
, . . . , v
N
} of V gives rise to N! ordered bases obtained by
rearrangement. For example, the ordered triples [i, j, k], [i, k, j], [j, i, k], [j, k, i], [k, i, j], and [k, j, i] are the 3! ordered
bases of R
3
that are associated with the basis {i, j, k} of R
3
.
With our new language we can state our fundamental question as follows: How can we tell if a given subset of
a vector space is a basis for it? We will answer this question in the next subsection.
Linear Independence
If {v
1
, v
2
, . . . , v
N
} is a basis of a vector space V then equation (1) has a solution for every y V . This tells us
that the vectors v
1
, v
2
, . . . , v
N
span V . A basis has the further property that for each y there is only one solution
to equation (1). The following theorem tells us that the question of uniqueness is determined for all y by the special
case y =
0 .
Theorem 1 Suppose that v
1
, v
2
, . . . , v
N
are elements of a vector space V over F. The equation x
1
v
1
+x
2
v
2
+. . . +
x
N
v
N
= y has exactly one solution x
1
, x
2
, . . . , x
N
for every y in the span of v
1
, v
2
, . . . , v
N
if and only if the equation
x
1
v
1
+ x
2
v
2
+ . . . + x
N
v
N
=
0 has only the trivial solution x
1
= 0, x
2
= 0, . . . , x
N
= 0. Otherwise, the equation
x
1
v
1
+x
2
v
2
+. . . +x
N
v
N
= y has innitely many solutions x
1
, x
2
, . . . , x
N
for every y in the span of v
1
, v
2
, . . . , v
N
.
Proof: Since
0 = 0v
1
+ 0v
2
+ . . . + 0v
N
we see that
0 is in the span of v
1
, v
2
, . . . , v
N
. If the equation x
1
v
1
+
x
2
v
2
+. . . +x
N
v
N
= y has exactly one solution for every y in the span of v
1
, v
2
, . . . , v
N
then it obviously has exactly
one solution when y =
0 .
It follows that a
1
b
1
= 0,a
2
b
2
= 0, . . . , a
N
b
N
= 0, or a
1
= b
1
,a
2
= b
2
, . . . , a
N
= b
N
. In other words, for each
y the equation x
1
v
1
+ x
2
v
2
+ . . . + x
N
v
N
= y has exactly one solution.
Finally, suppose that the equation x
1
v
1
+x
2
v
2
+. . . +x
N
v
N
=
N
v
N
=
0 . Then we also have (
1
) v
1
+ (
2
) v
2
+ . . . + (
N
) v
N
=
0 for every F. Since dierent values
of yield dierent solutions we see that the equation
1
v
1
+
2
v
2
+ . . . +
N
v
N
=
0
= y.
Thus, for each we obtain a solution of x
1
v
1
+x
2
v
2
+. . . +x
N
v
N
= y by setting x
1
=
1
+
1
, . . . , x
N
=
N
+
N
.
Theorem 1 suggests that we have hit upon an important property worthy of a denition:
0.1. THE DIMENSION OF A VECTOR SPACE 3
Denition Suppose that v
1
, v
2
, . . . , v
N
are vectors in a vector space V over F. We say that the vectors v
1
, v
2
,
. . . , v
N
are linearly independent over F if the equation
x
1
v
1
+ x
2
v
2
+ . . . + x
N
v
N
=
0
(to be solved for x
1
, x
2
, . . . , x
N
in F) has only the trivial solution x
1
= x
2
= . . . = x
N
= 0. Otherwise we say
that v
1
, v
2
, . . . , v
N
are linearly dependent over F. In this case there are scalars
1
,
2
, . . . ,
N
not all zero
such that
1
v
1
+
2
v
2
+ . . . +
N
v
N
=
0 is only true if = 0.
EXAMPLE 2 Show that the vectors
u = [0, 1, 1],
v = [1, 1, 1], and
w = [1, 2, 3] in R
3
are linearly independent.
Solution: We must determine whether the vector equation
x
1
0
1
1
+ x
2
1
1
1
+ x
3
1
2
3
0
0
0
(2)
has a nontrivial solution. Since the reduced row echelon form of the matrix
0 1 1
1 1 2
1 1 3
is
1 0 0
0 1 0
0 0 1
we see
that equation (2) has only the trivial solution. Consequently,
u = [0, 1, 1],
v = [1, 1, 1], and
w = [1, 2, 3] are
linearly independent.
We have observed that the vectors in a basis for a vector space span the space and are linearly independent.
Our next theorem asserts that the converse is also true:
Theorem 2 A subset {v
1
, v
2
, . . . , v
N
} of a vector space V over F is a basis of V if and only if it is a spanning set
of linearly independent vectors.
Proof: Suppose that {v
1
, v
2
, . . . , v
N
} is a set of linearly independent vectors that span V . The assumption that the
vectors v
1
, v
2
, . . . , v
N
span V tells us that the equation x
1
v
1
+ x
2
v
2
+ . . . + x
N
v
N
= y has at least one solution for
every y V . If it had more than one solution for some y then it would follow from the preceding theorem that the
equation x
1
v
1
+x
2
v
2
+. . . +x
N
v
N
=
0 , v
2
=
0 , . . . , v
N
=
0 .
Proof: If v
i
were
0 then
0
v
1
+ 0
v
2
+ . . . + 0
v
i1
+ 1
v
i
+ 0
v
i+1
+ . . . + 0
v
N
=
0
would be a nontrivial dependence relation.
Theorem 4 If a set consists of linearly independent vectors then so does every one of its subsets.
4
Proof: Suppose that v
1
, v
2
, . . . , v
N
are linearly independent vectors. Suppose that {v
i1
, v
i2
, . . . , v
im
}
{v
1
, v
2
, . . . , v
N
}. Suppose that
i
1
v
i
1
+
i
2
v
i
2
+ . . . +
i
N
v
i
N
=
0 is a nontrivial dependence relation among
the v
i
1
, v
i
2
, . . . , v
i
m
. We will show that this assumption leads to a contradiction. For 1 j N set
j
=
i
k
if
j = i
k
and
j
= 0 otherwise. Then
v
1
+
2
v
2
+ . . . +
N
v
N
=
i
1
v
i
1
+
i
2
v
i
2
+ . . . +
i
N
v
i
N
=
0
is a nontrivial dependence relation among the vectors v
1
, v
2
, . . . , v
N
, which contradicts the hypothesis that the
vectors v
1
, v
2
, . . . , v
N
are linearly independent.
Theorem 5 A set of vectors is linearly dependent if and only if one of the vectors in the set can be written as a
linear combination of the others. In particular, two vectors are linearly dependent if and only if one is a scalar
multiple of the other.
Proof: If v
1
, v
2
, . . . , v
N
are linearly dependent vectors then there is a nontrivial dependence relation
1
v
1
+
2
v
2
+
. . . +
N
v
N
=
v
i
=
v
1
. . .
i1
v
i1
i+1
v
i+1
. . .
N
v
N
.
Conversely, if one vector is a linear combination of the others, say
v
i
=
1
v
1
+ . . . +
i1
v
i1
+
i+1
v
i+1
+ . . . +
N
v
N
,
then
v
1
+ . . . +
i1
v
i1
+ (1)
v
i
+
i+1
v
i+1
+ . . . +
N
v
N
=
0
is a nontrivial dependence relation.
The Dimension of a Vector Space
On rst reading the importance of our next theorem may seem obscure. Yet it is the key to the notion of
dimension.
Theorem 6 (Replacement Theorem) Suppose that V is a vector space over F that is generated by N vectors v
1
, v
2
,
. . . , v
N
. Suppose that w
1
, w
2
, . . . , w
m
are m linearly independent vectors where 1 m N. Then m of the
vectors v
1
, v
2
, . . . , v
N
can be replaced with the vectors w
1
, w
2
, . . . , w
m
so that the resulting set still spans V over
F.
Proof: We do the replacement one vector at a time. Since v
1
, v
2
, . . . , v
N
span V we can nd scalars c
1
, c
2
, . . . ,
c
N
such that
w
1
= c
1
v
1
+ c
2
v
2
+ . . . + c
N
v
N
. (3)
Since w
1
is a member of a set of linearly independent vectors it cannot be
0 . Therefore one of the coecients c
1
,
c
2
, . . . , c
N
must be nonzero. Suppose that c
k
= 0. Then we may isolate v
k
in equation (3), obtaining
v
k
=
c
1
c
k
v
1
c
2
c
k
v
2
. . .
c
k1
c
k
v
k1
+
1
c
k
w
1
c
k+1
c
k
v
k+1
. . .
c
N
c
k
v
N
. (4)
We claim that v
1
, v
2
, . . . ,v
k1
, w
1
, v
k+1
, . . . , v
N
span V . To see why, let y be an arbitrary element of V . Then
there are scalars a
1
, a
2
, . . . , a
N
such that
y = a
1
v
1
+ a
2
v
2
+ . . . + a
k1
v
k1
+ a
k
v
k
+ a
k+1
v
k+1
+ . . . + a
N
v
N
. (5)
0.1. THE DIMENSION OF A VECTOR SPACE 5
If we use equation (4) to substitute for v
k
in equation (5) then we obtain, after regrouping,
y =
a
1
a
k
c
1
c
k
v
1
+
a
2
a
k
c
2
c
k
v
2
+ . . .
+
a
k1
a
k
c
k1
c
k
v
k1
+
a
k
c
k
w
1
+
a
k+1
a
k
c
k+1
c
k
v
k+1
+ . . . +
a
N
a
k
c
N
c
k
v
N
.
Since we can write an arbitrary element y of V as a linear combination of the vectors v
1
, v
2
, . . . ,v
k1
, w
1
, v
k+1
,
. . . , v
N
, we conclude that they span V .
Next we will show that we can replace one of v
1
, v
2
, . . . ,v
k1
, v
k+1
, . . . , v
N
with w
2
. To simplify the notation,
let us rename the indices of the vectors v
k+1
, . . . , v
N
by shifting each down by one. After this renaming, our set of
spanning vectors consists of v
1
, v
2
, . . . ,v
k1
, v
k
, v
k+1
, . . . , v
N1
, w
1
. We can nd scalars d
1
, d
2
, . . . , d
N1
,
1
such that
w
2
= d
1
v
1
+ d
2
v
2
+ . . . + d
N1
v
N1
+
1
w
1
. (6)
There must be at least one index j such that d
j
= 0. Otherwise we would have w
2
=
1
w
1
. This equation
would result in the nontrivial dependence relation
1
w
1
+ w
2
+ 0w
3
+ . . . + 0w
m
=
0 , contradicting the linear
independence of w
1
, w
2
, . . . , w
m
. Since d
j
= 0 we may use equation (6) to solve for v
j
:
v
j
=
d
1
d
j
v
1
d
2
d
j
v
2
. . .
d
j1
d
j
v
j1
d
j+1
d
j
v
j+1
. . .
d
N1
d
j
v
N1
1
d
j
w
1
+
1
d
j
w
2
. (7)
We claim that v
1
, v
2
, . . . ,v
j1
, v
j+1
, . . . , v
N1
, w
1
, w
2
span V . To see why, let y belong to V . Because v
1
, v
2
,
. . . ,v
j1
, v
j
, v
j+1
, . . . , v
N1
, w
1
span V we can nd scalars b
1
, b
2
, . . . , b
N1
, b
N
such that
y = b
1
v
1
+ b
2
v
2
+ . . . + b
j1
v
j1
+ b
j
v
j
+ b
j+1
v
j+1
+ . . . + b
N1
v
N1
+ b
N
w
1
. (8)
If we use equation (7) to substitute for v
j
in equation (8) the result is that the right side of equation (8) becomes
a linear combination of the vectors v
1
, v
2
, . . . ,v
j1
, v
j+1
, . . . , v
N1
, w
1
, w
2
. Since we can write an arbitrary
element y of V as a linear combination of the vectors v
1
, v
2
, . . . ,v
j1
, v
j+1
, . . . , v
N1
, w
1
, w
2
, we conclude that
they span V .
The process that we have been carrying out can be continued until each of w
1
, w
2
, . . . , w
m
has been used to
replace one of the original spanning vectors.
EXAMPLE 3 The vectors v
1
= i, v
2
= j, v
3
= k span R
3
. The vectors w
1
= 2i 3j, w
2
= 4i 6j + 5k are
linearly independent. Apply the Replacement Theorem explicitly, replacing two of v
1
, v
2
, v
3
with w
1
, w
2
so that the
span remains all of R
3
.
Solution: We have w
1
= 2v
1
+ (3) v
2
+ 0v
3
. Because the coecients of v
1
and v
2
are nonzero, we may replace
either one of them with w
1
. Suppose that we replace v
2
so that the new spanning set is v
1
, w
1
, v
3
. Next, we express
w
2
in terms of these spanning vectors: w
2
= 0v
1
+ 2w
1
+ 5v
3
. We can use w
2
to replace any v
i
with a nonzero
coecient. We have no choice but to replace v
3
with w
2
. Our nal set of spanning vectors is v
1
, w
1
, w
2
. As a
check, let y = [y
1
, y
2
, y
3
] be an arbitrary vector in R
3
. By row reducing the augmented matrix [v
1
, w
1
, w
2
| y] (where
each vector is written as a column vector) we nd
rref
1 2 4 y
1
0 3 6 y
2
0 0 5 y
3
1 0 0 y
1
+
2
3
y
2
0 1 0
1
3
y
2
2
5
y
3
0 0 1
1
5
y
3
,
which tells us that we have the unique representation
y =
y
1
+
2
3
y
2
v
1
+
1
3
y
2
2
5
y
3
w
1
+
1
5
y
3
w
2
.
We will now apply the Replacement Theorem to prove that a set of linearly independent vectors cannot have
more elements than a spanning set.
6
Theorem 7 Suppose that V is a vector space over F that
1. is spanned by a set {v
1
, v
2
, . . . , v
N
} of N vectors, and
2. contains a set {w
1
, w
2
, . . . , w
m
} of m linearly independent vectors.
Then 1 m N.
Proof: Suppose that m N + 1. Replace all N vectors v
1
, v
2
, . . . , v
N
with w
1
, w
2
, . . . , w
N
, obtaining the
set {w
1
, w
2
, . . . , w
N
} of spanning vectors for V . This tells us that we can express w
N+1
as a linear combination of
w
1
, w
2
, . . . , w
N
. As a consequence, Theorem 5 tells us that the vectors w
1
, w
2
, . . . , w
N
, w
N+1
are linearly dependent.
That is, for some coecients c
1
, . . . c
N+1
not all zero, we have
c
1
w
1
+ c
2
w
2
+ . . . c
N
w
N
+ c
N+1
w
N+1
=
0 .
But then
c
1
w
1
+ c
2
w
2
+ . . . c
N
w
N
+ c
N+1
w
N+1
+ 0w
N+2
+ . . . + 0w
m
=
0 .
is a nontrivial dependence relation among the linearly independent vectors w
1
, w
2
, . . . , w
m
, which is a contradiction.
Theorem 8 If {v
1
, v
2
, . . . , v
N
} and {w
1
, w
2
, . . . , w
m
} are bases of V over F, then N = m .
Proof: Since v
1
, v
2
, . . . , v
N
span V over F and the vectors w
1
, w
2
, . . . , w
m
are linearly independent we use the
preceding theorem to conclude that m N. Likewise w
1
, w
2
, . . . , w
m
span V over F and the vectors v
1
, v
2
, . . . , v
N
are linearly independent. Therefore N m. Since m N m neither inequality can be strict and N = m.
Our last Theorem tells us that any two bases of a nitely-generated vector space have the same number of
elements. This is the number intrinsic to a vector space that we will use to dene dimension. Of course, in order to
make such a denition we must know that bases actually exist.
Theorem 9 Every nitely-generated vector space has a basis.
Proof: Suppose that {v
1
, v
2
, . . . , v
N
} is a spanning set for V . Let w
1
be a nonzero vector. If V = Fw
1
then {w
1
}
is a spanning set and w
1
is independent. Therefore {w
1
} is a basis of V . Otherwise, V
1
= Fw
1
is a proper subset of
V and there is a nonzero element w
2
in V but not in V
1
. Notice that the vectors w
1
, w
2
are linearly independent
since neither can be a multiple of the other. Let V
2
be the space spanned by w
1
, w
2
. If V
2
= V then {w
1
, w
2
} is a
set of linearly independent spanning vectors. Hence, {w
1
, w
2
} is a basis of V . Otherwise, V
2
is a proper subspace
of V and there is a nonzero vector w
3
in V that is not in V
2
. We claim that there can be no nontrivial dependence
relation,
1
w
1
+
2
w
2
+
3
w
3
=
0 , (9)
among w
1
, w
2
, w
3
. That is because in such a relation we would have either
3
= 0 or
3
= 0, each of which leads
to a contradiction. In the rst case, the case with
3
= 0, equation (9) simplies to
1
w
1
+
2
w
2
=
0 . Since at
least one of the coecients
1
or
2
is nonzero, this equation contradicts the linear independence of w
1
, w
2
. In the
second case, the case with
3
= 0, equation (9) leads to w
3
= (
1
/
3
) w
1
(
2
/
3
) w
2
. This equation contradicts
the fact that w
3
is not in the span of w
1
, w
2
. We conclude that w
1
, w
2
, w
3
are linearly independent. We continue
the process, obtaining increasing sets {w
1
},{w
1
, w
2
},{w
1
, w
2
, w
3
}, . . . , {w
1
, w
2
, . . . , w
p
}, of linearly independent
vectors. As long as the subspace spanned by {w
1
, w
2
, . . . , w
p
} is a proper subset of V the process can be continued.
But we know that we cannot have more than N linearly independent vectorsso the process must end with some
set {w
1
, w
2
, . . . , w
p
} of linearly independent vectors that span V . As we have seen, such a set is a basis of V .
0.1. THE DIMENSION OF A VECTOR SPACE 7
Denition The dimension of a nitely-generated vector space V over F is the number of elements in a basis of V .
We will denote this number by dim
F
(V ) .
EXAMPLE 4 What are the dimensions of the following vector spaces:
1. C as a vector space over C.
2. C as a vector space over R.
3. V = {A M
3,3
(F) : A
t
= A} over F.
Solution: Every element z C can be written as z = z1. Here the z on the left is an element of vector space C, z
on the right is an element of the eld C of scalars, and 1 on the right is a vector in C. We see that {1} is a basis for
C as a vector space over C. The dimension of C over C is 1.
The vectors 1 C and i C are linearly independent over R. Every vector z C can be written as the linear
combination z = x + yi with x, y R. Therefore {1, i} is a basis of C over R. The dimension of C over R is 2.
The matrices v
1
, v
2
, v
3
, v
4
, v
5
, v
6
, in order,
1 0 0
0 0 0
0 0 0
0 0 0
0 1 0
0 0 0
0 0 0
0 0 0
0 0 1
0 1 0
1 0 0
0 0 0
0 0 1
0 0 0
1 0 0
0 0 0
0 0 1
0 1 0
a d e
d b f
e f c
can only be 0 if a = b = c = d = e = f = 0 and every 3 3 symmetric real matrix has this form. The dimension of
V over R is therefore 6.
There are two requirements for a set of vectors {v
1
, v
2
, . . . , v
d
} to be a basis for a vector space V : the vectors
v
1
, v
2
, . . . , v
d
must span V and they must be linearly independent. The next theorem tells us that if we know that
the dimension of V is d, the number of elements in the set {v
1
, v
2
, . . . , v
d
}, then we need only verify one of the two
conditions.
Theorem 10 Suppose that {w
1
, w
2
, . . . , w
d
} is a subset of a d-dimensional vector space V .
(i) If w
1
, w
2
, . . . , w
d
span V then {w
1
, w
2
, . . . , w
d
} is a basis.
(ii) If w
1
, w
2
, . . . , w
d
are linearly independent vectors then {w
1
, w
2
, . . . , w
d
} is a basis.
Proof: Suppose that the vectors w
1
, w
2
, . . . , w
d
span V . We will show that we obtain a contradiction if these vectors
are linearly dependent. For then we would have a dependence relation
w
k
=
1
w
1
+
2
w
2
+ . . . +
k1
w
k1
+
k+1
w
k+1
+ . . . +
d
w
d
.
For any vector y we can nd scalars
1
, . . . ,
d
such that
y =
1
w
1
+
2
w
2
+ . . . +
k1
w
k1
+
k
w
k
+
k+1
w
k+1
+ . . . +
d
w
d
or
y = (
1
+
k
1
) w
1
+(
2
+
k
2
) w
2
+. . . +(
k1
+
k
k1
) w
k1
+(
k+1
+
k
k+1
) w
k+1
+. . . +(
d
+
k
d
) w
d
This shows that {w
1
, w
2
, . . . , w
k1
, w
k+1
, . . . , w
d
} span V . But this is a spanning set of independent vectors, hence
a basis. This is a contradiction since the dimension of V is d, not d 1. Therefore the vectors w
1
, w
2
, . . . , w
d
must
be linearly independent. Because they also span V they constitute a basis. This proves part (i).
Next, suppose that w
1
, w
2
, . . . , w
d
are linearly independent vectors. Because it is d-dimensional, the space V has
a basis {v
1
, v
2
, . . . , v
d
}. In particular, the vectors v
1
, v
2
, . . . , v
d
span V . According to the Replacement Theorem
we can replace v
1
, v
2
, . . . , v
d
with w
1
, w
2
, . . . , w
d
so that the resulting set {w
1
, w
2
, . . . , w
d
} still spans V . But then
{w
1
, w
2
, . . . , w
d
} is a spanning set of linearly independent vectors. It is therefore a basis of V .
8
Theorem 11 Suppose that V is a d-dimensional vector space V . Then:
(i) Any subset {w
1
, w
2
, . . . , w
m
} of V with more than d elements is a set of linearly dependent vectors.
(ii) Any subset {w
1
, w
2
, . . . , w
m
} of V with fewer than d elements cannot be a spanning set.
Proof: We know that V has a basis that by denition consists of d vectors. Therefore V has a spanning set with d
vectors; consequently it cannot have a set of linearly independent vectors with more than d elements. This proves
part (i).
Suppose that {w
1
, w
2
, . . . , w
m
} is a subset of V with fewer than d elements. If {w
1
, w
2
, . . . , w
m
} did span V
then V could not have a set of linearly independent vectors with more than m elements. But V has just such a set:
a basis with d elements. We conclude that {w
1
, w
2
, . . . , w
m
} cannot span V .
Our next theorem tells us that any set of linearly independent vectors can be extended to a basis.
Theorem 12 Let {w
1
, . . . w
m
} be a set of linearly independent vectors in a d-dimensional vector space V . Then
there are vectors v
m+1
, . . . , v
d
such that {w
1
, . . . w
m
, v
m+1
, . . . , v
d
} is a basis of V .
Proof: We have shown that V has a basis {v
1
, . . . , v
d
}. According to the Replacement Theorem we can replace m
of the vectors in the basis with the vectors w
1
, . . . w
m
and rename the remaining d m vectors v
m+1
, . . . , v
d
so that
{w
1
, . . . w
m
, v
m+1
, . . . , v
d
} spans V . But a spanning set of d elements in a d-dimensional vector space is a basis.
Theorem 13 If W is a subspace of a vector space V over F, then dim
F
(W) dim
F
(V ).
Proof: Let {w
1
, . . . , w
in V . Therefore {w
1
, . . . , w
, v
+1
, . . . , v
d
} of V . Therefore d .