Download as pdf or txt
Download as pdf or txt
You are on page 1of 30

Lecture 3

Parametrisations, Solution sets,


Linear dependence

homogeneous systems, linear dependence Lecture 3 1 / 30


Before we begin...
An extra exercise for people who would like to work ahead a bit:
Definition
Let k, m, n ∈ N, let A be an m × n matrix and let B be an n × k matrix
with column vectors b1 ,...,bk . We then define the matrix product AB as
the matrix with Abj as its jth column vector, where 1 ≤ j ≤ k.

homogeneous systems, linear dependence Lecture 3 2 / 30


Before we begin...
Let k, m, n ∈ N, let A be an m × n matrix, B be an n × k matrix and
consider the systems Ax = b and By = x, where x ∈ Rn , y ∈ Rk and
b ∈ Rm .
If b is given, we can solve Ax = b for x and then By = x for y , but if y is
the vector we wanted all along, then you can also rewrite Ax = b as
A(By ) = b, or (AB)y = b. You would have to prove that
(AB)y = A(By ), of course. We will do this when we go deeper into
working with matrices, but it is something you can actually already try to
prove yourself with what you have seen so far.

homogeneous systems, linear dependence Lecture 3 3 / 30


Recap
Recall that a linear system can only have:
no solutions,
exactly one solution,
infinitely many solutions.
Suppose we have a linear system with more variables than equations.
Suppose we do row reduction on the augmented matrix of this system.
If, at the end of the procedure, we obtain a row of the form

0 0 0 ··· 0 □

where □ is a number different from zero, then, as we know, the system is


inconsistent (it has no solutions).
Assume we do not get such a row. Then, we necessarily have at least one
free variable, which implies that the system has infinitely many solutions.

homogeneous systems, linear dependence Lecture 3 4 / 30


Observation
Therefore, if a system of linear equations has more variables than
equations, then there are only two possibilities: it either has
no solutions;
infinitely many solutions.

In the other cases, namely more equations than variables or the same
number of equations and variables, all three cases are possible.

homogeneous systems, linear dependence Lecture 3 5 / 30


Parametric equations of lines
Assume u, v are vectors in Rn and u ̸= 0. Then a parametrisation of the
line through 0 in the direction of u is:
t 7→ tu, t∈R
Example
In R2 , take u = ( 32 , 21 ).
x2

ℓ1 : {tu : t ∈ R}
u
x1

homogeneous systems, linear dependence Lecture 3 6 / 30


Parametric equations of lines
A parametrisation of the line through v parallel to u:
t 7→ v + tu, t∈R

Example
In R2 , take u = ( 32 , 21 ) and v = (−2, 1).

x2

ℓ2 : {v + tu : t ∈ R}

v ℓ1 : {tu : t ∈ R}
u
x1

homogeneous systems, linear dependence Lecture 3 7 / 30


Parametric equations of planes
Assume u1 , u2 and v are vectors in Rm , u1 , u2 ̸= 0 and u1 , u2 are not
aligned (that is, one is not a multiple of the other).
Then a parametrisation of the plane through 0 determined by u1 and u2 is:

(s, t) 7→ su1 + tu2 , s, t ∈ R

A parametrisation of the plane through v, parallel to the above plane is:

(s, t) 7→ v + su1 + tu2 , s, t in R

homogeneous systems, linear dependence Lecture 3 8 / 30


Parametric equations of planes
Example
Take u1 = (2, 1, 0), u2 = (0, 1, −1), v = (0, 0, 3).

homogeneous systems, linear dependence Lecture 3 9 / 30


Homogeneous and nonhomogeneous systems
Definition
A system of linear equations is said to be homogeneous if it can be written
in the form Ax = 0, where A is an m × n matrix and 0 is the zero vector
in Rm .

A homogeneous system Ax = 0 is never inconsistent; it always has at


least one solution, namely, x = 0 ∈ Rn since A0 = 0. This solution is
called the trivial solution.
The homogeneous equation Ax = 0 has a nontrivial solution if and
only if the equation has at least one free variable.

Definition
A system of linear equations is said to be nonhomogeneous if it can be
written in the form Ax = b, where A is an m × n matrix and b ∈ Rm .

homogeneous systems, linear dependence Lecture 3 10 / 30


Solution set of a homogeneous system
Recall that last time we stated that the operation x → Ax is linear :
a) A(v + w) = Av + Aw;
b) A(αv) = α(Av).
Of course, here A is an m × n matrix and v, w ∈ Rn and α ∈ R.
Exercise
Show that solutions to a homogeneous linear system Ax = 0 can be scaled
and added, to give new solutions to the same system.

homogeneous systems, linear dependence Lecture 3 11 / 30


Two important facts
(1) If u is a solution of Ax = 0 and v is a solution of Ax = b, then u + v
is a solution of Ax = b.
▶ Proof. A(u + v) = Au + Av = 0 + b = b

(2) If v1 is a solution of Ax = b and v2 is another solution of Ax = b,


then v2 can be written as v1 + u, where u is a solution of Ax = 0.
▶ Proof. Let u = v2 − v1 . Then we have v2 = u + v1 and moreover,
Au = A(v2 − v1 ) = Av2 − Av1 = b − b = 0

homogeneous systems, linear dependence Lecture 3 12 / 30


Example
    
1 −2 x1 −3
We solve the matrix equation = .
−3 6 x2 9

1 −2 −3
The augmented matrix is: .
−3 6 9
 
1 −2 −3 x1 = −3 + 2x2
Row reduction gives: =⇒
0 0 0 x2 is free
     
−3 + 2x2 −3 2
=⇒ Solutions are x = = + x2
x2 0 1

homogeneous systems, linear dependence Lecture 3 13 / 30


Example (continued)
Rewrite the solution as a parametric equation:
   
−3 2
x= + t , t∈R
0 1
| {z }
”Parametric part”,
solution set of Ax = 0
| {z h i
}
−3
Solution set of Ax= −9

2.0
1.5
1.0
0.5

-4 -2 2 4
- 0.5
- 1.0

homogeneous systems, linear dependence Lecture 3 14 / 30


Solution sets of linear systems in parametric form
Let us see how this works through two examples.
Example
 
  x1  
4 0 −4  x2  = 8
We solve the matrix equation: .
−3 0 3 −6
x3
 
4 0 −4 8
The augmented matrix is .
−3 0 3 −6
 
1 0 −1 2 x1 = 2 + x3
Row reduction gives: =⇒
0 0 0 0 x2 , x3 free
       
2 + x3 2 0 1
=⇒ Solutions are x =  x2  =  0  + x2  1  + x3  0 
x3 0 0 1
Introduce new labels, to avoid confusion: x2 = s, x3 = t.

homogeneous systems, linear dependence Lecture 3 15 / 30


Example (continued)
Rewrite the solution as a parametric equation:
     
2 0 1
x =  0  + s  1  + t  0 , s, t in R
0 0 1
| {z }
”Parametric part”, solution set of Ax=0
| {z h i
}
8
Solution set of Ax= −6

homogeneous systems, linear dependence Lecture 3 16 / 30


Linear dependence
Let v1 , v2 , . . . , vk be vectors in Rn . Consider the vector equation

α1 v1 + α2 v2 + · · · + αk vk = 0. (†)

This always admits the solution α1 = α2 = · · · = αk = 0, called the trivial


solution.
Question: Are there other solutions?

If no, the collection of vectors {v1 , . . . , vk } is called linearly


independent.
If yes, the collection of vectors {v1 , . . . , vk } is called linearly
dependent.
To check which it is, solve (†).

homogeneous systems, linear dependence Lecture 3 17 / 30


Simple examples of linear (in)dependence
Example
Two vectors v1 , v2 are linearly dependent if and only if one of them is a
multiple of the other.

Proof. Linear dependency of v1 , v2 implies that the system

x1 v1 + x2 v2 = 0 (∗)

has more solutions than the trivial solution. This means that at least one
of x1 and x2 is not 0. Suppose x1 ̸= 0. Then we have
x2
v1 = − v2 .
x1
If v1 , v2 are linearly independent, then x1 = x2 = 0 is the only solution to
(∗) and we cannot write either vector as a multiple of the other.

homogeneous systems, linear dependence Lecture 3 18 / 30


Simple examples of linear (in)dependence
Example
If one of the vectors v1 , v2 , . . . , vk is 0, then the vectors are linearly
dependent.

homogeneous systems, linear dependence Lecture 3 19 / 30


Simple examples of linear (in)dependence
Example
Can we write v1 = (3, 5) and v2 = ( 21 , 65 ) as a multiple of the other? We
need to check whether the system
   
3 1/2
x1 + x2 =0
5 5/6

has a non-trivial solution. The augmented matrix of the equation is


   
3 1/2 0 row reduction 3 1/2 0
−−−−−−−−−→
5 5/6 0 0 0 0

We get 3x1 + 12 x2 = 0, i.e. x1 = − 61 x2 . Thus the answer is affirmative; if


we take x1 = 1, we can write v1 = −6v2 .

homogeneous systems, linear dependence Lecture 3 20 / 30


Simple examples of linear (in)dependence
Example
Let v1 = (1, 1, 0), v2 = (0, −1, 1), v3 = (5, 1, 3) be vectors in R3 . To see
whether these vectors are linearly dependent, we must check whether the
equation
x1 v1 + x2 v2 + x3 v3 = 0
only has the trivial solution (x1 , x2 , x3 ) = (0, 0, 0). The augmented matrix
is    
1 0 5 0 row reduction
1 0 0 0
 1 −1 1 0  −−−−−−−−−→  0 1 0 0 
0 1 3 0 0 0 1 0
So x1 = x2 = x3 = 0 is the unique solution. Therefore we can say that the
vectors v1 , v2 and v3 are linearly independent.

homogeneous systems, linear dependence Lecture 3 21 / 30


Important theorems
Theorem
An indexed set S = {v1 , . . . , vk } of two or more vectors is linearly
dependent if and only if at least one of the vectors in S is a linear
combination of the others.
Proof. Suppose we have a non-trivial solution (α1 , . . . , αk ) to
α1 v1 + α2 v2 + · · · + αk vk = 0.
Note that not all of the αi are zero, so suppose for some index j we have
αj ̸= 0. Then dividing by αj and rearranging the terms of the equation, we
get:
α1 α2 αj−1 αj+1 αk
vj = − v1 − v2 + · · · − vj−1 − vj+1 + · · · − vk .
αj αj αj αj αj
Proving that if there is a vj that can be written as a linear combination of
the other vectors, the vectors v1 ,...,vk are linearly dependent will be left as
an exercise. □
homogeneous systems, linear dependence Lecture 3 22 / 30
Important theorems
Theorem
If k > n and v1 , . . . , vk are vectors in Rn , then they are linearly dependent.

Proof. If k > n, then the number of variables is more than the number of
equations in the system

x1 v1 + x2 v2 + · · · + xk vk = 0. (∗)

From our discussion at the beginning, we know that this system cannot
have exactly one solution. Moreover, the system is homogeneous, so it has
at least one solution. We conclude that it has infinitely many solutions,
which means that there are nonzero xj , 1 ≤ j ≤ k, which means that the
vectors v1 , . . . , vk are linearly dependent. □

homogeneous systems, linear dependence Lecture 3 23 / 30


Simple examples
Example
     
2 3 −5
Let , , be vectors in R2 . Are they linearly dependent
1 2 8
or independent?
Answer: We have 3 vectors in a 2-dimensional space, so we immediately
know that they are linearly dependent (by the theorem on the previous
slide).

Example
     
2 0 −5
Let  1  ,  0  ,  8  be vectors in R3 . Are they linearly dependent
5 0 1
or independent?
Answer: Since one of the vectors is the zero vector, the vectors are
linearly dependent.
homogeneous systems, linear dependence Lecture 3 24 / 30
Simple examples
Example
The vectors e1 = (1, 0, 0), e2 = (0, 1, 0), e3 = (0, 0, 1) are linearly
independent, since the equation x1 e1 + x2 e2 + x3 e3 = 0 has augmented
matrix  
1 0 0 0
 0 1 0 0 ,
0 0 1 0
which means that x1 = x2 = x3 = 0 is the unique solution.

The vectors e1 , e2 , e3 are very special, as we will see shortly.

homogeneous systems, linear dependence Lecture 3 25 / 30


Bases of Rn
Definition
A basis of Rn is any collection of n linearly independent vectors in Rn .

The most important example of a basis of Rn is the collection of vectors


{e1 , . . . , en }, where:

e1 = (1, 0, 0, . . . , 0, 0)
e2 = (0, 1, 0, . . . , 0, 0)
..
.
en = (0, 0, 0, . . . , 0, 1).

As in the previous example, this collection of vectors is linearly


independent. It is called the canonical basis of Rn .

homogeneous systems, linear dependence Lecture 3 26 / 30


Important theorem
Theorem (bases span Rn )
If the collection of vectors {v1 , . . . , vn } is a basis of Rn then any vector u
in Rn can be written as a linear combination of v1 , . . . , vn in a unique way.
In other words, the equation

α1 v1 + · · · + αn vn = u (⋄)

has a unique solution (α1 , . . . , αn ) ∈ Rn for any u in Rn .

Proof. Let us first show that (⋄) has at least one solution for any vector u.
If for a vector u we have that (⋄) does not have a solution, then u cannot
be written as a linear combination of v1 , . . . , vn . In that case the collection
{v1 , . . . , vn , u} is linearly independent. However, this collection has n + 1
vectors, so it cannot be linearly independent by the theorem from four
slides ago.

homogeneous systems, linear dependence Lecture 3 27 / 30


Important theorem
Theorem (bases span Rn )
If the collection of vectors {v1 , . . . , vn } is a basis of Rn then any vector u
in Rn can be written as a linear combination of v1 , . . . , vn in a unique way.
In other words, the equation

x1 v1 + · · · + xn vn = u (⋄)

has a unique solution (α1 , . . . , αn ) ∈ Rn for any u in Rn .

Proof. (continued) To prove that (⋄) has a unique solution for any u,
suppose (α1 , . . . , αn ) and (β1 , . . . , βn ) are two solutions of (⋄). Then

0 = u − u = (α1 v1 + · · · + αn vn ) − (β1 v1 + · · · + βn vn )
= (α1 − β1 )v1 + · · · + (αn − βn )vn .

Since v1 , . . . , vn are linearly independent, this means that αi = βi for all i.



homogeneous systems, linear dependence Lecture 3 28 / 30
Simple example
Example
Let v1 = (1, 1, 0), v2 = (0, −1, 1), v3 = (5, 1, 3), B = {v1 , v2 , v3 } and
u = (11, 9, 1). Let us verify that B is a basis of R3 and write u as a linear
combination of its elements.
The set B has 3 vectors in R3 , so the number of vectors matches the
dimension. We must now check that B is linearly independent. That is, we
must check that the equation x1 v1 + x2 v2 + x3 v3 = 0 only has the trivial
solution. This vector equation has augmented matrix
 
1 0 5 0
 1 −1 1 0 
0 1 3 0

and we can check by row reduction that indeed x1 = x2 = x3 = 0 is the


unique solution. Hence B is a basis.

homogeneous systems, linear dependence Lecture 3 29 / 30


Simple example
Example (continued)
Let v1 = (1, 1, 0), v2 = (0, −1, 1), v3 = (5, 1, 3), B = {v1 , v2 , v3 } and
u = (11, 9, 1). Given that B is a basis of R3 , let us write u as a linear
combination of its elements.
We need find the unique tuple (x1 , x2 , x3 ) which satisfies the equation
x1 v1 + x2 v2 + x3 v3 = u. This vector equation has augmented matrix
 
1 0 5 11
 1 −1 1 9 
0 1 3 1

and again by row reduction we find (x1 , x2 , x3 ) = (6, −2, 1). This gives

u = 6v1 − 2v2 + v3 .

homogeneous systems, linear dependence Lecture 3 30 / 30

You might also like