Vector Spaces

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

Summary of the lectures on Vector Spaces

1. We have seen the importance of vectors and vector spaces in statis-


tical pattern recognition, machine learning and information retrieval.
Specifically patterns, documents and datasets arising in many of these
applications is based on the notion of vectors.

2. Data is typically represented as a matrix and analysis of such matrices


is useful in clustering, classification, and regression.

3. Vector space notion helps us in looking at a set and operations to


abstract linear combinations of elements of the set (of vectors).

4. A vector space is a set V of vectors based on a field F along with


tow operations: (i) Scalar multiplication and (ii) Vector addition. Here
each operation satisfies several basic properties.

5. Vector addition is commutative, associative, existence of identity and


inverse associated.

6. Similarly scalar multiplication is associative, has an identity, satisfies


distributive properties.

7. We have considered several examples of vectors and vector spaces to


show that the notion of vector space is algebraic, rather than being
geometric.

8. We have also covered the relation between vectors when they make
sense geometrically. Specifically, we have considered addition of vectors
and scalar multiplication of vectors and how they relate to correspond-
ing objects in geometry.

9. We have examined the notion of Subspace of a vector space and by


examining the axioms (properties) of a vector space, we came out with
the following characterization of the subspace:

(a) The zero vector V must be an element of every subspace of V .


(V
/ W W is not a subspace (necessity) and V W 6 W is
a subspace of V (not sufficient). (Prof. Vittal Raos lecture notes
was used here).
(b) , W + W (W is closed wrt addition)
(c) c F, W c W (W is closed wrt scalar multiplication)
10. We have shown equivalently that a nonempty subset W of V is a sub-
space if and only if , W c + W for c F . This could also
be used to define subspace.

11. We have seen that intersection of subspaces is again a subspace. Union


of two subspaces may not be a subspace in general.

12. We have characterized the sum of two or more subspaces and shown
that the direct sum is a subspace of the vector space.

The above material including material on Linear Systems was


considered for the first test

Material for the second test from Vector spaces is given below

1. We have examined the notion of subspace spanned by S where S is


any nonempty subset of a vector space V . WS , the smallest subspace
containing S, which is the intersection of all subspaces containing S, is
called the subspace spanned by S.

2. We have considered the notion of linear combination vectors in a


finite nonempty subset of vectors.

3. We have linked these two notions by showing that the subspace spanned
by a non-empty subset S of a vector space V is the set of all linear
combinations of vectors in S.

4. We have studied linearly dependent and linearly independent sets


of vectors and their properties.

5. We have characterized the notion of Basis of a vector space V as a


linearly independent set of vectors in V which spans the space V. The
space V is finite-dimensional if it has a finite basis.

6. We have considered the possibility of existence of multiple bases for the


same vector space.

7. We have considered an example of an infinite dimensional vector space


in connection with polynomials.

8. We have examined the notion of dimension of a vector space and


the size of different bases for the same vector space. We have also
introduced the notion of standard basis.
9. We have seen how to extend a finite set of linearly independent vectors
to form a basis for a vector space.

10. We have linked invertibility of a matrix with that of linearly independent


rows.

11. We have considered ordered bases of a vector space and coordinates of


a vector in different bases.

12. Then we have examined how the coordinates of a vector change from
one ordered basis to another. Specifically, we have shown the existence
of a unique invertible matrix that could be used to characterize the
transformation from one ordered basis to another.

13. We have gone through a summary of row-equivalence including the


uniqueness of the row-reduced echelon form of a matrix.

14. We have considered the notions of row space and solution space
associated with a matrix.

15. We have examined row-equivalence of matrices using row space.

16. We have considered some questions on subspaces. These are (1) How
does one determine if a finite collection of vectors is linearly indepen-
dent? (2) Whether a given vector lies in the vector space? and (3) How
to describe a subspace?

17. We have considered answers to these questions based on row and col-
umn spaces of matrices.

Material from the book by K Hoffman and R Kunze was used.

You might also like