Vector Spaces: Definition, Examples, Homomorphism's

You might also like

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 23

VECTOR SPACES

Definition, Examples, Homomorphism's


Linear Combination
• Let’s O1 and O2 set S of some objects
• Let a and b Defined number system
• Than linear combination of objects is
• aO1 + bO2
• Linear combination should again belong to set S. That means more
objects can be obtained form linearly combining linearly combined
objects.
• Linear combination has two basic operations
• Scalar multiplication
• Vector addition
• Linear combination arises in many many contexts
Linear Combination Geometric
• Representation of points in Cartesian plane
• Any point (a,b) can be constructed from linear
combination of unit vectors (1,0) and (0,1)
• (a,b) = a(1,0)+b(0,1)
• (a,b) = (a,0)+(0,b)
• (a,b) = a + b
• Unit vectors and are called Basis vectors
Linear Combination Differential Eq
• Given ODE

• If we happen to know that solutions are y = sin(x) and y =


cos(x)
• Than it can be easily established that any linear
combination of sin x and cos x are also the solution
• Solution space is any vector of the form
• Y = a sin(x) + b cos(x), Where a, b can be any number
• Basis vectors for this space are sin x and cos x
Linear Combination Polynomials
• Any linear combination of basic polynomials is also a
polynomial

• Is a linear combination of 4 basic polynomials 1,


• Coefficients can belong to any Field F
Linear Combination Solution to Simultaneous Equations
• x – y = 7, 3x + 2y = 6
• To find the solution 2 times equation 1 is added to
equation 2.
• Why is above operation allowed
• If equation 1 is represented as L1 = R1 , similarly
equation 2 as L2=R2
• Combining (L1,L2) and (R1,R2) linearly is same as adding
equations to get the solutions
• L1+2L2 = R1+2R2
Vector space
• Vector space is a mathematical structure which
encapsulates the operations of scalar multiplication and
vector addition. Linear combination makes sense in vector
space.
• Set V such that given two elements u , v and numbers a ,
b we can form a linear combination au+bv
• Examples of vector space
• Polynomial of degree n or less
• Continuous functions in
• Topological Vector Spaces
Vector Space Formal Definition
Axiom Meaning
Associativity of addition u + (v + w) = (u + v) + w
Commutativity of addition u + v = v + u
There exists an element 0 ∈ V, called the 
Identity element of addition
zero vector, such that v + 0 = v for all v ∈ V.
For every v ∈ V, there exists an
Inverse elements of addition element −v ∈ V, called the additive inverse
 of v, such that v + (−v) = 0.
Compatibility of scalar multiplication with field
a(bv) = (ab)v [nb 2]
multiplication
1v = v, where 1 denotes the 
Identity element of scalar multiplication
multiplicative identity in F.
Distributivity of scalar multiplication with
a(u + v) = au + av
respect to vector addition  
Distributivity of scalar multiplication with
(a + b)v = av + bv
respect to field addition
Basis , Span , Independence
• A basis B of a vector space V over a field F is a linearly independent
 subset of V that spans V.
• In more detail, suppose that B = { v1, …, vn } is a finite subset of a
vector space V over a field F (such as the real or complex numbers
 R or C). Then B is a basis if it satisfies the following conditions:
• the linear independence property,
• for all a1, …, an ∈ F, if a1v1 + … + anvn = 0, then necessarily a1 = …
= an = 0; No vector is linear combination in more than one way
• the spanning property,
• for every (vector) x in V it is possible to choose a1, …, an ∈ F such
that x = a1v1 + … + anvn. Each vector in V is LC of basis vectors
Basis
• Cardinality of Basis set is the dimension of vector space
• A Vector space can have multiple basis set but each
should have same cardinality or dimension
• Vector spaces can be infinite dimensional. Ex. Functional
spaces
• Scalars can come from general exotic fields. This
generalization is useful in ALGEBRAIC NUMBERS
Zero Vector Space - Trivial
Set {0}
Associativity of addition 0 + (0 + 0) = (0 + 0) + 0
Commutativity of addition 0 + 0 = 0 + 0

Identity element of addition 0 itself is the identity element

Inverse elements of addition  v + (−v) = 0. -0 is same as 0 so 0 +(-0)=0

Compatibility of scalar multiplication with field


a(bv) = (ab)v  a(b0) = (ab)0
multiplication

Identity element of scalar multiplication 1v = v, 10 = 0

Distributivity of scalar multiplication with


a(0 + 0) = a0 + a0
respect to vector addition  
Distributivity of scalar multiplication with
(a + b)0 = a0 + b0
respect to field addition
Field F Vector Space - Trivial
Set 1 dimensional space over itself
a + (b + c) = (a + b) + c; where a,b,c are scalars in
Associativity of addition
F. Same as Field addition
Commutativity of addition a + b = b + a; Field additive commutativity

Identity element of addition 0 ; Additive identity of Field

Inverse elements of addition  a + (−a) = 0. Additive inverse of field

Compatibility of scalar multiplication with field


a(bm) = (ab)m; multiplicative associativity
multiplication

Identity element of scalar multiplication 1a = a; multiplicative identity of field

Distributivity of scalar multiplication with


a(b + c) = ab + ac; Field distributivity
respect to vector addition  
Distributivity of scalar multiplication with
(a + b)c = ac + bc
respect to field addition
Coordinate Space – Non Trivial
Space of all n tuples from Field F also known
Set as Fn is vector space over F
= (x1,x2,x3..xn) where xi F
a + (b + c) = (a + b) + c; n=2
Let’s a = (x1,x2) b=(x3,x4) c=(x5,x6)
LHS (x1,x2)+((x3+x5),(x4+x6))
Associativity of addition LHS (x1+x3+x5,x2+x4+x6)
RHS ((x1,x2)+(x3,x4))+(x5,x6)
RHS (x1+x3,x2+x4)+(x5,x6)
RHS (x1+x3+x5,x2+x4+x6)
a + b = b + a; n=2
Let’s a = (x1,x2) b=(x3,x4) x1,x2,x3,x4 F
LHS (x1,x2)+(x3,x4)
Commutativity of addition LHS (x1+x3,x2+x4)
RHS (x3,x4)+(x1,x2)
RHS (x3+x1,x4+x2)
Addition is commutative so x1+x3=x3+x1
Coordinate Space – Non Trivial
Identity element of addition (0,0) ; n=2 Additive identity of Field
 a + (−a) = 0. Additive inverse of field
Let’s a = (x1,x2) so –a = (-x1,-x2)
Inverse elements of addition
In terms of vectors –a is reflection of vector on origin
LHS (x1,x2)+(-x1,-x2) = 0
a(bm) = (ab)m; a,b,x1,x2 F ; n=2
Compatibility of scalar multiplication with field Let’s m=(x1,x2)
multiplication LHS a(b(x1,x2)) = a(bx1, bx2) = (abx1 , abx2)
RHS (ab)(x1,x2) = (abx1 , abx2)
1a = a; multiplicative identity of field
Identity element of scalar multiplication
Let’s a =(x1, x2) 1.(x1,x2)=(x1,x2)
a(b + c) = ab + ac; Field distributivity
Distributivity of scalar multiplication with respect to Let’s b=(x1,x2) c=(x3,x4)
vector addition   LHS a((x1,x2)+(x3,x4)) = a(x1+x3,x2+x4)
RHS (ax1,ax2)+(ax3,ax4)=a(x1+x3,x2+x4)
(a + b)c = ac + bc Let’s c =(x1,x2)
Distributivity of scalar multiplication with respect to
LHS (a+b)x1,(a+b)x2
field addition
RHS (ax1,ax2)+(bx1,bx2)=(a+b)x1,(a+b)x2
Coordinate Space
• The most common cases are where F is the field of 
real numbers giving the real coordinate space Rn, or the field
of complex numbers giving the complex coordinate space Cn.
• The quaternions and the octonions are respectively four- and
eight-dimensional vector spaces over the reals.
• The vector space Fn comes with a standard basis:
• e1 = (1,0,0……0)
• e2 = (0,1,0……0)
• e3 = (0,0,0……1)

where 1 denotes the multiplicative identity in F.


Homomorphism – Linear Maps
• Linear map is homomorphism from vector space to vector
space.
• Sends straight line to straight line that’s why called linear
map.
• Since linear map is a homomorphism so it should
preserve structure’s operation
• Linear map preserves linear combination
• F(a1*v1 + a2*v2 + a3*v3…..+ an*vn) = a1 * F(v1) + a2 *
F(v2) + a3 * F(v3)…..+ an * F(vn)
Linear Map as Matrix
• Let’s f is a linear map such that f:V->W where V and W
are two vector spaces.
• f(a1*v1 + a2*v2 + a3*v3…..+ an*vn) = a1 * f(v1) + a2 *
f(v2) + a3 * f(v3)…..+ an * f(vn) v1,v2…vn is Basis(V)
• f(v1) = a1j * w1 + a2j * w2 +….amj * wm
• Solving eq 1 and 2 together we get MxN matrix
Linear Map as Matrix
• Matrix is built from choice of Basis(V) and Basis(W). If
different basis is to be used matrix will be recomputed
• Let’s f:U->V and g:V->W than fg:U->W
• Matrix of f = A and Matrix of g = B than Matrix of fg = AB
• Matrix multiplication operation is associative
• A(BC) = (AB)C
• Order of evaluation is from right to left
• Matrix multiplication is not commutative
• AB != BA
Isomorphism – Invertible linear maps
• If f:V->V and g:V->V than
• f is isomorphism if f(g(v)) = g(f(v)) = v
• Any Isomorphism is Automorphism on vector space
• f = A and g = Inv(A); det(A)!=0 for invertible matrix
• Composition of f o g = g o f = v so it’s a symmetry
• Collection of invertible matrices is a group as inverted
matrix is also invertible
• If V is n dimensional and the scalars come from the field F,
then this group is called GLn(F). The letters “G” and “L”
stand for “general” and “linear”
Linear Maps for infinite dimensional vector spaces
• Examples of infinite dimensional vector space
• Set of all differentiable two dimensional functions
• Set of all functions
• Let’s
• V = set of functions f such that f:R->R and f is differentiable
• W = set of ALL functions g such that g:R->R

V W
f1 , f2 D is linear
g1 , g2
Are differentiable map Are general functions
functions
Linear map on functional spaces
• Let f1 and f2 are two functions from V
• L is linear map L:V -> W such that
• L(af1 + bf2 ) = aL(f1) + bL(f2)
• L(f1) = g1 and L(f2) = g2
• Differential is such a linear map as
• D(af1 + bf2 ) = aD(f1) + bD(f2)
Linear map on functional spaces
• Let V be the vector space of functions
• Let u(x,y) be two dimensional function
• T be a linear map on V

• Functions like u(x,y) is called kernel and can be compared


to matrix Aij
Eigenvalues and Eigenvectors

You might also like