Professional Documents
Culture Documents
Logic For Computer Science: Vector Spaces
Logic For Computer Science: Vector Spaces
Vector Spaces
Vector Space
• Spanning the Space
• Change of Basis
• Generalize
Colour
Error Correcting Codes
Fourier Transformation
Polynomials & Integer Multiplication
JPEG Image Compression
Dimension Reduction Face Recognition
Linear Transformations
Calculous
Probabilistic Markov Process
Quantum Probability
Jeff Edmonds
Lecture 6.4 York University Math1090,EEC6111
Vector Spaces
Vector Spaces
Vector Spaces
A vector v is:
•is an arrow with a direction and a length
•is a tuple of real a1,a2,…,ad of coeficients
v = a1,a2,a3
a1 v = a1,a2 a2
a2 a1
a2
Vector Spaces
We learned about vectors Ok. So we all know
in math or in physics. about Euclidian Space.
A vector v is:
•is an arrow with a direction and a length
•is a tuple of real a1,a2,…,ad of coeficients
d is the # of dimensions.
4 3,4
v=
3
Vector Spaces
" basis = w1,w2,…,wd of vectors (linearly independent)
" vector v in the entire space i.e. none is sum So
$ real coefficients a1,a2,…,ad of the others
v = a1w1+a2w2 +… + adwd
We say the basis spans the space
uniquely
Can we generalize this
to get something new?
Standard
Basis = w1,w2
What is special
= , 3,4 about this basis?
v=
a1,a2 = 3,4
Vector Spaces
" basis = w1,w2,…,wd of vectors (linearly independent)
" vector v in the entire space
$ real coefficients a1,a2,…,ad Ok that is more cool.
v = a1w1+a2w2 +… + adwd Every basis
´ ´ ´ ´ ´ ´ can be used to reach
= a1w1+a2w2 +… + adwd
every place.
Standard Strange
Basis = w1,w2 Basis = w1,w2
= , = ,
v=
v=
Excellent
Lets talk about new things.
Vector Spaces
Defn: vectors space
" vectors u,v,w " reals a,b
• Closed: a×u + b×v is a vector, 0×u=zero vector, 1×u=u.
• Associative: u+(v+w) = (u+v)+w
• Commutative: u+v = v+u
• Distributive: a×(u+v) = a×u + a×v
• + Inverse: "v $u u+v=0, i.e. v=-u
Standard Strange
Basis = w1,w2 = , Basis = w1,w2 = ,
v= v=
FTP compression
v = a1w ´ 1+a
´ 2w´ 2 +…
´ + adw´d ´
a´f = amount of frequency f.
Strange
Basis = w1,w2
= ,
v=
pixel x1
Dimension Reduction
collection of faces Represent each image by a vector
with a dimension for each pixel.
“Fit” the data with a multi-dimensional ellipse.
Transform from the standard basis x1,…,x10,000
to the basis a1,…,a10,000
in the directions giving important features.
Drop all all the coordinates except for a1,…,a100.
Each described by
10,000 numbers
Dimension Reduction
collection of faces Represent each image by a vector
with a dimension for each pixel.
“Fit” the data with a multi-dimensional ellipse.
Transform from the standard basis x1,…,x10,000
to the basis a1,…,a10,000
in the directions giving important features.
Drop all all the coordinates except for a1,…,a100.
Each described by Described by
10,000 numbers 100 numbers
Then recognize
the face using
these 100 numbers
Linear Transformations
Basis = [ , ]
T( ) =
T( ) =
We only need to know where the basis vectors get mapped
T(v) = a1T(w1) +a2T(w2) +… + adT(wd )
Integrating
• f(x) = x2exsin x
• Can you differentiate it?
• Can you integrate it?
Sure!
f’(x) = 2xexsinx + x2exsinx + x2excosx
Ahh? No
I can!
Think of differentiation as a Linear
Transformations and then invert it.
Integrating
/x(x2exsinx + x2excosx + xexsinx + xexcosx + exsinx + excosx)
=
2x2excosx + 2xexsinx + 4xexcosx + 1exsinx + 3excosx
1 0
1 -1 0 0 0 0 x2exsinx
2
1 1 0 0 0 0 1 x2excosx
2
2 0 1 -1 0 0 1
= Basis = xe x
sinx
4
0 2 1 1 0 0 1 xexcosx
0 0 1 0 1 -1 1 1 exsinx
0 0 0 1 1 1 1 3 excosx
Markov Process Classical Probability
The present
specified by current state.
• A node for each state
Bear • A dimension for each state
Bull
Stag
[
Bear .075
Stag .025
.8
.05
.25
.5 ][ ] [ ]
0 = .075
0 .025
Markov Process Classical Probability
The present
specified by current state.
The past
does not matter
The future
determined probabilistically.
Bear
Bull
Stag
[
Bear .075
Stag .025
.8
.05
.25
.5 ][ ] [ ]
0 = .075
0 .025
Markov Process Classical Probability
The present
specified by current state.
The past
does not matter
The future
determined probabilistically.
• Probabilities add to one
Stag
[
Bear .075
Stag .025
.8
.05
.25
.5 ][ ] [ ]
0 = .075
0 .025
Markov Process Classical Probability
The present
• A vector of probabilities
state is current.
• Probabilities add to one
Stag
[
Bear .075
Stag .025
.8
.05
.25
.5 ][ ] [
.075 = . 0.13375
.025 0.03875 ]
Markov Process Classical Probability
The present
• A vector of probabilities
state is current.
• Probabilities add to one
Stag
[
Bear .075
Stag .025
.8
.05
.25
.5 ][ ] [
.075 = . 0.13375
.025 0.03875 ]
Unary Matrix Quantum Probability
The present
• A vector of probabilities
state is current.
• Probabilities add to one
Stag
[
Bear .075
Stag .025
.8
.05
.25
.5 ][ ] [
.075 = . 0.13375
.025 0.03875 ]
Unary Matrix Quantum Probability
The present
• A vector of probabilities
state is current.
• Could be negative
(or complex)
I read that all quantum computing
can be done with the reals.
The future determined by
a transformation.
Bear
Bull
Stag
[
Bear .075
Stag .025
.8
.05 ][ ] [
2
.25− √ 1/.075
.5 0 .025
= . 0.13375
0.03875 ]
Unary Matrix Quantum Probability
The present
• A vector of probabilities
state is current.
• Could be negative
(or complex)
• Length of vector is 1
(ie sum of squares)
The future determined by
aunary transformation.
A rotation
Bear
preserving lengths
Bull
Stag
and angles.
Bull cos
. 9 -sin.15 .25√ 1/ 2 .9 0.8275
[ .075
Bear sin cos .8 .25
Stag .025 .05 1 .5 0 .025
√ 2
][ ] [
− 1/.075 = . 0.13375
0.03875 ]
Unary Matrix Quantum Probability
The present
• A vector of probabilities
state is current.
• Could be negative
(or complex)
• Length of vector is 1
(ie sum of squares)
The future determined by
aunary transformation.
A rotation
Bear
preserving lengths
Bull
Stag
and angles.
Bull . 9 .15 .25 Rows/columns
cos -sin 2 .9 0.8275
Bear
Stag
[ .075
sin cos .8
√ 1/
][ ] [
− 1/.075
.25
orthonormal
√ 2 ie length
.025T .05 1 .5 0 .025
M M=I and
= . 0.13375
one
0.03875
perpendicular
]
Unary Matrix Quantum Probability
The present
• A vector of probabilities
state is current.
• Could be negative
(or complex)
• Length of vector is 1
(ie sum of squares)
The future determined by
aunary transformation.
A rotation
Bear
preserving lengths
Bull
Stag
and angles.
Bull . 9 .15 .25 All unary
.9 transformations
0.8275
cos -sin
Bear
Stag
[ .075
sin cos .8 .25
√ 1/
][ ] [
2
− 1/.075
can be decomposed
2
√ into
.025 .05 1 .5 0simple
these
= . 0.13375
.0252d rotations.
0.03875 ]
Unary Matrix Quantum Probability
A photon must be a particle
because it hits things. • A vector of probabilities
Eg one hits the back screen state is current.
at a time. • Could be negative
(or complex)
• Length of vector is 1
Unary Matrix Quantum Probability
A photon must be a particle
because it hits things. • A vector of probabilities
When it acts like a wave, state is current.
what is waving? • Could be negative
(or complex)
The probability distribution! • Length of vector is 1
Unary Matrix Quantum Probability
Now the photon is located
probabilistically • A vector of probabilities
in one of the red places. state is current.
• Could be negative
(or complex)
• Length of vector is 1
Cross section
of red curve
gaussian cos
real & imaginary
Unary Matrix Quantum Probability
Now the photon is located
probabilistically • A vector of probabilities
in one of the red places. state is current.
Then it “moves” • Could be negative
(or complex)
• Length of vector is 1
Cross section
of red curve
gaussian cos
real & imaginary
Unary Matrix Quantum Probability
A positive probability of
arriving here via slit one. • A vector of probabilities
A positive probability of state is current.
arriving here via slit two. • Could be negative
This probabilities add. (or complex)
Positive interference. • Length of vector is 1
Cross section
of red curve
gaussian cos
real & imaginary
Unary Matrix Quantum Probability
A positive probability of
arriving here via slit one. • A vector of probabilities
A negative probability of state is current.
arriving here via slit two. • Could be negative
This probabilities add. (or complex)
Negative interference. • Length of vector is 1
Cross section
of red curve
gaussian cos
real & imaginary
If you want one sample of where the system
will end up after a long time,
Markov Process Classical Probability
Mathematically, it is sufficient to follow one path.
Unary Matrix Quantum Probability
Mathematically, you must follow all paths.
because of the negative interference.
This is why
they say the
quantum system
is in a
superposition.
I think negative
probabilities is a
bigger deal.
End