Download as txt, pdf, or txt
Download as txt, pdf, or txt
You are on page 1of 20

[Marco Gualtieri] 12:09:39

i'm not sure what you mean by your question because when you have a identity map on
a vector space that has that has no there is no reference to a basis there, right?

[Hannah Liu] 12:09:51


An identity map exists on a vector space.

[Marco Gualtieri] 12:09:55


So there there is no basis needed to define the identity map.

[Hannah Liu] 12:09:58


Right, you agree. So if you have a an identity map going from a vector, space.

[Marco Gualtieri] 12:10:07


So if let me just hmm

[Marco Gualtieri] 12:10:28


Let's say that you have the identity map on a vector space V.

[Marco Gualtieri] 12:10:41


Right? So you might ask yourself, Well, what is the matrix of this identity map?

[Marco Gualtieri] 12:10:52


Yeah. And whenever you ask that question of what is the matrix what you're doing is
you're choosing basis on the domain and not the code domain. so you can choose any
basis for the domain at any basis, for the code domain

[Marco Gualtieri] 12:11:12


And this identity map. Let me just simplify the notation a little bit.

[Marco Gualtieri] 12:11:18


This identity map will have the matrix representation for that choice of bases.

[Hannah Liu] 12:11:27


How do we think that out? Well, the Callums of this matrix are going to be?

[Marco Gualtieri] 12:11:33


Well, let's say if the basis beta is v one to Vn.

[Marco Gualtieri] 12:11:39


And Gamma is w one up to the wn and what you're supposed to do as you're supposed
to take V.

[Marco Gualtieri] 12:11:47


One you're supposed to work the identity map to that which is just V.

[Marco Gualtieri] 12:11:53


One, and you're supposed to rewrite this in terms of the basis for the code domain

[Marco Gualtieri] 12:12:06


Right, and then you're supposed to put these numbers down the first column, and
then you do the same thing for V.

[Marco Gualtieri] 12:12:14


2 V 2. The identity map takes it to V 2, and you can write it as

[Marco Gualtieri] 12:12:26


Right, and then you put that down the second call right, and you continue this way
and that will be the matrix of the identity map.
[Marco Gualtieri] 12:12:38
So what that means is that the matrix for the identity map could be anything.

[Marco Gualtieri] 12:12:43


It could be I mean there's there's no there's a great variety of possible matrices
for the identity map, depending on which bases you pick for the domain of the code
domain Okay, do you agree so far.

[Hannah Liu] 12:12:54


Yeah, So it's the point is the identity matrix is it equal to the identity map.

[Marco Gualtieri] 12:13:01


Right. Okay, So i'm gonna get to that in a second now So now I'm, i'm just and this
is important for what we're gonna talk about today.

[Marco Gualtieri] 12:13:10


So because the identity map is mapping from V to itself.

[Marco Gualtieri] 12:13:14


That means that we have the option of selecting the basis only once.

[Marco Gualtieri] 12:13:22


So if, on the other hand, if we choose, beta to be equal to Gamma to be the let's
say V.

[Marco Gualtieri] 12:13:33


One up to vn and this is an option that we have only when the linear map is going
from a vector space to itself.

[Marco Gualtieri] 12:13:42


It's only in that case that we have this option. so this is only possible when the
domain is equal to the code domain.

[Marco Gualtieri] 12:13:57


Okay, then, what is the matrix of the identity map? Well, if I do now, the matrix
of the identity map in the same basis for the domain of the code domain, this is
going to be well.

[Marco Gualtieri] 12:14:10


What do you have to do? You have to take V. one.

[Marco Gualtieri] 12:14:12


You have to the identity map to it. you get a few and then write it in terms of the
basis.

[Marco Gualtieri] 12:14:19


Well guess what it's going to be one times v one plus 0 times v.

[Marco Gualtieri] 12:14:22


2 plus etc. So you're just gonna get 1 0 0 0

[Marco Gualtieri] 12:14:31


Right. So then you will get the identity matrix

[Marco Gualtieri] 12:14:40


Okay, so it's it's that's what happens when you choose the same basis for the
domain and the code domain, and then it doesn't depend on which basis you pick you
could have picked a different basis made a prime as long
[Hannah Liu] 12:14:54
as it's the same you're gonna see the same matrix Okay, yeah, thank you.

[Marco Gualtieri] 12:15:05


You're welcome. Okay, so let's let's move now to to discuss

[Marco Gualtieri] 12:15:17


Okay, So let me just briefly review what we did last time.

[Marco Gualtieri] 12:15:21


So so last time.

[Samprit Ghosh] 12:15:28


Up, I should I should Oh, thank you so someone set just helping It's complete.

[Marco Gualtieri] 12:15:45


Oh, hi, Stanford, Thank you for starting the recording.

[Samprit Ghosh] 12:15:49


Okay, Okay, good. So last time we saw that we basically prove the following
theorem: We we prove the following theorem.

[Marco Gualtieri] 12:16:02


So number one, Okay, every K by n matrix.

[Marco Gualtieri] 12:16:11


Is equivalent.

[Marco Gualtieri] 12:16:17


2, hey? block matrix of the form

[Marco Gualtieri] 12:16:33


Oops.

[Marco Gualtieri] 12:16:50


Where we have a single r by our square block in the top right corner, which is the
identity matrix, and everything else is 0.

[Marco Gualtieri] 12:16:58


This is K. and this is N.

[Marco Gualtieri] 12:17:06


For are equal to the rank which is equal to the dimension of the image.

[Marco Gualtieri] 12:17:23


And number 2 Oops number 2,

[Marco Gualtieri] 12:17:29


This is this is the The concrete version of the statement was part one, and the
appointment the abstract version of the theorem is that if A and B our linear maps
from V to W.

[Marco Gualtieri] 12:17:52


Such that the dimension such that the rank let me put it this way, such that the
rank of a is equal to the rank.

[Marco Gualtieri] 12:18:03


That's B. that means the dimension of the image of A and the dimension of the image
of B are the same dimension.

[Marco Gualtieri] 12:18:10


They are the same number, then they are equivalent. I.

[Marco Gualtieri] 12:18:23


That is to say, okay, what does it mean to be equivalent?

[Marco Gualtieri] 12:18:25


There exist isomorphisms.

[Marco Gualtieri] 12:18:31


I'm gonna give the isomorphism slightly different names just to distinguish them
from the maps

[Samprit Ghosh] 12:18:41


Professor I think the full screen is not visible to all of us.

[Marco Gualtieri] 12:18:46


What about container? it's like zoom dinner

[Samprit Ghosh] 12:18:53


Do you see it now? The whole screen? Yes. okay, okay. So

[Marco Gualtieri] 12:19:05


So So there are isomorphisms such that.

[Marco Gualtieri] 12:19:15


So. let me draw the picture. So we have V.

[Marco Gualtieri] 12:19:21


We have these 2 maps, and then we have this hiomorphism under.

[Marco Gualtieri] 12:19:31


I'll write a little isomorphism symbol here

[Marco Gualtieri] 12:19:39


And the point is that be can be written in terms of a you can get B from A by
simply applying f inverse, composing it with A and then applying G.

[Marco Gualtieri] 12:20:02


So this is the abstract version of the statement above So it's basically saying
that as long as 2 linear maps going from V to W have the same rank, then we can you
could think of it this way.

[Marco Gualtieri] 12:20:20


That what F. is doing is it's changing the basis or rotating V.

[Marco Gualtieri] 12:20:26


And G. is rotating the code domain I mean i'm using a rough notate.

[Marco Gualtieri] 12:20:32


I don't know it's not really a rotation we haven't spoken about what rotations are
but they are. but it's an isomorphism which changes the basis in a way on W.

[Marco Gualtieri] 12:20:42


Such that B and A. then look the same. they they appear to be the same, or they're
equivalent according to this equivalence relation that we defined earlier. Okay, So
what what we want to do? now?
[Marco Gualtieri] 12:20:57
Okay, if we want to do something similar. but for maps on a single vector, space.

[Marco Gualtieri] 12:21:03


So now we want to classify the linear maps in Lvv.

[Marco Gualtieri] 12:21:18


So V is a finite dimensional vector space over the field.

[Marco Gualtieri] 12:21:23


F. and we're looking now at linear maps from D.

[Marco Gualtieri] 12:21:26


To V. and for for these

[Marco Gualtieri] 12:21:33


We have a different, a different equivalence, relation.

[Marco Gualtieri] 12:21:46


Which is the following: So we'll say that that a to be linear operators.

[Marco Gualtieri] 12:21:57


So actually we we call these just it's it's quite common to call these linear
operators

[Marco Gualtieri] 12:22:09


On V, just just to emphasize the fact that they are operating on one space.

[Marco Gualtieri] 12:22:18


Okay, So these will be equivalent now there's a special name for this, which is
usually used, which is called similar.

[Marco Gualtieri] 12:22:34


So instead of equivalent i'll i'll use the term similar

[Marco Gualtieri] 12:22:41


When when there exists an isomorphism

[Marco Gualtieri] 12:22:54


F. from V to V. and here i'm gonna draw a similar picture.

[Marco Gualtieri] 12:22:59


So we have a going from V to V. be going from V to V.

[Marco Gualtieri] 12:23:04


And now we have an isomorphism that identifies the top and the bottom.

[Marco Gualtieri] 12:23:12


And now it's the same it's the same isomorphism on both sides, and we can write B
as F.

[Marco Gualtieri] 12:23:24


After a after f inverse

[Marco Gualtieri] 12:23:35


Okay, so So this is I i'm hoping that you will see the the similarity between what
I'm talking about.
[Marco Gualtieri] 12:23:46
Now, and what we were talking about for finite sets right

[Marco Gualtieri] 12:23:50


Where you are allowed to relabel the finite set if you're mapping from the set to
itself.

[Marco Gualtieri] 12:23:56


You only want to label it once and that's what this F.

[Marco Gualtieri] 12:24:02


Is F is like a relabeling of the set it's just that.

[Marco Gualtieri] 12:24:08


You don't have 2 sets in the picture you only have one, and therefore you only have
the opportunity to relabel that one set.

[Marco Gualtieri] 12:24:16


Okay, and that's what the job of F. is so the only difference between A and B.

[Marco Gualtieri] 12:24:21


Well when when A and B are similar, that means that after a relabeling of the
unique set that we have, they are the same.

[Marco Gualtieri] 12:24:29


So let me write that in in a kind of informal it's it's saying that A.

[Marco Gualtieri] 12:24:35


And B are the same. After Hey, relabeling A.

[Marco Gualtieri] 12:24:48


V Okay, now, I don't really mean that it's a labeling.

[Marco Gualtieri] 12:24:54


It's it's just analogous to a relabeling. What it is is that isomorphism.

[Marco Gualtieri] 12:24:58


Okay, Okay, any, any question about this goal that we have this.

[Marco Gualtieri] 12:25:07


This is the main goal for the remainder of the course.

[Marco Gualtieri] 12:25:10


And maybe the importance of this will become clear as we as we go.

[Marco Gualtieri] 12:25:16


As we go on. Yeah, go ahead. Kevin. Go ahead.

[Kevin Chen] 12:25:21


This might be a silly question, but I remember an axler.

[Kevin Chen] 12:25:27


I think he mentions a little critique about the concept of.

[Kevin Chen] 12:25:31


I think, equivalent similarity and isomorphism.

[Marco Gualtieri] 12:25:37


And he notes that a question people ask is well if isomorphism is so important, or
the act of saying, Yes, with a few mappings you can.

[Kevin Chen] 12:25:46


It's like just certain, than your operators The same as each other.

[Marco Gualtieri] 12:25:50


What's the point of like doing linear Algebra at all?

[Kevin Chen] 12:25:54


And he has an answer by saying that it's useful to consider the differences, even
when they're isomorphic to each other.

[Marco Gualtieri] 12:26:02


Would you mind elaborating on this, or is it Scott related?

[Marco Gualtieri] 12:26:10


Yeah

[Marco Gualtieri] 12:26:11


Yeah. So I I did. I did say a little bit about this in in the previous section
where we were talking about these.

[Marco Gualtieri] 12:26:19


Because, you see, well, okay, so I did mention this last time.

[Marco Gualtieri] 12:26:26


So the point is that we we encounter this this problem or this.

[Marco Gualtieri] 12:26:30


This concept before, when we proved that every finite dimensional vector space over
the field is isomorphic to f to the end.

[Marco Gualtieri] 12:26:40


So what's the point of considering vector spaces at all?

[Marco Gualtieri] 12:26:43


Why not just learn about f to the end? right? n dimensional the standard vector.

[Marco Gualtieri] 12:26:50


Space of the mention, n for over a field is f to the N.

[Marco Gualtieri] 12:26:54


And every other vector space of dimension. n. over that field is isomorphic to that
one right?

[Marco Gualtieri] 12:27:00


So so it may seem strange. Why did we learn the axioms of vector.

[Marco Gualtieri] 12:27:06


Spaces when they're all isomorphic to a single one of them that we couldn't have
studied.

[Marco Gualtieri] 12:27:09


In the first place. and so this is a misguided kind of complaint or realization,
because because of the fact that the the standard vector space F to the N is not
just a vector space.

[Marco Gualtieri] 12:27:29


It is a vector space with a preferred basis. It has a distinguished basis which are
the unit vectors, e.

[Marco Gualtieri] 12:27:35


One up to en. and the point is that you will encounter in many problems vector
spaces that are not that do not have any preferred basis at all.

[Marco Gualtieri] 12:27:50


For, for example. if I give a linear map from F to the end to F to the M.

[Marco Gualtieri] 12:27:56


That linear map will have a null space, and that null space may be the solutions to
your linear system, and that null space will not have a natural choice of basis.

[Marco Gualtieri] 12:28:06


It needs a choice. You need to choose a basis for that null space, so that null
space.

[Marco Gualtieri] 12:28:13


It is a non trivial observation that it is isomorphic to F to the end for some end.

[Marco Gualtieri] 12:28:19


But it is not at all clear how to make it isomorphic with F to the end.

[Marco Gualtieri] 12:28:25


To do that, you need to choose the basis. So even if you just learned about F to
the end, you would inevitably obtain vector subspaces, vector quotient spaces, and
so on that it would not be clear what

[Marco Gualtieri] 12:28:39


they are, it would not be clear that they are actually after the end.

[Marco Gualtieri] 12:28:43


So in that sense. when you work with vector spaces there's natural operations that
you can do on them that will generate other vector spaces from them. and those
vector spaces will not have baseies naturally chosen on them Okay.

[Marco Gualtieri] 12:28:55


so similarly here you will encounter linear maps natural linear maps that occur in
problems, and these linear maps will not come with a natural basis for the domain
or for the code domain.

[Marco Gualtieri] 12:29:10


They'll just come as linear maps and so it will be it will be up to us to decide
which basis to choose in order to represent those linear maps.

[Marco Gualtieri] 12:29:20


So in you're going to see a very powerful example of this in the assignment. and
the next assignment where i'm gonna ask you to do something like a mini Google a
mini search engine Okay, where you know you have all of

[Marco Gualtieri] 12:29:32


your data provided to you about the connectivity of a network.

[Marco Gualtieri] 12:29:38


But in order to, in order to find things like search results, that basis that you
are being given is not a good basis, and it does not help you to solve the problem.

[Marco Gualtieri] 12:29:47


And so you need to find a completely different basis to refactor the information
into the more useful core.

[Marco Gualtieri] 12:29:54


And so in fact, that's all that Google is doing it's changing a basis.

[Marco Gualtieri] 12:29:58


It's finding a a kind of interesting basis or a useful basis.

[Marco Gualtieri] 12:30:03


Anyway, we'll we'll see that developing as as we go.

[Marco Gualtieri] 12:30:06


But but that's the basic a basic idea is that all right.

[Marco Gualtieri] 12:30:12


If I continue, maybe do you have a? Is it important to ask?

[Mehdi Benallegue] 12:30:16


Now go go ahead. Ask your question now. I wanted to ask because since we're talking
about this similarity of a matrix and we're saying there's a single map where we
are relayable and then we relabel

[Mehdi Benallegue] 12:30:29


back. So let's say you have your change from like what From the basis like from
Beta to beta prime.

[Marco Gualtieri] 12:30:40


Yes, can. you have is the matrix that represents the linear mob from the bit of
time basis to the database is also similar, because when you're it's not inverse.

[Marco Gualtieri] 12:30:55


Right. Yes, so I I I when you're thinking direct you're thinking along the right
lines.

[Marco Gualtieri] 12:31:00


But let me ask you to just think about that on your own and look at the formulas
for changing basis, and see that that when you yeah, that's a good point.

[Marco Gualtieri] 12:31:12


So maybe i'll say this this way so for example right if

[Marco Gualtieri] 12:31:25


Yeah, I it'll take me too long to to to give a complete answer to your question.

[Marco Gualtieri] 12:31:30


But you are thinking along the right lines that that that taking an operator like
this, a and multiplying it on the left and on the right by F and f inverse.

[Marco Gualtieri] 12:31:46


This is exactly the same as what you do when you change basis on a matrix, right?

[Marco Gualtieri] 12:31:51


So that if I wanted to take it's very you know compare this to a change of basis
where if I have a a linear bound written in one basis, we can express express
express this in this way.

[Marco Gualtieri] 12:32:25


Okay. And what you see is that this operator we could call it B.
[Marco Gualtieri] 12:32:30
This operator we could call it a Umhm. This we could call it F.

[Marco Gualtieri] 12:32:36


And this is indeed f inverse, because the order here is different.

[Marco Gualtieri] 12:32:42


Right. This is Beta on the domain side, and this is Beta on the code domain side.

[Marco Gualtieri] 12:32:47


So when we change the basis for a for an operator, we are indeed,

[Marco Gualtieri] 12:32:55


Giving an example of this type of similarity. So 2 matrices that differ by a change
of basis, and these are matrices that are acting on the same space.

[Marco Gualtieri] 12:33:11


They they would be similar. Okay, So let's let's start talking about these
operators.

[Marco Gualtieri] 12:33:18


So the first thing I want to kind of expose you to is what kinds of operators are
there in the linear operators on?

[Marco Gualtieri] 12:33:34


V: Okay. So I just want you to recall self maps.

[Marco Gualtieri] 12:33:44


Remember that we were, you know, when we were looking at maps from a set to itself
right.

[Marco Gualtieri] 12:33:53


We had these maps that would do something like this.

[Marco Gualtieri] 12:33:59


That was an example of a cell map which took the first 2 elements and flip them
over right.

[Marco Gualtieri] 12:34:06


And this is an important idea that that you're doing a permutation on the elements
of the set.

[Marco Gualtieri] 12:34:14


And this is why? you don't you know it's it's when you are studying such maps that
you do not want to change the labeling independently on the domain of the code
domain you want to make sure that you

[Marco Gualtieri] 12:34:26


retain the information that the first 2 elements of the set are actually exchanged.

[Marco Gualtieri] 12:34:30


So if I if I relabel this to 2, one and 3.

[Marco Gualtieri] 12:34:36


But I did not relabel this then you might say that this self map, you know it was
the same thing as the identity map, which it clearly is not because you're actually
flipping the 2 first elements.

[Marco Gualtieri] 12:34:51


So. So if you remember we actually kind of drew out all of the possible self maps,
right, and some of them looked like this, let's say, and so on.

[Marco Gualtieri] 12:35:04


So there were all kinds of self maps that we could be interested in.

[Marco Gualtieri] 12:35:08


So the question that I'm asking here is what kind of operators?

[Marco Gualtieri] 12:35:12


What kind of self maps can I have in a in in this, in on a vector space?

[Marco Gualtieri] 12:35:17


Okay, So let's just go through the main important examples example 0 is the 0
operator

[Marco Gualtieri] 12:35:28


What does it do? It just takes a vector in V and takes it to 0, and the matrix of
the 0 operator.

[Marco Gualtieri] 12:35:37


With respect to a basis, Beta is just 0 independent of independent of Beta.

[Marco Gualtieri] 12:35:51


Okay. The next example the identity operator

[Marco Gualtieri] 12:36:01


Hi! what is the do? It takes v to p that's a linear operator, and the matrix for
the identity When I use the same basis for the domain of the code domain is what we
discussed before it's just ones down the

[Marco Gualtieri] 12:36:13


diagonal

[Marco Gualtieri] 12:36:22


Okay, and this is also independent of which basis I choose just because there's so
simple.

[Marco Gualtieri] 12:36:32


Okay. Now, another example, which is very similar to the previous one.

[Marco Gualtieri] 12:36:36


Suppose that I take an element in the field, Then I can take that element
multiplied by the identity that will take V.

[Marco Gualtieri] 12:36:46


Lambda V. for all V. Okay, and the matrix of Lambda times.

[Marco Gualtieri] 12:36:53


The identity in the basis. Beta is just lambda down the diagonal

[Marco Gualtieri] 12:37:01


Okay, and there's depending on the dimension there'll be several lambdas Again,
this is independent of data.

[Marco Gualtieri] 12:37:14


Okay, So far, these are very simple operators. They just multiply by a scale,
Right?
[Marco Gualtieri] 12:37:22
Actually, the first 2 are examples of the second of the third now, let's do a a
slightly more complicated example.

[Marco Gualtieri] 12:37:36


Okay, which is the operator lambda, one lambda, 2 0 0, which which is going, which
is a linear, map from F 2 to f 2.

[Marco Gualtieri] 12:37:55


Where here Lambda, one, and Lambda, 2 are elements of the field.

[Marco Gualtieri] 12:37:59


F

[Marco Gualtieri] 12:38:06


Okay. So So if I draw a picture of this operator, what is it doing?

[Marco Gualtieri] 12:38:17


Taking the standard basis, 1 0 0 one and it's simply multiplying the first basis
element by Lambda one

[Marco Gualtieri] 12:38:36


And it's multiplying the second by lambda 2

[Marco Gualtieri] 12:38:46


Okay, So what that means is that the unit square would be sent to this rectangle

[Marco Gualtieri] 12:39:04


Okay, That's what this linear map is doing it's expanding or contracting, depending
on how big lambda one is.

[Marco Gualtieri] 12:39:12


It's it's expanding or contracting in the X direction by one factor, and in the Y
direction by a different factor.

[Marco Gualtieri] 12:39:22


Okay, So it could be, for example, that lambda 2 is very very small.

[Marco Gualtieri] 12:39:26


So it's squishing it down to a pancake and Lambda.

[Marco Gualtieri] 12:39:30


One could be very large, right? So that could be a possibility.

[Marco Gualtieri] 12:39:36


Okay. but but I just want to, make some an important comment, which is that

[Marco Gualtieri] 12:39:43


This matrix will change if we use a different basis

[Marco Gualtieri] 12:39:59


Okay. So it looks simple

[Marco Gualtieri] 12:40:06


Diagonal in the standard basis, but it would not be simple and an arbitrary basis.

[Marco Gualtieri] 12:40:23


Okay, So just to do an example of this let's say that we take a to be one up 1 0 0
2.
[Marco Gualtieri] 12:40:33
Okay.

[Marco Gualtieri] 12:40:42


Okay, So this is in the basis let's say e one e 2.

[Marco Gualtieri] 12:40:48


I'll call this basis Beta Okay, Now let's consider a different basis in basis.

[Marco Gualtieri] 12:40:54


Beta prime e one prime e 2 prime. Okay, this is I wanna figure out what the base,
what the matrix looks like in that basis.

[Marco Gualtieri] 12:41:06


So so in order to do this, I need to change my basis So i'll i'll i'll look at the

[Marco Gualtieri] 12:41:15


So this 1 0 0 2. This is a written in the basis, Beta.

[Marco Gualtieri] 12:41:23


But if I want to know what is a in the basis, Beta Prime, this is gonna be.

[Marco Gualtieri] 12:41:30


I beta prime beta a beta beta. I beta beta prime.

[Marco Gualtieri] 12:41:40


Okay, So I need to specify what is I beta beta Prime or Beta Prime Beta right?

[Marco Gualtieri] 12:41:52


And I need to say what what is Beta Prime in terms of Beta.

[Marco Gualtieri] 12:41:57


So let me write that. so e one prime. I can choose my new basis.

[Marco Gualtieri] 12:42:02


So let me just choose choose it to be as follow so I'm going to say that Beta prime
the first element is going to be one over square root, 2 e one plus one over square
root 2 e 2 and the second basis.

[Marco Gualtieri] 12:42:18


Element. e 2 prime is going to be minus one over square root, 2 e one plus one over
square root, 2 e 2.

[Marco Gualtieri] 12:42:26


So what is this oops? What is this? Let me just make a little bit of space here.

[Marco Gualtieri] 12:42:36


So here's my initial basis. e one e 2 and here is my new basis.

[Marco Gualtieri] 12:42:46


E, one prime e, 2 prime

[Marco Gualtieri] 12:43:02


Okay, So I've basically wrote it in my basis by 45 degrees.

[Marco Gualtieri] 12:43:08


Okay, So I I I want to know what does the matrix look like in this new basis.
[Marco Gualtieri] 12:43:16
So it's going to be p inverse a p Where P.

[Marco Gualtieri] 12:43:26


Is this expression of the Beta prime basis in terms of the databases, and that's
exactly given by those coefficients, one over square root, 2 oops, one over square
root, 2 one over square root, 2 minus one over square.

[Marco Gualtieri] 12:43:44


root, 2 and one over square root, 2 so i've written Now the 2 basis this basis
element I've written it in this column, because I'm supposed to express Beta prime
in terms.

[Marco Gualtieri] 12:43:56


Of Beta. and then I I put these coefficients down here.

[Marco Gualtieri] 12:43:58


Okay, that gives me P. and then this will be P inverse.

[Marco Gualtieri] 12:44:02


And so if I work that out, i'm gonna get i'm gonna get one over square root, 2 one
over square root, 2 minus one over square root, 2 one over square, 2 then i'll have
my 1 2 and then i'll

[Marco Gualtieri] 12:44:25


have one over square, 2 minus. Okay, and if I work that out what i'll get. Is
that's 3 halves?

[Marco Gualtieri] 12:44:38


One half, one half, and 3 halves. So we see that this thing is no longer diagonal.

[Marco Gualtieri] 12:44:49


Okay, so it So it looks simple in the beta basis. but it does not look simple in
the beta prime basis.

[Marco Gualtieri] 12:44:58


And that's because you know you know the reason that you want and need to are
special, is because e one is is multiplied by a constant and e 2 is multiplied by a
constant But if you take the sum.

[Marco Gualtieri] 12:45:12


Of you wanna need to, then that is not multiplied by a constant right.

[Marco Gualtieri] 12:45:18


Because if you look at this quantity here, what is the operator going to do it's
going to multiply e one by the constant one, and it's going to multiply e 2 by the
constant 2.

[Marco Gualtieri] 12:45:30


So this vector. is not going to be sent to an overall multiple of itself anymore.

[Marco Gualtieri] 12:45:33


It's going to be sent to a different vector so what we would say, is that is that e
one and e 2 are i eigenvectors and e one prime and e 2 prime. are not okay, So this
means I didn't vector

[Marco Gualtieri] 12:45:57


it means kind of self vector it's sent to itself in a sense. it's like a fixed
point similar to a fixed point.
[Marco Gualtieri] 12:46:07
Okay, so. So it's it's a lot better to write your operator in a basis which is a
basis of eigenvectors, because then the matrix will look simple.

[Marco Gualtieri] 12:46:19


It will look diagonal that's the idea so these eigenvectors.

[Marco Gualtieri] 12:46:29


So the definition definition, let a be a linear operator on V.

[Marco Gualtieri] 12:46:39


So an eigenvector is a non 0 vector v.

[Marco Gualtieri] 12:46:52


Such that av is Lambda V. for some lambda in the field.

[Marco Gualtieri] 12:47:01


It is sent to a multiple of itself, and I just want to point out that it could be,
but it's send it to 0 multiple 0 on itself.

[Marco Gualtieri] 12:47:14


This may this this lambda it's called the eigenvalue of the eigenvector v and the
set of all eigenvalues of the operator.

[Marco Gualtieri] 12:47:42


A it's called it's spectrum

[Marco Gualtieri] 12:47:53


Okay, So there's a few things being defined here so let me just highlight that.

[Marco Gualtieri] 12:48:01


So we have the definition of an eigenvector it's a non-zero vector that is sent to
itself.

[Marco Gualtieri] 12:48:07


Multiply it by a constant, that constant is the Eigenvalue, and the set of all
Eigenvalues is called the spectrum of a of the of the operator.

[Marco Gualtieri] 12:48:20


A okay, Maybe another piece of language is that the process finding so? okay?

[Marco Gualtieri] 12:48:33


So if we can find a basis A. V, all of which you know, which consists entirely of
sorry entirely of Eigenvectors

[Marco Gualtieri] 12:49:10


Which consists entirely of Eigenvectors of a with Let me just v.

[Marco Gualtieri] 12:49:19


One Vn. a V. which consists entirely of Eigenvalues of Eigenvectors of a that is to
say, that a applied to Vi.

[Marco Gualtieri] 12:49:31


Is equal to Lambda, i. Times vi. for I equals one up to n.

[Marco Gualtieri] 12:49:41


Then the matrix of a in this basis is extremely simple, because it just what does
they do?
[Marco Gualtieri] 12:49:51
It just takes the basis vector, to itself, multiplied by lambda.

[Marco Gualtieri] 12:49:53


So we're just going to get

[Marco Gualtieri] 12:50:09


A diagonal matrix

[Marco Gualtieri] 12:50:19


Where the Eigenvalues are on the diagonal

[Marco Gualtieri] 12:50:27


Okay, So that would mean that in this basis V. W. Oops

[Marco Gualtieri] 12:50:38


V. one v. 2, v. 3. right? then see. The whole point is that But when you, when you
find a basis of eigenvectors, that means that in this basis, if you take the unit
cube oops, I want to make this in black if you

[Marco Gualtieri] 12:50:57


take the unit cube in this basis

[Marco Gualtieri] 12:51:07


Right. Then what is it going to do it's going to multiply the first edge by lambda?

[Marco Gualtieri] 12:51:16


One, the second edge by lambda, 2 the third edge.

[Marco Gualtieri] 12:51:29


Hi, Lana! 3 guys and what you'll get

[Marco Gualtieri] 12:51:46


Is a scaled cube where the scaling is different in every dimension, so we will take
a unit cube in the basis V one up to Vn.

[Marco Gualtieri] 12:52:00


To a rescaled cube, which has different scaling factors in all the different
directions that's Now that means that you've organized yourself, and you know
exactly what this linear operator is doing.

[Marco Gualtieri] 12:52:11


It's taking certain special directions and scaling them and other special
directions and scaling loads.

[Marco Gualtieri] 12:52:18


Okay, So this is the ideal scenario this would be like this would be like, So okay,
but there's a warning.

[Marco Gualtieri] 12:52:27


So there's a warning that i'll and then i'll ask for questions.

[Marco Gualtieri] 12:52:31


So there's a warning which is that it may not be possible

[Marco Gualtieri] 12:52:40


To find so many I can vectors. Okay, so it might be that you're linear operator
does not have enough I Eigenvectors to find a basis.
[Marco Gualtieri] 12:52:57
Okay, So this is very similar to you know we we when you, when you have a
permutation, right?

[Marco Gualtieri] 12:53:10


Of course it could be that every single point is fixed, but we know that there are
possibilities where they're not fixed where they are actually permuted in this way.

[Marco Gualtieri] 12:53:22


Or you could have something like this. Where one is sent to 2, 2 is sent to 3, 3
ascent to 4, and 4 is sent to one right.

[Marco Gualtieri] 12:53:32


You will not be able to label these so that every point is fixed.

[Marco Gualtieri] 12:53:37


Sometimes you have these cycles

[Marco Gualtieri] 12:53:45


The presence of these cycles is very similar to

[Marco Gualtieri] 12:53:49


Well, what we're gonna see is that just like the fact that permutations can have
cycles that cannot be removed by a relabeling.

[Marco Gualtieri] 12:53:57


Similarly, it is possible for a linear operator to have these cycles which cannot
be removed and diagonalized, so that you will, you won't necessarily be able to
find a basis of eigenvectors.

[Marco Gualtieri] 12:54:10


It's a very similar, a totally analogous process so just to give you an example of
this

[Marco Gualtieri] 12:54:21


Consider this consider this transformation from F 2 to f 2, right?

[Marco Gualtieri] 12:54:30


What does it do? Well, here's my basis let's say you want an e 2 e one is sent to
itself

[Marco Gualtieri] 12:54:44


But e 2 is sent to e one plus e 2, so that means that

[Marco Gualtieri] 12:54:56


So it means that this unit square it's taken to this slanted square

[Marco Gualtieri] 12:55:10


Okay, And so you can see that any other vector right? if I So e 2 fails to be an
eigenvector.

[Marco Gualtieri] 12:55:17


E 2 is not sent to itself, or a multiple of itself. so you could try to find some
other vector somewhere, right?

[Marco Gualtieri] 12:55:27


And you could ask, Is there another vector? Is there another vector, somewhere here
that is sent to itself, or to a multiple of itself?
[Marco Gualtieri] 12:55:34
But you can see that that is impossible, because any vector I pick which is not in
this line is going to be shared in this direction.

[Marco Gualtieri] 12:55:43


It's going to be moved in that direction, and so so you have So in this case there
is only one eigenvector.

[Marco Gualtieri] 12:56:02


Well, there is only

[Marco Gualtieri] 12:56:08


So let me put it this way it's not possible

[Marco Gualtieri] 12:56:15


To extend the Eigenvector E. one to a basis of Eigenvectors.

[Marco Gualtieri] 12:56:31


In this case. Okay, because e 2. When I look at what it does to e 2, what you get
is e 2 plus a multiple of V: one.

[Marco Gualtieri] 12:56:42


Okay? Or you could look also at this 1 0, 1 0 0.

[Marco Gualtieri] 12:56:47


What does it do? Well, it takes e one to 0 oops

[Marco Gualtieri] 12:56:55


It takes e one into 0, and it takes e 2 to be one. right?

[Marco Gualtieri] 12:57:06


So you see that e one is an eigenvector, because it is sent to a multiple of
itself, namely, 0.

[Marco Gualtieri] 12:57:15


So here e one is an eigenvector with lambda equals 0, but e 2 is sent is sent to e
one, so it's like a cycle.

[Marco Gualtieri] 12:57:28


Kind of that e 2 is sent to e one and d one is sent to 0, but e 2 there's no other
vector if I take any other vector at all, any vector oops

[Marco Gualtieri] 12:57:43


Let's say ae one plus b e 2 where is it going to be sent?

[Marco Gualtieri] 12:57:48


It's going to be sent to bee one so this This cannot be a multiple of ae one plus b
2.

[Marco Gualtieri] 12:57:59


This cannot be a multiple unless a is equal to 0 and well, if it's equal to 0.

[Marco Gualtieri] 12:58:05


Then we're just back It sorry sorry it would require B.

[Marco Gualtieri] 12:58:10


To be equal to 0 and then we're back at we're back at e one again.
[Marco Gualtieri] 12:58:15
So So any other vector is not is not an eigenvector, right?

[Marco Gualtieri] 12:58:23


So the basic task that we need to do is we're gonna do that next week is we need to
deal with this problem that, we we cannot necessarily find a basis of eigenvectors.

[Marco Gualtieri] 12:58:34


But we have to try, so we need to figure out how to find a basis of eigenvectors
when it does exist, and if it doesn't exist, then we need to do the best we can,
and for what we call generalized

[Marco Gualtieri] 12:58:46


Eigenvectors, and these generalized Eigenvectors will be like the cycles that we
saw in the category of sets.

[Marco Gualtieri] 12:58:55


Okay, let me leave it there for today. and if there's Any questions go ahead Any
questions

[Mehdi Benallegue] 12:59:08


Yes, So I okay, maybe i'm jumping a little bit ahead.

[Marco Gualtieri] 12:59:15


But in the problem set we congratulations I I can.

[Mehdi Benallegue] 12:59:24


I'll use. sit on the they are at all and then we also we're supposed to show, but
no matter of a basis.

[Mehdi Benallegue] 12:59:31


The trace function is always going to be the same. So if we express the matrix in
the Iigen vector basis, then we take the trace. is gonna be this the sum of all the
eigenvalues right?

[Marco Gualtieri] 12:59:45


For any that's right yeah so the trace you can think of it as taking the sum of the
items values.

[Marco Gualtieri] 12:59:55


That's right, but in order in order for that to be true, you'll need there to be a
basis of eigenvectors, and there may not be a basis for eigenvectors, so that gives
an interpretation to the

[Marco Gualtieri] 13:00:09


trace. If there's a basis of eigenvectors, if there's a basis for eigenvectors,
then you can say that the trace the meaning of the trace is that it's the sum of
all those

[Marco Gualtieri] 13:00:19


numbers. Those Eigenvalues and what we're going to find out later is, Well, what
does the trace mean? If there is not a basis of eigenvectors, then what is it?

[Mehdi Benallegue] 13:00:33


And there is a basis of Eigenvectors. Can we, can we determine the icon, the
individual Eigenvalues based on the value of a trace?

[Marco Gualtieri] 13:00:43


Or is it just trace is only the sum so it's it's only a collective information.
[Mehdi Benallegue] 13:00:50
It's not an individual information. Okay, thank you yeah my team goes.

[Mateen Ismail] 13:00:56


Go ahead. yes, so I was just like being around the few more operations I like.

[Mateen Ismail] 13:01:01


I look in a skate it scales like, and look like that any basis that the thing will.

[Marco Gualtieri] 13:01:15


For which operator are you talking about I think it's a scale where basically, you
just multiply every back.

[Marco Gualtieri] 13:01:22


Then, yeah, indeed. So see. that was example number 2: that if if you multiply the
vectors just by a constant, then then any vector.

[Mateen Ismail] 13:01:35


Is an eigenvector that's correct and it has the same eigenvalue which is that scale
factor. Exactly.

You might also like