Professional Documents
Culture Documents
Fiedler Linearizations of Polynomial Matrices and Applications
Fiedler Linearizations of Polynomial Matrices and Applications
1 DataScouting S.A., Thessaloniki, Greece (www.datascouting.com) 2 Department of Informatics and Communications, Technological Educational
Institute of Serres, Greece
Dr Vologiannidis Stavros
Introduction
Denition A
is a matrix pencil
L(s ) = U (s )
T (s )
0
I(n1)p
V (s )
where U (s ), V (s ) are unimodular matrices, i.e. matrices with constant non-zero determinants. Moreover, L(s ) is a of T (s ) if the dual pencil L# (s ) = sL0 L1 is a linearization of the dual polynomial matrix
Dr Vologiannidis Stavros
Strong Linearization
Introduction
Denition A
is a matrix pencil
L(s ) = U (s )
T (s )
0
I(n1)p
V (s )
where U (s ), V (s ) are unimodular matrices, i.e. matrices with constant non-zero determinants. Moreover, L(s ) is a of T (s ) if the dual pencil L# (s ) = sL0 L1 is a linearization of the dual polynomial matrix
Dr Vologiannidis Stavros
Strong Linearization
Companion Linearizations
The most common linearizations of T (s ) are the well known rst and second companion linearizations P (s ) and P (s ) Ip 0 0 0 Ip 0 . . .. .. 0 Ip . . . . . . . . . . . . P (s ) = s . . , .. .. 0 0 Ip . . 0 . T0 T1 Tn1 0 0 Tn Ip 0 0 0 0 T0 . . . .. . . . . Ip . . . . . . (s ) = s 0 Ip P . .. .. . .. . . . . . 0 Tn2 . 0 . 0 0 Tn 0 Ip Tn1
Companion Linearizations
The most common linearizations of T (s ) are the well known rst and second companion linearizations P (s ) and P (s ) Ip 0 0 0 Ip 0 . . .. .. 0 Ip . . . . . . . . . . . . P (s ) = s . . , .. .. 0 0 Ip . . 0 . T0 T1 Tn1 0 0 Tn Ip 0 0 0 0 T0 . . . .. . . . . Ip . . . . . . (s ) = s 0 Ip P . .. .. . .. . . . . . 0 Tn2 . 0 . 0 0 Tn 0 Ip Tn1
Companion Linearizations
The most common linearizations of T (s ) are the well known rst and second companion linearizations P (s ) and P (s ) Ip 0 0 0 Ip 0 . . .. .. 0 Ip . . . . . . . . . . . . P (s ) = s . . , .. .. 0 0 Ip . . 0 . T0 T1 Tn1 0 0 Tn Ip 0 0 0 0 T0 . . . .. . . . . Ip . . . . . . (s ) = s 0 Ip P . .. .. . .. . . . . . 0 Tn2 . 0 . 0 0 Tn 0 Ip Tn1
uk Rq , k = 0, 1, 2, 3, . . . is the input vector, yk Rp , k = 0, 1, 2, 3, . . . is the output vector, Ti Rpp , i = 0, 1, . . . , n and Si Rpq , i = 0, 1, . . . , m. T ( )yk = S ( )uk
where T ( ) = n=0 Ti i , S ( ) = m 0 Si i are polynomial i i= matrices and is the forward shift operator, i.e. xk = xk +1 . The behaviour of the ARMA model can be studied through the algebraic properties of T ( ).
Dr Vologiannidis Stavros
uk Rq , k = 0, 1, 2, 3, . . . is the input vector, yk Rp , k = 0, 1, 2, 3, . . . is the output vector, Ti Rpp , i = 0, 1, . . . , n and Si Rpq , i = 0, 1, . . . , m. T ( )yk = S ( )uk
where T ( ) = n=0 Ti i , S ( ) = m 0 Si i are polynomial i i= matrices and is the forward shift operator, i.e. xk = xk +1 . The behaviour of the ARMA model can be studied through the algebraic properties of T ( ).
Dr Vologiannidis Stavros
uk Rq , k = 0, 1, 2, 3, . . . is the input vector, yk Rp , k = 0, 1, 2, 3, . . . is the output vector, Ti Rpp , i = 0, 1, . . . , n and Si Rpq , i = 0, 1, . . . , m. T ( )yk = S ( )uk
where T ( ) = n=0 Ti i , S ( ) = m 0 Si i are polynomial i i= matrices and is the forward shift operator, i.e. xk = xk +1 . The behaviour of the ARMA model can be studied through the algebraic properties of T ( ).
Dr Vologiannidis Stavros
Tn
0 . . . xk +1 0
0 T0
0 . . .
. T1
..
Ip
0 . .. . . . 0 Ip Tn1
xk
0 . . = . 0
S0 Sm
0 . . . v k 0
The coecient matrices on the l.h.s. of the above equation are the those of the First Companion Linearization P ( ). The behaviour of the ARMA model can be studied through the properties of the rst companion linearizationP ( ).
Dr Vologiannidis Stavros
Tn
0 . . . xk +1 0
0 T0
0 . . .
. T1
..
Ip
0 . .. . . . 0 Ip Tn1
xk
0 . . = . 0
S0 Sm
0 . . . v k 0
The coecient matrices on the l.h.s. of the above equation are the those of the First Companion Linearization P ( ). The behaviour of the ARMA model can be studied through the properties of the rst companion linearizationP ( ).
Dr Vologiannidis Stavros
eigenvector
eigenvalue
it is easy to see that 0 is also an eigenvalue of T ( ), while the associated eigenvector x0 can be recovered from the rst p components of 0 .
Dr Vologiannidis Stavros
eigenvector
eigenvalue
it is easy to see that 0 is also an eigenvalue of T ( ), while the associated eigenvector x0 can be recovered from the rst p components of 0 .
Dr Vologiannidis Stavros
eigenvector
eigenvalue
it is easy to see that 0 is also an eigenvalue of T ( ), while the associated eigenvector x0 can be recovered from the rst p components of 0 .
Dr Vologiannidis Stavros
Dynamic analysis of structures (T0 and T2 real symmetric and positive semidenite) T T Vibrations of spinning structures (T0 = T0 , T2 = T2 T ,T1 = T1 ) Modelling of excitation of train tracks (palindromic, sparse, matrix dimensions=1005) Camera surveillance .... Optimal control Stability analysis of vibrating systems under state delay feedback control ....
Dr Vologiannidis Stavros
Dynamic analysis of structures (T0 and T2 real symmetric and positive semidenite) T T Vibrations of spinning structures (T0 = T0 , T2 = T2 T ,T1 = T1 ) Modelling of excitation of train tracks (palindromic, sparse, matrix dimensions=1005) Camera surveillance .... Optimal control Stability analysis of vibrating systems under state delay feedback control ....
Dr Vologiannidis Stavros
Dynamic analysis of structures (T0 and T2 real symmetric and positive semidenite) T T Vibrations of spinning structures (T0 = T0 , T2 = T2 T ,T1 = T1 ) Modelling of excitation of train tracks (palindromic, sparse, matrix dimensions=1005) Camera surveillance .... Optimal control Stability analysis of vibrating systems under state delay feedback control ....
Dr Vologiannidis Stavros
Motivation
Many applications lead to (large scale) matrix polynomial problems where the coecients are sparse and structured Any kind of extra structure (arising typically from the properties of the underlying physical problem) should be reected as much as possible in the method used for solving the problem Reliable numerical algorithms are available for matrix pencils Special techniques exist for structured matrix pencils Out of the innite number of linearizations which one should we choose? Linearizations have varying eigenvalue condition numbers Solving PEP with a backward stable algorithm (e.g., QZ) applied to a linearization can be backward unstable for the PEP
Motivation
Many applications lead to (large scale) matrix polynomial problems where the coecients are sparse and structured Any kind of extra structure (arising typically from the properties of the underlying physical problem) should be reected as much as possible in the method used for solving the problem Reliable numerical algorithms are available for matrix pencils Special techniques exist for structured matrix pencils Out of the innite number of linearizations which one should we choose? Linearizations have varying eigenvalue condition numbers Solving PEP with a backward stable algorithm (e.g., QZ) applied to a linearization can be backward unstable for the PEP
Motivation
Many applications lead to (large scale) matrix polynomial problems where the coecients are sparse and structured Any kind of extra structure (arising typically from the properties of the underlying physical problem) should be reected as much as possible in the method used for solving the problem Reliable numerical algorithms are available for matrix pencils Special techniques exist for structured matrix pencils Out of the innite number of linearizations which one should we choose? Linearizations have varying eigenvalue condition numbers Solving PEP with a backward stable algorithm (e.g., QZ) applied to a linearization can be backward unstable for the PEP
Extended Fiedler Linerizations - (Antoniou, Vologiannidis) (Extended) Fiedler linearizations focus on: The construction of (companion like) linearizations using the unperturbed coecients of T (s ). Preservation of selected structural properties of the polynomial matrix. Direct eigenvector recovery by inspection.
Dr Vologiannidis Stavros
Extended Fiedler Linerizations - (Antoniou, Vologiannidis) (Extended) Fiedler linearizations focus on: The construction of (companion like) linearizations using the unperturbed coecients of T (s ). Preservation of selected structural properties of the polynomial matrix. Direct eigenvector recovery by inspection.
Dr Vologiannidis Stavros
Extended Fiedler Linerizations - (Antoniou, Vologiannidis) (Extended) Fiedler linearizations focus on: The construction of (companion like) linearizations using the unperturbed coecients of T (s ). Preservation of selected structural properties of the polynomial matrix. Direct eigenvector recovery by inspection.
Dr Vologiannidis Stavros
Extended Fiedler Linerizations - (Antoniou, Vologiannidis) (Extended) Fiedler linearizations focus on: The construction of (companion like) linearizations using the unperturbed coecients of T (s ). Preservation of selected structural properties of the polynomial matrix. Direct eigenvector recovery by inspection.
Dr Vologiannidis Stavros
Extended Fiedler Linerizations - (Antoniou, Vologiannidis) (Extended) Fiedler linearizations focus on: The construction of (companion like) linearizations using the unperturbed coecients of T (s ). Preservation of selected structural properties of the polynomial matrix. Direct eigenvector recovery by inspection.
Dr Vologiannidis Stavros
Extended Fiedler Linerizations - (Antoniou, Vologiannidis) (Extended) Fiedler linearizations focus on: The construction of (companion like) linearizations using the unperturbed coecients of T (s ). Preservation of selected structural properties of the polynomial matrix. Direct eigenvector recovery by inspection.
Dr Vologiannidis Stavros
Overview
Denition of some elementary matrices containing coecients of the original polynomial matrix and presentation of some of their properties. Method to construct Fiedler linearizations using multiplication of elementary matrices. Method to construct block symmetric Fiedler linearizations using multiplication of elementary matrices. Recovery of the eigenvectors of the original polynomial matrix by those of the linearization.
Dr Vologiannidis Stavros
Elementary matrices
Denition (Elementary matrices) Let T (s ) a regular polynomial matrix. Then we dene the following elementary matrices corresponding to T (s ) as follows: Ip(k 1) 0 .. , k = 1, 2, . . . , n 1, . Ck Ak = 0 . .. . . Ip(nk 1) . where
Ck =
and
Ip Ip Tk
0
Theorem [1] Let i , j {0, 1, 2, . . . , n 1}. Then Ai Aj = Aj Ai if and only if |i j | = 1. Denition Let I1 and I2 be two tuples. I1 will be termed equivalent to I2 (I1 I2 ) if and only if AI1 = AI2 .
Dr Vologiannidis Stavros
Theorem [1] Let i , j {0, 1, 2, . . . , n 1}. Then Ai Aj = Aj Ai if and only if |i j | = 1. Denition Let I1 and I2 be two tuples. I1 will be termed equivalent to I2 (I1 I2 ) if and only if AI1 = AI2 .
Dr Vologiannidis Stavros
Theorem [1] Let i , j {0, 1, 2, . . . , n 1}. Then Ai Aj = Aj Ai if and only if |i j | = 1. Denition Let I1 and I2 be two tuples. I1 will be termed equivalent to I2 (I1 I2 ) if and only if AI1 = AI2 .
Dr Vologiannidis Stavros
A = Inp .
(k , k + 1, ..., l ), k l , k > l
free
operation
Example 0 T0 T0 T1 A0 A1 A0 =
Ip T1 2 A1 A0 A1 = T1 T1 T0
Ip(n2)
Ip(n2)
A = Inp .
(k , k + 1, ..., l ), k l , k > l
free
operation
Example 0 T0 T0 T1 A0 A1 A0 =
Ip T1 2 A1 A0 A1 = T1 T1 T0
Ip(n2)
Ip(n2)
A = Inp .
(k , k + 1, ..., l ), k l , k > l
free
operation
Example 0 T0 T0 T1 A0 A1 A0 =
Ip T1 2 A1 A0 A1 = T1 T1 T0
Ip(n2)
Ip(n2)
A = Inp .
(k , k + 1, ..., l ), k l , k > l
free
operation
Example 0 T0 T0 T1 A0 A1 A0 =
Ip T1 2 A1 A0 A1 = T1 T1 T0
Ip(n2)
Ip(n2)
Let I1 = (1, 2, 0, 1, 3, 2, 0, 1) and I2 = (1, 2, 0, , 3, , 2, 0). Then I1 has SIP but not I2 . Also note that I1 (1, , , 1, 3, 2, 0, 1) (1, 0, 2, , , 2, 0, 1) ... We need to nd an easy way to uniquely write index tuples corresponding to equal products of elementary matrices.
Dr Vologiannidis Stavros
1 1
02
31
Let I1 = (1, 2, 0, 1, 3, 2, 0, 1) and I2 = (1, 2, 0, , 3, , 2, 0). Then I1 has SIP but not I2 . Also note that I1 (1, , , 1, 3, 2, 0, 1) (1, 0, 2, , , 2, 0, 1) ... We need to nd an easy way to uniquely write index tuples corresponding to equal products of elementary matrices.
Dr Vologiannidis Stavros
1 1
02
31
Let I1 = (1, 2, 0, 1, 3, 2, 0, 1) and I2 = (1, 2, 0, , 3, , 2, 0). Then I1 has SIP but not I2 . Also note that I1 (1, , , 1, 3, 2, 0, 1) (1, 0, 2, , , 2, 0, 1) ... We need to nd an easy way to uniquely write index tuples corresponding to equal products of elementary matrices.
Dr Vologiannidis Stavros
1 1
02
31
Let I1 = (1, 2, 0, 1, 3, 2, 0, 1) and I2 = (1, 2, 0, , 3, , 2, 0). Then I1 has SIP but not I2 . Also note that I1 (1, , , 1, 3, 2, 0, 1) (1, 0, 2, , , 2, 0, 1) ... We need to nd an easy way to uniquely write index tuples corresponding to equal products of elementary matrices.
Dr Vologiannidis Stavros
1 1
02
31
Let I1 = (1, 2, 0, 1, 3, 2, 0, 1) and I2 = (1, 2, 0, , 3, , 2, 0). Then I1 has SIP but not I2 . Also note that I1 (1, , , 1, 3, 2, 0, 1) (1, 0, 2, , , 2, 0, 1) ... We need to nd an easy way to uniquely write index tuples corresponding to equal products of elementary matrices.
Dr Vologiannidis Stavros
1 1
02
31
Ik 1
01(l k +1)
Il k +1
01l
A(k :l ) =
I Tk . . . Tl
T0 T1
Inl 1
,0 < k l n 1
Il
. . . Tl
Inl 1
, k = 0, l n 1
Dr Vologiannidis Stavros
n1
i =n1
Example Consider n = 4 and I = (0, 1, 0, 2, 1, 3, 2, 0, 1, 0). A column standard form is AI = (A0 1 A2 A3 )(A0 A1 A2 )(A0 A1 )(A0 ) = A 0 0 0 T0 0 0 T0 T1 . 0 T0 T1 T2 T0 T1 T2 T3 A row standard form is AI = (A0 )(A1 A0 )(A2 A1 A0 )(A3 A2 A1 A0 ).
Dr Vologiannidis Stavros
n1
i =n1
Example Consider n = 4 and I = (0, 1, 0, 2, 1, 3, 2, 0, 1, 0). A column standard form is AI = (A0 1 A2 A3 )(A0 A1 A2 )(A0 A1 )(A0 ) = A 0 0 0 T0 0 0 T0 T1 . 0 T0 T1 T2 T0 T1 T2 T3 A row standard form is AI = (A0 )(A1 A0 )(A2 A1 A0 )(A3 A2 A1 A0 ).
Dr Vologiannidis Stavros
AI is operation free i any of the following statements hold 1 I satises the SIP. 2 AI can be written in the column standard form. 3 AI can be written in a row standard form.
Dr Vologiannidis Stavros
AI is operation free i any of the following statements hold 1 I satises the SIP. 2 AI can be written in the column standard form. 3 AI can be written in a row standard form.
Dr Vologiannidis Stavros
Tk Ip Ip 0
Row and column standard forms can be dened accordingly. Similar characterization of operation free products. Example
T1 T2 T3 T4
Dr Vologiannidis Stavros
0 . 0 0
Tk Ip Ip 0
Row and column standard forms can be dened accordingly. Similar characterization of operation free products. Example
T1 T2 T3 T4
Dr Vologiannidis Stavros
0 . 0 0
Tk Ip Ip 0
Row and column standard forms can be dened accordingly. Similar characterization of operation free products. Example
T1 T2 T3 T4
Dr Vologiannidis Stavros
0 . 0 0
Tk Ip Ip 0
Row and column standard forms can be dened accordingly. Similar characterization of operation free products. Example
T1 T2 T3 T4
Dr Vologiannidis Stavros
0 . 0 0
Construction of Linearizations
Let T (s ) be a pol. matrix of degree n with T0 , Tn nonsingular. Choose k {1, 2, . . . , n}. Let P be a permutation of the tuple (0 : k 1) and LP , RP tuples with elements from (0 : k 2) s.t. (LP , P, RP ) satises the SIP. Let N be a permutation of the tuple (n : k ) and LN , RN tuples with elements from (n : k 1) s.t. (LN , N , RN ) satises the SIP. Then the matrix pencil
Notice that T0 is allowed to be singular if 0 (LP , RP ) while the / same holds for Tn if n (LN , RN ). /
Dr Vologiannidis Stavros
Construction of Linearizations
Let T (s ) be a pol. matrix of degree n with T0 , Tn nonsingular. Choose k {1, 2, . . . , n}. Let P be a permutation of the tuple (0 : k 1) and LP , RP tuples with elements from (0 : k 2) s.t. (LP , P, RP ) satises the SIP. Let N be a permutation of the tuple (n : k ) and LN , RN tuples with elements from (n : k 1) s.t. (LN , N , RN ) satises the SIP. Then the matrix pencil
Notice that T0 is allowed to be singular if 0 (LP , RP ) while the / same holds for Tn if n (LN , RN ). /
Dr Vologiannidis Stavros
Construction of Linearizations
Let T (s ) be a pol. matrix of degree n with T0 , Tn nonsingular. Choose k {1, 2, . . . , n}. Let P be a permutation of the tuple (0 : k 1) and LP , RP tuples with elements from (0 : k 2) s.t. (LP , P, RP ) satises the SIP. Let N be a permutation of the tuple (n : k ) and LN , RN tuples with elements from (n : k 1) s.t. (LN , N , RN ) satises the SIP. Then the matrix pencil
Notice that T0 is allowed to be singular if 0 (LP , RP ) while the / same holds for Tn if n (LN , RN ). /
Dr Vologiannidis Stavros
Construction of Linearizations
Let T (s ) be a pol. matrix of degree n with T0 , Tn nonsingular. Choose k {1, 2, . . . , n}. Let P be a permutation of the tuple (0 : k 1) and LP , RP tuples with elements from (0 : k 2) s.t. (LP , P, RP ) satises the SIP. Let N be a permutation of the tuple (n : k ) and LN , RN tuples with elements from (n : k 1) s.t. (LN , N , RN ) satises the SIP. Then the matrix pencil
Notice that T0 is allowed to be singular if 0 (LP , RP ) while the / same holds for Tn if n (LN , RN ). /
Dr Vologiannidis Stavros
Construction of Linearizations
Let T (s ) be a pol. matrix of degree n with T0 , Tn nonsingular. Choose k {1, 2, . . . , n}. Let P be a permutation of the tuple (0 : k 1) and LP , RP tuples with elements from (0 : k 2) s.t. (LP , P, RP ) satises the SIP. Let N be a permutation of the tuple (n : k ) and LN , RN tuples with elements from (n : k 1) s.t. (LN , N , RN ) satises the SIP. Then the matrix pencil
Notice that T0 is allowed to be singular if 0 (LP , RP ) while the / same holds for Tn if n (LN , RN ). /
Dr Vologiannidis Stavros
Construction of Linearizations
Let T (s ) be a pol. matrix of degree n with T0 , Tn nonsingular. Choose k {1, 2, . . . , n}. Let P be a permutation of the tuple (0 : k 1) and LP , RP tuples with elements from (0 : k 2) s.t. (LP , P, RP ) satises the SIP. Let N be a permutation of the tuple (n : k ) and LN , RN tuples with elements from (n : k 1) s.t. (LN , N , RN ) satises the SIP. Then the matrix pencil
Notice that T0 is allowed to be singular if 0 (LP , RP ) while the / same holds for Tn if n (LN , RN ). /
Dr Vologiannidis Stavros
Construction of Linearizations
Example Using k = 3 and P = (1, 0, 2) N = (3, 5, 4), RP = (1), LP = , RN = ,LN = (4) we get
0 I 0 0 I T1 0 0 0 0 T3 T4 P (s ) = s 0 0 T4 T5 0 0 I 0
0 0 0 I 0 0 0 T0 T1 0 I I T1 T2 0 0 0 0 0 T4 0 0 0 0 I
0 0 0 . I
(LP , P, RP ) = (1, 0, 2, 1) satises the SIP (LN , N , RN ) = (4, 3, 5, 4) satises the SIP
Dr Vologiannidis Stavros
Construction of Linearizations
0 0 T0 0 0 T0 T1 0 0 0 I T0 T1 T2 0 0 0 T4 0 0 0 0 0 I 0
0 0 0 . I
(LP , P, RP ) = (0, 1, 0, 2, 1, 0) satises the SIP (LN , N , RN ) = (4, 3, 5, 4) satises the SIP
Dr Vologiannidis Stavros
Construction of Linearizations
Why were the previous linearizations block symmetric? Can we produce all the block symmetric linearizations in this family?
Dr Vologiannidis Stavros
is a linearization of T (s ). This linearization can be further extended to produce more block symmetric linearizations by pre and post multiplying by the same elementary matrix Ai (always obeying SIP).
Dr Vologiannidis Stavros
is a linearization of T (s ). This linearization can be further extended to produce more block symmetric linearizations by pre and post multiplying by the same elementary matrix Ai (always obeying SIP).
Dr Vologiannidis Stavros
is a linearization of T (s ). This linearization can be further extended to produce more block symmetric linearizations by pre and post multiplying by the same elementary matrix Ai (always obeying SIP).
Dr Vologiannidis Stavros
is a linearization of T (s ). This linearization can be further extended to produce more block symmetric linearizations by pre and post multiplying by the same elementary matrix Ai (always obeying SIP).
Dr Vologiannidis Stavros
is a linearization of T (s ). This linearization can be further extended to produce more block symmetric linearizations by pre and post multiplying by the same elementary matrix Ai (always obeying SIP).
Dr Vologiannidis Stavros
Dr Vologiannidis Stavros
T T Let x = x1 xn
corresponding to an eigenvalue . If A(c :i ) is the column i =k 1 standard form of A(P,RP ) and ci = 0 such that cj = 0 for j > i, then xi +1 is a right eigenvector of T (s ) corresponding to .
Theorem
Let x = x1 xn C1np be a left eigenvector of P (s ) corresponding to an eigenvalue . If A(r :i ) is the row standard i =0 form of A(LP ,P) and ri = 0 such that rj = 0 for j > i, then xi +1 is a left eigenvector of T (s ) corresponding to .
i
k 1
Dr Vologiannidis Stavros
T T Let x = x1 xn
corresponding to an eigenvalue . If A(c :i ) is the column i =k 1 standard form of A(P,RP ) and ci = 0 such that cj = 0 for j > i, then xi +1 is a right eigenvector of T (s ) corresponding to .
Theorem
Let x = x1 xn C1np be a left eigenvector of P (s ) corresponding to an eigenvalue . If A(r :i ) is the row standard i =0 form of A(LP ,P) and ri = 0 such that rj = 0 for j > i, then xi +1 is a left eigenvector of T (s ) corresponding to .
i
k 1
Dr Vologiannidis Stavros
T T Let x = x1 xn
corresponding to an eigenvalue . If A(c :i ) is the column i =k 1 standard form of A(P,RP ) and ci = 0 such that cj = 0 for j > i, then xi +1 is a right eigenvector of T (s ) corresponding to .
Theorem
Let x = x1 xn C1np be a left eigenvector of P (s ) corresponding to an eigenvalue . If A(r :i ) is the row standard i =0 form of A(LP ,P) and ri = 0 such that rj = 0 for j > i, then xi +1 is a left eigenvector of T (s ) corresponding to .
i
k 1
Dr Vologiannidis Stavros
Example
Example Using k = 3 and P = (1, 0, 2) N = (3, 5, 4), RP = (1), LP = , RN = ,LN = (4) we get
0 I 0 0 I T1 0 0 0 0 T3 T4 P (s ) = s 0 0 T4 T5 0 0 I 0
0 0 0 I 0 0 0 T0 T1 0 I I T1 T2 0 0 0 0 0 T4 0 0 0 0 I
0 0 0 . I
Since (P, RP ) = ((1, 0, 2), (1)) ((1, 2), (0, 1)) , x2 is a right eigenvector of T (s ).
Dr Vologiannidis Stavros
T T Let x = x1 xn
corresponding to an eigenvalue at . If A(c :i ) is the column i =1 standard form of A(N ,RN ) and ci = n such that cj = n for j > i, then xi is a right eigenvector of T (s ) corresponding to .
Theorem
Let x = x1 xn C1np be a left eigenvector of P (s ) corresponding to an eigenvalue at . If A(r :i ) is the row i =n standard form of A(LN ,N ) and ri = n such that rj = n for j > i, then xi is a left eigenvector of T (s ) corresponding to .
i
Dr Vologiannidis Stavros
T T Let x = x1 xn
corresponding to an eigenvalue at . If A(c :i ) is the column i =1 standard form of A(N ,RN ) and ci = n such that cj = n for j > i, then xi is a right eigenvector of T (s ) corresponding to .
Theorem
Let x = x1 xn C1np be a left eigenvector of P (s ) corresponding to an eigenvalue at . If A(r :i ) is the row i =n standard form of A(LN ,N ) and ri = n such that rj = n for j > i, then xi is a left eigenvector of T (s ) corresponding to .
i
Dr Vologiannidis Stavros
T T Let x = x1 xn
corresponding to an eigenvalue at . If A(c :i ) is the column i =1 standard form of A(N ,RN ) and ci = n such that cj = n for j > i, then xi is a right eigenvector of T (s ) corresponding to .
Theorem
Let x = x1 xn C1np be a left eigenvector of P (s ) corresponding to an eigenvalue at . If A(r :i ) is the row i =n standard form of A(LN ,N ) and ri = n such that rj = n for j > i, then xi is a left eigenvector of T (s ) corresponding to .
i
Dr Vologiannidis Stavros
Example
Example Using k = 3 and P = (1, 0, 2) N = (3, 5, 4), RP = (1), LP = , RN = ,LN = (4) we get
0 I 0 0 I T1 0 0 0 0 T3 T4 P (s ) = s 0 0 T4 T5 0 0 I 0
0 0 0 I 0 0 0 T0 T1 0 I I T1 T2 0 0 0 0 0 T4 0 0 0 0 I
0 0 0 . I
Dr Vologiannidis Stavros
Conclusions
Properties of products of elementary matrices. A broad family of linearizations Method for creating block symmetric linearizations. Eigenvector recovery is easily established due to the operation free property of Fiedler linearizations.
Dr Vologiannidis Stavros
Bibliography I
E. N. Antoniou and S. Vologiannidis. A new family of companion forms of polynomial matrices. Electron. J. Linear Algebra, 11:7887 (electronic), 2004. E. N. Antoniou and S. Vologiannidis. Linearizations of polynomial matrices with symmetries and their applications. Electron. J. Linear Algebra, 15:107114 (electronic), 2006. Miroslav Fiedler. A note on companion matrices. Linear Algebra Appl., 372:325331, 2003. Nicholas J. Higham, D. Steven Mackey, Niloufer Mackey, and Franoise Tisseur. Symmetric linearizations for matrix polynomials. SIAM J. Matrix Anal. Appl., 29(1):143159 (electronic), 2006.
Dr Vologiannidis Stavros
Bibliography II
Peter Lancaster. Symmetric transformations of the companion matrix. 8:146148 (electronic), 1961. Peter Lancaster and Uwe Prells. Isospectral families of high-order systems. ZAMM Z. Angew. Math. Mech., 87(3):219234, 2007. D. Steven Mackey, Niloufer Mackey, Christian Mehl, and Volker Mehrmann. Vector spaces of linearizations for matrix polynomials. SIAM J. Matrix Anal. Appl., 28(4):9711004 (electronic), 2006.
Dr Vologiannidis Stavros
Bibliography III
S. Vologiannidis and E. Antoniou. A permuted factors approach for the linearization of polynomial matrices. Mathematics of Control, Signals, and Systems (MCSS), 22:317342, 2011. 10.1007/s00498-011-0059-6.
Dr Vologiannidis Stavros