Professional Documents
Culture Documents
11 - Chapter 03
11 - Chapter 03
49
What you will find in this Chapter?
Pre requisites:
3.1 Introduction:
th
R.C matrix is a square matrix in which i row, for all integer value of i, is orthogonal
th
to i column. Though the set of all R.C matrices is a sub-class of square matrices but refrains
to obey some of the basic tenets of matrix algebra. Member matrices of the set of R.C
matrices, except the null matrix, are always non-singular and they disguise many invincible
characteristics seemingly common in nature. It has wide open many avenues for further
research work.
We introduce R.C matrix and try to unveil some of its excellent salient features. These
are the features which have become known to us on looking at its basic structure from
different angles.
What -we call, an R.C matrix is an outcome of an open discussion on the various
properties of matrices. Just a thought; what can happen if each one of the n rows of a matrix is
th
orthogonal to corresponding column of the same matrix. In general i row, for all integer
th
value of i, is orthogonal to i column and hence truly justified to name as ‘R.C’ matrix.
1 2 −6
A= 1 2 −3 is an R.C matrix.
1 1 3
We write its general form. A R.C matrix A of order × from the set JJn is as follows.
50
ᄂ …
A= ᄂ ᄂ ᄂ … ᄂ ∈
… …
ᄂ
…
…
… …
ᄃ
ᄂ …
.
[e.g. (,-) + , -. , .- + , -/ , /- + , -0 , 0- + … … … . = 1]
We have R.C matrices of higher order and even can be constructed by extending R.C
property from a given R.C matrix of order 3 × 3.
R.C matrices of order 2 × 2 are,
−1 −1 1 1 1 −1 −1 −1
(1) 2 1 −1 3, (2) 2 −1 1 3, (3) 2 1 1 3 , 4 (4) 2 1 13
At this stage we would prefer to draw facts from graphical presentation of any R.C
matrix and extend our imaginations to possibly search for such matrices of higher order then
what we tackle with on hand. We take the first matrix shown above.
Let A = 1 −1
−1
2 −1 3 ; it is a R.C matrix. In two dimensional rectangular frames, the vectors
−1 1 −1 −1
−1 −1 1 −1
2 3 4 2 3 are orthogonal to each other. In the same way the vectors 2 3 and 2 3
are orthogonal to each other.
We have developed an elegant method of extending a R.C matrix of order nxn to the
next higher order matrix of order ( + 1) × ( + 1). We shall discuss the same in the
to follow. We, just for citation purpose, write R.C section matrices of order 3 × 3 and 4 ×
A= 1/2 −1/2 1 and 1 −1 4.
1 1 1 7 = 9 3 3 2 −1
3 3
1/4 −1/2 1/2
8 3/2 −3
3 −3 :
1 1
1 −1
51
[Extension of a R.C matrix from 2x2 to any R.C matrix of subsequent higher order has been shown in the
annexure. Readers are requested to please go through the technical proceedings.]
We write its general form. A R.C matrix A of order × from the set JJn is as follows.
ᄂ …
A= ᄂ ᄂ ᄂ … ᄂ ∈
… …
ᄂ
…
…
… …
ᄃ
ᄂ …
We will be treating with a general format of 3 × 3 R.C matrix and derive many theorems
A= = < ..(2)
;> ?
@ A $
Matrix as a standard matrix and treat this as a R.C matrix with all real entries.
[If all real entries are zero, then it is a null matrix and hence defining property of an R.C
matrix permits a Null Matrix in the category of a R.C matrix.
1 = 0 0 0
..(3)
i.e 0 0 0 -- A null matrix is, by definition, a R.C matrix
0 0 0
We have, by definition of R.C matrix for matrix A as in (2), the following conditions.
ᄂ + ;= + <@ = 0 ..(4)
;= + > ᄂ + A? = 0
..(5)
<@ + A? + $ ᄂ = 0
..(6)
We shall write
B = ;=, 7 = <@, 4 C = A?
ᄂ ᄂ ᄂ ..(7)
i.e. + B + 7 = 0, > + B + C = 0, 4 $ + 7 + C = 0
52
B = ;= = (− ᄂ − > ᄂ + $ ᄂ )P
ᄂ
L (− ᄂ
)
ᄂ −$ ᄂ ..(8)
7 = <@ = + >
C = A? = ( ᄂ ᄂ − >ᄂ − $ᄂ O
ᄂ )
M
Using the above relations those we have derived, we will prove some important notions in
terms of theorems.
Theorem 01: A R.C matrix with real entries, except a null matrix, cannot be a
symmetric matrix.
; <
= > ?
ᄂ ᄂ ᄂ
Using property given in (8), as ;= > 0, we have a + y – r < 0
ᄂ ᄂ ᄂ ᄂ ᄂ
In the same way as@< > 0, we have a − y + r < 0 and A? > 0 gives −a + y +
r ᄂ < 0.
ᄂ ᄂ ᄂ
Adding all the results obtained above we have a + y + r < 0 which is possible for any
real values of ‘a, y, and r ’.
This implies that each one of ;=, <@, 4 A? = 0 . [Which can make each one of ‘ a, y, and
ᄂ ᄂ ᄂ
r’= 0 and hence in turn a + y + r = 0]
This proves that except a null matrix a R.C matrix cannot be symmetric one. This proves the
theorem.
Lemma: On the same lines except for a non-null R.C matrix cannot be a skew symmetric
matrix.
Proof: we must have at least one of ;=, @<, 4 A? < 0 ; while two of them are less or equal to zero. This in turn implies that, on addition of 2;=, 2@<, 4 2A? , a ᄂ + y ᄂ + r ᄂ ≤ 0;
which is not true. If it is a non-symmetric one then by its basic property each one of diagonal
elements, ,, Y, ,Z[ \ = 1. This, in turn, implies that a ᄂ + y ᄂ + r ᄂ < 0; which is not
possible for real entries matrix. This proves the lemma.
53
In the next section we will prove some defining features of 3x3 R.C matrices and then will
establish that those can be extended for the R.C matrices of higher order also.
Theorem 02: A R.C matrix, except a null matrix, is always a non-singular matrix.
Proof: By definition we accept that a null matrix is a R.C matrix and hence in that case R.C
matrix is a singular one.
The case when a R.C matrix is a non-null one, by the defining property of R.C matrix we
claim its non-singularity.
th th
[By definition of R.C matrix its i row vector is orthogonal to i column vector only making
the result of their dot product/ inner product equal to zero. Had it been orthogonal to any other
th
column except the i one then the column vectors become linearly dependent which results in
singularity of the matrix.]
We conclude that a R.C matrix, except a null matrix, is always a non-singular matrix.
Theorem 3: In the 3x3 R.C matrix with real entries product of at least two off diagonal
(principal) entries are negative.
[This is an important clue to construct a 3x3 R.C matrix. The same concept can be extended to
the R.C matrices of higher order.]
@ A <$
must be negative; this is rather one of the most important salient features of R.C matrix. The
theorem targets on establishing that at least two of QR, ST, 4 UV are strictly less than zero
Being an R.C matrix the entries follow R.C property. We write relations (4), (5), and (6) as
below.
ᄂ + ;= + <@ = 0
;= + > ᄂ + A? = 0
<@ + A? + $ ᄂ = 0
54
With this we enjoin the notations with above relation to get
+ B + 7 = 0, > ᄂ + B + C = 0, 4 $ ᄂ + 7 + C = 0
(− ᄂ − > ᄂ + $ ᄂ )P
L (− + > − )
$
7 = <@ = ᄂ ᄂ ᄂ
ᄂ
O
C = A? = ᄂ ( ᄂ − > ᄂ − $ ᄂ ) M
ᄂ ᄂ
[In the same way we can derive that – 2 y > 0 4 – 2r > 0. ]
This helps us conclude that B = ;=, 7 = <@, 4 C = A? ^^ > 0 is not possible for a R.C
matrix.
Case 2: Any two of A, B, and C > 0 and the remaining < 0 is not possible.
The case 2 mentioned above supports the statement and hence the proof.
Case 5: Any two of A, B, and C are < 0 and the remaining one is > 0.
55
ᄂ ᄂ ᄂ
Say A = bx < 0. This implies that −a − y + r < 0
ᄂ ᄂ ᄂ
and B = cp < 0. This implies that −a + y − r < 0
2
Adding them we get -2 a < 0 which is true. [∵ A is a real entry matrix.] In addition to this, for
ᄂ ᄂ ᄂ
C = A? > 0 Implies a − y − r > 0. We have to show validity of the result. From the
ᄂ ᄂ ᄂ
a – r > −y firstresultwehave,
∴ a ᄂ – r ᄂ − y ᄂ > −2y ᄂ
ᄂ ᄂ ᄂ ᄂ
−2y < 0 a − r − y > 0.
ᄂ ᄂ ᄂ
7 = <@ < 0. This implies that −a + y − r <0
ᄂ ᄂ ᄂ
and C = A? < 0 Implies that a − y − r <0
. .
As found in the above case, adding first two relations we derived– . , < 0 ). a , > 0,
. . ᄂ ᄂ ᄂ
we can derive −. \ < 0 4 − 2 Y < 0. We get each one of , > , 4 $ > 0;
which is true. It proves the statement.
It is the most important and useful notion in matrix algebra. For the given square
matrix A there exists a non-zero vector X such that for some value λ we have the matrix
equation bc = d c Satisfied. ‘d’ is called Eigen value and X is called the Eigen vector. In
this section we discuss salient features of Eigen values and Eigen vectors for the R.C matrix.
56
3.5.1 Introduction to important Preliminaries:
Before we proceed to enunciate our findings would like to mention certain peripherals
regarding the R.C matrix and Eigen value. This will simplify our proceedings. All these will
prove useful in the arguments proving the next theorems.
; <
= > ?
Let us initially focus our attention on A = @ A $ a R.C matrix with all real entries
defined by (2).
- . /
Let d , d . ,Z[ d be the Eigen values of A with corresponding non zero Eigen vectors c -,
c . , ,Z[ c / .
[1] As the matrix A is an R.C matrix, by defining properties, we have the following results.
These results are already mentioned in earlier work but just to abridge we cite those at this
point.
B = ;= = ᄂ
(− ᄂ − > ᄂ + $ ᄂ )P
L (− + > − )
$
7 = <@ = ᄂ ᄂ ᄂ
ᄂ
O
C = A? = ᄂ ( ᄂ − >ᄂ − $ᄂ )
M
∴ ᄂ + > ᄂ + $ ᄂ = −2 (bx + cp + qz
[2] Also recalling the facts pertaining to Eigen valuesd - , d .. ,Z[ d /; we write ..(8)
+ \
..(10)
(b)d - dSum. + ofd . productd / +d / ofd - Eigen=(,Yvalues−QRtaken)+( ,\two−atTSatime)+(Y\ − UV)
..(12)
57
With this on hand we state some theorems.
Theorem 4: Eigen values of a R.C matrix with real entries are either all zero or exactly
one real.
Proof: As a null matrix is also a R.C matrix, we have all Eigen values are zero and hence the
proof. If the matrix A is not a null matrix then we proceed as follows.
ᄂ
On comparing two results for ( + > + $ ) we have
..(13)
. . .
⟹ d- + d . + d / = 1
Proof: If any one of the Eigen value is zero then it implies that |A| = 0.
This means that the R.C matrix is a singular matrix. This violates the defining property of the
Note: We have the derived fact that the Eigen values d - , d . , ,Z[ d / are such that
..(14)
.
d - + d . + d/
. .
=1
Then one is real and different from zero while other two are complex conjugate of each
other.
58
3.5.2 Important Derivation:
In tune with above results we have up till now two important results--- (13) and (14).
d. + d. + d. = 1
From (13) we write that - . / with no λ being zero.
. . .
d + d = −d d., 4 d /
∴ − d -. = .(u . − v .) ..(15)
d. + d. + d. = 1
Deductions: The results established above will help deduce the following relations.
@ A <$
[1] Trace,
w
+ w ᄂ + w = + > + $ = x (Say)
[3]Determinant,
w
w ᄂ w = |B| = det (B) = }
[4] w
ᄂ ᄂ ᄂ
+ w ᄂ + w = 0
~6]w ᄂ + w = 2 ‡ = x – w
4 w ᄂ– w = 2 ) t and in connection with [1] above
59
[7] w
= x − 2 ‡ ). a ‡ = (w
− x)/2 and w
(2 ‡) + w ᄂ w = y
Gw ᄂ– wK ᄂ = 2ˆ2 ‡ ᄂ − ‰ 3 w ᄂ = ‡ + ‹ˆ ‡ ᄂ −
‰
• w = ‡ – ‹ˆ ‡ᄂ −
‰
•
[8] Using Š
we have ŠŒ and ŠŒ
This conveys that knowing only the real Eigen value it is sufficient enough to write the
remaining two complex conjugate Eigen values.
[9] Fact: in a given R.C matrix there exists at least one column C Ž or a row say • Ž such
that for a non –zero real value ‘c ’ such that either C• = <. C•
#$ •• = <. ••
; i.e. the column
B = 1 2 −6 C = 3 −2
1 2 −3 −1
By now, it is well known that a non-null R.C matrix has only one non-zero real root
while the remainder two are complex conjugate. [At this stage we reiterate that A real entry
R.C matrix cannot be either symmetric or skew symmetric.]
The vision to shape this section is to locate graphically and approximate algebraically
the real root of the characteristic equation of the given R.C matrix. As we have discussed
many possible properties inter-linking the different Eigen values of a given R.C matrix, we
state here what we shall require at times. We need the first one (5) above in set (12); It is our
characteristic equation.
d/– k d . + ‘ d – ’ = 1
For real eigen root λ, the graph of f(λ) on set of perpendicular real axis, will intersect the x-
axis in a point, say x1 = λ1; its location is our objective.
For d = 0, "(d) = −’ “ a$a ’ = [lo. |b| = d - d . d / where, as said earlier, d ., 4 d / are
complex Eigen values. Plotting this, we get the graph of a cubic curve. We parallel our work
60
Let A = 1 2 −3 With x = x$<a = 6, y = 18, 4 } = −3
1 2 −6
1 1 3
/ .
"(d) = d – – d + -— d + / , "#$ d = 1, n(d) = /. This situation is graphed as below in Figure:
3.5.1.The Figure: 3.5.2 shows its magnification on an interval about its intersection on X-Axis.
During the time that we derived and critically reviewed the characteristics of R.C
matrices of dimensions 2 × 2 and onwards, we could find many interesting features. We
commit, we have searched a small area and still we enjoin our efforts inspired by a new result
we work upon. Excavating such unknown area may elaborate mathematically ignited minds.
All constructive suggestions are welcome.
Annexure: As discussed, we, in this section, will elaborate the technique of finding an extension of a 2 × 2 R.C matrix to the R.C matrices of
higher order. We begin with a simple R.C matrix of order 2 × 2.
62
Let us consider, A1 = 2 1 3
1
1 −1 > = (1)= + 1 4 > ᄂ = ( −1)= +
1
Let us consider the column system as [which shows
perpendicular lines in R2 space.]
1 1 1 1 A
< < ᄂ < < ᄂ <
letters
in different positions are the real values. It is so planned that they satisfy R.C property.
We have, 2 ᄂ
ᄂ 3 + ᄂ3 (1) + @< = 0, ᄂ
3 (1) + (1) ᄂ + A< ᄂ =
2− 0 ,
2−
ᄂ
4 @< + A< ᄂ + (<) = 0
This gives us a free choice for selection of variables remaining within the given equation.
We select @ = 1, < = ¢ , A = 1, < ᄂ = 4 £a<a < = . 1
ᄂ ᄂ
−
The extended version of R.C matrix is now, A2 = 1/2 −1/2
1 1 1
1 5 −1
values are obtained as, 11 7 5
63
Step: 01
Initial approximation of real Eigen vector is d p = 1.This initial approximation will always
work as we know that for any real R.C matrix is with a Real Eigen value and a pair of
complex Eigen value. As a sum of square of Eigen values of R.C matrix is always zero, all
Eigen values of R.C matrix are zero if real Eigen value is zero.
@(0) = 128
This gives information about height ‘h’ of characteristic equation in XY plane, which in turn
is always non zero as discussed above.
Characteristic equation of A is a cubic polynomial and using nature of graph we can claim a
point on X axis with exact height ‘-h’ (same magnitude but opposite in sign)
So, equation @(w) = −128 is always consistent and gives a real solution.
d Fp = −/. ¦.0§
Then average of these two approximations is first approximation derived as,
w ¨ + w F¨ 0 + (−3.5247)
w= = ⇒ d - = −-. §–./¦
2 2
Step: 02
Now we iterate same process with input
w ¨ = −1.76235 into characteristic equation
@(w) = 0.
@(−1.76235) = 41.28413657
So, equation @(w) = −41.28413657 is always consistent and gives a real solution.
@(w) = −41.28413657 ⇒ @(w) = w − 8w ᄂ + 32w + 128 = −41.28413657
⇒ w = −2.74987, w = 5.37493 − i 5.71583 and w = 5.37493 + i 5.71583 Considering this Real
Eigen value as w F¨ = −2.74987
64
Then average of these two approximations is second approximation derived as,
w ¨ + w F¨ (−1.76235 ) + (−2.74987)
wᄂ = ⇒ d . = −.. .¦–--
2 2
=
Step: 03
Now we iterate same process with input
w ¨ = −2.25611 into characteristic equation
@(w) = 0.
@(−2.25611) = 3.60054850
So, equation @(w) = −3.60054850 is always consistent and gives a real solution.
@(w) = −3.60054850 ⇒ @(w) = w − 8w ᄂ + 32w + 128 = −41.28413657
⇒ w = −2.34119, w = 5.1705 − i 5.42915 and w = 5.1705 + i 5.42915 Considering this
Real Eigen value as wF¨ = −2.34119
Then average of these two approximations is third approximation derived as,
w ¨ + w F¨ (−2.25611 ) + (−2.34119)
w= = ⇒ d / = −.. .ª—–¦
2 2
Step: 04
Now we will repeat our first step with input
w ¨ = −2.29865 into characteristic equation
@(w) = 0.
@(−2.29865 ) = 0.02727735
So, equation @(w) = −0.02727735 is always consistent and gives a real solution.
@(w) = −0.02727735 ⇒ @(w) = w − 8w ᄂ + 32w + 128 = −0.02727735
⇒ w = −2.29929, w = 5.14964 − i5.4002 and w = 5.14964 + i5.4002 Considering this
Real Eigen value wF¨ = −2.29929
Then average of these two approximations is fourth approximation derived as,
w ¨ + w F¨ (−2.29865 ) + (−2.29929)
w¢ = 2
= 2
⇒ d 0 = −.. .ª—ª§
Conclusion: w
= −1.76235, w ᄂ = −2.25611,w = −2.29865 and w ¢ = −2.29897
65
Four approximations gives three decimal accurate solution. For further accuracy we can
repeat same stapes.
66