Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

PART THREE:

LINEAR SYSTEMS OF EQUATIONS

Iterative Method
Gauss-Seidel Method

It is the most commonly used iterative method to solve linear


algebraic equations
Well-suited for large number of equations.

This method employs initial guess and then iterates to obtain


refined estimates of the solution.

(2)

1
Gauss-Seidel Method
 Suppose that we limit ourselves to a 3x3 system in form
of [A]{x}={B}
1) The first equation can be solved for x1, the second
equation for x2, and third equation for x3 to yield:
1. b1  a12 x2j 1  a13 x3j 1
x1j  (11.5a)
(12.1a)
a11 j and j-1 are
j 1
b2  a21 x  a23 xj
the present
x2j  1 3
(11.5b)
(12.1b)
a22 and previous
b3  a31 x1j  a32 x2j
x 
3
j
(12.1c) iterations
(11.5c)
a33

(3)

General Form of each equation


n n
b1   a1 j x j b n 1  a n 1, j xj
j 1 j 1
j 1
x1  xn 1 
j n  1
a11 a n 1,n 1
n n
b2   a2 j x j bn   anj x j
j 1 j 1
j2
x2  xn 
jn
a22 ann
n
ci   aij x j
j 1
ji
xi  , i  1, 2, , n.
aii
(4)

2
Gauss-Seidel Method

2) start with initial guess values for x’s:


3) Substitute guesses in (11.5a) and get new x1; use this x1
with x2 and x3 in (11.5b) to get new x2; again use new x1
and the new x2 in (11.5c) to get new x3
4) Repeat this process until our solution converges closely
enough to the true values.
5) Convergence can be checked using the criterion that for
all i, j 1
new
xi old
 xi j
xi  x i
 a ,i  new
100%  100%   s
x i xij

(5)

Example 11.3 3 x  0.1 x  0.2 x  7.85


1 2 3

0.1 x1  7 x2  0.3 x3  19.3


0.3 x1  0.2 x2  10 x3  71.4
Solve for the unknowns
7.85  0.1 x2j 1  0.2 x3j 1
x1j 
3
19.3  0.1 x1j  0.3 x3j 1
x2j 
7
71.4  0.3 x1j  0.2 x2j
x3 
j

a 33
Use initial guess x2, x3 are zero

(6)

3
First iteration:
7.85  0.1(0)  0.2(0)
x11  =2.616667
3
19.3  0.1(2.616667)  0.3(0)
x21  =-2.794524
7
71.4  0.3(2.616667)  0.2( 2.794524)
x31  =7.005610
10
Second iteration:
7.85  0.1()  0.2()
x12  =2.990557
3
19.3  0.1(2.990557)  0.3(7.005610)
x22  = -2.499625
7
71.4  0.3(2.990557)  0.2(-2.499625)
x32  = 7.000291
10
(7)

2.990557  2.616667
 a ,1  100%  12.5%
2.990557

2.499625  ( 2.794524)
 a ,2  100%  11.8%
2.499625

7.000291  7.005610
 a ,3  100%  0.076%
7.000291

(8)

4
Gauss-Seidel Method: Pitfall
What went wrong?
Even though done correctly, the answer is not converging
to the correct answer
This example illustrates a pitfall of the Gauss-Siedel
method: not all systems of equations will converge.
Is there a fix?
One class of system of equations always converges: One with a
diagonally dominant coefficient matrix.
Diagonally dominant: [A] in [A] [X] = [C] is diagonally
dominant if:
n n
aii   aij for all ‘i’ and a ii   aij for at least one ‘i’
j 1 j 1
j i j i
(9)

Diagonally dominant: The coefficient on the diagonal must be at least


equal to the sum of the other coefficients in that row and at least one
row with a diagonal coefficient greater than the sum of the other
coefficients in that row.

Which coefficient matrix is diagonally dominant?


 2 5.81 34 124 34 56 
A    45 43 1  [ B]   23 53 5 
123 16 1   96 34 129

Most physical systems do result in simultaneous


linear equations that have diagonally dominant
coefficient matrices.

(10)

5
Relaxation of G_S method

 It is a slight modification of the Gauss-Seidel method that is


designed to enhance convergence.
 After each new value of x is computed, it is modified by a
weighted average of the results of the previous and present
iteration.
xij   xij  (1   ) xij 1 or xinew   xinew  (1   ) xiold
 Where  is assigned between 0-2
 If   1 then no modification, 0    1 underrelaxation slow
the oscillations, 1  extra
 2 weight given for the new
value; overrelaxation

(12)

You might also like