Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 44

Numerical Methods

Marisa Villano, Tom Fagan,


Dave Fairburn, Chris Savino,
David Goldberg, Daniel Rave
An Overview
 The Method of Finite Differences
 Error Approximations and Dangers
 Approxmations to Diffusions
 Crank Nicholson Scheme
 Stability Criterion
Finite Differences

Best known numerical method of


approximation

Marisa Villano
Finite Differences
 Approximating the derivative with a
difference quotient from the Taylor series

 Function of One Variable


 Choose mesh size Δx
 Then uj ~ u(jΔx)
First Derivative Approximations
 Backward difference: (uj – uj-1) / Δx

 Forward difference: (uj+1 – uj) / Δx

 Centered difference: (uj+1 – uj-1) / 2Δx


Taylor Expansion
2
 u(x + Δx) = u(x) + u΄(x)Δx + 1/2 u˝(x)(Δx)
3 4
+ 1/6 u˝΄(x)(Δx) + O(Δx)

2
 u(x – Δx) = u(x) – u΄(x)Δx + 1/2 u˝(x)(Δx)
3 4
- 1/6 u˝΄(x)(Δx) + O(Δx)
Taylor Expansion
u΄(x) = u(x) – u(x – Δx) + O(Δx)
Δx
u΄(x) = u(x + Δx) – u(x) + O(Δx)
Δx
2
u΄(x) = u(x + Δx) – u(x – Δx) + O(Δx)
2Δx
Second Derivative Approximation
2
 Centered difference: (uj+1 – 2uj + uj-1) / (Δx)

 Taylor Expansion
2
u˝(x) = u(x + Δx) – 2u(x) + u(x – Δx) + O(Δx)
2
(Δx)
Function of Two Variables
n
u(jΔx, nΔt) ~ uj
 Backward difference for t and x

∂u n n-1
(jΔx, nΔt) ~ (uj – u ) / Δt
j
∂t

∂u n n-1
(jΔx, nΔt) ~ (uj – u ) / Δx
j
∂x
Function of Two Variables

 Forward difference for t and x

∂u n+1
(jΔx, nΔt) ~ (u j
n
– uj ) / Δt
∂t
∂u n+1
(jΔx, nΔt) ~ (u j
n
– uj ) / Δx
∂x
Function of Two Variables

 Centered difference for t and x

∂u n+1
(jΔx, nΔt) ~ (u j
n-1
– u ) / (2Δt)
j
∂t
∂u n+1
(jΔx, nΔt) ~ (u j
n-1
– u ) / (2Δx)
j
∂x
Error
 Truncation Error: introduced in the solution by the
approximation of the derivative
 Local Error: from each term of the equation
 Global Error: from the accumulation of local
error

 Roundoff Error: introduced in the computation by


the finite number of digits used by the computer
The Dangers of the Finite
Difference Method
Evidence from an example in 8.1

Dave Fairburn
Example from 8.1
 Consider ut = uxx u(x,0) = h(x)
 We will use the finite difference method to
approximate the solution
 Forward difference for ut

 Centered difference for uxx


 Re-write equation in terms of the finite
difference approximations
Finite Difference Eqn.
 ujn+1 - ujn = unj+1 - 2ujn + unj-1
t ( x) 2

Error: The local truncation error is O( t)


from the left hand side and is O( x)2 from
the right hand side.
Assumptions
 Assume that we choose a small change in
x, and that the denominator on both sides
of the equation are equal.
 We are now left with the scheme:

ujn+1 = unj+1 - unj + unj-1


 Solving u with this scheme is now easy to
do once we have the initial data.
Initial Data
 Let u(x,0) = h(x) = a step function with
the following properties:
h(x) = 0 for all j except for j = 5, so
hj = 0 0 0 0 1 0 0 0 0 0 0 ….
 Initially, only a certain section, which is
at j = 5 is equal to the value of 1.
 “j” serves as the counter for the x
values.
How to solve?
 We know u0j = 1 at j = 5 and 0 at all other j
initially (given by superscript 0).
 We can plug into our scheme to solve for u1j at
all j’s.
 u1j = u0j-1 - u0j + u0j+1
 u15 = -1; u14 = 1; u16 = 1
 Now we can continue to increase the # of
iterations, n, and create a table…
Solution for 4 iterations
4 1 -4 10 -16 19 -16 10 -4 1 0
n 3 0 1 -3 6 -7 6 -3 1 0 0
-
v 2 0 0 1 -2 3 -2 1 0 0 0
a
l 1 0 0 0 1 -1 1 0 0 0 0
u
e 0 0 0 0 0 1 0 0 0 0 0
s 1 2 3 4 5 6 7 8 9 10

j values
Analysis of Solution
 Is this solution viable?
 Maximum principle states that the solution must
be between 0 and 1 given our initial data
 At n = 4, our solution has already ballooned to u
= 19!
 Clearly, there are cases when the finite
difference method can pose serious problems.
Charting the Error
 Assume the solution is constant and equal to 0.5 (halfway between
the possible 0 and 1)
Lessons Learned
 While the finite difference method is easy
and convenient to use in many cases,
there are some dangers associated with
the method.
 We will investigate why the assumption
that allowed us to simplify the scheme
could have been a major contributor to
the large error.
Approximations of Diffusions

Neumann Boundary Conditions


and the Crank-Nicolson Scheme

Chris Savino
Approximations of Diffusions
 Errors have accumulated from the
approximations of the derivatives using the
previous scheme
 The problem is the choice of the mesh Δt
to the mesh Δx
t
 Let s= (x)2

can solve scheme


uj n 1  s (uj  1n  uj  1n )  (1  2 s )uj n
Neumann Boundary Conditions
 0 x l

ux (0, t )  g (t ) ux (l , t )  h(t )

 Simplest Approximations are

u1n  u 0 n uj n  uj  1n
 gn  hn
x x
 To get smallest error, we use centered
differences for the derivatives on the boundary
 Introduce ghost points

u 1
n uj  1n
 Boundary Conditions become

n n
u u
1 1 n uj  1n  uj  1n
g  hn
2x 2x
Crank-Nicolson Scheme
 Can avoid any restrictions on stability
conditions
 Unconditionally stable no matter what the
value of s is.
 Centered Second Difference:
uj  1n  2uj n  uj  1n 2 n
2
 ( u )j
( x )
 Pick a number theta between 0 and 1
 Theta scheme:

uj n 1  uj n
 (1   )( u 2 ) j n   ( u 2 ) j n 1
t
 We analyze the scheme by plugging in a
separated solution

uj n  (eik x ) j ( (k )) n

 Therefore

1  2(1   ) s (1  cos k x)


 (k ) 
1  2 s (1  cos k x)
 Must Check stability condition  (k )  1

 If  (k )  1 then s (1  2 )(1  cos k x)  1

 Therefore

1  2  0
is always true
1
 1
 If 2 then there is no restriction on the size
of s for stability to hold
 The scheme is unconditionally stable
 When theta = ½ it is called the Crank-Nicolson
scheme
 If theta < ½ then the scheme is stable if

t 1
2
s
( x ) 2  4
Stability Criterion

Approximations of the diffusion


equation, ut=uxx

David Goldberg
Stability Criterion
 The method of finite differences gives an
answer, but it does not guarantee that this
answer is meaningful.
 Values must be chosen appropriately, to
ensure that the results make sense and
are applicable to real world scenarios.
 This condition, that values must satisfy in
order to be worthwhile, is called the
“stability criterion.”
Example
 As per the book, take, for instance, the
diffusion problem:
𝑢𝑡 = 𝑢𝑥𝑥 for 0<x<π, t>0
𝑢=0 at x=0, π, that is
𝑢ሺ0, 𝑡ሻ = 𝑢ሺ𝜋, 𝑡ሻ = 0
𝜋
𝑥 in ቀ0, ቁ
2
𝑢ሺ𝑥, 0ሻ = 𝜙ሺ𝑥 ሻ = ቐ
𝜋 − 𝑥 in ቀ𝜋 , 𝜋ቁ
2
Example, continued
As can be easily shown, the graph of φ(x)
looks like this.
Example, continued
 In attempting to use the method of finite
differences, we are using a forward difference
for ut and a centered difference for uxx.
 This means that

𝑢𝑗𝑛+1 − 𝑢𝑗𝑛𝑢𝑗𝑛+1 − 2𝑢𝑗𝑛 + 𝑢𝑗𝑛−1


 It is important to note here = the superscript
that
Δ𝑡 ሺΔ𝑥 ሻ2
n denotes a counter on the t variable, and the
subscript j denotes a counter on the x
variable.
Example, continued
 In order to make the calculations a bit cleaner, we
are introducing a variable, s, which is defined by
t
s
 Rearranging, we have ( x ) 2

u nj 1  s u nj 1  2u nj  u nj 1   u nj
u nj 1  s u nj 1   2s u nj   u nj  s u nj 1 
 It would be nice if we could justn plugn in values and
get a valid result… u n 1
j  s  u j 1  u j 1   1  2 s   u j 
n
Example, continued
 However, putting in different values can lead to the
results being close to, or far from, that actual answer.
 For instance, letting ∆x=π/20, and letting s=5/11, we get
a relatively nice result. Letting s=5/9 does not get such a
nice result.

 So what, of significance, changes?


Example, Continued
 As it turns out, changing the value of s can
significantly change the validity of the
solution. To see why, we return to our
...
equation.
u nj 1  s u nj 1  u nj 1   1  2s  u nj 
Seperate variables
u  XT  u nj  X jTn
So, by combining like terms,
Tn 1  X j 1  X j 1 
 s   1  2 s
Tn  Xj
 
Example, continued
 Since the left hand side is a function of T
and the right hand side is a function of X,
they must be equal to a constant.
...
Tn 1  X j 1  X j 1 
 s
   1  2s  
Tn  Xj 
Tn 1
   Tn   Tn 1  Tn   nT0
Tn
and also
 X j 1  X j 1 
  s   1  2s
 Xj
 
Example, continued
 This is a discrete version of an ODE,
... which when solved gives
  1  2 s  s(eik x  e  ik x )
  1  2 s  2 s cos(k x)
Since, as discovered before, Tn   nT0
if   1, T will grow without bound.
By above,
1  4s    1
1
So 1  4 s  1,  s 
2
Example, finished
t 1

 Thus, to achieve stability,  x 2 2. This is
why setting s=5/9 didn’t give a valid result.
 It is to be noted that usually the necessary
criterion is that , but
that in this case it was
 1  Oirrelevant.
(t ) instead of   1
 So the stability criterion must be worked
out before one can effectively use the
method of finite differences.
Approximations of Diffusions

Example from 8.2

Daniel Rave
Summary
 Breif Review of Methods

 Wide Applicability

 Importance of Stability

You might also like