Lyapunov Stability Theory: y G X y F X F X y

You might also like

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 15

Lyapunov stability theory

Consider the nonlinear system


x = f (x ) (1)
Let us assume that xeq = 0 is an equilibrium point of (1).

Q.:
What if the equilibrium point xeq ≠ 0?

A.:
There is no loss of generality by this assumption. We can always
choose a shifted coordinates system in the form
y = x – xeq. The derivative of y is given by
y = x = f ( x ) = f ( y + xeq ) = g ( y ) = 0

The system described by the means of the new variable has the
equilibrium at the origin.
Definition
The equilibrium point x = 0 of (1) is
• stable, if for each ε > 0, there is δ = δ (ε ) > 0 such that
x (0) < δ ⇒ x (t ) < ε, ∀t ≥ 0

• unstable, if not stable


• asymptotically stable, if it is stable and δ can be chosen such
that
x(0) < δ ⇒ lim x(t ) = 0
t →∞

• marginally stable if it is stable but not asymptotically stable

Asymptotically stable

xeq
δ

Marginally stable

unstable
Lyapunov First Method
(The indirect method)

According to the basic definitions, stability properties depend


only on the nature of the system near the equilibrium point.


Let us linearize the system description!


For small deviations from the equilibrium point, the performance
of the system is approximately governed by the linear terms.
These terms dominate and thus determine stability – provided
that the linear terms do not vanish.

The idea of checking stability by examination of a linearized


version of the system is referred to as Liapunov’s first method or
Liapunov’s indirect method.
Theorem
Let x = 0 be an equilibrium point of a nonlinear system
x = f (x )

where f : D → Rn is continuously differentiable and D is the


neighborhood of the equilibrium point.
∂f
Let λ i denote the eigenvalues of the matrix A=
∂x x =0

1. If Re λ i < 0 for all i then x = 0 is asymptotically stable for the


nonlinear system.
2. If Re λ i > 0 for one or more i then x = 0 is unstable for the
nonlinear system.
3. If Re λ i ≤ 0 for all i and at least one Re λ j = 0 then x = 0
may be either stable, asymptotically stable or unstable for the
nonlinear system

Conclusion:
Except for the boundary situation, the eigenvalues of the
linearized system completely reveal the stability properties of an
equilibrium point of a nonlinear system.

If there are boundary eigenvalues, a separate analysis is required


An example

K
M f

y
B

d 2 y (t ) dy (t )
M 2
+B + Ky (t ) = f (t )
dt dt

Moreover, since we are interested in stability properties, f(t) = 0.

My + By + Ky = 0 , equilibrium point : y = 0

State variables

 x1(t ) = y(t )

 x2 (t ) = y (t )

 x1 = y = x2

 K B
 x 2 = y = − M x1 − M x2
The total stored energy is given by
1 1
V (t ) = Kx12 + Mx 22
2 2

which have the following properties:


• positive for all nonzero values of x1(t) and x1(t)
• equals zero when x1(t) = x1(t) = 0

The time derivative of V(t) is given by:


dV (t ) ∂V (t ) ∂V (t )
= x 1 + x 2
dt ∂x1 ∂x2

dV (t )
= −Bx 22
dt

dV/dt is negative ⇒ the state must move from its initial state in
the direction of smaller values of V(t)

x2
V = C3

V = C2

V = C1
x1

C1 < C2 < C3
Lyapunov Second Method
(The direct method)
Theorem

Let x = 0 be an equilibrium point of a nonlinear system


x = f (x )

Let V : D → R be a continuously differentiable function on a


neighborhood D of x = 0, such that

V(0) = 0 and V(x) > 0 in D – {0},

V ( x ) ≤0 in D
Then, x = 0 is stable.
Moreover, if V ( x) <0 in D – {0} then x = 0 is asymptotically
stable

The task:
To find V(x), called a Lyapunov function, which must satisfy the
following requirements:
• V is continuous
• V(x) has a unique minimum at xeq with respect to all other
points in D
• Along any trajectory of the system contained in D the value of
V never increases
What if the stability of x = 0 has been established?

The first Lyapunov method determines stability in the immediate


vicinity of the equilibrium point.

The second Lyapunov method allows to determine how far from


the equilibrium point the trajectory can be and still converge to it
as t approaches ∞


region of asymptotic stability (region of attraction, basin)

Let φ (t;x) be the solution of the system equation that starts at


initial state x at time t = 0.
Then the region of attraction is defined as the set of all points x
such that
limt→∞ φ (t;x) = 0

If Ω c = { x ∈ Rn | V(x) ≤ c } is bounded and contained in D, then


every trajectory starting in Ω c remains in Ω c and approaches the
equilibrium point as t → ∞.

Thus, Ω c is an estimate of the region of attraction.


Types of stability with reference to the region of attraction:

• local stability (stability in the small) – when a system remains


within an infinitesimal region around the equilibrium when
subjected to small perturbation
• finite stability – when a system returns to the equilibrium point
from any point within a region R of finite dimensions
surrounding it
• global stability (stability in the large) – if the region R includes
the entire state space

Theorem
Let x = 0 be an equilibrium point of a nonlinear system
x = f (x )

Let V : Rn → R be a continuously differentiable function such that

V(0) = 0 and V(x) > 0 ∀x ≠ 0,

||x|| → ∞ ⇒ V(x) → ∞

V ( x ) ≤ 0 ∀x ≠ 0
then x = 0 is globally asymptotically stable
Another example (a pursuit problem)
Suppose a hound is chasing a rabbit (in such a way that his
velocity vector always points directly toward the rabbit).
The velocities of the rabbit and the hound are constant and
denoted by R and H, respectively (see the picture)

Let xr, yr, and xh, yh denote the x and y coordinates of the rabbit
and hound, respectively. Then
x r = R
x h2 + y h2 = H 2
y r = y r = 0
The fact that velocity vector of the hound always points toward
the rabbit means that
x h = −k ( xh − x r )
y h = −k ( y h − y r ) k – a positive constant

So
− H ( x h − xr )
x h =
( xh − xr ) 2 + y h2
− Hyh
y h =
( xh − xr ) 2 + y h2

Let us introduce the relative coordinates – the coordinates of the


difference in position of the hound and the rabbit:
x = xh − x r
y = yh

− Hx
x = −R
x2 + y2
− Hy (*)
y h =
x2 + y2

Will the hound always catch the rabbit?


Will a trajectory with an arbitrary initial condition eventually get
to the point where the relative coordinates are zero?


We can consider the origin as an equilibrium point


What are conditions for global stability of the system?

We have to find a suitable Lyapunov function for the system
given by (*)

Let us choose as a Lyapunov function


V(x,y) = x2 + y2
Then
V ( x, y ) = −2 H x 2 + y 2 − 2 Rx

If H > R:
• if x = 0 and y ≠ 0, it is clear that V ( x, y ) <0
• if x ≠ 0 then
−H x 2 + y 2 − Rx < −( H − R ) x < 0

Thus, V ( x, y ) <0 for all x, y except the origin.


If the hound runs faster than the rabbit, he always catches the
rabbit
Comments on the second Lyapunov’s method:

• determines stability without actually having to solve the


differential equation
• can be applied even if the system model cannot be linearized
• allows to estimate the stability region
• in some cases there are natural Lyapunov function candidates,
like energy functions in electrical or mechanical systems

• the stability conditions are sufficient, but not necessary

• there is no systematic method for finding Lyapunov functions


– sometimes a matter of trial and error
• a Lyapunov function for any particular system is not unique
Nonlinear phenomena:

• Finite escape time: The state of an unstable linear system goes


to infinity as time approaches infinity; a nonlinear system’s
state, however, can go to infinity in finite time.

• Multiple isolated equilibria: a linear system can have only one


isolated equilibrium point; hence it can have only one steady-
state operating point which attracts the state of the system
irrespective of the initial state. A nonlinear system can have
more than one isolated equilibrium point. The state may
converge to one of the several steady-state operating points,
depending on the initial state of the system.

• Limit cycles: For a linear time-invariant system to oscilate, it


must have a pair of eigenvalues on the imaginary axis, which is
a nonrobust condition that is almost impossible to maintain in
the presence of perturbations. Even if we do, the amplitude of
the oscillation will be dependent on the initial state. In real life
stable oscillation must be produced by nonlinear systems.
There are nonlinear systemswhich can go into an oscillation of
fixed amplitude and frequency, irrespective of the initial state
(so called limit cycle)
• Subharmonic, harmonic or almost–periodic oscillations: A
stable linear system under a periodic input produces an output
of the same frequency. A nonlinear system under periodic
excitation can oscillate with frequencies which are
submultiples or multiples of the input frequency. It may even
generate an almost–periodic oscillation, an example of which
is the sum of periodic oscillations with frequencies which are
not multiples of each other

• Chaos: A nonlinear system can have more complicated steady-


state behavior that is not equilibrium, periodic oscillation or
almost–periodic oscillation. Such behavior is usually referred
to as chaos. Some of these chaotic motions exhibit
randomness, despite the deterministic nature of the system

• Multiple modes of behavior: It is not unusual for two or more


modes of behavior to be exhibited by the same nonlinear
system. An unforced system may have more than one limit
cycle. A forced system with periodic excitation may exhibit
harmonic, subharmonic or more complicated steady-state
behavior, depending upon the amplitude and frequency of the
input. It may even exhibit a discontinuous jump in the mode of
behavior as the amplitude or frequency of the excitation is
smoothly changed

You might also like