Professional Documents
Culture Documents
Lecture 2 Review of Classical Control
Lecture 2 Review of Classical Control
Systems
Lecture-2
Review of Basic Concepts of Classical control
1
What is Control System?
• A system Controlling the operation of another
system.
• A system that can regulate itself and another
system.
• A control System is a device, or set of devices
to manage, command, direct or regulate the
behaviour of other device(s) or system(s).
2
Types of Control System
• Natural Control System
– Universe
– Human Body
• Manmade Control System
– Vehicles
– Aeroplanes
3
Types of Control System
• Manual Control Systems
– Room Temperature regulation Via Electric Fan
– Water Level Control
4
Types of Control System
Open-Loop Control Systems
Open-Loop Control Systems utilize a controller or control actuator to
obtain the desired response.
• Output has no effect on the control action.
• In other words output is neither measured nor fed back.
Input Output
Controller Process
6
Types of Control System
Closed-Loop Control Systems
Input Output
Comparator Controller Process
Measurement
7
Types of Control System
Multivariable Control System
Temp Outputs
Humidity Comparator Controller Process
Pressure
Measurements
8
Types of Control System
Feedback Control System
Feedback
9
Types of Control System
Servo System
• A Control System in which output varies linearly with the input is called a
linear control system.
30
0
25
-5
20
y(t)
y(t)
-10
15
-15
10
-20 5
0 2 4 6 8 10 0 2 4 6 8 10
u(t)
u(t) 11
Types of Control System
Linear Vs Nonlinear Control System
• When the input and output has nonlinear relationship the system is said
to be nonlinear.
0.4
Adhesion Coefficient
0.3
0.2
0.1
0
0 0.02 0.04 0.06 0.08
Creep
12
Types of Control System
Linear Vs Nonlinear Control System
• Linear control System Does not
exist in practice. Adhesion Characteristics of Road
0.4
• Linear control systems are
idealized models fabricated by
Adhesion Coefficient
0.3
the analyst purely for the
simplicity of analysis and design. 0.2
°C
Temperature
500°C
Valve Position
0% 25% 100%
% Open
14
Types of Control System
Time invariant vs Time variant
• When the characteristics of the system do not depend upon time itself
then the system is said to time invariant control system.
y(t ) 2u(t ) 1
y(t ) 2u(t ) 3t
15
Types of Control System
Lumped parameter vs Distributed Parameter
d 2x dx
M 2
C kx
dt dt
• Whereas the distributed parameter control systems are described by
partial differential equations.
2
x x x
f1 f2 g 2
dy dz dz
16
Types of Control System
Continuous Data Vs Discrete Data System
• A discrete time control system involves one or more variables that are
known only at discrete time intervals.
X[n]
n
17
Types of Control System
Deterministic vs Stochastic Control System
t t
t 18
Types of Control System
Adaptive Control System
19
Types of Control System
Learning Control System
20
Classification of Control Systems
Control Systems
Natural Man-made
Manual Automatic
Open-loop Closed-loop
Non-linear linear
Non-linear linear
22
Examples of Control Systems
23
Examples of Modern Control Systems
24
Examples of Modern Control Systems
25
Examples of Modern Control Systems
26
Transfer Function
• Transfer Function is the ratio of Laplace transform of the
output to the Laplace transform of the input. Assuming
all initial conditions are zero.
If u(t ) U ( S ) and
y(t ) Y ( S )
28
Why Laplace Transform?
• By use of Laplace transform we can convert many
common functions into algebraic function of complex
variable s.
• For example
sin t 2 2
s
Or
at 1
e
sa
• Where s is a complex variable (complex frequency) and is
given as
s j 29
Laplace Transform of Derivatives
• Not only common function can be converted into
simple algebraic expressions but calculus operations
can also be converted into algebraic expressions.
• For example
dx(t )
sX ( S ) x( 0 )
dt
2
d x(t ) 2 dx( 0 )
2
s X ( S ) x( 0)
dt dt
30
Laplace Transform of Derivatives
• In general
n
d x(t ) n n 1 n 1
n
s X (S ) s x( 0 ) x (0)
dt
31
Example: RC Circuit
32
Laplace Transform of Integrals
1
x(t )dt X ( S )
s
33
Calculation of the Transfer Function
• Consider the following ODE where y(t) is input of the system
and x(t) is the output.
d 2 x(t ) dy(t ) dx(t )
A C B
• or dt 2 dt dt
34
Calculation of the Transfer Function
As 2 X ( s ) CsY ( s ) BsX ( s )
• Rearranging the above equation
As 2 X ( s ) BsX ( s ) CsY ( s )
X ( s )[ As 2 Bs ] CsY ( s )
X (s) Cs C
2
Y ( s ) As Bs As B 35
Example
1. Find out the transfer function of the RC network shown in figure-1.
Assume that the capacitor is not initially charged.
Figure-1
2. u(t) and y(t) are the input and output respectively of a system defined by
following ODE. Determine the Transfer Function. Assume there is no any
energy stored in the system.
36
Transfer Function
• In general
37
Transfer Function
• Otherwise ‘improper’
38
Transfer Function
• Transfer function helps us to check
system
39
Stability of Control System
• There are several meanings of stability, in general
there are two kinds of stability definitions in control
system study.
– Absolute Stability
– Relative Stability
40
Stability of Control System
41
Stability of Control System
42
Stability of Control System
• Poles is also defined as “it is the frequency at which
system becomes infinite”. Hence the name pole
where field is infinite.
43
Stability of Control System
• Poles is also defined as “it is the frequency at which
system becomes infinite”.
• Like a magnetic pole or black hole.
44
Relation b/w poles and zeros and frequency
response of the system
• The relationship between poles and zeros and the frequency
response of a system comes alive with this 3D pole-zero plot.
45
Relation b/w poles and zeros and frequency
response of the system
• 3D pole-zero plot
– System has 1 ‘zero’ and 2 ‘poles’.
46
Relation b/w poles and zeros and frequency
response of the system
47
Example
• Consider the Transfer function calculated in previous
slides.
X (s) C
G( s )
Y ( s ) As B
B
s
A
48
Examples
• Consider the following transfer functions.
– Determine
• Whether the transfer function is proper or improper
• Poles of the system
• zeros of the system
• Order of the system
s3 s
)i G( s ) )ii G( s )
s( s 2 ) ( s 1)( s 2 )( s 3)
( s 3) 2 s 2 ( s 1)
)iii G( s ) )iv G( s )
s( s 2 10) s( s 10)
49
Stability of Control Systems
• The poles and zeros of the system are plotted in s-plane
to check the stability of the system.
j
LHP RHP
Recall s j
s-plane
50
Stability of Control Systems
• If all the poles of the system lie in left half plane the
system is said to be Stable.
• If any of the poles lie in right half plane the system is said
to be unstable.
• If pole(s) lie on imaginary axis the system is said to be
marginally stable. j
3
stable
2
Imaginary Axis
-1
-2
-3
-4
-5
-5 -4 -3 -2 -1 0 1 2 3 4 5
Real Axis
52
Examples
Pole-Zero Map
5
4
stable
3
1
Imaginary Axis
-1
-2
-3
-4
-5
-5 -4 -3 -2 -1 0 1 2 3 4 5
Real Axis 53
Examples
Pole-Zero Map
5
4
unstable
3
1
Imaginary Axis
-1
-2
-3
-4
-5
-5 -4 -3 -2 -1 0 1 2 3 4 5
Real Axis
54
Examples
Pole-Zero Map
5
3 stable
2
Imaginary Axis
-1
-2
-3
-4
-5
-5 -4 -3 -2 -1 0 1 2 3 4 5
Real Axis
55
Examples
Pole-Zero Map
5
3 Marginally stable
2
Imaginary Axis
-1
-2
-3
-4
-5
-5 -4 -3 -2 -1 0 1 2 3 4 5
Real Axis
56
Examples
Pole-Zero Map
5
4
stable
3
2
Imaginary Axis
-1
-2
-3
-4
-5
-3 -2 -1 0 1 2 3
Real Axis
57
Examples
Pole-Zero Map
4
3 Marginally stable
1
Imaginary Axis
-1
-2
-3
-4
-2 -1.5 -1 -0.5 0 0.5 1 1.5 2
Real Axis
58
Examples
• Relative Stability
4 4
stable 3
stable
3
2 2
1
Imaginary Axis
Imaginary Axis
0 0
-1 -1
-2 -2
-3 -3
-4 -4
-5 -5
-5 -4 -3 -2 -1 0 1 2 3 4 5 -6 -4 -2 0 2 4
Real Axis Real Axis
59
Stability of Control Systems
• For example
C
G( s ) , if A 1, B 3 and C 10
As B
• Then the only pole of the system lie at
pole 3
j
LHP RHP
X
-3
s-plane
60
Examples
• Consider the following transfer functions.
Determine whether the transfer function is proper or improper
Calculate the Poles and zeros of the system
Determine the order of the system
Draw the pole-zero map
Determine the Stability of the system
s3 s
)i G( s ) )ii G( s )
s( s 2 ) ( s 1)( s 2 )( s 3)
( s 3) 2 s 2 ( s 1)
)iii G( s ) )iv G( s )
s( s 2 10) s( s 10)
61
Another definition of Stability
• The system is said to be stable if for any bounded
input the output of the system is also bounded
(BIBO).
• Thus the for any bounded input the output either
remain constant or decrease with time.
u(t) overshoot
y(t)
1
Plant 1
t
t
Unit Step Input
Output
62
Another definition of Stability
• If for any bounded input the output is not
bounded the system is said to be unstable.
u(t)
y(t)
1
e at
Plant
t
t
Unit Step Input
Output
63
BIBO vs Transfer Function
• For example
Y ( s) 1 Y (s) 1
G1 ( s ) G2 ( s)
U (s) s 3 U ( s) s 3
Pole-Zero Map Pole-Zero Map
4 4
unstable
3 stable 3
2 2
1 1
Imaginary Axis
Imaginary Axis
0 0
-1 -1
-2 -2
-3 -3
-4 -4
-4 -2 0 2 4 -4 -2 0 2 4
Real Axis Real Axis 64
BIBO vs Transfer Function
• For example
Y ( s) 1 Y (s) 1
G1 ( s ) G2 ( s)
U (s) s 3 U ( s) s 3
1 Y ( s)
1 1 1 Y ( s) 1
G1 ( s ) 1
G2 ( s ) 1
1
U ( s) s3 U (s) s 3
y (t ) e 3t u (t ) y (t ) e 3t u (t )
65
BIBO vs Transfer Function
• For example
3t
y (t ) e u (t ) y (t ) e3t u (t )
exp(-3t)*u(t) 12
x 10 exp(3t)*u(t)
1 12
10
0.8
8
0.6
6
0.4
4
0.2
2
0 0
0 1 2 3 4 0 2 4 6 8 10 66
BIBO vs Transfer Function
• Whenever one or more than one poles are in
RHP the solution of dynamic equations
contains increasing exponential terms.
• Such as e 3t
.
• That makes the response of the system
unbounded and hence the overall response of
the system is unstable.
67
To download this lecture visit
http://imtiazhussainkalwar.weebly.com/
END OF LECTURE-2
68