Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

Priority Priority--driven Scheduling of driven Scheduling of

periodic tasks: RM and DM periodic tasks: RM and DM


OUTLINE
Fixed-priority algorithms for scheduling periodic tasks Fixed-priority algorithms for scheduling periodic tasks
- Optimality of RM?
- Optimality of DM?
- U
lub
for 2 tasks with RM
- U
lub
for n tasks with RM
1
Ref: [Liu]
Ch. 6 (pg. 129 131; 146 152)
Optimality of RM? Optimality of RM?
We have already seen that fixed-priority scheduling algorithms are
not optimal for periodic tasks with arbitrary periods
However RM is optimal in the special case when: However RM is optimal in the special case when:
- tasks are simply periodic (for every pair of tasks one period is
multiple of the other),
- relative deadlines D
i
are not shorter than the periods p
i
We will prove this in the following slide (theorem).
Notations:
- tasks indexes are ordered by priority:
2
- tasks indexes are ordered by priority:
T
i
has priority higher than T
k
if i < k.
- T
i
denotes the subset of tasks with priority equal or higher than T
i
(T
i
={T
1
, T
2
, T
i
})
- U
i
=
k=1 to i
u
k
=
k=1 to i
e
k
/p
k
is the total utilization of T
i
First optimality of RM First optimality of RM
Theorem: A system of independent preemptable simply periodic tasks
with relative deadlines equal to or larger than their periods
is feasible in one processor with RM algorithm iff its total
utilization is less than or equal to 1. utilization is less than or equal to 1.
Proof: only-if: obvious; if: if the task set is not feasible, then U > 1
- assume T
i
misses its deadline at time t ; let t=0 be the last instant
before t when processor idles or executes jobs with priority < T
i
;
- we assume D
k
= p
k
for any k = 1, .. i : if a system is feasible when
D
k
= p
k
for any k, it is feasible also when D
k
> p
k
for some k;
- t is integer multiple of p
k
(k =1, 2, i);

3
- t is integer multiple of p
k
(k =1, 2, i);
total time required to complete all jobs with deadline d t:

k=1 to i
e
k
(t /p
k
) = t
k=1 to i
e
k
/p
k
= t U
i
- if T
i
misses its deadline, then the above time exceeds the
available time t : t U
i
> t
U
i
> 1
Critical instants Critical instants
in a fixed priority task set with D
i
p
i
, a critical instant for task T
k
is the time at
which the release of one of its jobs will produce the largest response time w
k
it is easy to realize that a critical instant for task T
k
occurs when one of its
jobs is released simultaneously with a job of every higher priority tasks: jobs is released simultaneously with a job of every higher priority tasks:
T
k
T
i
T
k
alone:
w
k1
= e
k
T
k
and T
i
:
T
k
w
k,1
= e
k
+ 3e
i
w
k,2
= e
k
+ 4e
i
J
k,1
J
k,2
4
- the response time w
k,1
of a job J
k,1
is delayed by the interference of a
higher priority task T
i
(in the bottom figure, jobs of T
i
delay w
k,1
by 3e
i
)
- when the release time of J
i,y
gets closer to the release time of J
k,x
(e.g.
J
i,5
and J
k,2
), the delay may increase (4e
i
for J
k,2
)
- the response time of J
k,x
is maximum when the 2 release times coincide
- repeat the argument for all tasks T
j
( j =1, 2, k-1)
Second optimality of RM Second optimality of RM
Theorem: If a task set with p
i
= D
i
can be feasibly scheduled by a fixed
priority algorithm then it can be feasibly scheduled by RM.
Proof:
1. if jobs of all tasks are feasible at their critical instants, then they are 1. if jobs of all tasks are feasible at their critical instants, then they are
feasible in any other conditions; so we consider the (worst case)
critical instants
2. consider a set of 2 periodic tasks T
1
and T
2
with p
1
< p
2
: if priorities
are not assigned according to RM, then T
2
has the higher priority:
T
2
e
2
e
1
p
2
5
from the figure it is easy to see that, at critical instants, the schedule
is feasible iff
e
1
+ e
2
p
1
3. we now will prove that if this condition holds, then the 2 tasks are
schedulable also with RM
w
1
T
1
p
1
Second optimality of RM (2) Second optimality of RM (2)
we prove that if e
1
+ e
2
p
1
, then the 2 tasks are schedulable also with RM:
- let us invert the priorities of the 2 tasks so that they respect the RM rule
(p
1
< p
2
and T
1
has now the higher priority);
- lets consider the jobs J
1,1
(T
1
) and J
2,1
(T
2
) starting at a critical instant: - lets consider the jobs J
1,1
(T
1
) and J
2,1
(T
2
) starting at a critical instant:
T
1
T
2
e
2
w
1
= e
1
p
1
p
2
w
2
- the response time w
1
of J
1,1
is shorter than in the previous case (J
1,1
is not
delayed by T
2
) if J
1,1
did not miss its deadline in the previous case, it will
6
delayed by T
2
) if J
1,1
did not miss its deadline in the previous case, it will
not miss it now;
- since e
1
+ e
2
p
1
the response time w
2
of J
2,1
is equal to e
1
+ e
2
(J
2,1
is
delayed by one and only one job of T
1
: since e
1
+ e
2
p
1
, the release of
the second job of T
1
occurs after J
2,1
has completed);
- then w
2
= e
1
+ e
2
p
1
< p
2
J
2,1
will not miss its deadline;
- the 2 tasks are feasible; the result can be extended to n tasks.
Optimality of RM when Optimality of RM when DD
ii
>p >p
i i
??
Problem: If a task set with D
i
> p
i
can be feasibly scheduled by a fixed priority
algorithm, then can it be feasibly scheduled by RM?
or can it be feasibly scheduled by DM?
The answer in NO to both questions, as proven by the following example:
T
1
: (9, 3, 10); T
2
: (6, 4, 8) U = 3/9 + 4/6 = 1
- the system is feasible with fixed priority T
1
> T
2
:
T
1
9 18
d
1,1
d
1,2
T
2
12 14 6 7 8 18
d
2,3
d
2,1
d
2,2
r
2,2
r
2,1
r
2,3
J
1,1
J
1,2
J
2,1
J
2,2
J
2,3
r
1,1
r
1,2
7
- with RM or DM (priority T
2
> T
1
) the system is not feasible:
here J
1,1
misses its deadline
T
1
9 18
d
1,1
d
1,2
T
2
12 14 6 7 8 18
d
2,3
d
2,1
d
2,2
r
2,2
r
2,1
r
2,3
r
1,1
r
1,2
J
1,2
J
1,1
J
2,3
J
2,2
J
2,1
Optimality of RM and DM? Optimality of RM and DM?
we have already seen that fixed-priority scheduling algorithms are not
optimal for tasks with arbitrary periods;
we have also seen that: we have also seen that:
1. RM is optimal in the special case when tasks are simply periodic
and relative deadlines D
i
are not shorter than periods p
i
;
2. RM is optimal among fixed-priority algorithms when relative
deadlines D
i
are equal to periods p
i
;
in the general case, fixed priority RM and DM algorithms are not
optimal, nevertheless they are often used because they lead to more
8
optimal, nevertheless they are often used because they lead to more
predictable and stable systems;
among all the fixed priority algorithms, DM is the best because it is
optimal also when relative deadlines are shorter than periods.
Optimality of DM Optimality of DM
Theorem: If a task set with D
i
p
i
can be feasibly scheduled by a fixed
priority algorithm then it can be feasibly scheduled by DM.
Proof:
1. if all tasks are feasible at their critical instants, then they are feasible 1. if all tasks are feasible at their critical instants, then they are feasible
in any other conditions; so we consider the (worst case) critical
instants
2. consider a set of 2 periodic tasks T
1
and T
2
with D
1
< D
2
: if priorities
are not assigned according to DM, then T
2
has the higher priority:
T
2
e
2
e
1
D
2
p
2
9
from the figure it is easy to see that, at critical instants, the
schedule is feasible iff
e
1
+ e
2
D
1
3. we now will prove that if this condition holds, then the 2 tasks are
schedulable also with DM
w
1
T
1
D
1
p
1
Optimality of DM (2) Optimality of DM (2)
we prove that if e
1
+ e
2
D
1
, then the 2 tasks are schedulable also with DM:
- let us invert the priorities of the 2 tasks so that they respect the DM rule
(D
1
< D
2
and T
1
has now the higher priority);
- lets consider the jobs J
1,1
(T
1
)and J
2,1
(T
2
) starting at a critical instant:
w
2
T
1
T
2
e
2
w
1
= e
1
D
1
D
2
p
2
p
1
- lets consider the jobs J
1,1
(T
1
)and J
2,1
(T
2
) starting at a critical instant:
- the response time w
1
of J
1,1
is shorter than in the previous case (J
1,1
is not
delayed by T
2
) if J
1,1
did not miss its deadline in the previous case, it will
not miss it now;
10
not miss it now;
- since e
1
+ e
2
D
1
the response time w
2
of J
2,1
is equal to e
1
+ e
2
(J
2,1
is
delayed by one and only one execution of T
1
: since e
1
+ e
2
D
1
p
1
, the
second job of T
1
is released after J
2,1
has completed);
- then w
2
= e
1
+ e
2
D
1
< D
2
J
2,1
will not miss its deadline;
- the 2 tasks are feasible; the result can be extended to n tasks.
why this is not true for RM? suppose D
2
<e
1
+e
2
<D
1
, while p
2
>p
1
.
UU
lub lub
for 2 tasks with RM for 2 tasks with RM
To compute the least upper bound U
lub
of the processor utilization factor
for a set of 2 periodic tasks T
1
and T
2
, with D
i
= p
i
, scheduled by the RM
algorithm, algorithm,
1. we assume p
1
< p
2
: according to RM, T
1
has the higher priority;
2. we compute the upper bound U
ub
of their utilization factor by setting
execution times e
1
and e
2
to fully utilize the processor;
3. we minimize the upper bound U
ub
by adjusting all other parameters
of the 2 task (essentially their periods or, more precisely, as we will
see, the ratio between their periods).
11
see, the ratio between their periods).
UU
lub lub
for 2 tasks with RM (2) for 2 tasks with RM (2)
- let F = p
2
/p
1
be the number of periods of T
1
entirely contained in p
2
;
- we consider the 2 cases:
case 1: all jobs of T
1
released in the first period of T
2
starting from a critical
instant, are completed within the first period of T
2
; that is: Fp
1
+ e
1
p
2
instant, are completed within the first period of T
2
; that is: Fp
1
+ e
1
p
2
- in this situation, the largest possible value of e
2
for which the system is
feasible is: e
2MAX
= p
2
- (F + 1)e
1
(with e
2
=e
2MAX
the tasks fully utilize the pr.)
- the corresponding U
ub
is:
e
1
T
1
T
2
p
1
p
2
J
1,l
12
- the corresponding U
ub
is:
U
ub
= (e
1
/p
1
) + (e
2MAX
/p
2
) (U
ub
varies with e
1
)
U
ub
= (e
1
/p
1
) +1 - (F + 1)e
1
/p
2
= 1+ (e
1
/p
2
) [(p
2
/p
1
) - (F + 1)]
- since [(p
2
/p
1
) - (F + 1)] is negative, U
ub
decreases with e
1
and is minimum
when e
1
has its maximum value
- since Fp
1
+ e
1
p
2
, the maximum value of e
1
is e
1MAX
= p
2
- Fp
1
- U
ub
is minimum when e
1
= p
2
- Fp
1
(when job J
1,l
completes at t = p
2
)
UU
lub lub
for 2 tasks with RM (3) for 2 tasks with RM (3)
case 2: the execution of the last job of T
1
(J
1,l
) released in the first period
of T
2
overlaps the release of the second job of T
2
; that is: Fp
1
+ e
1
> p
2
T
2
p
2
J
- in this situation, the largest possible value of e
2
for which the system is
feasible is: e
2MAX
= Fp
1
- Fe
1
(with e
2
=e
2MAX
the tasks fully utilize the proc.)
- the corresponding U
ub
is:
U
ub
(e
1
/p
1
) + (e
2MAX
/p
2
) (U
ub
varies with e
1
)
U
ub
= (e
1
/p
1
) + F(p
1
- e
1
) /p
2
= (e
1
/p
1
) + (Fp
1
/p
2
) - (Fe
1
/p
2
)
T
1
p
1
e
1
J
1,l
13
U
ub
= (e
1
/p
1
) + F(p
1
- e
1
) /p
2
= (e
1
/p
1
) + (Fp
1
/p
2
) - (Fe
1
/p
2
)
U
ub
= (Fp
1
/p
2
) + (e
1
/p
2
)[(p
2
/p
1
) - F ]
- since [(p
2
/p
1
) - F ] is positive, U
ub
increases with e
1
and is minimum when
e
1
has its minimum value
- since Fp
1
+ e
1
> p
2
, the minimum value of e
1
is e
1MIN
= p
2
- Fp
1
- U
ub
is minimum when e
1
= p
2
- Fp
1
- in both cases U
ub
is minimum when J
1,l
terminates exactly at p
2
UU
lub lub
for 2 tasks with RM (4) for 2 tasks with RM (4)
- in both cases U
ub
is minimum when e
1
= p
2
- Fp
1
(J
1,l
terminates exactly at p
2
)
- using this value of e
1
, the expression of U
ub
becomes:
U
ub
= (Fp
1
/p
2
) + (e
1
/p
2
)[(p
2
/p
1
) - F ] = (Fp
1
/p
2
) + [(p
2
- F p
1
) /p
2
][(p
2
/p
1
) - F ]
U
ub
= (p
1
/p
2
){F + [(p
2
/p
1
) - F ][(p
2
/p
1
) - F ]} U
ub
= (p
1
/p
2
){F + [(p
2
/p
1
) - F ][(p
2
/p
1
) - F ]}
we note that U
ub
is expressed as a function of the ratio k = p
2
/p
1
U
ub
(k) = [F + (k - F )
2
]/k
- to simplify the expression, let G = (p
2
/p
1
) - F (G = k - k )
we note that 0 G < 1 and that G = 0 when p
2
is a multiple of p
1
.
then we may write:
U
ub
= (p
1
/p
2
)[F + G
2
]
U
ub
= (F + G
2
)/(p
2
/p
1
) = (F + G
2
) / [(p
2
/p
1
) - F + F ]
14
U
ub
= (F + G )/(p
2
/p
1
) = (F + G ) / [(p
2
/p
1
) - F + F ]
U
ub
= (F + G
2
) /(G + F ) = [F + G - (G - G
2
)] /(F + G )
U
ub
= 1 - [(G - G
2
) /(F + G )] = 1 - [G(1 - G) /(F + G )]
- since G(1 - G) is non negative, U
ub
increases with F and is minimum when F
has its minimum value, i.e. when F = 1
- when F =1, we have: U
ub
= (1 + G
2
) /(1 + G ) (expressed as a function of G)
- we will now find the minimum of the function U
ub
(G)
UU
lub lub
for 2 tasks with RM (5) for 2 tasks with RM (5)
- lets find the minimum of the function U
ub
(G) = (1 + G
2
) /(1 + G):
dU
ub
/dG = [2G(1 + G ) - (1 + G
2
)] /(1 + G)
2
= (G
2
+ 2G -1) /(1 + G)
2
dU
ub
/dG = 0 when G
2
+ 2G -1 = 0, which has 2 solutions:
G
1
= -1 - 2
G
2
= -1 + 2
since 0 G <1, G
1
is discarded:
G
2
= -1 + 2 is the value of G for which U
ub
has its minimum value U
lub
.
- then the least upper bound U
RM
(2) of the utilization factor for 2 tasks
scheduled with the RM algorithm is:
15
RM
scheduled with the RM algorithm is:
U
RM
(2) = [1+ (2 - 1)
2
] / [1 + (2 - 1)] = (4 - 22) /2 = 2(2 - 1)
U
RM
(2) = 2(2
1/2
- 1) 0.828
UU
lub lub
for 2 tasks with RM (6) for 2 tasks with RM (6)
U
RM
(2) = 2(2
1/2
- 1) 0.828
- this U
lub
takes place when:
F = p
2
/p
1
=1: only one full period p
1
is contained in p
2
, F = p
2
/p
1
=1: only one full period p
1
is contained in p
2
,
J
1,2
terminates exactly at p
2
in the critical time zone of T
2
,
G = (p
2
/p
1
) - p
2
/p
1
= 2 - 1: the ratio k = p
2
/p
1
is equal to 2.
- this situation is represented in the following figure:
T
T
2
p
2
= p
1
2
e
1 J
1,2
e
2
16
- in this situation: e
2
+ 2e
1
= p
2
; e
2
+ e
1
= p
1
(e
2
+ 2e
1
)/(e
2
+ e
1
) = p
2
/ p
1
= 2 e
2
+ 2e
1
= 2(e
2
+ e
1
)
2e
1
-2e
1
= 2e
2
- e
2
e
2
(2 -1) = e
1
(2 -2) e
2
(2 -1) = e
1
2(2 -1)
e
2
= e
1
2
T
1
p
1
UU
lub lub
for 2 tasks with RM (7) for 2 tasks with RM (7)
- to get a deeper insight into the problem, we may examine the expression
of U
ub
as a function of the ratio k = p
2
/p
1
:
U
ub
(k) = [F + (k - F )
2
]/k = (F + k
2
+ F
2
- 2kF )/k = (k
2
- 2kF + F + F
2
)/k
U (k) = k - 2F + F (F + 1)/k U
ub
(k) = k - 2F + F (F + 1)/k
- the following figure represents the upper bound of U as a function of k:
U
ub
as a function of k = p
2
/p
1
U (k) = k - 2F + F (F + 1)/k
0,8
0,9
1
1,1
U
ub
Uub
17
U
ub
(k) = k - 2F + F (F + 1)/k
when k is integer, k = F and
U
ub
(k) = k - 2k + k (k + 1)/k,
U
ub
(k) = 1
0,6
0,7
0,8
1 2 3 4 5 6 7 8 9 10
k = p
2
/p
1
UU
lub lub
for 2 tasks with RM (8) for 2 tasks with RM (8)
- lets now find the values of k that produce the local minima of U
ub
(k) for any
given value of F:
U
ub
(k) = k - 2F + F (F + 1)/k
dU
ub
/dk = 1 F (F + 1)/k
2
dU
ub
/dk = 1 F (F + 1)/k
2
dU
ub
/dk = 0 (and U
ub
(k) is minimum) for k* = [F(F + 1)] ,
- for any given value of F, the corresponding minimum value of U is:
U* = 2( [F(F + 1)] - F)
- the following table contains the first few values of k* for which the processor
utilization has a local minimum U*
F k* U*
18
F k* U*
1 2 0.828
2 6 0.899
3 12 0.928
4 20 0.944
5 30 0.954
Values of k* and U* as a function of F = p
2
/p
1

k* = [F(F + 1)]
U* = 2( [F(F + 1)] - F)
UU
lub lub
for n tasks with RM for n tasks with RM
Theorem: A set of n indipendent, preemptable, periodic tasks with D
i
= p
i
can be feasibly scheduled on a processor with the RM algorithm
if its total utilization U is not greater than:
U
RM
(n) = n(2
1/n
- 1) U
RM
(n) = n(2
1/n
- 1)
Proof (informal):
we will try to construct a set of n tasks that fully utilize the processor
and whose total utilization U is the smallest among all sets of n tasks
that fully utilize the processor;
we call period ratio of the set of n tasks: q
n,1
= p
n
/p
1
in the first part of the proof we show that for any task set with period
ratio q > 2 that fully utilizes the processor (with total utilization U) a
19
ratio q
n,1
> 2 that fully utilizes the processor (with total utilization U) a
corresponding set of n tasks may be found, with period ratio q
n,1
2,
that fully utilizes the processor with total utilization U < U;
as a consequence, in constructing the task set with the smallest U, we
may consider only task sets with period ratio q
n,1
2, i.e. with p
n
2p
1
.
UU
lub lub
for n tasks with RM (2) for n tasks with RM (2)
first part of the proof:
for any task set with period ratio q
n,1
> 2 that fully utilizes the processor
(with total utilization U) there is a corresponding set of n tasks, with
period ratio q
n,1
2, that fully utilize the processor with total utilization period ratio q
n,1
2, that fully utilize the processor with total utilization
U < U:
starting from any set of n task with period ratio q
n,1
> 2 that fully
utilize the processor, we will step by step construct another set of n
task with period ratio q
n,1
2 and with a lower total utilization;
for any task T
k
whose period p
k
is such that lp
k
<p
n
(l+1)p
k
, with l2,
we transform T
k
and T
n
in this way (as shown in the next slide):
- the new T
k
has period lp
k
and execution time e
k
,
20
- the new T
k
has period lp
k
and execution time e
k
,
- the new T
n
has period p
n
and execution time e
n
+ (l-1)e
k
;
- the next slide will also show that the new task set :
- fully utilizes the processor,
- has the ratio p
n
/p
k
2,
- has a lower processor utilization;
the process is repeated for all tasks T
k
for which p
n
/p
k
> 2.
UU
lub lub
for n tasks with RM (3) for n tasks with RM (3)
T
k
p
k
2p
k
(l-1)p
k
lp
k
e
k
e
n
0
(l+1)p
k
the new T
k
has period lp
k
and execution time e
k
,
the new T
n
has period p
n
T
n
p
n
T
k

p
k
= lp
k
T
e
n
= e
n
+ (l-1)e
k
e
k
= e
k
0
0
21
the new T
n
has period p
n
and exec. time e
n
+ (l-1)e
k
by construction:
- T
k
and T
n
use the same amount of processor time used by T
k
and T
n
;
- p
n
2p
k
(p
n
/p
k
2);
- U - U = e
k
/p
k
- e
k
/ lp
k
- (l-1)e
k
/p
n
= (1/ lp
k
- 1/p
n
)(l-1)e
k
- U - U > 0 since lp
k
< p
n
T
n

p
n
= p
n
0
UU
lub lub
for n tasks with RM (4) for n tasks with RM (4)
second part of the proof:
we consider only the case: p
n
2p
1
- we will try to construct a set of n tasks that fully utilize the processor - we will try to construct a set of n tasks that fully utilize the processor
and whose total utilization U is the smallest among all sets of n tasks
that fully utilize the processor;
- starting from a critical instant, if p
n
2p
1
, in the feasible interval of the
lowest priority job J
n,1
(0 p
n
), each one of the other n -1 higher priority
tasks (T
1
, .. T
n-1
) will release 2 jobs;
- starting from a critical instant, the task set must keep the processor
constantly busy until p and must complete the execution of J and of
22
constantly busy until p
n
and must complete the execution of J
n,1
and of
all the other 2(n -1) jobs of higher priority tasks released before p
n
:
then the task set is feasible and fully utilizes the processor;
UU
lub lub
for n tasks with RM (5) for n tasks with RM (5)
- a possible schedule that is feasible and fully utilizes the processor is
the one in the figure below; the relationships between periods and
execution times satisfy the equations () :

T
1
other equalities:
e
k
= p
k+1
- p
k
for k =1, 2, , n-1
e
n
= p
n
- 2
k=1 to n-1
e
k

T
1
p
1
2p
1
T
2
p
2
T
3
p
3
T
23
other equalities:
p
1
=
k=1 to n
e
k
p
n
- p
1
=
k=1 to n-1
e
k
e
n
= 2p
1
- p
n
- we will show that such a schedule, that satisfies the equations (), has
the minimum U among all schedules that fully utilize the processor.
T
n-1
p
n-1
T
n
p
n
UU
lub lub
for n tasks with RM (6) for n tasks with RM (6)
a different schedule that is feasible and fully utilizes the processor, but
with periods and execution times that do not satisfy equations (), may
be obtained modifying the schedule by increasing or decreasing the
execution time of one of the tasks.
lets first increase (WLOG) the
execution time of one of the tasks.
lets first increase (WLOG) the
execution time of T
1
(see fig.):
e
1
= e
1
+ = p
2
- p
1
+
- this increment would cause
all jobs of all lower priority
tasks to start execution time
units later than in the original
system; as a result J
n,1
would
T
1
p
1
2p
1
T
2
p
2
T
3
p
3
+ +

24
system; as a result J
n,1
would
not terminate at p
1
: its
remaining execution time
could be resumed only at
time p
n
+ , beyond its
deadline;
T
n-1
p
n-1
T
n
p
n

T
k
p
k

preemption missed deadline


UU
lub lub
for n tasks with RM (7) for n tasks with RM (7)
- to keep the new system schedulable, the execution time of some other
task T
k
must be reduced by the same amount (see figure):
e
k
= e
k
- = p
k+1
- p
k
-
- the remaining tasks in this new
system still satisfy equations ()
- it is easy to see that the total
utilization U of this new system
is greater than the total
utilization U of the original
T
1
p
1
2p
1
T
2
p
2
T
3
p
3
+ +


25
system:
U -U= e
1
/p
1
+ e
k
/p
k
- e
1
/p
1
- e
k
/p
k
U -U= /p
1
- /p
k
> 0 since p
1
< p
k
T
n-1
p
n-1
T
n
p
n

T
k
p
k

- -
UU
lub lub
for n tasks with RM (8) for n tasks with RM (8)
lets now decrease (WLOG) the execution time of T
1
(see figure):
e
1
= e
1
- = p
2
- p
1
-
- this decrement would cause the first jobs of all lower priority tasks to
start execution time units earlier than in the original system; start execution time units earlier than in the original system;
- as a result J
n,1
would terminate
at p
1
- and the processor
would then remain idle for
time units before starting
execution of J
1,2
at t = p
1
;
- the processor would remain
idle again for more time units
T
1
p
1
2p
1
T
2
p
2
T
3
p
3
- -

26
idle again for more time units
from the instant at which J
1,2
completes (at t = p
2
- ) to
when J
2,2
is released (at p
2
);
- in the time interval (0 - p
2
) the
total idle time would be 2.
T
n-1
p
n-1
T
n
p
n

T
k
p
k

processor idle

UU
lub lub
for n tasks with RM (9) for n tasks with RM (9)
- to keep the processor fully utilized in the new system, the execution time
of some other task T
k
must be incremented by the same amount 2:
e
k
= e
k
+ 2 = p
k+1
- p
k
+ 2
- the remaining tasks in this new
system still satisfy equations ();
- it is easy to see that also the
total utilization U of this new
system is greater than the total
utilization U of the original
system:
T
1
p
1
2p
1
T
2
p
2
T
3
p
3
- -


27
system:
U - U= e
1
/p
1
+ e
k
/p
k
- e
1
/p
1
- e
k
/p
k
U - U= 2/p
k
- /p
1
U - U > 0 since p
k
2p
1
T
n-1
p
n-1
T
n
p
n

T
k
p
k

+ 2 + 2
UU
lub lub
for n tasks with RM (10) for n tasks with RM (10)
we have proved that the sets of n tasks whose parameters satisfy
equations () (and p
n
2p
1
) have the minimum processor utilization U(n)
among all task sets of n tasks that do not satisfy (); lets express U(n)
as a function of ratios between periods of the tasks q
k,i,
= p
k
/ p
i
: as a function of ratios between periods of the tasks q
k,i,
= p
k
/ p
i
:
U(n) =
k=1 to n
e
k
/p
k
=
k=1 to n-1
[e
k
/p
k
]+ e
n
/p
n
{apply equations ()}
U(n) =
k=1 to n-1
[(p
k+1
- p
k
)/p
k
] + (p
n
-2
k=1 to n-1
e
k
)/p
n
U(n) =
k=1 to n-1
[p
k+1
/p
k
- p
k
/p
k
] + 1 - (2/p
n
)
k=1 to n-1
(p
k+1
- p
k
)
U(n) =
k=1 to n-1
[p
k+1
/p
k
-1] + 1 - (2/p
n
)(p
n
- p
1
)
U(n) =
k=1 to n-1
(q
k+1,k
) - (n -1) + 1 - 2(p
n
- p
1
)/p
n
U(n) =
k=1 to n-1
(q
k+1,k
) - n + 2 - 2(p
n
- p
1
)/p
n
28
U(n) =
k=1 to n-1
(q
k+1,k
) - n + 2 - 2(p
n
- p
1
)/p
n
U(n) =
k=1 to n-1
(q
k+1,k
) - n + 2(1 - (p
n
- p
1
)/p
n
)
U(n) =
k=1 to n-1
(q
k+1,k
) - n + 2p
1
/p
n
U(n) =
k=1 to n-1
(q
k+1,k
) - n + 2/(p
n
/p
1
) {p
n
/p
1
=(p
n
/p
n-1
)(p
n-1
/p
n-2
)(p
2
/p
1
)}
U(n) = q
2,1
+ q
3,2
+ q
n,n-1
- n + 2/(q
2,1
q
3,2
q
n,n-1
)
UU
lub lub
for n tasks with RM (11) for n tasks with RM (11)
when p
n
2p
1
, the utilization (U(n)) of a set of n tasks that satisfy () is
a function of the ratios between adjacent periods p
k+1
/ p
k
:
U(n) = q
2,1
+ q
3,2
+ q
n,n-1
- n + 2/(q
2,1
q
3,2
q
n,n-1
)
this is a symmetrical function of the n -1 adjacent period ratios (its this is a symmetrical function of the n -1 adjacent period ratios (its
values are the same for any permutation of its n -1 variables);
lets compute its second partial derivative for any one of its variables:
U(n)/q
k+1,k
= 1 - 2(q
2,1
q
k,k-1
q
k+2,k+1
q
n,n-1
)/(
i=1 to n-1
(q
i+1,i
))
2
U(n)/q
k+1,k
= 1 - 2/(q
2,1
q
k,k-1
(q
k+1,k
)
2
q
n,n-1
)

2
U(n)/
2
q
k+1,k
= 2(q
2,1
q
k,k-1
2q
k+1,k
q
n,n-1
)/((q
k+1,k
)
2

i=1 to n-1
(q
i+1,i
))
2

2
U(n)/
2
q
k+1,k
> 0
29

2
U(n)/
2
q
k+1,k
> 0
the second partial derivative is greater than zero:
U(n) is a symmetrical convex function of q
k+1,k
(k =1, 2, n-1)
U(n) has a unique minimum and this minimum is the schedulable
utilization of RM for a set of n tasks: U
lub
(n) = U
RM
(n)
U(n) is minimum when q
k+1,k
is such that U(n)/q
k+1,k
= 0
UU
lub lub
for n tasks with RM (12) for n tasks with RM (12)
to find the value of U
RM
(n) we must find the values of its variables q
k+1,k
(k =1, 2, n-1) for which all its partial derivatives are zero;
since (U(n)) is a symmetrical function of its n-1 variables, its minimum is
reached when all its n-1 variables have the same value for which its reached when all its n-1 variables have the same value for which its
partial derivatives are zero:
the partial derivative with respect to the generic variable q
k+1,k
is:
U(n)/q
k+1,k
= 1 - 2/(q
2,1
q
k,k-1
(q
k+1,k
)
2
q
n,n-1
)
when all q
i+1,i
are equal to q
k+1,k
:
U(n)/q
k+1,k
= 1 - 2/(q
k+1,k
)
n
this partial derivative is zero when q
k+1,k
= 2
1/n
30
this partial derivative is zero when q
k+1,k
= 2
the corresponding minimum value of U(n) [q
2,1
+ q
3,2
+ q
n,n-1
- n + 2/(q
2,1
q
3,2
q
n,n-1
)] is:
U
RM
(n) = (n-1)2
1/n
- n + 2/2
(n-1)/n
U
RM
(n) = n2
1/n
- 2
1/n
- n + 2(2
(1-n)/n
) = n2
1/n
- 2
1/n
- n + 2
1+(1-n)/n
U
RM
(n) = n2
1/n
- 2
1/n
- n + 2
(n+1-n)/n
= n2
1/n
- 2
1/n
- n + 2
1/n
= n2
1/n
- n
U
RM
(n) = n(2
1/n
- 1)
- we have seen that the expression that gives U
RM
for 2 tasks:
U
RM
(2) = 2(2
1/2
- 1)
can be extended to the case of n task: U
RM
(n) = n(2
1/n
- 1)
- it is rather easy to see (next slide) that:

UU
lub lub
for n tasks with RM (13) for n tasks with RM (13)
- it is rather easy to see (next slide) that:
when n, U
RM
(n) = ln 2 ( 0.693)
- the following table and figure give an idea of the values of U
RM
(n) :
n U
RM
(n)
1 1.000
1
1,1
31
2 0.828
3 0.780
4 0.757
5 0.743
0.693
0,6
0,7
0,8
0,9
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
n
Urm(n)
0.693
UU
lub lub
for n tasks with RM (14) for n tasks with RM (14)
U
RM
(n) = n(2
1/n
- 1)
lets substitute: y = (2
1/n
- 1):
lim
n
y = 0 lim
n
y = 0
1/n = log
2
(y+1) = ln (y+1) / ln 2 (recall: log
z
x = ln x / ln z)
n = ln 2 / ln (y+1)
U
RM
(n) = ny = y ln 2 / ln (y+1)
lim
n
U
RM
(n) = lim
y0
[y ln 2 / ln (y+1)] = ln 2 lim
y0
[y / ln (y+1)]
(recall Hopitals rule: lim f/g = lim f / g )
32
(recall Hopitals rule: lim f/g = lim f / g )
(recall: derivative of ln: lnx = 1/ x)
lim
y0
[y / ln (y+1)] = lim
y0
1 / [1/ (y+1)] = lim
y0
(y+1) = 1
lim
n
U
RM
(n) = ln 2

You might also like