Professional Documents
Culture Documents
DAA-unit 3
DAA-unit 3
4 Greedy Method
Choice
of solution
made
at each
step of
E
• At each step the choice made should be,
.
Feasible - It should satisfy the problem's constraints .
• In short, while making a choice there should be a Greed for the optimum solution.
Q. 1 Explain the terms, feasible eolution, optimal solution and objective function. ]
~[JNTIJ : Part B, May-09, Marks 8, April-11, Maries 7
Ans.:
Feasible solution - For solving a particular problem there exists n inputs -and we need to obtain a subset that
satisfies some constraints. Then any subset that satisfies these constraints is called feasible solution.
Optimal solution - From a set of feasible solutions, particular solution that satisfies or n early satisfies the
objectives of the function such a solution is called optimal solution.
Objective function - A feasible solution that either minimizes or maximizes a given objective function is called
objective function.
(4 - 1)
~
1
.,.A A,ualy1l1 <1/ t\l,~1lrlthmii
4 .. '} <,ri•rtly M,t1wd 1
2, optimal aubatructure : A pruhl,'ni i1huw t1 upllnrnl
..:,..,bstrUcture if an opthn,ll soh,tlon t ) II :l) Mh1l1111110 r1 p111111h1K Ire""
l
l
II' pl'u l l•'nl
.. tainS optimal solution t,) thf• sut n . 1.
4) !-ilup.lP 111,w•, •1• ultort,•tJl pulh .,lr,orltlirn .
~'-''' 1 r l 0µ 1('ll\ll ,
I I)
l,ther words a prohlt'm hns optlmnl fltthitll'uctur .. II
Q,fl l>lff"r•mtlhh, bMfw.,un dlvlde •uad conquer and
the t,est next choice nlwnys lt•.uls to tlu, ,lptlmnl
Gr1u1dv trudt.od.
sohJ.tion. II~· [JNTU , Part P, O■c,•12, Marks 7, May-09, Markt 6)
ln thiS chapter we will first undt•rshr od tlw co,w,•pt o( An,a,:
creedY method. Then we will discusa vnrlous
e:\'.ainples for which Greedy method is npplil'll. Sr.No. Dlvhfo ,md co•u1110 Greedy olgorlthrn
-
l. Dlvld1• n11d rn11qu1,t reedy mt:thod ts used
Q.3 Write the control abstracti on for the greed~ wwd lo obl(,ln ,, to obt.JJn optlmum
IPltbod, ~ [JNTU : Part B, May~12, Marks 7] 11olullon lo Blvt·n solution.
probli•rn.
2. In this technique, the
~ In greedy method a set
Algorithm problem 111 divided of feasible solution is
Greed.Y (D, n) Into small generated and
/{In GreedY approach D is a domain subproblem s. These optimum solution is
subproblem s arc picked up.
//from which solution is to be obtained of size n
solved Independe ntly.
/{Initially assume Finally all the solutions
Solution ~ 0 of subproblem s are
fori~1t ondo collected together to I
get the solution to the
{ given problem.
... A s +- select (D) // section of solution from D
Cheek lf uiv -- •t (Feasible (solution, s)) then 3. In this method, In greedy method, the
iS feasiblesolution
selected or solution +- Union (solution s)·
_._._ _ _ _,. duplication s in optimum selection is
not. , subsolutio ns are without revising
Make a feasible
} choices and select neglected. That means previously generated
retum solution optimum solution. duplicate solutions solutions.
may be obtained.
In Greedy method following activities are performed.
4. Divide and conquer is Greedy method is
1. First we select some solution from input domain. less efficient because of compartiv ely efficient
rework on solutions. I but there is no as such
2. Then we check whether the solution is feasible or guarantee of getting
I
not. optimum solution.
3. From the set of feasible solutions, particular 5. Examples : Quick sort Examples : Knapsack
solution that satisfies or nearly satisfies the binary search problem, finding
minimum spanning
objective of the function. Such a solution is called
L----'---------- Ji
tree.
optimal solution.
4· As Greedy method works in stages. At each stage Q.6 Explain the control abstracti on of greedy
only one input is considere d at each time. ~ased method compare this with dynamic programming.
on this input it is decided whether particular l(i=' [JNTU : Part B, May-12, Marks 15)
input gives the optimal solution or not. Ans. : Control abstraction- Refer Q.3.
Ans.:
1) Job sequencing with deadlines
2) Knapsack problem
....... 0
(MCODl)
, • TECHNICAL PUBLICATJONS 0" • An up th ru sl for knowlsdg"
4-3 GrudyM_IHL
~"'Ir,~
~g,r and Analysis of Algorithms_
Comparison with Dynamic Programming • Consider all possible schedules and compute ~
mini.mum total time in the system.
Sr. Greedy method Dynamic programming
Q.8 Give an algorithm for job sequencing With
No.
deadline. Also specify the time complexity of %
1. Greedy method is used Dynamic programming algorithm.
for obtaining optimum is also for obtaining
optimum solution. Ans. : The algorithm for job sequencing is as given
solution.
below
2. In Greedy method a set There is no special set
of feasible solutions of feasible solutions in
this method. Algorithm Job_seq(D,J,n)
and the picks up the
optimum solution. {
Dynamic programming
I / /Problem description : This algorithm is for job
3. In Greedy method the
optimum selection is considers all possible sequencing using Greedy method.
without revising sequences in order to //D(i] denotes ith deadline where 1 ~ i $ n
previously generated obtain the optimum
solution.
//J(i] denotes ith job
solutions.
// D(J(i]] $ D(J(i+1J]
4. In Greedy method there It is guaranteed that the
dynamic programming //Initially
is no as such guarantee
of getting optimum will generate optimal D(O]H>;
solution. solution using principle J(O]H>;
of optimality. J[l)t-1;
countt-1;
f 4.2 Job Sequencing with Deadlines for t-2 to n do
{
Q, 7 State and explain Job sequencing with tt-count;
deadline problem. whlle(D(J(t]] > D(i]) AND D(J(tl]l=t)) do tt-t-1 ;
Ans. : Consider that there are n jobs that are to be if((D(J(t]] ~ D(i])AND(D(i] > t)) then
executed. At any time t = 1,2,3,.. only exactly one job {
is to be executed. The profits pi are given. These //insertion of ith feasible sequence into J array
profits are gained by corresponding jobs. For for st-count to (t+1) step - 1 do
obtaining feasible solution we should take care that J(s+l] +- J(s];
the jobs get completed within their given deadlines. J(t+l]+-I;
countt-count+ 1;
Let n = 4
}//end of if
D pi di }//end of while
-
1 70 2 return count;
12 1 }
2
3
4
18
35 -- 2
1
The sequence of J will be inserted if and only il
D[J[t]]!=t. This also means that the job J will be
processed if it is in within the deadline.
We will follow following rules to obtain the feasible
solution The computing time taken by above Job_seq
2
algorithm is O(n ), because the basic operation of
• Each job takes one unit of time.
computing sequence in array J is within two nested
• H job starts before or at its deadline, profit is for loops.
obtained, otherwise no profit.
Q. 9 Solve the following Instance of "Job
• Goal is to schedule jobs to maximize the total n
seque ctng with deadlines" problem : "
profit. n =
7, profits (pi, p 2, p 3 , ....... , p 7 ) = (3, s, 20,
.Jmowlaclae
~ ,aul Alt1dysis of Algorithms
4-4 Greedy Method
JS, 1, 6, 30) and deadlines (d
(1, S, 4, 3, 2, 1, 2). 1, d2, ..... , d,) = Step 8 :Thus the optimal sequence which we will
Q" [JNTU • Part B M obtain will be 6-7-4-3. The maximum profit will be
• ' ay-13, Marks 8)
74.
SteP 1: We will arrange the profits . . Q.10 Explain Job-sequencing with deadline•. Solve
P1 m the following instance :
descert ding ord er. Th en corresponding deadlines will
n = S.
appear. (P1, P2, P3 , P4 , P5 ) = (20, 15, 10, 5, 1)
Profit 30 20 18 i 6 s (d1, d2 , d3 , cl., d5 ) = (2, 2, 1, 3, 3)
l 3 1
.....L_
i
~[JNTU : Part B, Dec.-12, Marks 8)
Job P7 p3 p4 I
P6 P2
- · ·•·-·•-·-··•• . -· Pi ~ Ans.:
I
Deadline 2 4 3 1 3 1 2
•"'<- - = - - Step 1 : We will arrange the profits Pi in
descending order, along with corresponding
step 2 : Create an array J[] which stores the jobs.
Initially J [ J will be deadlines.
1 2 3 4 S 6 Profit
7
J[7] Deadline
[ o I a 0 I a J
time required by job I ls ti. Write an algorithm
that determines which jobs are to be run on whlch
proceoon and the order in which they should be
J [4]
th nm eo th.at the finish time of the last Job is
Step 3 : Add job in array J [ ] at the index
J minimized. ~[JNTU : Part B, April-11, Maries 151
denoted by its deadline dJ. First job is P 1 .Th.e
Ans. : Refer Q.8.
deadline for this job is 2. Hence insert P1 in J [ 1 at
nd
2 index.
1 2 3 4
!4.3 The 0/1 Knapsack P;obl~;J
L P1 T Q.14 Write an algorithm of greedy knapsack.
Step 4 : Next job is P 4• Insert it at index 1. ~[JHTU : Part A, Nov.-16, Maries 2)
Ans.:
r p4
1
-y-
,
2
·-p 1
'"
3
T. ·-~·•-· ..-r~,.·· -· ····--·1
- .J_
4
-- __..,,,,_,J
The Knapsack problem can be stated as follows :
P3 with deadline 2. But the J [2] is
Step 5 : Next job is Suppose there are n objects from i = 1, 2, ... n. Each
already occupied. Hence discard P 3 . Similarly job p object i has some positive weight wi and some profit
2
will also be discarded. value is associated with each object which is denoted
Step 6 : Thus the final optimal sequence which we as Pi This Knapsack carry at the most weight W.
will obtain will be P 1 - P4 . The maximum profit will
While solving above mentioned Knapsack probleJJ\
be 127.
~e have the capacity constraint. When we tr}' to
~lve this problem using Greedy approach our goal
Q.12 How do you reduce/relate Job 11equenclng IS,
problem with traveling ealee person problem ?
~[JNTU : Part B, Aprfl-11, Marks 15)
(laooSe only those obj --
t. .-,At_ ects that 4-6
~-- give tnaxun
Ans. : The feasible l . Greedy Method
Z. 'the total weight of selected o . urn .
so utions are as given below.
~ W. bjects should be
. -
x. Xi I ~
Afld then we can obtain th -,
. ~ words e set of feasibl 1/2
..
1/3 1/4
lrt ou- ~ e solutions
,-- . 1
mariari:red
1
L Pixi subject to ~
-
~
0
-
2/15
2/3 r 0
1
n 4" Wixi $W
- - - - - -- - - - - - - in
---
- 0 1 I
i 1/2
18 C•rrta that there are three items. Weight The solution 2 gives the maximum profit and hence it
Pl8lt value of each Item is as given below, turns out to be optimum solution.
j Q.17 Find the optimal solution of the Knapsack
i wi i pi tnstance n 7, M = =
15, (P1,P2,••·~> (10, 5, =
.
I
I
30 15, 7, 6, 18, 3) and (w1,w2,ws,•••'°7) = (2, 3, 5,
1 18
-
i - 7, 1, 4, 1) ~[JNTU : Part B, Marks SJ
2 15 T 21
f-- -- ..i.
18
Ans. : For solving this problem, we will find out the
3 10 _I - fraction of weight that can be used to fill up the
~ the above f Knapsack satisfying the given capacity with
11......._ W • ZO. Obtain the solution or Marks SJ
.. ~ It
......
d - . p [JNTU : Part B,
C £ ua: problem. - (DICODU
JJ,-IIKJI ,trt,I Anrrl111l11 •if .t\l.~,,,Hlt,,.~ 4 ~1 Grtrdy1',
Solutlon l(:,
L p, x, • (Jt>-l)+(s• ~)+(1s-1i+(7•~J l
X1 X2 x .. )( fj )( ff X7
➔ ((,.1) +(18•1) +(3•1)
j
l/2 1/1 1/•l 1(/ ,1/'1 1
s 2/3 ()
(2•l)t ( 3•
' L
• 14
- (2• ;) +(3• ~) +(5• ~)
• (10.l)+(s. ~)+(lS.1)+ (7$})
+(7•~ )+(1•1)+(~ ~)+1 +( 6•0) +(18•1) + (3•1)
• 1 + 1 + 1.25 + l + 1 + 3 + 1
L w 1 x, • 9.25 L p xi
1 - 48.67
Solution 5:
+(1•0)+(#1 )+(1• J) L
• 10 + 3.33 + 15 + 0 + 6 + 18 + 3
p 1 x 1 • 55.33
• 1 + 1 + 5 + 1 + 0 + 4 + 0.5
From above computations, dearly solution 5
L W1X, - 12..5 maximum profit. Hence solution 5 is
optimum solution. . . . ... .
rM • (11,, 1J+rs-1J+(l5'1)+(1•})
Q.18 If Pt /w1 ~
P2 /w2 ... Pn /w ~
prove that
~<~)+(18#1)+( 341) 0
knapqck 9etaeratea an optimal aolutJon to thf
afven In.at.a.nee of the knape•ck problem.
• 5 + 1 67 + 15 + 0 + 1fS + 1.5
lp ,x, - 42..11
r.- (JMTU : Part 8, May-08, M,rb I}
OR
Solution '3 : If object. are selected In order of deaeaaJng vt,,'lff
then Pl'ove that the alaortthm lmapqck finu aa
I ~, 1x, - c2-1 )~r 1it ~)+fs-1>+(1.~) oPtiJnaJ solution.
- L Pi Yi
1:Si:Sn
[ 4.4 Minimum Cos t~~: S-J_ Q.20 Different iate between Prims and Kruskal '•
algorithm . ~ [JNTU : Part B, May-09, 12, Marks 6}
...;porta nt Points t; Remember Ans.:
• n,...,__ the
f
are two algorithm S or obtaining
l
•&~
Prim' s Sr. No. , Prims Algorithm ·1· Kruskal's Algorithm
rninhaum spanning tree namely -
Prim's algorithm Krlskal's algorithm
Alaorithm and Kruskal' s algorithm• .
1.
initializes with node initializes with edge
• ~ilsgy to obtain minimun t spaJUUD g tree
--::-
W
•~- ;~Prhn::'.:•~Al:go:n:·thm:~-~:, _.-=._. :-=~.:.=~J~:::: ::: ::: ::: ::: ;;:
(!t 1
1,_ DKODt ;
Design and Analysis of Algorithms 4. 9
r Step 2 :
2. In Kruskal's algorithm
In Prim's algorithm the
next node is always the minimum weight
edge is selected
CD
selected from
previously selected independent of earlier
I node. selection.
13. - ------
In Prim's algorithm The Kruskal's 0
graph must be algorithm can work
connected. with disconnected
4.
-+ -
graph.
i The time complexity of The time complexity of
© 0
1ob:1I wal1,.1ht .. 10
.Kruskal's algorithm is
1 ~ s algorithm is
.i... .o ) O(logV) .
' ....... ..... Step 3 :
Step 4:
0
20
· Fig. Q.21.1 Graph
Step 1 :
... Total weight • 53
Step 5 :
0 0 0
© 0 0
© 0
Total weight = 0
20
Total weight • 64
•-10
----~ Grudy Method
{ ~ ~ .. ~--~1
min_dist+-Gf iJ J
vlt-i
v2 +- J. ~ p~
! '.r'feldmg up !hcJge Yer'..095
nnmum edge.
}
}
}
}
Wrlte(vl, v2,min_dist);
tree(vl) t- tree(v2] t- 1
total t- total+min_dist
}
Total weight = 90 Wrlte("Total Path Length Is" ,total)
--
/,n.,...._.,_
. ......,. : Spanning . tad with total P
tree gets pnn
tllltal•O:
:- Me the selected vertices list
llr if...-0 to nodes-1 do
1NeUJ .,_o
= t ;//tel<e initial vertex
tllte(O)
T~ ~ by Time taken by
for (k • 1 to nodes-1) loop for (i = O to nodes-1) loop for (j s= 0 to nodes-1) loop
We take variable n for 'nodes' for the sake of simplicity of solving the equation then
• 2n[(n - 1) - 1 + 1] = 2n(n - 1)
= 2n2-2n
T(n) = n 2
T(n) = 8(n 2 )
But n stands for total number of nodes or vertices in the tree. Hence we can also state
But if the Primfs algori~ is impleme nted using binary heap with the creation of graph using adjacenc
y list then
its time complexity becomes 8 (E log 2 V) where E stands for total number of edges and V stands for
total number
of vertices.
Q.24 Explain the Kruakal '• algorithm with the appropriate example.
~[JNTU : Part B, Marks 5]
Ans. : Consider the graph given below :
0 a,.,,, s 1
CD
©Totdl W$lght a0 a
1
0
Step 8 :
0 0
© 0
Total weight = 1O
Sllp I:
20
0 Total wr,lght • 87
Step 7:
0
~
10
©
Total weight = 21
-.,, : 20
Total weight • 90
Ans.:
Algorithm spann ing_t ree()
//Pro blem Deacrlptlon : This algori thm finds the
//min imum spann ing tree using Kruskal'• algorithm
Total weigh t• 33
DtCOol.
Vulgn 4nd l\t1aly1/1 uf Alg1.rrltlim•
1 - 13
//Out put : prtm11 th•J 11p(Jm1lng tr9u with thg wtol
cost r.Jf
//apennJ.ng trtte
counu - -0
k<-0
eum+ -0
for t<- -0 to tot_no dtJ• do
peren t(il'- 1
while (coun tl tot nodvl l-l)do
{
P<>•' -Mini muro( tot_ed ges);/ /ftndl ng the mirum um colt
Bdge
lf(pcs ""-1 )then //P~rh apg no nod a 1n tha graph
break
vl+---O (po&t. vl
v2•-0 tpoa) .v2
1'-Fin d(vl ,pai-eot}
J....-F1nd(v2,parent)
lf(ll•J )tben
{ tr~e []I ] lu an array In
which the epannlng tree
tree(k}(OJ '-Vl ..-,::; ;...--, edge, are ator&d
//11UJring the minim um edge in anay tree[)
tree(k )l1J .__ v2
k++
coun t++;
,wm + +-G (poa I.coat CompuUng total cost
or all the minim um
I I accum ulot1 ng the total cost of MST distan ces.
Un10n (l,J,pa rent);
}
O!Po•J.coat INFlNlT'f
}
U(cou nt • tot_n ode& -'l)t~
{
for t<- -0 to tot_n odes- 1
{
For each node of
wrtt. (tr8e UI(O I. tree 11) (1 J) I, the minimum distance
} edges are collected in
wri t.("C oat of $penn ing Tree fa ",sum ) array tree [ ] [ ] .
The spanning tree
} Is printed here.
iJ) f),•lctt~ Hit' Jow,•i,I c.c,"t ,•d))c' from th,• liat of c.•dgc:
t.
-
I
lI
T. -
11 { ,lffl/f Al l'l/111 /f./l.f/f) N'1 At11111 tltrl/f,I r,,, k1111wJ,,,JIJ"
.-AMJyru of Alg07'i8uas
- -- -------- 4-14
Jolla 11.ese functions
can be done eH;,.-;,.,__.ai_ . , • - - - - -- - - -- -
Gfffllly~
----..::....--
J -..a-. E is :maintained as a ~~~• ...-y ll r:-c list nay at diet• ait-,e Oadadias die wCw:taLL al
~-_,-_ry.-w to sort all fue edo--es_ T £-U.5-t.
•-.,- -LI.CU_
;.-_ -
3-~ i: k run
~ file directed trees). •. . par8a ~ •
- . o - l.!~edzes aMd.
- - • • ~ asO(la mEfnbea:p th.en fr~ :leX~ Edge-~~
al,I - ,ed m og i hme. 1:'l.e consn-:~,..,..,~ ~ l - . ~
lir[JJmJ : Part ~ Dec.--09, Marfa u . ~11. Milrla 15]
~.,_ O(IE J) time. L'sing u;u~ -c,-,..J
~
;:~J.i,Ot •8:4:p
- v... a.:_~ ........- . a..gotl.....~
die MST c.an be construaea in Or Etlog E.J. -::_--:is
-..es
r-·- that . the computmg .,;...,.,~ ~ v _ • ,
~ ~ ~~s
·-ifl■m JS O(ElogF;.
QZI Slaow that in a complete. sra,ph With n
• • -, tile number of SJ>annln.g trees eenented F,g. 0-2S.1
- - be greater than (2•- I - 2). Ans.. : i) Prim'.s algmiJhm
a'"[JMTU : Part B, Apnl-11, Kins 8] .
S1a-p 1 : 5e!ect a:i: edge .-\13 ""~.lll m:infmmn ?.eight
,.,._ : For a complete graph ½ -.;•,ith n ,e:rtkes, the . ,, a
tolal onrnber of spanning trees are n=- - 2.
For example consider a complete graph ...,,ith 3
,alias.
i.e. ~-2=~-2 =31 =3 Stap 3; Select nex'.t j II j j tf i : I !J '. L, y;,eigh!- edge CD.
3 2
0 0 Select next m:intm:rm w-e:gh: eri..5~ .-\E.
Sta,p 4 :
.
.... 3 2
....
- ....
0-----0- © ©
-= .J
0
F"~3 Q.27.1
0 0 ·l=·
Q.21 o.e.
eHbe.r Prlm's or Kruskafs aJgo1tthm Smp 5 : Select: an edge "'-'ilh mini:mmn weight.
;f\(= . .
llllll ■ 11aen are r.egattve edge weights. • 5 2 C -= ~
:f\(
•
miJ • --om weight. The oruer
_,l in ...-hich ~ertice:s a::-e
' lv
added to minimum spanning tree depends on;:: on
er:t Mtg of the weight of the edge. Hence there . . 15 no -- . 1
~G
.a, ~
Design and Analy sis of Algorithms 4-15
] C G
}
$tep 3 : Choose vertex R
}
void cllaplay(int source ,int destination ,int dist I]) S = {P, Q R} , T = {S, T, U}
{
d(P - S) = d(P - Q) + d(Q - R) + d(R - S) = 4
IDIS:
getala(); d(P -T) = 7
pdatt('"\11 Step by Step shortest path is ... \n"); d(P - U) = d(P - R) + d(R - U) = 2 + 1 = 3
fm(f•dNtina tion;il =source;i= path(i])
Step 4 : Choose vertex U.
{
pdntf( .. %d<-",i); S = {P, Q R, U}, T = {S, T}
} d(P- S) = 4
PDIIICatJkl" ,t>:.
d(P - T) = 7
Plimt(• 'l'be length=%d " ,dist(destin ation]);
} Step 5 : Choose vertex S
Q.31 Applp Dljbtraw' • algorithm for finding all S = {P, Q R, U, S}, T = {T}
.....
•olkat patba from single source 'P' In a given d(P -T) = 7
'L---- =---- u
~.
.... f
r We will always choose
T
Fig. Q.31.1
~~ The ~ y dIB-~ ~ - - •