Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

[ UNIT• Ill]

4 Greedy Method

14.1 General Metho~ j


r
i
I
Important Points to Remember
• In an algorithmic strategy like Greedy, the decision of solution IS
available. The Greedy method is a straightforward me thO d
·
· · taken based on the information
This method is pop
u1ar f
or o
btainin the
g
optimized solutions.
• In Greedy technique the solution is constructed through a sequence of steps, each expanding a
partially constructed ~lution obtained so far, until a complete solution to the problem 15 reached.

Choice
of solution
made
at each
step of
E
• At each step the choice made should be,
.
Feasible - It should satisfy the problem's constraints .

Locally optimal - Among all feasible solutions the best


choice is to be made.
problem solving
in Greedy Irrevocable - Once the particular choice is made then
approach it should not get changed on subsequent steps.

• In short, while making a choice there should be a Greed for the optimum solution.
Q. 1 Explain the terms, feasible eolution, optimal solution and objective function. ]
~[JNTIJ : Part B, May-09, Marks 8, April-11, Maries 7

Ans.:
Feasible solution - For solving a particular problem there exists n inputs -and we need to obtain a subset that
satisfies some constraints. Then any subset that satisfies these constraints is called feasible solution.
Optimal solution - From a set of feasible solutions, particular solution that satisfies or n early satisfies the
objectives of the function such a solution is called optimal solution.
Objective function - A feasible solution that either minimizes or maximizes a given objective function is called
objective function.

Q.2 Write the general characterl.tlca of Greedy algorithm.


~ [JNTU : Part B, Martes 5}
Ans. : There are two characteristics of Greedy algorithm
1. Greedy choice property : By this property, a globally optimal solution can be arrived at by making a IocallY
optimal choice. That means, for finding the solution to the problem. We solve the subproblems, and whichever
choice. We find best for that subproblem is considered. Then solve the subproblem is considered. Then solve the
subproblem arising after the choice is made. This choice may depend upon the previously made choices but it
does not depend on any future choice. Thus in greedy method, greedy choices are made one after the another,
reducing each given problem instance to smaller one. The greedy choice property brings efficiency in solving the
problem with the help of subproblems.

(4 - 1)
~
1
.,.A A,ualy1l1 <1/ t\l,~1lrlthmii
4 .. '} <,ri•rtly M,t1wd 1
2, optimal aubatructure : A pruhl,'ni i1huw t1 upllnrnl
..:,..,bstrUcture if an opthn,ll soh,tlon t ) II :l) Mh1l1111110 r1 p111111h1K Ire""
l
l
II' pl'u l l•'nl
.. tainS optimal solution t,) thf• sut n . 1.
4) !-ilup.lP 111,w•, •1• ultort,•tJl pulh .,lr,orltlirn .
~'-''' 1 r l 0µ 1('ll\ll ,
I I)
l,ther words a prohlt'm hns optlmnl fltthitll'uctur .. II
Q,fl l>lff"r•mtlhh, bMfw.,un dlvlde •uad conquer and
the t,est next choice nlwnys lt•.uls to tlu, ,lptlmnl
Gr1u1dv trudt.od.
sohJ.tion. II~· [JNTU , Part P, O■c,•12, Marks 7, May-09, Markt 6)
ln thiS chapter we will first undt•rshr od tlw co,w,•pt o( An,a,:
creedY method. Then we will discusa vnrlous
e:\'.ainples for which Greedy method is npplil'll. Sr.No. Dlvhfo ,md co•u1110 Greedy olgorlthrn
-
l. Dlvld1• n11d rn11qu1,t reedy mt:thod ts used
Q.3 Write the control abstracti on for the greed~ wwd lo obl(,ln ,, to obt.JJn optlmum
IPltbod, ~ [JNTU : Part B, May~12, Marks 7] 11olullon lo Blvt·n solution.
probli•rn.
2. In this technique, the
~ In greedy method a set
Algorithm problem 111 divided of feasible solution is
Greed.Y (D, n) Into small generated and
/{In GreedY approach D is a domain subproblem s. These optimum solution is
subproblem s arc picked up.
//from which solution is to be obtained of size n
solved Independe ntly.
/{Initially assume Finally all the solutions
Solution ~ 0 of subproblem s are
fori~1t ondo collected together to I
get the solution to the
{ given problem.
... A s +- select (D) // section of solution from D
Cheek lf uiv -- •t (Feasible (solution, s)) then 3. In this method, In greedy method, the
iS feasiblesolution
selected or solution +- Union (solution s)·
_._._ _ _ _,. duplication s in optimum selection is
not. , subsolutio ns are without revising
Make a feasible
} choices and select neglected. That means previously generated
retum solution optimum solution. duplicate solutions solutions.
may be obtained.
In Greedy method following activities are performed.
4. Divide and conquer is Greedy method is
1. First we select some solution from input domain. less efficient because of compartiv ely efficient
rework on solutions. I but there is no as such
2. Then we check whether the solution is feasible or guarantee of getting
I
not. optimum solution.
3. From the set of feasible solutions, particular 5. Examples : Quick sort Examples : Knapsack
solution that satisfies or nearly satisfies the binary search problem, finding
minimum spanning
objective of the function. Such a solution is called
L----'---------- Ji
tree.
optimal solution.
4· As Greedy method works in stages. At each stage Q.6 Explain the control abstracti on of greedy
only one input is considere d at each time. ~ased method compare this with dynamic programming.
on this input it is decided whether particular l(i=' [JNTU : Part B, May-12, Marks 15)

input gives the optimal solution or not. Ans. : Control abstraction- Refer Q.3.

Q.4 Dlecuu the applicati ons of Greedy Metbod.


·. l(W"[JNTU : Part B, May-13, Marks 7)

Ans.:
1) Job sequencing with deadlines
2) Knapsack problem
....... 0
(MCODl)
, • TECHNICAL PUBLICATJONS 0" • An up th ru sl for knowlsdg"
4-3 GrudyM_IHL
~"'Ir,~
~g,r and Analysis of Algorithms_

Comparison with Dynamic Programming • Consider all possible schedules and compute ~
mini.mum total time in the system.
Sr. Greedy method Dynamic programming
Q.8 Give an algorithm for job sequencing With
No.
deadline. Also specify the time complexity of %
1. Greedy method is used Dynamic programming algorithm.
for obtaining optimum is also for obtaining
optimum solution. Ans. : The algorithm for job sequencing is as given
solution.
below
2. In Greedy method a set There is no special set
of feasible solutions of feasible solutions in
this method. Algorithm Job_seq(D,J,n)
and the picks up the
optimum solution. {
Dynamic programming
I / /Problem description : This algorithm is for job
3. In Greedy method the
optimum selection is considers all possible sequencing using Greedy method.
without revising sequences in order to //D(i] denotes ith deadline where 1 ~ i $ n
previously generated obtain the optimum
solution.
//J(i] denotes ith job
solutions.
// D(J(i]] $ D(J(i+1J]
4. In Greedy method there It is guaranteed that the
dynamic programming //Initially
is no as such guarantee
of getting optimum will generate optimal D(O]H>;
solution. solution using principle J(O]H>;
of optimality. J[l)t-1;
countt-1;
f 4.2 Job Sequencing with Deadlines for t-2 to n do
{
Q, 7 State and explain Job sequencing with tt-count;
deadline problem. whlle(D(J(t]] > D(i]) AND D(J(tl]l=t)) do tt-t-1 ;
Ans. : Consider that there are n jobs that are to be if((D(J(t]] ~ D(i])AND(D(i] > t)) then
executed. At any time t = 1,2,3,.. only exactly one job {
is to be executed. The profits pi are given. These //insertion of ith feasible sequence into J array
profits are gained by corresponding jobs. For for st-count to (t+1) step - 1 do
obtaining feasible solution we should take care that J(s+l] +- J(s];
the jobs get completed within their given deadlines. J(t+l]+-I;
countt-count+ 1;
Let n = 4
}//end of if
D pi di }//end of while
-
1 70 2 return count;
12 1 }
2
3
4
18
35 -- 2
1
The sequence of J will be inserted if and only il
D[J[t]]!=t. This also means that the job J will be
processed if it is in within the deadline.
We will follow following rules to obtain the feasible
solution The computing time taken by above Job_seq
2
algorithm is O(n ), because the basic operation of
• Each job takes one unit of time.
computing sequence in array J is within two nested
• H job starts before or at its deadline, profit is for loops.
obtained, otherwise no profit.
Q. 9 Solve the following Instance of "Job
• Goal is to schedule jobs to maximize the total n
seque ctng with deadlines" problem : "
profit. n =
7, profits (pi, p 2, p 3 , ....... , p 7 ) = (3, s, 20,

.Jmowlaclae
~ ,aul Alt1dysis of Algorithms
4-4 Greedy Method
JS, 1, 6, 30) and deadlines (d
(1, S, 4, 3, 2, 1, 2). 1, d2, ..... , d,) = Step 8 :Thus the optimal sequence which we will
Q" [JNTU • Part B M obtain will be 6-7-4-3. The maximum profit will be
• ' ay-13, Marks 8)
74.

SteP 1: We will arrange the profits . . Q.10 Explain Job-sequencing with deadline•. Solve
P1 m the following instance :
descert ding ord er. Th en corresponding deadlines will
n = S.
appear. (P1, P2, P3 , P4 , P5 ) = (20, 15, 10, 5, 1)
Profit 30 20 18 i 6 s (d1, d2 , d3 , cl., d5 ) = (2, 2, 1, 3, 3)
l 3 1
.....L_
i
~[JNTU : Part B, Dec.-12, Marks 8)
Job P7 p3 p4 I
P6 P2
- · ·•·-·•-·-··•• . -· Pi ~ Ans.:
I

Deadline 2 4 3 1 3 1 2
•"'<- - = - - Step 1 : We will arrange the profits Pi in
descending order, along with corresponding
step 2 : Create an array J[] which stores the jobs.
Initially J [ J will be deadlines.

1 2 3 4 S 6 Profit
7

@ __~ ..I_ o _I o- l o r o 0 Job

J[7] Deadline

Step 3 : Add i th job in array J [ ] at the index


Step 2 : Create an array J [ J which stores the jobs.
Initially J [ ] will be
denoted by its deadline Di.
1 2 3 4 5
First job is p7 • The deadline for this job is 2. Hence -.-
insert p7 in the array j [ ] at 2 nd index.
o- -o T o
[-- 0 I 0
J
1 2 3 4 5 6 7 J [5]
-
P7
-I ·I L [ Step 3 :
th
Add i Job in array J[] at index denoted by
its deadline Di.
Step 4: Next job is P 3· Insert it in array J[] at First job is P1, its deadline is 2.
nd
index 4. Hence insert P 1 in the array J[ ] at 2 index.
6 7 1 2 3 4 5
1 2 3 4 5
..- 7
P7 I
p3 Ll ~ [_ P1

Step 4 : Next job is P2 whose deadline is 2 but J[2] is


Step 5 : Next job is p 4 • It has a deadline 3. Therefore already occupied. We will search for empty location <
insert it at index 3 J[2] . The J[l] is empty. Insert P2 at J[l].
6 7 Step 5 : Next job is P3 whose deadline is 1. But as
1 2 3

[ P7 I P4 [1-1 J[l] is already occupied discard this job.


Step 6 : Next job is P4 with deadline 3. Insert P 4 at
Step 8 : Next job is P6, it has deadline 1. Hence J[3].
Place p 6 at index 1. 1 2 3 4 s
6 7
1 2 3 4 5
_l
[r-P-6- P-7 T~ [ P-3-,--,-I --- Step 7 : Next job is P 5 with deadline 3. But as there
Step 7 : Next job is p 2 , it has deadline 3· But ~ is
3 is empty slot at<= J[3], we will discard this job.
,.1~ d . p·ty slot at index Step 8 : Thus the optimal sequence which we will
'U.lea Y occupied and there IS no em
< J [3]. Just discard job P2. similarly job P1 and Ps obtain will be P2 - P1 - P4 . The maximum profit will
be 40.
will get discarded.

,.. • t for l<nOwfedge


1J!rae«Cfl RUBJ JCATJDNS • An up thN5
r,,,~dlf Mtth,,d
f),•11("'' 11,1,I i11111f1111l11 of Al.'1111/l/1111 1 , J roblem can be F>t:.3',,-'
1
An11. : TJ,,, Jr,b S<•qu,,ncing I.I,;,;
ft
,.~

Q, 11 Buh,~ Ih" lollowh111 J,,h •••1ueru~h•fl a,,,,Merm (t,Jl,,wff ·


(maxlmblnu H1&5 a•rofU J," , ornJIIMtlJ19 J,,b-, 1;.,,,,,4 OmhJ.d<-r that fhN<• ar<~ n 1obb that arP. to b~ P/Pru~d.
thglr cl~11Jllra-lf) u•h•n r,r~•dy 11lg,,rJthm.
p job jJ; to be (!)(P.CUt!:d . n.e pr<Jfits
N (Numher ol Joint) 4 A 1 ~ t i"''! t {m Jy {)n w •

Profit• 111111,,dnt~J wUh Juh• llf(' ga JrrP d by


v-xecuting
· · . JObs within ~v,.,
the
( •.,I • ...
F2•
,..:, , ,,4 ) (too • 10 ' J 5, 27). o..adllnH d,•adHnt.6, 'Jli~ goal it, to Qbtam th<: ~quen.ce of Jrio~
11••och1h,cl with J,,b, th:it wW J1'1fjxfm!YR. total pmfit.
(dJ, cl,:, d;1, cl 4 ) (2, t, 2, J), lll\.,. pNTU • Part BJ
Ao,. : I ,I'! Profl1,i lttt,•d wHh Jobtt (1'1, J'z, 1',y 1'4)
:11-1110<
w~ can w.rftc a Job sequence as a vector (1, fa, h, ....
in) which fm pfiea that tour sta~ at. city 1, goes t<'J
F ( IC)(), I 0, rn, 27) J),•urllltl(•fj mmodat,•d with JOblf
· and 80 on until the final c1ty Jn LS rea&l:d
ncx t c1·ty 1z ~ ·
(d 1, cl:,,, d:1, cl 4) (2, l, 2, I).
To complete the tour, the salesman must travel from
Stop 1 : w.- will ori·nng<• the• pro(it1-1 P1 In d<•B<'tmdJng Jn back to city 1. The cost of tour will then be
onh•r'. ·nwn corn•i,poncllng dc•iHJlin(~ff app<?ar. wm n-1
cost ,,. c(l, j2J + Lc(jk, h:+ 1) + dJr, lJ
l'rt ,fit JOO 27 15 JO J<.a,2

Job Where c(i, j) specifies the cost of traveling from city i


to city j when c(i, j) represents the distance i and j the
IJl?adlinL• 2 2 1
objective of the problem is to minimize total distaiice
Step 2 : Create an array J r ] which stores the jobs. traveled.
Tnitia11y J r ] will be Q, 13Suppose there are n jobs to be executed but
2 3 4 only k proceuora that can work in parallel. The

[ o I a 0 I a J
time required by job I ls ti. Write an algorithm
that determines which jobs are to be run on whlch
proceoon and the order in which they should be
J [4]
th nm eo th.at the finish time of the last Job is
Step 3 : Add job in array J [ ] at the index
J minimized. ~[JNTU : Part B, April-11, Maries 151
denoted by its deadline dJ. First job is P 1 .Th.e
Ans. : Refer Q.8.
deadline for this job is 2. Hence insert P1 in J [ 1 at
nd
2 index.

1 2 3 4
!4.3 The 0/1 Knapsack P;obl~;J
L P1 T Q.14 Write an algorithm of greedy knapsack.
Step 4 : Next job is P 4• Insert it at index 1. ~[JHTU : Part A, Nov.-16, Maries 2)
Ans.:
r p4
1
-y-

,
2
·-p 1
'"
3
T. ·-~·•-· ..-r~,.·· -· ····--·1
- .J_
4

-- __..,,,,_,J
The Knapsack problem can be stated as follows :
P3 with deadline 2. But the J [2] is
Step 5 : Next job is Suppose there are n objects from i = 1, 2, ... n. Each
already occupied. Hence discard P 3 . Similarly job p object i has some positive weight wi and some profit
2
will also be discarded. value is associated with each object which is denoted
Step 6 : Thus the final optimal sequence which we as Pi This Knapsack carry at the most weight W.
will obtain will be P 1 - P4 . The maximum profit will
While solving above mentioned Knapsack probleJJ\
be 127.
~e have the capacity constraint. When we tr}' to
~lve this problem using Greedy approach our goal
Q.12 How do you reduce/relate Job 11equenclng IS,
problem with traveling ealee person problem ?
~[JNTU : Part B, Aprfl-11, Marks 15)
(laooSe only those obj --
t. .-,At_ ects that 4-6
~-- give tnaxun
Ans. : The feasible l . Greedy Method
Z. 'the total weight of selected o . urn .
so utions are as given below.
~ W. bjects should be
. -
x. Xi I ~
Afld then we can obtain th -,
. ~ words e set of feasibl 1/2
..
1/3 1/4
lrt ou- ~ e solutions
,-- . 1

mariari:red
1
L Pixi subject to ~
-
~
0
-
2/15
2/3 r 0

1
n 4" Wixi $W
- - - - - -- - - - - - - in
---
- 0 1 I
i 1/2

W}lere_ theKnapsack can carry the .


Let us compute I wi~
objed 1 such that OS xi s I and I . fraction ~ of an 1. 1/2 ''18 + 1/3 ''15 + 1/4 ''10 = 16 5
S1sn. 2 1* .
. 18 + 2/15 *15 + 0*8
Q.15 Wrlta procedure for GREE = 20
(P,W,N.X.N) where P and W DY KNAPSACK
conta.t 3. 0"18 +2/3*15 +10
_.,,,,.. M Is Knapsack .~- ns profits and
u:e and X is th = 20
vector. Q"[JNTU : Part B M e solution
, ay-03, 12, Marks SJ
4. 0*18 +1*15 +l/2*10
Ans.:
= 20
AJguiltlliii GREEDY_KNAPSACK (P W
{ , ,M.X.N)
Let us compute L Pw
//P(il oootains the profits of i items. 1. l/2 *JO + 1/3 *21 + 1/4 *18
= 26.5
//W(iJ contaiDs weights of i items.
2· l *30 + 2/15 *21 + 0''18
//wtae1SiSN.
= 32.8
II Xiii is the solution vector.
3. 0*30 +2/3*21 +18
//Mis Knapsack size.
= 32
iJr (i:= 1 to n) do
4. 0*30 +1*21 +1/2*18
{
= 30
if (Wfi) <M) then //with the capacity
{ To summarize this
X{iJ:=1.0; 7
L WiXi L PiXi I
}
M:= M-W[iJ;
16.5 I-- 26.5
}
if (i<= N) then
20
20
+ I
32.8
32
I
XllJ := M/W[iJ; / / XliJ gives the output. -i-
} 20 30

18 C•rrta that there are three items. Weight The solution 2 gives the maximum profit and hence it
Pl8lt value of each Item is as given below, turns out to be optimum solution.
j Q.17 Find the optimal solution of the Knapsack
i wi i pi tnstance n 7, M = =
15, (P1,P2,••·~> (10, 5, =
.
I

I
30 15, 7, 6, 18, 3) and (w1,w2,ws,•••'°7) = (2, 3, 5,
1 18
-
i - 7, 1, 4, 1) ~[JNTU : Part B, Marks SJ
2 15 T 21
f-- -- ..i.
18
Ans. : For solving this problem, we will find out the
3 10 _I - fraction of weight that can be used to fill up the
~ the above f Knapsack satisfying the given capacity with
11......._ W • ZO. Obtain the solution or Marks SJ
.. ~ It
......
d - . p [JNTU : Part B,
C £ ua: problem. - (DICODU
JJ,-IIKJI ,trt,I Anrrl111l11 •if .t\l.~,,,Hlt,,.~ 4 ~1 Grtrdy1',

m;iximum w,·11,ihi Wo will II) \' HI 1111111 l1llhlll1111~ Jul'


llw .t-,1m, -.

Ft>c'l!iiblt· t1nlullrn1-1 .;-,111 111•


L W1X1 - 15

Solutlon l(:,
L p, x, • (Jt>-l)+(s• ~)+(1s-1i+(7•~J l
X1 X2 x .. )( fj )( ff X7
➔ ((,.1) +(18•1) +(3•1)
j
l/2 1/1 1/•l 1(/ ,1/'1 1

2 1/2 1/:1 1/7 () 1/2


• lO + 1.67 -t 15 + l ➔ 6 4 18 + 3 ~
.I
3 1 1/:} 1/7 L PfXj • 54.67
4
Solution 4: 1
1/3 1 1/7 0
~)+(5•1) ➔ (1•})
I

s 2/3 ()
(2•l)t ( 3•
' L

Solution 1: +(hO) +( 4•1) +(1•1)

• 14
- (2• ;) +(3• ~) +(5• ~)
• (10.l)+(s. ~)+(lS.1)+ (7$})
+(7•~ )+(1•1)+(~ ~)+1 +( 6•0) +(18•1) + (3•1)
• 1 + 1 + 1.25 + l + 1 + 3 + 1
L w 1 x, • 9.25 L p xi
1 - 48.67
Solution 5:

L Pixi - (10-;)+(s. ~)+(1s• !) L WJXj • (2•1)+( 3•j)+(S.1) +(7•0)

+(7•~ )+(~)+(18•¾)+3 +(l•l)+( ~1 )+(1•1)


• 2+2+5+1+4+1
• 5 + 1.67 + 3.75 + 1 + 6 + 13.5 + 3
~WlXi • 15
L PiXj • 33.92
Solution 2: L Pixi - (lO.l)+(s•j)+(lS-1) +(7•0)

}.; w 1x 1 • (2•½)+[3'1)+(5'1)+(7•}) +( 6'-1) + (lS.1) + (3•1)

+(1•0)+(#1 )+(1• J) L
• 10 + 3.33 + 15 + 0 + 6 + 18 + 3
p 1 x 1 • 55.33
• 1 + 1 + 5 + 1 + 0 + 4 + 0.5
From above computations, dearly solution 5
L W1X, - 12..5 maximum profit. Hence solution 5 is
optimum solution. . . . ... .

rM • (11,, 1J+rs-1J+(l5'1)+(1•})
Q.18 If Pt /w1 ~
P2 /w2 ... Pn /w ~
prove that
~<~)+(18#1)+( 341) 0
knapqck 9etaeratea an optimal aolutJon to thf
afven In.at.a.nee of the knape•ck problem.
• 5 + 1 67 + 15 + 0 + 1fS + 1.5
lp ,x, - 42..11
r.- (JMTU : Part 8, May-08, M,rb I}
OR
Solution '3 : If object. are selected In order of deaeaaJng vt,,'lff
then Pl'ove that the alaortthm lmapqck finu aa
I ~, 1x, - c2-1 )~r 1it ~)+fs-1>+(1.~) oPtiJnaJ solution.

-+{1•1)4 (4•J J • (1-lJ ls"[JNTU : P1rt I, Ma)--Oa, Dec.• U, Aprtl-11, M1rk.1 IJ

T~ r. <;HJ/}(_ A!. FVJ.il.J""A7JOJd NJ up tJ1ngJ lo, l y ~


~-A.,u dysi• of Algorithms
;.nJ. : Proof : Let, X = (X1, '¼, .. ~) be th 4-8 Greedy Mtthod
tutions for the knapsack problem . e set of
so roach- If all xi = 1 then the so::g Greedy 1. Start from the source vertex. Pick up the I
ir~btem itself is a optimal solution. on of the minimum weighted edge.
2. Pick up the next edge by covering any of the
..row consider a least index j such that # already visited vertex.
l" • ~ 1. From
the algonthm d
, xi=l when lSiSj and x- ==
. "th 1 0 h 3. While selecting the edge - just make sure that
w en
i <: i ~ n an ~ \e1 er ess than or equal to zero or
1
the circuit is not getting formed.
inaY be less • 4. When all the vertices get visited then stop the
procedure . Display the obtained minimum
Let, y • (Y1, Y2, ... Yn) be an optimal solution such that
spanning tree.
LWt Yt• m where m denotes the capacity of
• Strategy to obtain minimum spanning tree
J<napsack.
using Kruskal' s Algorithm
Let k be a least index with three possibilities _ 1. Start from the source vertex. Pick up the
minimum weighted edge.
1. If k < j then Xk=l but Yk ~ xkand yk< :xx.
2. Pick up the any minimum weighted edge each
2. if k • j then L
wi xi= m and Yi = Xi. It leads to time.
the solution of knapsack .
3. While selecting the edge - just make sure that
3. If k > j then L
wi Yi> m and it cannot be a the circuit is not getting formed.
part of solution. There exists a new solution set 4. When all the vertices get visited then stop the
z=(z 1 ,z 2 , ... ,zn) with zi=xi, l~i<k and procedure.
~<i:Sn Wi (Yi - 2 i) = wk (zk - Yk}
Q. 19 Explain the concept of minimum spanning
This can be formulat ed as - tree. ~[JNTU : Part A, Marks 3}

L Pi z1 • L Pi Yi +(zk -yk) wk !k Ans. : Minimum spanning tree of weighted connected


lSiSn l:Si:Sn k graph G is a spanning tree with minimum or smallest
p· weight.
- L. (Yi -zi) wi w1·
1
k <1:Sn .

- L Pi Yi
1:Si:Sn

If l: Pi Z1· >p· y•then that means y is clearly greater


than z and 1 1
hence y cannot be optim · al Then
· Graph G
rern~i"---
~--'eS cases ate either z = x or z "# x. If z =. x and (a)
x is Optimal then transform y to x. This ultimatel y
Fig. Q.19.1 Graph and two spanning trees out of which
Proves that x too is optimal. . (b)° is a minimum spanning tree

[ 4.4 Minimum Cos t~~: S-J_ Q.20 Different iate between Prims and Kruskal '•
algorithm . ~ [JNTU : Part B, May-09, 12, Marks 6}
...;porta nt Points t; Remember Ans.:
• n,...,__ the
f
are two algorithm S or obtaining

l
•&~
Prim' s Sr. No. , Prims Algorithm ·1· Kruskal's Algorithm
rninhaum spanning tree namely -
Prim's algorithm Krlskal's algorithm
Alaorithm and Kruskal' s algorithm• .
1.
initializes with node initializes with edge
• ~ilsgy to obtain minimun t spaJUUD g tree

--::-
W
•~- ;~Prhn::'.:•~Al:go:n:·thm:~-~:, _.-=._. :-=~.:.=~J~:::: ::: ::: ::: ::: ;;:
(!t 1
1,_ DKODt ;
Design and Analysis of Algorithms 4. 9
r Step 2 :
2. In Kruskal's algorithm
In Prim's algorithm the
next node is always the minimum weight
edge is selected
CD
selected from
previously selected independent of earlier
I node. selection.
13. - ------
In Prim's algorithm The Kruskal's 0
graph must be algorithm can work
connected. with disconnected

4.
-+ -
graph.
i The time complexity of The time complexity of
© 0
1ob:1I wal1,.1ht .. 10
.Kruskal's algorithm is
1 ~ s algorithm is
.i... .o ) O(logV) .
' ....... ..... Step 3 :

Example of Prim' s Algorithm - Refer q.21, Q.22. 0


Example of Kruskal' s Algorithm - Refer Q.24, Q.25.

Q.21 Explain the Prim'• algorithm with the


appropriate example. 0
~[JNTU : Part B, April•11, Marks 7, Nov,•16, Marks 10]

Ans. : Consider the graph given below :


25 0
Total weight - J3

Step 4:

0
20
· Fig. Q.21.1 Graph

Now, we will consider all the vertices first. Then we


0 0
will select an edge with minimum weight. The
algorithm proceeds by selecting adjacent edges with
minimum weight. Care should be taken for not r - - -- --t 4
forming circuit. 20

Step 1 :
... Total weight • 53
Step 5 :

0 0 0
© 0 0
© 0
Total weight = 0
20

Total weight • 64
•-10
----~ Grudy Method

//Initially assign minimum dist e.s infinity


for i+- O to nodes-1 do
{
for j+--0 to nodes-1 do
0 {
C:s.::7"1ect-:-
an- ec1g9
- luefl
-- lha!
- ~--
'/9f!Bl( la Mlectld llflC! other 11 001
ano the edge has 1h11 !eat 'Ml¢t.

It (G [i,jJANO ((tree(iJAND ltree[jJ)OR


(ltree{i) AND tree{j])))then
{
20
lf(G(i,j) < min_dist)then
iota! weight == 78

{ ~ ~ .. ~--~1
min_dist+-Gf iJ J
vlt-i
v2 +- J. ~ p~
! '.r'feldmg up !hcJge Yer'..095
nnmum edge.
}
}
}
}

Wrlte(vl, v2,min_dist);
tree(vl) t- tree(v2] t- 1
total t- total+min_dist
}
Total weight = 90 Wrlte("Total Path Length Is" ,total)

Q.23 Give the time compJexity of Prlm'•


Qa ..... Prim'• Algorithm. Algorithm. ~[JNTU : Part B, Marks 5]
~ [JNTU : Part B, Marks 5)
Ans. : The algorithm spends most of the time in
Ala: selecting the edge with minimum length. Hence the
A11 • basic operation of this algorithm is to find the edge
..........Size-1,0 ... Size - 1],nodes) with minimum path length. Titis can be given by
Ill: CZ ii Description: This algorithm is for following formula .
.__,
➔- Hdhl'J
l!PJt,,n algorithm for finding spanning tree f odes
/ ~ : Weighted Graph G and total number o nath

--
/,n.,...._.,_
. ......,. : Spanning . tad with total P
tree gets pnn

tllltal•O:
:- Me the selected vertices list
llr if...-0 to nodes-1 do
1NeUJ .,_o
= t ;//tel<e initial vertex
tllte(O)

Ill' fi-! to nodes do


{
adn_diat t- -
DKOOI
De,ign 11nd Analysl, of Algorithm, 4-11

T(n) • nodf-1 (nodfl1 + nodf11J


k• 1 1• 0 ja 0

T~ ~ by Time taken by
for (k • 1 to nodes-1) loop for (i = O to nodes-1) loop for (j s= 0 to nodes-1) loop

We take variable n for 'nodes' for the sake of simplicity of solving the equation then

T(n) • :t: (I1 + ~¥ol J


n-1
= L [((n - 1) + 0 + 1) + ((n - 1) + 0 + 1))
k= 1
n-1 n-1
= I <2n) == 2n 11
k== 1 k= 1

• 2n[(n - 1) - 1 + 1] = 2n(n - 1)
= 2n2-2n

T(n) = n 2
T(n) = 8(n 2 )
But n stands for total number of nodes or vertices in the tree. Hence we can also state

l_ Time complexity of Prim's Algorithm is 8( IV 2 1).

But if the Primfs algori~ is impleme nted using binary heap with the creation of graph using adjacenc
y list then
its time complexity becomes 8 (E log 2 V) where E stands for total number of edges and V stands for
total number
of vertices.
Q.24 Explain the Kruakal '• algorithm with the appropriate example.
~[JNTU : Part B, Marks 5]
Ans. : Consider the graph given below :

Fig. Q.24.1 Graph


First we will select all the vertices. Then an edge with optimum weight is selected from heap, even though it lJ
not adjacent to previously selected edge. Care should be taken for not forming circuit.
" "..... ...Jrir ft/ Algllr itb.~

_,: (j,uJ y M,tHIJI

0 a,.,,, s 1
CD

©Totdl W$lght a0 a

1
0
Step 8 :

0 0

© 0
Total weight = 1O

Sllp I:
20
0 Total wr,lght • 87

Step 7:

0
~
10

©
Total weight = 21
-.,, : 20
Total weight • 90

Q.25 Write Kruakal1 algorithm that generata


minimum •panning tree for every connecud
undirected graph.
l(if [JNTU : Part B, May-12, Marks 15, DecA 2,
Marks 7)

Ans.:
Algorithm spann ing_t ree()
//Pro blem Deacrlptlon : This algori thm finds the
//min imum spann ing tree using Kruskal'• algorithm

0 //lllp ut: The adjac ency matri x graph G contain1ng colt

Total weigh t• 33

DtCOol.
Vulgn 4nd l\t1aly1/1 uf Alg1.rrltlim•
1 - 13
//Out put : prtm11 th•J 11p(Jm1lng tr9u with thg wtol
cost r.Jf
//apennJ.ng trtte
counu - -0
k<-0
eum+ -0
for t<- -0 to tot_no dtJ• do
peren t(il'- 1
while (coun tl tot nodvl l-l)do
{
P<>•' -Mini muro( tot_ed ges);/ /ftndl ng the mirum um colt
Bdge
lf(pcs ""-1 )then //P~rh apg no nod a 1n tha graph
break
vl+---O (po&t. vl
v2•-0 tpoa) .v2
1'-Fin d(vl ,pai-eot}
J....-F1nd(v2,parent)
lf(ll•J )tben
{ tr~e []I ] lu an array In
which the epannlng tree
tree(k}(OJ '-Vl ..-,::; ;...--, edge, are ator&d
//11UJring the minim um edge in anay tree[)
tree(k )l1J .__ v2
k++
coun t++;
,wm + +-G (poa I.coat CompuUng total cost
or all the minim um
I I accum ulot1 ng the total cost of MST distan ces.
Un10n (l,J,pa rent);
}
O!Po•J.coat INFlNlT'f
}
U(cou nt • tot_n ode& -'l)t~
{
for t<- -0 to tot_n odes- 1
{
For each node of
wrtt. (tr8e UI(O I. tree 11) (1 J) I, the minimum distance
} edges are collected in
wri t.("C oat of $penn ing Tree fa ",sum ) array tree [ ] [ ] .
The spanning tree
} Is printed here.

Q.26 Comp ute the dm• c;om plnlty of deriving


minim um ■penning tree from the weigh ted conn
u.JngKrwlkal'• ■lgortthm. ected ..,ra
ll1i"[JN TU : Part B, Aprll-11, Marks 7, Dec.-12, Marki IJ
Ans. : The timP comph•xlty of Krusk al's algor ithm ia O(Elo
gV) or O(ElogE) wher e Eis total numb er of edges, V
tot~I numb er of v,\rtkcllJ.
Proof : t..~t, B Js th,! totnl numbc•r of ,\dgc11. Jn Kru6ka1'u
p,~rformed •
f) D,1h•rmlnJng ,·dgc• wlth mfnlm um nmt.
algor ithm follow ing are the two main functions that

iJ) f),•lctt~ Hit' Jow,•i,I c.c,"t ,•d))c' from th,• liat of c.•dgc:
t.
-
I
lI

T. -
11 { ,lffl/f Al l'l/111 /f./l.f/f) N'1 At11111 tltrl/f,I r,,, k1111wJ,,,JIJ"
.-AMJyru of Alg07'i8uas
- -- -------- 4-14
Jolla 11.ese functions
can be done eH;,.-;,.,__.ai_ . , • - - - - -- - - -- -
Gfffllly~
----..::....--
J -..a-. E is :maintained as a ~~~• ...-y ll r:-c list nay at diet• ait-,e Oadadias die wCw:taLL al
~-_,-_ry.-w to sort all fue edo--es_ T £-U.5-t.
•-.,- -LI.CU_
;.-_ -
3-~ i: k run
~ file directed trees). •. . par8a ~ •
- . o - l.!~edzes aMd.
- - • • ~ asO(la mEfnbea:p th.en fr~ :leX~ Edge-~~
al,I - ,ed m og i hme. 1:'l.e consn-:~,..,..,~ ~ l - . ~
lir[JJmJ : Part ~ Dec.--09, Marfa u . ~11. Milrla 15]
~.,_ O(IE J) time. L'sing u;u~ -c,-,..J
~
;:~J.i,Ot •8:4:p
- v... a.:_~ ........- . a..gotl.....~
die MST c.an be construaea in Or Etlog E.J. -::_--:is
-..es
r-·- that . the computmg .,;...,.,~ ~ v _ • ,
~ ~ ~~s
·-ifl■m JS O(ElogF;.
QZI Slaow that in a complete. sra,ph With n
• • -, tile number of SJ>annln.g trees eenented F,g. 0-2S.1
- - be greater than (2•- I - 2). Ans.. : i) Prim'.s algmiJhm
a'"[JMTU : Part B, Apnl-11, Kins 8] .
S1a-p 1 : 5e!ect a:i: edge .-\13 ""~.lll m:infmmn ?.eight
,.,._ : For a complete graph ½ -.;•,ith n ,e:rtkes, the . ,, a
tolal onrnber of spanning trees are n=- - 2.
For example consider a complete graph ...,,ith 3
,alias.
i.e. ~-2=~-2 =31 =3 Stap 3; Select nex'.t j II j j tf i : I !J '. L, y;,eigh!- edge CD.
3 2
0 0 Select next m:intm:rm w-e:gh: eri..5~ .-\E.
Sta,p 4 :
.
.... 3 2
....
- ....
0-----0- © ©
-= .J

0
F"~3 Q.27.1
0 0 ·l=·
Q.21 o.e.
eHbe.r Prlm's or Kruskafs aJgo1tthm Smp 5 : Select: an edge "'-'ilh mini:mmn weight.

;f\(= . .
llllll ■ 11aen are r.egattve edge weights. • 5 2 C -= ~

~[JHTU : Part S, April-11, M.z.Tn 8)


Aa.: Yes the Prim's or Kruskar s algorithm "i'r~O'"~
wlam there are negative 1".,eights in the gra~n.
Be.awe in finding mini:mUm spamung · tree usmg Step 6 : Select an edge with mi::ntrm::rm , ..eight FG
1.--

fllele algorithms.. We select an edge tnaf . • H•:D

:f\(

miJ • --om weight. The oruer
_,l in ...-hich ~ertice:s a::-e
' lv
added to minimum spanning tree depends on;:: on
er:t Mtg of the weight of the edge. Hence there . . 15 no -- . 1
~G

e&ct of negative edge in generation - of mromur:m


,, • tg tree.
Step 7 : Select an edge ,.,i:th minimum weight 1.e.
GH
Cl.II I oee we want to find the m.inimIDD
---~ ~h
5 C bee of the following ~ - r 2. there is a

II ._ n....a_
rnnl •
, aJ-_...;.+hlT'I; whenever
..... el aode:s, always use
l:S'-'~u-~-~
_..i_.;..-
aI }labetic o~us::s
P table showins
:K...__~ _2_ :
__3 ____
;

fae.. .._. &om node A). Dr.a~ a arraY•


~ t; Ill ..,_ vaJuea of the cost .-ph. As all the vertices get nsited the path tength - 14 in
■ naw.c the ,aine ~-
If ... Kniabf'• aJgoritlun on cture looks at an alphabetic order.
. . . . . . . . . di.joint-MU data stru

.a, ~
Design and Analy sis of Algorithms 4-15

b) Krus kal's algor ithm A


Step 6 : Select an edge AE
Step 1 : Select an edge AB E
Disjoint-set (path
compression)
B

] C G

Step 2 : Select an edge FG


A D
--~--- H
Total p ath lengt h • 12
F
B
[ 4.S_.s l~gle Sour ce Shor test Path- Prob lem J
Q.30 Write the pseu doco de of a simp ler versi on
G of
Dijkstra's algorithm that finds only the di■tancea
Step 3 : Select an edge GH A (I.e. the lengt hs of short est paths but not short.eat
paths them selve s) &om a given verte x to all other
verti ces of a . graph repre sente d by Its weigh
t
B F matrix.
~[JN TU : Part B, Dec.-11, Marks 15, May-1 7, Marks
5}
Ans. :
G void Dijks tra(in t ·tot_n odes, int cost(1 0)[10 ).int sourc
e,mt
dist[) )
{
····- ········· · · · · · - - - - - - - - - - l H int i,j,v1 ,v2,m in_dis t;
Step 4 : Select an edge GD A int s(lO)i
for(i= O;i<t ot_no des;i+ +)
{
B dist[i ) =cost (sour ce)(i) ;//ini tially put the
s[i)=O; //dist ance from sourc e verte x to i
//i is varie d for each verte x
path[ i)=so urce; //all the sourc es are put in path
}
s (sour ce)= 1;
D
- ·••········· ·•·••········--··--..··.................-................ r--- --.. ..:..H
..:.__ .......... _.......... for(i= 1;i <tot_ node s;i + +)
Step 5 : Select an edge BC {
A
min_dist= infini ty;
vt=-1 ;//res et previ ous value of vl
for(j =O;j <tot_ node s;j + + )
B
{
if(s[j) = =O)
{
C
if(dist[j ) < min_ dist)
{
D min_ dist =dist (jl;//f indin g minim um disam ce
H v l=j;
}
)
} --- Greedy Mnhod
atwtJ•l; d(P - Q) = 1 d(P- R) • 00

b(v2•0;v2< tot_nodes; \12 ++)


d(P - S) = 6 d(P - U) = oo
{
·•(v2)==0 ) d(P - T) = 7
{ Stap 2 : Choose vertex Q.
if(distfv1]+ cost(vl](v2 )<di t(
{ S V2)} S = {P, Q}, T = {R, S, T, U}

dist(v2]=d ist(vl)+cos t(vl](v2)· d(P - S) = d(P - Q) + d(Q - S) = 5


path(v2)=v 1; ' d(P -T) = 7
}
} d(P - R) = d(P - Q) + d(Q - R) = 2
} d(P - U) = 00

}
$tep 3 : Choose vertex R
}
void cllaplay(int source ,int destination ,int dist I]) S = {P, Q R} , T = {S, T, U}
{
d(P - S) = d(P - Q) + d(Q - R) + d(R - S) = 4
IDIS:
getala(); d(P -T) = 7

pdatt('"\11 Step by Step shortest path is ... \n"); d(P - U) = d(P - R) + d(R - U) = 2 + 1 = 3
fm(f•dNtina tion;il =source;i= path(i])
Step 4 : Choose vertex U.
{
pdntf( .. %d<-",i); S = {P, Q R, U}, T = {S, T}
} d(P- S) = 4
PDIIICatJkl" ,t>:.
d(P - T) = 7
Plimt(• 'l'be length=%d " ,dist(destin ation]);
} Step 5 : Choose vertex S

Q.31 Applp Dljbtraw' • algorithm for finding all S = {P, Q R, U, S}, T = {T}

.....
•olkat patba from single source 'P' In a given d(P -T) = 7

Thus we get shortest distances to all other vertices


from vertex P. These distances are represented as
follows.
(1) (2)
Q

'L---- =---- u

~.
.... f
r We will always choose
T

Fig. Q.31.1

the vertex with


'P'. Hence
T
(7)
Fig. Q.31.2
2

..,., Ult COit. We will start wi th vertex

~~ The ~ y dIB-~ ~ - - •

You might also like