Download as pdf or txt
Download as pdf or txt
You are on page 1of 1608

DESIGN & ANALYSIS OF

ALGORITHMS

Lecture # 1 & 2
INTRODUCTION
Objectives
■ Design and analysis of modern algorithms
■ Different variants
■ Accuracy
■ Efficiency
■ Comparing efficiencies
■ Motivation thinking new algorithms
■ Advanced designing techniques
■ Real world problems will be taken as examples
■ To create feelings about usefulness of this course

2
Expected Results
■ On successful completion, students will be able to
– Argue and prove correctness of algorithms
– Derive and solve mathematical models of problems
– Reasoning when an algorithm calls certain approach
– Analyze average and worst-case running times
– Integrating approaches in dynamic and greedy algos.
– Use of graph theory in problems solving
– Advanced topics such as
– Computational geometry, number theory etc.
– Several other algorithms such as
– String matching, NP completeness, approximate algorithms etc.
Recommended Material
■ Lecture Slides

■ Introduction to The Design & Analysis of


Algorithms by Anany Levitin

■ Computers and Intractability, Guide to the


Theory of NP-Completeness by Garey and
Johnson

4
Tentative Grading Scheme
■ 10 Class Tasks (10%)
■ 5 Assignments (10%)
■ 3-5 Quizzes (10%)
■ 1 Mid Term (25%)
■ 1 Presentation (5%)
■ 1 Final Term (40%)

5
Course Access
■ LMS

■ YouTube Channel

■ WhatsApp Group

■ Email
sulaman87@gmail.com
Course Contents
What is Algorithm?

■ A computer algorithm is a detailed step-by-step method for solving


a problem by using a computer.
■ An algorithm is a sequence of unambiguous instructions for
solving a problem in a finite amount of time.
■ An Algorithm is well defined computational procedure that takes
some value, or set of values, as input and produces some value,
or set of values as output.
Algorithm
■ An algorithm is a sequence of instructions that one must perform
in order to solve a well-formulated problem.

■ A well-formulated problem is unambiguous and precise, leaving


no room for misinterpretation. The number of steps are always
finite.

■ In other words, algorithm is the method of translating the inputs


into the outputs.
6 Characteristics of an Algorithm
Finiteness
■ Algorithms are finite in two aspects:
– Length of Algorithm
– Number of Computational Steps

■ If algorithm is not of finite length, then we do not have infinite


space/memory to save and implement it.
■ If the number of computational steps are not finite, then it means
the problem is not solved at all, as infinity is undefined.
Popular Algorithms
■ Most basic and popular algorithms are
– Sorting algorithms
– Searching algorithms
■ Which algorithm is best?
– Mainly, it depends upon various factors, for example in case
of sorting:
■ The number of items to be sorted.
■ The extent to which the items are already sorted.
■ Possible restrictions on the item values.
■ The kind of storage device to be used etc.
Problems vs Algorithms vs Programs
■ For each problem or class of problems, there may be many
different algorithms.
■ For each algorithm, there may be many different implementations
(programs).
Testing Correctness
■ How do we know whether an algorithm is
actually correct?
■ First, the logical analysis of the problem we
performed in order to design the algorithm
should give us confidence that we have
identified a valid procedure for finding a
solution.
■ Second, we can test the algorithm by choosing
different sets of input values, carrying out the
algorithm, and checking to see if the resulting
solution does, in fact, work.
Testing Correctness

■ BUT… no matter how much testing


we do, unless there are only a finite
number of possible input values for
the algorithm to consider, testing can
never prove that the algorithm
produces correct results in all cases.
Testing Correctness
■ We can attempt to construct a formal,
mathematical proof that, if the algorithm is
given valid input values then the results
obtained from the algorithm must be a solution
to the problem.
■ We should expect that such a proof be
provided for every algorithm.
■ In the absence of such a proof, we should view
the purported algorithm as nothing more than
a heuristic procedure, which may or may not
yield the expected results.
Algorithm Design Techniques
■ There are many types of algorithm designing techniques, but the most
commonly studied are:
1. Recursive Algorithms 7. Dynamic Programming

2. Iterative Algorithms 8. Greedy Algorithms

3. Divide & Conquer Algorithms 9. Brute Force Algorithms


4. Decrease & Conquer 10. Backtracking Algorithms
5. Transform & Conquer 11. Randomized Algorithms
6. Iterative Improvement
12. Time & Space Trade-off
Problem Solving Phases

■ Analysis
– How does system work?
– Breaking a system down to known components
– How components (processes) relate to each other
– Breaking a process down to known functions
■ Synthesis
– Building tools
– Building functions with supporting tools
– Composing functions to form a process
– How components should be put together?
– Final solution
Problem Solving Process

Verification

Implementation

Analysis
•Correctness
Algorithm •Time & Space
•Input •Optimality
Strategy •Output
•Steps
•Design
Problem
Time Complexity of Algorithms
■ When we talk about time complexity of algorithms, what do we
actually mean?

■ It is not the actual running time. It is not the standard.


– It varies from machine to machine & compiler to compiler.
– It even varies on one machine due to availability of resources.
Model of Computation (Assumptions)
■ Design assumption
– Level of abstraction which meets our requirements
– Neither more nor less e.g. [0, 1] infinite continuous interval
■ Analysis independent of the variations in
– Machine
– Operating system
– Programming languages
– Compiler etc.
Model of Computation (Assumptions)

■ Low-level details will not be considered


■ Our model will be an abstraction of a standard generic
single-processor machine, called a random access
machine or RAM.
■ A RAM is assumed to be an idealized machine
– Infinitely large random-access memory
– Instructions execute sequentially
Model of Computation (Assumptions)
■ Every instruction is in fact a basic operation on two values in the
machines memory which takes unit time.
■ These might be characters or integers.
■ Example of basic operations include
– Assigning a value to a variable
– Arithmetic operation (+, - , × , /) on integers
– Performing any comparison e.g. a < b
– Boolean operations
– Accessing an element of an array.
Model of Computation (Assumptions)
■ In theoretical analysis, computational complexity
– Estimated in asymptotic sense, i.e.
– Estimating for large inputs
■ Big O, Big Omega, Theta etc. notations are used to compute the
complexity
■ Asymptotic notations are used because different implementations
of algorithm may differ in efficiency
■ Efficiencies of two given algorithm are related
– By a constant multiplicative factor
– Called hidden constant.
Drawbacks in Model of Computation
■ First poor assumption
– We assumed that each basic operation takes constant time, i.e. model allows
■ Adding
■ Multiplying
■ Comparing etc.
two numbers of any length in constant time
■ Addition of two numbers takes a unit time!
– Not good because numbers may be arbitrarily
■ Addition and multiplication both take unit time!
– Again very bad assumption
Justification for the Model

■ But with all these weaknesses, our model is not so bad


because we have to give the

– Comparison not the absolute analysis of any algorithm.


– We have to deal with large inputs not with the small size
– Model seems to work well describing computational power of
modern nonparallel machines
Can we do Exact Measure of Efficiency?

■ Exact, not asymptotic, measure of efficiency can be sometimes


computed but it usually requires certain assumptions concerning
implementation.
Summary : Computational Model
■ Analysis will be performed with respect to this computational
model for comparison of algorithms
■ We will give asymptotic analysis not detailed comparison i.e. for
large inputs
■ We will use generic uniprocessor random-access machine (RAM)
in analysis
– All memory equally expensive to access
– No concurrent operations
– All reasonable instructions take unit time, except, of course, function
calls
Conclusion
■ What, Why and Where Algorithms?
■ Designing Techniques
■ Problem solving Phases and Procedure
■ Model of computations
– Major assumptions at design and analysis level
– Merits and demerits, justification of assumptions taken
■ We proved that algorithm is a technology
■ Compared algorithmic technology with others
■ Discussed importance of algorithms
– In almost all areas of computer science and engineering
– Algorithms make difference in users and developers
DESIGN & ANALYSIS OF
ALGORITHMS

Lecture # 3
MatheMatical algorithMic
toolbox
Review of Lecture No 1 & 2
■ Introduction to Algorithms
■ Designing Techniques
■ Algorithm is a Technology
■ Model of Computation
■ Even we have supercomputers, it requires algorithm
■ Algorithms makes difference in users and modeler
■ Some of the applications areas of algorithms
 Management and manipulation of data
 Electronic commerce
 Manufacturing and other commercial settings
 Shortest paths etc.
Topics Covered Today
■ Sets
■ Sequences
■ Order pairs
■ Cross Product
■ Relation
■ Functions
■ Operators over above structures
■ Conclusion
Set
■ A set is well defined collection of objects, which are
– unordered
– distinct
– same type
– with common properties
■ Notation:
– Elements of set are listed between a pair of curly braces
S1 = {R, R, R, B, G} = {R, B, G} = {B, G, R}
■ Empty Set
– S3 = { } = , has not elements, called empty set
Representation of Sets
■ Three ways to represent a set
• Descriptive Form
• Tabular form
• Set Builder Form (Set Comprehension)
Example
■ Descriptive Form
S = set of all prime numbers
■ Tabular form
S = {2, 3, 5, . . .}
■ Set Builder Form
S = {x : xN  i{2, 3, . . ., x-1}  i (x mod i ≠ 0)}
Representation of Sets
Some More Examples

1. {x : x  Z  x2 = x} = {0, 1}
2. {x : x  N  x mod 2 = 0} = {0, 2, 4, . . . }
3. {x : x  N  x mod 2 ≠ 0} = {1, 3, 5, . . . }
4. {x : x  Z  x ≥ 0  x ≤ 6} = {0, 1, 2, 3, 4, 5, 6}
5. {x2 : x  Z  x ≥ 0  x ≤ 6} = {0, 1, 4, . . ., 25, 36}
6. {x3 : x  N  x mod 2 ≠ 0} = {1, 27, 125, . . . }
Assignment #1
 Note that people disagree on the exact definitions of whole numbers and
natural numbers.

 Research Assignment#1
 0 is included in Whole numbers or Natural numbers?
 What is the definition of Natural numbers and Whole numbers?
 What is your opinion about zero w.r.t the definitions of Natural numbers and
Whole numbers?
 Validate your opinion.

7
Assignment #1
 Your article should contain sections like:
 Abstract
 Background
 Status of zero
 Definitions of Natural & Whole Numbers
 My opinion
 Proof
 References
 Each section should contain a proper heading
 Each reference should be properly cited
 Extra marks for the best group

8
Assignment #1
 Format
 Font Size: 12
 Font Name: Times New Roman
 Heading Size: 16
 Heading Font: Arial Black
 Page Size: A4
 Margins: Normal
 Group of a maximum of 4 and minimum of 3 students can submit one article.
 Only softcopy *.docx file is acceptable.
 Deadline: by 15 November, 2020
 Groups with same or similar article will get zero marks.

9
All collections are not sets
■ The prime numbers
Primes = {2, 3, 5, 7, . . . }
■ The four oceans of the world
Oceans = {Atlantic, Arctic, Indian, Pacific}
■ The passwords that may be generated using eight lower-case letters,
when repetition is allowed
■ Hard working students in BS-CS class session 2020-21 at DSU
■ Intelligent students in this class
■ Kind teachers at DSU
Operators Over Sets
Membership Operator
■ If an element e is a member of set S then it is denoted as
e  S and is read as “e is in S”.
■ Let S is sub-collection of X (any Universal set)
X = a set
SX

Now
 : X × PX Bool
 (x, S) = 1, if x is in S
= 0, if x is not in S
Subset
If each element of A is also in B, then A is said to be a subset of B,
A  B and B is superset of A, B  A.

Let X = a universal set, A  X, B  X


Now,
 : X x X  Bool
 (A, B) = 1, if x : X, x  A  x  B
= 0, if x : X, x  A  x  B
More Operators Over Sets
Intersection
 : X x X  X
 (A, B) = {x  X | x  A and x  B}
Union
 : X x X  X
 (A, B) = {x  X | x  A or x  B}
Set Difference
\ : X x X  X
\ (A, B) = {x  X | x  A but x  B}
Cardinality and Power Set of a given Set
Cardinality of a Set
■ A set, S, is finite if there is an integer n such that the elements of S can
be placed in a one-to-one correspondence with {1, 2, 3, …, n}, and we
say that cardinality is n. We write as: |S| = n
Power Set
■ How many distinct subsets does a finite set on n elements have? There
are 2n subsets.
■ How many distinct subsets of cardinality k does a finite set of n
elements have?
There are n n!
C (n, k ) Ck    
n

 k  (n  k )!k!
Partitioning of a set
A partition of a set A is a set of non-empty subsets of A such that
every element x in A exactly belong to one of these subsets.
Equivalently, set P of subsets of A, is a partition of A
1. If no element of P is empty
2. The union of the elements of P is equal to A. We say the elements
of P cover A.
3. The intersection of any two elements of P is empty. That means
the elements of P are pair wise disjoints.
Mathematical Way of Defining Partitioning
A partition P of a set A is a collection {A1, A2, . . ., An} such that
following are satisfied
1.  Ai  P, Ai  ,
2. Ai  Aj = ,  i, j  {1, 2, . . ., n} and i  j
3. A = A1  A2  . . . An
Example: Partitioning of Z Modulo 4
{x  Z | x mod 4 = 0 } = {. . .-8, -4, 0, 4, 8, . . . }
{x  Z | x mod 4 = 1 } = {. . .-7, -3, 1, 5, 9, . . . }
{x  Z | x mod 4 = 2 } = {. . .-6, -2, 2, 6, 10, . . . }
{x  Z | x mod 4 = 3 } = {. . .-5, -1, 3, 7, 11, . . . }
Sequence
■ It is sometimes necessary to record the order in which objects are
arranged, e.g.,
– Data may be indexed by an ordered collection of keys
– Messages may be stored in order of arrival
– Tasks may be performed in order of importance.
– Names can be sorted in order of alphabets etc.
■ Definition
– A group of elements in a specified order is called a sequence.
– A sequence can have repeated elements.
Sequence
■ Notation: Sequence is defined by listing elements in order, enclosed in
parentheses. e.g.
S = (a, b, c), T = (b, c, a), U = (a, a, b, c)
■ Sequence in a set
S = {(1, a), (2, b), (3, c)} = {(3, c), (2, b), (1, a)}

■ Permutation: If all elements of a finite sequence are distinct, that


sequence is said to be a permutation of the finite set consisting of the
same elements.
■ No. of Permutations: If a set has n elements, then there are n! distinct
permutations over it.
Operators over Sequences
■ Operators are for manipulations
■ Concatenation
– (a, b, c ) + ( d, a ) = ( a, b, c, d, a )
■ Other operators
– Extraction of information: ( a, b, c, d, a )(2) = b
– Filter Operator: {a, d} ( a, b, c, d, a ) = (a, d, a)
■ Note:
– We can think how resulting theory of sequences falls within our existing
theory of sets
– And how operators in set theory can be used in case of sequences
Tuples and Cross Product
■ A tuple is a finite sequence.
• Ordered pair (x, y), triple (x, y, z), quintuple
• A k-tuple is a tuple of k elements.
Construction to ordered pairs
■ The cross product of two sets, say A and B, is
A  B = {(x, y) | x  A, y  B}
| A  B | = |A| |B|

■ Some times, A and B are of same set, e.g.,


Z  Z, where Z denotes set of Integers
Binary Relations
Definition: If X and Y are two non-empty sets, then
X x Y = {(x, y) | x X and y  Y}
Example: If X = {a, b}, Y = {0,1} Then
X x Y = {(a, 0), (a, 1), (b, 0), (b, 1)}

Definition: A subset of X x Y is a relation over X x Y


Example: Compute all relations over X x Y, where
X = {a, b}, Y = {0,1}
R1 = , R2 = {(a, 0)}, R3 = {(a, 1)}
R4 = {(b, 0)}, R5 = {(b, 1)}, R6 = {(a, 0), (b, 0)}, . . .
There will be 24 = 16 number of relations
Binary Relations
Definition: XY = P(X x Y)
Example: If X = {a, b}, Y = {0,1} Then
X x Y = {(a, 0), (a, 1), (b, 0), (b, 1)} and
P(X x Y) = {R1, R2, R3, R4, R5, R6, R7, R8, R9, R10,R11, R12, R13, R14, R15, R16}

Lemma 1: if |X| = m, |Y| = n then


|P(X x Y)| = 2m + n

Algorithm: All binary relations, cardinality of it


Equivalence Relation
A relation R  X  X, is
■ Reflexive:
(x, x)  R,  x  X
■ Symmetric:
(x, y)  R  (y, x)  R,  x, y  X
■ Transitive:
(x, y)  R  (y, z)  R  (x, z)  R
■ Equivalence:
If reflexive, symmetric and transitive
Applications : Order Pairs in Terms of String
Example 1
If  = {0, 1}, then construct set of all strings of length 2 and 3
■ Set of length 2 =  x  = {0,1} x {0,1} = {(0,0), (0,1), (1,0), (1,1)}
= {00, 01, 10, 11}
■ Set of length 3 =  x  x  = {0, 1} x {0, 1} x {0,1}
= {(0,0,0), (0,0,1), (0,1,0), (0,1,1), (1,0,0), (1,0,1), (1,1,0), (1,1,1)}
= {000, 010, 100, 110, 001, 011, 101, 111}
Example 2
If  = {0, 1}, then construct set of all strings of length ≤ 3
■ Construction = {} U  U  x  U  x  x 
Similarly we can construct collection of all sets of length n.
Partial Function
A partial function is a relation that maps each element of X to at
most one element of Y.
pf: XY denotes the set of all partial functions.
Function
If each element of X is related to unique element of Y then partial
function is a total function denoted by f: X  Y.
Is Algorithm a Function?

Not good algorithms


Conclusion
■ We started with sets and built other structures e.g.
– Sequences
– Relations
– Functions, etc.
■ We discussed operators over above structures
■ All these tools are fundaments of mathematics
■ Sets play key role in algorithms design and analysis
■ Finally proving correctness of algorithms, we required logic.
■ Our next lecture will be on proving techniques
Class Task 1 (Graded)
■ Give mathematic definitions (set-builder mathematical
pseudocode) for:
– Partial Function
– Total Function

Note: Submit your answers on LMS by 12 Nov, 2020.


Late submissions will not be entertained.
DESIGN & ANALYSIS OF
ALGORITHMS

Lecture # 4, 5 & 6
ProPositional &
Predicate loGic
Logic
■ Logic is the study of the principles and methods that distinguish
between a valid and an invalid argument.

2
Statement / Proposition
■ A statement is a declarative sentence that is either true or false
but not both.
■ A statement is also referred to as a proposition.
■ Proposition: Simplest statements also called atomic formula.

■ EXAMPLES:
a. 2+2 = 4,
b. It is Sunday today.

3
Truth Values
■ If a proposition is true, we say that it has a truth value of "true”.
If a proposition is false, its truth value is "false".
The truth values “true” and “false” are, respectively, denoted by the
letters T and F.

■ EXAMPLES:
Propositions
1) Grass is green.
2) 4 + 2 = 6
3) 4 + 2 = 7
4) There are four fingers in a hand.

4
Invalid Propositions
■ Not Propositions
1) Close the door.
2) x is greater than 2.
3) He is very rich.

■ Either they don’t have any truth value or they are not declarative.

5
Remarks
■ If the sentence is preceded by other sentences that make the pronoun
or variable reference clear, then the sentence is a statement.

■ Example:
x=1
x>2
“x > 2” is a statement with truth-value FALSE.

Example
Bill Gates is an American.
He is very rich.
“He is very rich.” is a statement with truth-value TRUE.

6
Examples
■ x + 2 is positive. Not a statement

■ May I come in? Not a statement

■ Logic is interesting. A statement

■ It is hot today. A statement

■ -1 > 0 A statement

■ x + y = 12 Not a statement

7
Compound Statements
■ Simple statements could be used to build a compound statement.

■ LOGICAL CONNECTIVES
EXAMPLES:
1. “3 + 2 = 5” and “Lahore is a city in Pakistan.”
2. “The grass is green” or “ It is hot today.”
3. “Discrete Mathematics is not difficult to me.”

■ Here, AND, OR, NOT are called LOGICAL CONNECTIVES.


8
Notation of Statement
■ Statements are symbolically represented by letters such as p, q,
r,...

■ EXAMPLES:
p: “Islamabad is the capital of Pakistan”
q: “17 is divisible by 3”

9
Common Connectives

Note: Some books used “” symbol for negation.


10
Common Connectives
■ EXAMPLES

– p : “Islamabad is the capital of Pakistan”

– q : “17 is divisible by 3”

– p q : “Islamabad is the capital of Pakistan and 17 is divisible by 3”

– p q : “Islamabad is the capital of Pakistan or 17 is divisible by 3”

– ~p : “It is not the case that Islamabad is the capital of Pakistan”


or simply “Islamabad is not the capital of Pakistan”

11
TRANSLATION
TRANSLATING FROM ENGLISH TO SYMBOLS

Let p = “It is hot”, and q = “ It is sunny”

12
TRANSLATION

13
TRANSLATION

14
Truth Table
■ A truth table specifies the truth value of a compound proposition
for all possible truth values of its constituent propositions.

15
Negation
■ NEGATION (~):
If p is a statement variable, then negation of p, “not p”, is denoted
as “~p”

■ It has opposite truth value from p i.e., if p is true, then ~ p is false;


if p is false, then ~ p is true.

16
Truth Table for Negation

17
Conjunction
■ CONJUNCTION ( ):
If p and q are statements, then the conjunction of p and q is “p
and q”, denoted as
“p q”.

■ Remarks
– p q is true only when both p and q are true.
– If either p or q is false, or both are false, then p q is false.

18
Truth Table for Conjunction

19
Disjunction
■ DISJUNCTION ( ) or INCLUSIVE OR
If p & q are statements, then the disjunction of p and q is “p or q”,
denoted as
“p q”.

■ Remarks:
– p q is true when at least one of p or q is true.
– p q is false only when both p and q are false.

20
Truth Table for Disjunction

21
Truth Table for Disjunction
■ Note it that in the table F is only in that row where both p and q
have F and all other values are T.

■ Thus for finding out the truth values for the disjunction of two
statements we will only first search out where the both
statements are false and write down the F in the corresponding
row in the column of p q and in all other rows we will write T in
the column of p q.

22
Remark
■ Note that for Conjunction of two statements we find the T in both
the statements, But in disjunction we find F in both the
statements.

■ In other words, we will fill T in the first row of conjunction and F in


the last row of disjunction.

23
Example
■ ~p^(qv~r)
As there are three variables so 23=8 rows.
p q r
T T T
T T F
T F T
T F F
F T T
F T F
F F T
F F F

24
Example
■ ~p^(qv~r)
As there are three variables so 23=8 rows.
p q r ~p
T T T F
T T F F
T F T F
T F F F
F T T T
F T F T
F F T T
F F F T

25
Example
■ ~p^(qv~r)
As there are three variables so 23=8 rows.
p q r ~p ~r
T T T F F
T T F F T
T F T F F
T F F F T
F T T T F
F T F T T
F F T T F
F F F T T

26
Example
■ ~p^(qv~r)
As there are three variables so 23=8 rows.
p q r ~p ~r q v ~r
T T T F F T
T T F F T T
T F T F F F
T F F F T T
F T T T F T
F T F T T T
F F T T F F
F F F T T T

27
Example
■ ~p^(qv~r)
As there are three variables so 23=8 rows.
p q r ~p ~r q v ~r ~p ^ (q v ~r)
T T T F F T F
T T F F T T F
T F T F F F F
T F F F T T F
F T T T F T T
F T F T T T T
F F T T F F F
F F F T T T T

28
Exclusive OR
■ When OR is used in its exclusive sense, The statement “p or q”
means “p or q but not both” or “p or q and not p and q” which
translates into symbols as (p q) ~ (p q).
■ It is denoted as p q or p XOR q

■ TASK: Prove That


p q ≡ (p q) ~ (p q) ≡ (p q) ( p ~ q)
by using Truth Tables.
NOTE: “≡” symbol is used for equivalence.

29
Truth Table of XOR

30
LOGICAL EQUIVALENCE
■ If two logical expressions have the same logical values in the truth
table, then we say that the two logical expressions are logically or
scemantically equivalent. In the following example, ~ (~ p ) is
logically equivalent p. So it is written as ~(~p) ≡ p

31
QUESTION
■ Are
~(p q) and ~p ~q
logically equivalent ?

32
De-Morgan’s Laws

33
TAUTOLOGY
■ A tautology is a statement form that is always true regardless of the truth
values of the statement variables. A tautology is represented by the symbol
“t”.
EXAMPLE: The statement form p ~ p is tautology.
p ~p≡t

34
CONTRADICTION
■ A contradiction is a statement form that is always false regardless of the truth values of
the statement variables. A contradiction is represented by the symbol “c”.
EXAMPLE:
The statement form p ~ p is a contradiction.
■ Since in the last column in the truth table we have F in all the entries, so it is a
contradiction i.e. p ~ p ≡c

35
Class Task 2 (Graded)
■ Give one statement that is a Tautology and one that is a
Contradiction with three variables x,y and z?
■ Also write the description of variables in simple English.

Note:Submit your answers on LMS by 16 Nov, 2020.


Late submissions will not be entertained.

36
Laws of Logic
■ 1) Commutative Laws
p q≡q p
p q≡q p

■ 2) Associative Laws
(p q) r≡p (q r)
(p q) r≡p (q r)

■ 3) Distributive Laws
p (q r)≡(p q) (p r)
p (q r)≡(p q) (p r)

■ 4) Identity Laws
p t≡p
p c≡p

37
Laws of Logic
■ 5) Negation Laws
p p≡t
p p≡c

■ 6) Double Negation Law


( p) ≡ p

■ 7) Idempotent (something that remains same) Laws


p p≡p
p p≡p

■ 8) De-Morgan’s Laws
~ ( p q ) ≡ ~p q
~ ( p q ) ≡ ~p q

38
Laws of Logic
■ 9) Universal Bound Laws
p t≡t
p c≡c

■ 10) Absorption Laws


p (p q)≡p
p (p q)≡p

■ 11) Negation of t and c


~t≡c
~c≡t

39
Conditional Statement
■ If p and q are statement variables, the conditional of q by p is “If p then
q”
or “p implies q” and is denoted p → q.

■ p → q is false when p is true and q is false; otherwise it is true.

■ The arrow "→ " is the conditional operator.

■ In p → q, the statement p is called the hypothesis (or antecedent) and


q is called the conclusion (or consequent).

40
Truth Table

41
BICONDITIONAL STATEMENTS
■ If p and q are statement variables, the biconditional of p and q is
“p if and only if q”.

■ It is denoted p↔q. “if and only if” is abbreviated as iff.

■ The double headed arrow " ↔" is the biconditional operator.

42
Truth Table
■ p ↔ q is true only when p and q both are true or
both are false.
■ p ↔ q is false when either p or q is false.
■ Just the negation of XOR.

43
Translation Summary
of p → q

44
Translation of p → q
p: You will attend the classes.
q: You will be allowed to participate in the exams.

■ If p, then q.

■ If you will attend the classes, then you will be allowed


to participate in the exams.
45
Translation of p → q
p: You will attend the classes.
q: You will be allowed to participate in the exams.

■ If p, q.

■ If you will attend the classes, you will be allowed to


participate in the exams.
46
Translation of p → q
p: You will attend the classes.
q: You will be allowed to participate in the exams.

■ q, if p.

■ You will be allowed to participate in the exams if you


will attend the classes.
47
Translation of p → q
p: You will attend the classes.
q: You will be allowed to participate in the exams.

■ q when p.

■ You will be allowed to participate in the exams when


you will attend the classes.
48
Translation of p → q
p: You will attend the classes.
q: You will be allowed to participate in the exams.

■ q whenever p.

■ You will be allowed to participate in the exams


whenever you will attend the classes.
49
Translation of p → q
p: You will attend the classes.
q: You will be allowed to participate in the exams.

■ q follows from p.

■ You will be allowed to participate in the exams follows from


your attendance in the classes.
50
Translation of p → q
p: You will attend the classes.
q: You will be allowed to participate in the exams.

■ p implies q.

■ You will attend the classes implies that you will be


allowed to participate in the exams.
51
Translation of p → q
p: You will attend the classes.
q: You will be allowed to participate in the exams.

■ p is sufficient for q.

■ You will attend the classes is sufficient to allow you to


participate in the exams.
52
Translation of p → q
p: You will attend the classes.
q: You will be allowed to participate in the exams.

■ A sufficient condition for q is p.

■ A sufficient condition to allow you to participate in the


exams is that you will attend the classes.
53
Translation of p → q
p: You will attend the classes.
q: You will be allowed to participate in the exams.

■ q is necessary for p.

■ To allow you to participate in the exams is necessary if


you will attend the classes.
54
Translation of p → q
p: You will attend the classes.
q: You will be allowed to participate in the exams.

■ A necessary condition for p is q.

■ A necessary condition to attend the classes is to allow


you to participate in the exams.
55
Translation of p → q
p: You will attend the classes.
q: You will be allowed to participate in the exams.

■ p only if q.

■ You will attend the classes only if you will be allowed to


participate in the exams.
56
Remarks
■ In mathematical reasoning, we consider conditional statements of
a more general sort than we use in English.

■ Propositional language is an artificial language; we only parallel


English usage to make it easy to use and remember.

■ The if-then construction used in many programming languages is


different from that used in logic.

57
BICONDITIONAL STATEMENTS
■ If p and q are statement variables, the biconditional of p and q is
“p if and only if q”.

■ It is denoted p↔q. “if and only if” is abbreviated as iff.

■ The double headed arrow " ↔" is the biconditional operator.

58
Truth Table
■ p ↔ q is true only when p and q both are true or
both are false.
■ p ↔ q is false when either p or q is false.
■ Just the negation of XOR.
■ Translation:


59
BICONDITIONAL STATEMENT
■ It is the conjunction of
p → q and q → p
i.e. (p → q) ^ (q → p)
p q p→q q→p p→q ^ (q→p)

T T T T T

T F F T F

F T T F F

F F T T T

60
Precedence of Logical Operators

Precedence Operator

1 ~
2 ^
3 v
4 →
5 ↔

61
Laws of Logic
■ 1) Commutative Laws
p q≡q p
p q≡q p

■ 2) Associative Laws
(p q) r≡p (q r)
(p q) r≡p (q r)

■ 3) Distributive Laws
p (q r)≡(p q) (p r)
p (q r)≡(p q) (p r)

■ 4) Identity Laws
p t≡p
p c≡p

62
Laws of Logic
■ 5) Negation Laws
p p≡t
p p≡c

■ 6) Double Negation Law


( p) ≡ p

■ 7) Idempotent (something that remains same) Laws


p p≡p
p p≡p

■ 8) De-Morgan’s Laws
~ ( p q ) ≡ ~p q
~ ( p q ) ≡ ~p q

63
Laws of Logic
■ 9) Universal Bound Laws
p t≡t
p c≡c

■ 10) Absorption Laws


p (p q)≡p
p (p q)≡p

■ 11) Negation of t and c


~t≡c
~c≡t

64
Applications
■ Using Laws of Logic, simplify the following compound statement.
p v [~(~p ^ q)]
■ Solution:
p v [~(~p ^ q)]
≡ p v [~(~p) v (~q)] De-Morgan’s Law
≡ p v [p v ~q] Double Negation Law
≡ [p v p] v ~q Associative Law
≡ p v ~q Idempotent Law
65
Applications
■ Using Laws of Logic, verify the logical equivalence
~ (~ p q) (p q) ≡ p
■ Solution:
L.H.S
= ~(~p q) (p q)

≡ (~(~p) ~q) (p q) De-Morgan’s Law


≡ (p ~q) (p q) Double Negative Law
≡p (~q q) Distributive Law
≡p c Negation Law
≡p Identity Law
=R.H.S 66
Applications
■ Rephrase the condition more simply.

“You will get an A if you are hardworking and the sun shines, or you are hardworking and it
rains.”

■ Solution Steps:
1. Identify the Simple Propositions
2. Translate the Condition in symbols
3. Simplification by apply the Laws of Logic
4. Re-Translation

67
Applications
■ Identifying the Simple Propositions.

“You will get an A if you are hardworking and the sun shines, or you are hardworking and it rains.”

■ Translation

 a if h and s or h and r

 a if h ^ s v h ^ r

 a if (h ^ s) v (h ^ r)

 [(h ^ s) v (h ^ r)] → a

68
Applications
■ Simplification

[(h ^ s) v (h ^ r)] → a
≡ [h ^ (s v r)] → a Distributive Law

■ Re-Translation

If you are hardworking and either the sun shines or it rains, you will get an A.
OR
You will get an A if you are hardworking and either the sun shines or it rains.

69
Practice
■ Rewrite each of the following sentences more simply:

1. It is not true that I am tired and you are smart.


 {I am not tired or you are not smart.}

2. It is not true that I am tired or you are smart.


 {I am not tired and you are not smart.}

70
Practice
■ Rewrite each of the following sentences more simply:

1. I forgot my pen and my bag or I forgot my pen and my glasses.


■ {I forgot my pen and EITHER I forgot my bag or my glasses.

2. It is raining and I have forgotten my umbrella, or it is raining and


I have forgotten my hat.
■ {It is raining and I have forgotten EITHER my umbrella or my hat.}

71
More Laws of Logic
1. Commutative Law: p ↔ q ≡ q ↔ p

2. Implication Law: p → q ≡ ~ p q≡~(p ~ q)

3. Exportation Law: (p q)→r ≡ p →(q →r)

4. Equivalence: p ↔ q ≡ (p →q) (q →p)

5. Reductio ad absurdum: p →q ≡ (p ~q) →c


(Reduction to Contradiction)

72
Application
■ Rewrite the statement p ~ q→ r without using the symbol →.

■ Solution:
p ~q → r
≡ (p ~q)→ r
≡ ~ (p ~ q) r Implication Law

73
Application
■ Rewrite the statement ( p→ r ) ↔ ( q → r ) without using the
symbol → or ↔.

■ Solution:
(p→r)↔(q →r)
≡ (~p r) ↔ (~q r)
Implication Law

≡ [(~p r) →(~q r)] [(~q r) →(~p r)]


Equivalence Law

≡ [~(~p r) (~q r)] [~(~q r) (~p r)]


Implication Law
74
Simplifying It Further
≡ [~(~p r) (~q r)] [~(~q r) (~p r)]

≡ [~(~p) ~r (~q r)] [~(~q) ~r (~p r)]


De-Morgan’s Law

≡ [p ~r ~q r] [q ~r ~p r]
Double Negation Law

75
Class Task 3 (Graded)
■ Rewrite the statement form ~p q → r ~q to a logically
equivalent form that uses only ~ and

Note:Submit your answers on LMS by 16 Nov, 2020.


Late submissions will not be entertained.

76
Class Task 4 (Graded)
■ Show that ~(p→q) → p is a tautology without using truth tables.

Note:Submit your answers on LMS by 16 Nov, 2020.


Late submissions will not be entertained.

77
Inverse, Converse & Contrapositive
■ Consider a conditional statement:
p→q

■ Its INVERSE will be:


~p → ~q

■ CONVERSE of the original statement will be:


q→p

■ While its CONTRAPOSITIVE can be written as:


~q → ~p
78
Remarks - INVERSE

79
Remarks - CONVERSE

80
Task - CONTRAPOSITIVE
■ Prove that conditional statement and its contra-
positive are equivalent.

[Hint: You can use TRUTH TABLES]

81
Example
■ Conditional statement: p→q
If I am elected, I will reduce the taxes.

■ INVERSE: ~p → ~q
If I am NOT elected, I will NOT reduce the taxes.

■ CONVERSE: q→p
If I reduce the taxes, I will be elected.

■ CONTRAPOSITIVE: ~q → ~p
If I DO NOT reduce the taxes, I will NOT be elected.
82
Practice Question
■ Which of the following statement are propositions?
1. Today is Monday.
2. It is raining.
3. x is greater than 2.
4. x is less than y.

■ 1 and 2 are propositions while 3 and 4 are not.


■ Limitation of Propositional Logic

83
Predicate Logic
■ A more powerful type of Logic.

 The statement “x is greater than 2” has two parts.

– Subject: Variable, x

– Predicate: The property that the subject of the


statement can have.

■ Here, “x” is the variable; “is greater than 2” is the predicate.


84
Notation
■ We denote propositions and predicates in a different way.

■ For example:
1. m: Today is Monday.
2. r: It is raining.
3. G(x): x is greater than 2.
4. L(x,y): x is less than y.

85
Propositional Function
■ When variable is assigned a value, it becomes a proposition. Hence,
G(x) and L(x,y) are called the PROPOSITIONAL FUNCTION.

■ Let P (x) : x > 3. What are the truth values of P (4) and P (2)?

■ Solution:
We obtain the statement P (x) by substituting the value of x.

P (4), which is the statement “4 > 3,” is TRUE.


P (2), which is the statement “2 > 3,” is FALSE.

86
Example
■ Let A(x) denote the statement “Computer x is under attack by an
intruder.”

Suppose that of the computers on campus, only CS2 and MATH1


are currently under attack by intruders.

What are truth values of A(CS1), A(CS2), and A(MATH1)?

87
Predicate with 2 Variables
■ Let Q(x, y) denote the statement “x = y + 3”

What are the truth values of the propositions


Q(1, 2) and Q(3, 0)?

88
Task
■ Let A(c, n) denote the statement “Computer c is connected to
network n,” where c is a variable representing a computer and n
is a variable representing a network.

Suppose that the computer MATH1 is connected to network


CAMPUS2, but not to network CAMPUS1.

What are the values of A(MATH1, CAMPUS1) and A(MATH1,


CAMPUS2)?
89
Predicate wit 3 Variables
■ Let R(x, y, z) denote the statement

‘x + y = z”

What are the truth values of the propositions R(1, 2, 3) and R(0,
0, 1)

90
Remark
■ Consider two examples:
1. P(x): x<x+1; x N
2. Q(x): x<3; x N

■ The first Predicate is TRUE for all values of x; while the second
Predicate is TRUE only if x=1 or x=2.

91
Quantifiers
■ Quantification expresses the extent to which a predicate is TRUE
over a range of elements.

1. P(x): x<x+1; x N
2. Q(x): x<3; x N

■ The first Predicate is TRUE for all values of x; while the second
Predicate is true only if x=1 or x=2 i.e. for some values of x.

92
Notation
1. P(x): x<x+1; x N
2. Q(x): x<3; x N

■ We can write xP(x).


Here x means for all values of x and is called the UNIVERSAL
QUANTIFIER.

■ The second one can be expressed as xQ(x).


Here x means for some values of x and is called the EXISTENTIAL
QUANTIFIER.

93
Example
■ Let P (x) be the statement “x + 1 > x”. What is the truth value of the
quantification xP (x),
where the domain consists of all real numbers?

■ Solution: Because P (x) is true for all real numbers x, the quantification
xP(x) is TRUE.

94
COUNTEREXAMPLE
■ Let Q(x) be the statement “x < 2”. What is the truth value of the
quantification xQ(x), where the domain consists of all real numbers?

■ Solution: Q(x) is not true for every real number x, because, for instance, Q(3)
is FALSE. That is, x = 3 is a COUNTEREXAMPLE for the statement xQ(x).
Thus, xQ(x) is FALSE.

95
Example
■ Let P (x) denote the statement “x > 3”. What is the truth value of the
quantification xP (x),
where the domain consists of all real numbers?

■ Solution: Because “x > 3” is sometimes true—for instance, when x = 4—the


existential quantification of P (x), which is xP (x), is TRUE.

96
Remark
■ If x1, x2, x3, …, xn is the domain for P(x), then

xP(x) means P(x1) ^ P(x2) ^ P(x3) ^ … ^ P(xn)

And

xP(x) means P(x1) v P(x2) v P(x3) v … v P(xn)

97
Task
■ Let Q(x) denote the statement “x = x + 1.”

What is the truth value of the quantification


xQ(x)

Where the domain consists of all real numbers?

98
Task
■ What is the truth value of xP(x), where P(x) is the statement “x2 >
10” and the universe of discourse consists of the positive integers
not exceeding 4?

Note: Universe of discourse is also known as the domain.

99
Precedence of Quantifiers
■ The quantifiers and have higher precedence than all logical
operators from propositional calculus.

■ For example, x P(x) Q(x) is the disjunction of xP (x) and Q(x).

■ In other words, it means ( xP (x)) Q(x) rather than x(P (x)


Q(x)).

100
Binding & Free Variables
■ Consider x(x + y = 1).
Here, the variable x is bound by the existential quantification x,
but the variable y is free because it is not bound by a quantifier.

■ This illustrates that in the statement


x(x + y = 1), x is bound, but y is free.

101
Remarks
■ We can distribute a universal quantifier over a conjunction.

x(P (x) Q(x)) ≡ xP (x) xQ(x)

■ But, we cannot distribute a universal quantifier over a disjunction.

x(P (x) v Q(x)) ≡/ xP (x) v xQ(x)

102
Remarks
■ We cannot distribute an existential quantifier over a conjunction.

x(P (x) Q(x)) ≡/ xP (x) xQ(x)

■ But, we can distribute an existential quantifier over a disjunction.

x(P (x) v Q(x)) ≡ xP (x) v xQ(x)

103
Negating Quantified Expressions
■ Consider “Every Student has taken a course in Calculus”.
xP(x)

■ NEGATION
“Every Student has NOT taken a course in Calculus”.
OR
■ “There is at least one Student who have NOT taken a course in
Calculus.
x~P(x)
104
Remarks
■ So, ¬ xP(x) ≡ x ¬P(x)

■ Similarly, ¬ xP(x) ≡ x¬P(x)

■ These are referred to as:

De Morgan’s Laws for Quantifiers

105
Task
■ What are the negations of the statements?

1. “There is an honest politician.”

2. “All Pakistani’s eat junk food”?

106
Practice Question
■ Use predicates and quantifiers to express the system
specification:
“Every mail message larger than one megabyte will be
compressed”.

107
Practice Question
■ Use predicates and quantifiers to express the system
specification:
“If a user is active, at least one network link will
be available.”

108
Nested Quantifiers
■ Assume that the domain for the variables x and y consists of all
real numbers. The statement
x y(x + y = y + x)

says that

x+y=y+x
for all real numbers x and y.

109
Example
■ Similarly,

x y(x + y = 0)
says that

For every real number x there is a real number y;


such that x + y = 0.

110
Example
■ Similarly, the statement

x y z(x + (y + z) = (x + y) + z)

Says that

The associative law for addition is applicable on all real numbers.

111
Class Task 5 (Graded)
■ Translate the following compound statement into Simple English:

x y((x > 0) (y < 0) → (xy < 0))

■ where the domain for both variables consists of all real numbers.

Note:Submit your answers on LMS by 16 Nov, 2020.


Late submissions will not be entertained.

112
DESIGN & ANALYSIS OF
ALGORITHMS

Lecture # 7, 8 & 9
ARGUMENTS & PRoof
THEoRY
Argument
■ It is a sequence of statements that end with a conclusion.

■ All statements except the conclusion are called premises.

2
Format of Argument
Premise 1
Premise 2
.
. If any
.
Premise n
--------------------------------------
Conclusion
3
Rules of Inference

4
Modus Ponens
■ It is Friday.
■ If it is Friday THEN Jumma Prayer will be offered.
■ -------------------------------------
■ Jumma Prayer will be offered.

5
Modus Tollens
Suppose it is SATURDAY
■ Jumma Prayer will NOT be offered.
■ If it is Friday THEN Jumma Prayer will be offered.
■ -------------------------------------
■ It is NOT Friday.

6
Hypothetical Syllogism
■ If you lie, you commit a sin.
■ If you commit a sin, you will be oppressed.
■ --------------------------------------
■ If you lie, you will be oppressed.

7
Disjunctive Syllogism / Tollendo Syllogism

■ Students come to the class on time OR The teacher comes to the


class on time.
■ Students DO NOT come to the class on time.
■ --------------------------------------
■ The teacher comes to the class on time.

8
Addition
■ “Multan Sultans” has a match today.
■ --------------------------------------
■ “Multan Sultans” or “Lahore Qalandars” has a match today.

9
Simplification
■ “Multan Sultans” and “Karachi Kings” have a match today.
■ --------------------------------------
■ “Multan Sultans” has a match today.

10
Conjunction
■ “Multan Sultans” has a match today.
■ “Karachi Kings” has a match today.
■ --------------------------------------
■ “Multan Sultans” and “Karachi Kings” have a match today.

11
Resolution
■ Today is the first match of Lahore Qalandars OR Today is the first
match of Karachi Kings.
■ Lahore Qalandars WILL NOT play their first match today OR Today
is the first match of Multan Sultans.
■ --------------------------------------
■ Today is the first match of Multan Sultans OR Today is the first
match of Karachi Kings.

12
Valid / Invalid Arguments
■ An argument is valid if the conclusion is true when all the
premises are true.

■ Alternatively, an argument is valid if conjunction of its premises


imply conclusion.

■ That is (P1 P2 P3 ... Pn) → Con is a tautology.

13
Valid / Invalid Arguments
■ An argument is invalid if the conclusion is false when all the
premises are true.

■ Alternatively, an argument is invalid if conjunction of its premises


does not imply
conclusion.

■ That is (P1 P2 P3 ... Pn) → Con is not a tautology.

14
Valid Arguments

15
Invalid Arguments

16
Invalid Arguments

17
Fallacies
■ These resemble the rules of inference, but are based on
contingencies.

‫مغالطہ‬

18
Fallacy of Affirming the Conclusion
■ If you do every problem in this book, then you will learn discrete
mathematics.
■ You have learned discrete mathematics.
■ --------------------------------------
■ You did every problem in this book.

p→q
q .
--------
p .

19
Fallacy of Denying the Hypothesis
■ If you do every problem in this book, then you will learn discrete
mathematics.
■ You DIDN’T do every problem in this book.
■ --------------------------------------
■ You have NOT learned discrete mathematics.

p→q
~p .
--------
~q .

20
21
22
Building Arguments
■ Show that the premises
– “It is not sunny this afternoon and it is colder than yesterday,”
– “We will go swimming only if it is sunny,”
– “If we do not go swimming, then we will take a canoe trip,” and
– “If we take a canoe trip, then we will be home by sunset”
■ lead to the conclusion
– “We will be home by sunset”

23
Step-1: Identify Statements
■ Premises
– “It is not sunny this afternoon and it is colder than yesterday,”
– “We will go swimming only if it is sunny,”
– “If we do not go swimming, then we will take a canoe trip,” and
– “If we take a canoe trip, then we will be home by sunset”
■ Conclusion
– “We will be home by sunset”

24
Step 2: Symbolization
■ Premises
– ~s^c ~s^c
– w→s w→s
~w→c
– ~w→c
c→h
– c→h
--------------
■ Conclusion h
– h

25
Step 3: Applying the Rules
~s^c
--------------
~s Simplification Rule

~s
w→s
-------------- ~s^c
w→s
~w Modus Tollens
~w→c
c→h
--------------
h
26
Step 3: Applying the Rules
~w ~s
~w→c ~w
-------------- c
c Modus Ponens

c
c→h
~s^c
-------------- w→s
h Modus Ponens ~w→c
c→h
--------------
h
27
Example
■ Show that the premises
– “If you send me an e-mail message, then I will finish writing the
program,”
– “If you do not send me an e-mail message, then I will go to sleep early,”
and
– “If I go to sleep early, then I will wake up feeling refreshed”
■ lead to the conclusion
– “If I do not finish writing the program, then I will wake up feeling
refreshed”

28
Step-1: Identify Statements
■ Premises
– “If you send me an e-mail message, then I will finish writing the
program,”
– “If you do not send me an e-mail message, then I will go to sleep early,”
and
– “If I go to sleep early, then I will wake up feeling refreshed”
■ Conclusion
– “If I do not finish writing the program, then I will wake up feeling
refreshed”

29
Step 2: Symbolization
■ Premises
– e→f e→f
– ~e→s ~e→s
s→r
– s→r
--------------
■ Conclusion ~f→r
– ~f→r

30
Step 3: Applying the Rules
e→f
--------------
≡ ~ e v f Implication Law

~e→s
-------------- e→f
≡ ~(~e) v s Implication Law ~e→s
≡evs Double Negation Law s→r
------------
~f→r
31
Step 3: Applying the Rules
~evf ~evf
evs evs
--------------
fvs Resolution

s→r
e→f
--------------
~e→s
≡ ~s v r Implication Law s→r
------------
~f→r
32
Step 3: Applying the Rules
fvs ~evf
~s v r evs
fvs
--------------
~s v r
fvr Resolution

fvr e→f
-------------- ~e→s
s→r
≡ ~ f → r Implication Law
------------
(Reversed)
~f→r
33
ALTERNATE SOLUTION
■ We can also solve a problem in a number of ways.

■ Here, we present an alternate way to solve the same problem.

34
Step 3: Applying the Rules
~e→s
s→r
--------------
~ e → r Hypothetical Syllogism

~e→r
--------------
e→f
≡ ~(~e) v r Implication Law ~e→s
≡evr Double Negation Law s→r
------------
~f→r
35
Step 3: Applying the Rules
e→f ~e→r
-------------- evr
≡ ~ e v f Implication Law

~evf
evr e→f
-------------- ~e→s
fvr Resolution s→r
------------
~f→r
36
Step 3: Applying the Rules
fvr ~evf
-------------- evs
~evf
≡ ~ f → r Implication Law
fvr
(Reversed)

e→f
~e→s
s→r
------------
~f→r
37
Rules for Quantified Statements

38
Universal Instantiation
■ If P(x) is true for all x and c is a particular member of the domain,
then P(c) is also true.

All women are wise.


-------------------
Lisa is wise.
(Because she belongs to the domain of all women)
Here the member can be either arbitrary or specific
39
Existential Instantiation
■ If P(x) is true for some x then there is at least one
member of the domain c for which P(c) is also true.

Some students have not attended all the classes.


-------------------
Mr. ABC has not attended all the classes.
(Because Mr. ABC belongs to the domain of students of the class
and he did not attend all the classes)
Here the member is specific not arbitrary

40
Universal Generalization
■ Reverse of Universal Instantiation.
■ If P(c) is true for all elements c in the domain then xP(x) is also true.

(Lisa) has two eyes.


.
.
.
-------------------
Every human has two eyes.
(Because this is true for all elements of the domain)
Here the member is arbitrary.
41
Existential Generalization

■ Reverse of Existential Instantiation.


■ When a particular element c with P(c) is known, then xP(x) is
also TRUE.

Ali has passed the Discrete Mathematics.


-------------------
At least one student has passed Discrete Mathematics.
Here the member is specific.

42
Composite Rules - 1
■ Universal Instantiation & Modus Ponens are often used together.
We combine them as UNIVERSAL MODUS PONENS

43
Composite Rules - 2
■ Similarly, Universal Instantiation & Modus Tollens are also
combined as UNIVERSAL MODUS TOLLENS

44
Example: UNIVERSAL MODUS PONENS

■ Assume that For all positive integers n,


– “If n is greater than 4, then n2 is less than 2n”
Use UNIVERSAL MODUS PONENS to show that
– 1002 < 2100

45
Step-1: Identify Statements
■ Premises
– “For all positive integers n, if n is greater than 4, then n2 is less than 2n”

■ Conclusion
– 1002 < 2100

■ We know that
– 100 is greater than 4

46
Step 2: Symbolization
■ Premises
– n (G(n)→L(n)) n (G(n)→L(n))
– G(100) G(100)
--------------
L(100)
■ Conclusion
– L(100)

47
Step 3: Applying the Rules
n (G(n)→L(n))
G(100)
--------------
L(100) Or 1002 is less than 2100

By UNIVERSAL MODUS PONENS


n (G(n)→L(n))
G(100)
------------
L(100)
48
Proof
■ A PROOF is a valid argument that establishes the truth of a
mathematical statement.

■ A PROOF can use:


– The hypothesis of a theorem
– Axioms
– Previously proved theorems

■ Final Step:
– The truth of statement is proved

49
Basic Types of Proofs
■ Formal Proofs
– All steps are provided
– One rule used at a time
– Rules for each step are stated

■ Informal Proofs
– Some steps may be skipped
– More than one rule can be used at a step
– Some axioms are assumed
– Rules not explicitly stated

50
Basic Terms
■ THEOREM ‫قضيہ کليہ‬

– It is a statement that can be shown to be true.

– A general proposition not self-evident but proved by a chain of


reasoning.

– A truth established by means of accepted truths.

– Also called FACTS or RESULTS after being proved.

51
Basic Terms
■ LEMMA ‫قضيہ‬

– A less important theorem that is helpful in the proof of other results.


– An intermediate theorem in an argument or proof.

■ COROLLARY ‫ منطقی نتيجہ‬٬ ‫ضمنی نتيجہ‬

– A proposition that follows from (and is often appended to) one already
proved.
– A direct or natural consequence or result of a theorem.

52
Basic Terms
■ CONJECTURE ‫ اندازه‬،‫تخمينہ‬
– An opinion or conclusion formed on the basis of
incomplete information.

– Form an opinion or supposition about (something) on


the basis of incomplete information.

– If it is proved, it becomes theorem; otherwise not.

– On the basis of:


■ Partial evidence ‫استقراء ناقص‬
■ Heuristic argument ‫اﻻشباه و النظائر‬
■ Intuition ‫ ذوق‬،‫ وجدان‬of an expert
53
Basic Terms
■ AXIOM / POSTULATE ‫مسلمہ – اصول – بدیهی‬

– A thing suggested or assumed to be true.

– It is used as the basis for reasoning, discussion, or belief.

– These are considered to be self-evident.

– These are assumed to be true without any proof or demonstration.

54
Methods

55
Basic Concepts
■ An integer n can be even if and only if 2 is one of its factors.
– i.e. n=2 * k

■ An integer n can be odd if and only if


n=2 * k + 1 OR n=2*k–1

Here, k represents some integer.

56
Basic Concepts
■ An integer n is prime if and only if it has only two
factors; one of them is 1.
– i.e. if n=r.s; then r = 1 or s = 1

■ An integer n is composite if and only if it has at


least a pair of factors such that none of them is 1.
– i.e. if n=r.s; then r ≠ 1 and s ≠ 1

Let r and s represents some positive integer.

57
Basic Concepts
■ An integer n is divisible by d if and only if n=d.k for some integer k
and d ≠ 0.

■ An integer n is ‫ا‬perfect square if and only if n=k2 for some integer


k.

■ A real number r is rational only if r= and b≠0.


58
Direct Proof
■ In order to directly prove a conditional statement of the form "If p,
then q", it is sufficient to consider the situations in which the
statement p is true.

59
Direct Proof
■ The implication p →q can be proved by showing that if p is true,
then q must also be true.
■ This shows that the combination p true and q false never occurs.

No need to consider where


p=FALSE

60
Practice Questions
■ Prove that the sum of two odd integers is even.

61
Practice Questions
■ Prove that if n is any even integer, then(-1)n=1.

62
Practice Questions
■ Prove that the product of an even integer and an odd integer is
even.

63
Practice Questions
■ Prove that the square of an even integer is even.

64
Practice Questions
■ Prove that if n is an odd integer, then n3 + n is even.

65
Practice Questions
■ Prove that, if the sum of any two integers is even, then so is their
difference.

66
Practice Questions
■ Prove that the sum of any two rational numbers is
rational.

67
Practice Questions
■ Given any two distinct rational numbers r and s with r < s. Prove
that there is a rational number x such that r < x < s.

68
69
Practice Questions
■ Prove that the sum of any three consecutive integers is divisible
by 3.

70
Practice Questions
■ Prove the statement: There are real numbers a and b such that

71
72
Proof by Contraposition
■ We know that a conditional statement and its contrapositive are
logically equivalent.

■ So, if we show that the contrapositive is true, then the original


statement will also be true.

■ Here, we directly prove the contrapositive of a statement. Thus, it


is an indirect proof in which the negation of consequence is
proved directly.
73
Proof by Contraposition
■ The method of proof by contrapositive may be summarized as:

– Express the statement in the form if p then q.


p→ q

– Rewrite this statement in the contrapositive form if not q then not p. So it


becomes indirect.
~q→~p

– Prove the contrapositive by a direct proof.

74
Example

75
Example

76
Example

Proof:
Contrapositive: if n is odd, then n3+5 is even.
As n is odd, then n2 is also odd because product of two odd integers is also odd.
Similarly, n3 will also be odd being the product of two odd numbers.
As, sum of two odd numbers is always even, so n3+5 will be an even number.

77
Example

78
Example

79
Proof by Contradiction
Reductio ad absurdum: p →q ≡ (p ~q) →c
(Law of Reduction to Contradiction)

■ The negation of the implication p→q is


~(p→q) ≡ ~(~p q)
≡ ~~p ~q De-Morgan’s Law
≡ p ~q

■ If the implication is true, then its negation must be false, i.e., leads to a
contradiction.

■ Hence p→q ≡ (p ~q) → c, where c is a contradiction.


80
Proof by Contradiction
■ Thus, to prove an implication p → q by contradiction
method, we suppose that the condition p and the
negation of the conclusion q, i.e., (p ~q) is true and
ultimately arrive at a contradiction.

■ The method of proof by contradiction, may be


summarized as follows:
– Suppose the statement to be proved is false.
– Show that this supposition is also false and hence
leads logically to a contradiction.
– Conclude that the statement to be proved was true.

81
Example

82
Example

83
Example

84
Example

85
DESIGN & ANALYSIS OF
ALGORITHMS

Lecture # 10, 11 & 12


MatheMatical induction
Strong induction
IN S T R U C T O R : S u la m a n A h m a d N a z
Mathematical Induction
■ Mathematical induction is a mathematical proof technique.

■ It requires two cases to be proved:


1. The first case, called the BASIS STEP, proves that the property holds for
the number 0.

2. The second case, called the INDUCTION STEP, proves that, if the property
holds for one natural number n, then it holds for the next natural
number n + 1.

2
Mathematical Induction
■ These two steps establish the property P(n) for
every natural number n = 0, 1, 2, 3, ...

■ The BASIS STEP need not begin with zero.

■ Often it begins with the number one, and it can


begin with any natural number, establishing the
truth of the property for all natural numbers
greater than or equal to the starting number.

3
Mathematical Induction
■ Logically, it can be expressed as:

Basis Step


Induction Step

4
Example: Summations

5
Example: Summations

6
Task 1

7
Task 1

8
Task 2

9
Task 2

10
Example: Divisibility

11
Example: Divisibility

12
Task 3

13
Task 3

14
Example: Inequality

15
Example: Inequality

16
Task 4

17
Task 5

18
Task 5

19
Example: Property of Sequence
5ak-1

20
Example: Property of Sequence
5ak-1

21
Task 6

22
Task 6

23
Task 6

So

or As k>1, so it is obvious that the equality is true.

24
The Well Ordering Principle
■ It is one of those facts in mathematics that are very obvious.
■ Statement:
Any nonempty subset of natural numbers has a least/minimum
element.

■ This is only true for the natural numbers.


■ It is not true for the Integers, the Rational Numbers, the Real
numbers and the Complex numbers.
The Well Ordering Principle
■ Let A be a set such that:
– A  N.
– A ≠ .
So, not only the set has to be a subset of natural numbers, it has to be nonempty.
■ Then there is some natural number m such that:
– m  A.
– m  a, for all a  A.
This m is the minimum of (less than or equal to) all the elements in A and, most
important, m is in A.
Example 1
■ If A={2,5,7,10,22}, then it is clear that
– A  N.
– A ≠ .

■ The well ordering principle says that A has a minimum.


– In this case, m=2 is the minimum of A.
Example 2
■ If A={nN | n2+3n-200>0}, then it is clear that
– AN by the definition of A
– A≠ needs to be checked

The well ordering principle does not tell us


what “m” is exactly; it just tells us that “m” exists

■ The well ordering principle says that A has a minimum.


– In this case, m=13 is the minimum of A.
Example 3
■ If A={nN | n.sin(2n)>8}, then it is clear that
– AN by the definition of A
– A≠ needs to be checked

The well ordering principle does not tell us


what “m” is exactly; it just tells us that “m” exists

■ The well ordering principle says that A has a minimum.


– In this case, m=10 is the minimum of A.
Mathematical Induction is a Valid Proving Technique

Proof by Contradiction:
Suppose, n P(n) is false.
P(1) It means there are some +ive integers for which P(n) is
k (P(k)P(k+1)) false. Let it be set S with least k (by using well ordering
principle), so P(k) is also false.
------------------------------- We know that, 1S. Hence, 1<k
 n P(n) In other words k-1 is also positive.
As k-1<k, so k-1S and P(k-1) must be true.
From second premise, using modus ponens, P(k) is
also true which leads to contradiction.
Hence, n P(n) must be true.
Types of Induction
■ Weak Induction
– Mathematical Induction
– Principle of mathematical Induction
– Incomplete Induction

■ Strong Induction
– Second Principle of Mathematical Induction
– Complete Induction

■ Surprisingly, both are just equivalent.


Weak Induction vs. Strong Induction

P(1) P(1)
k (P(k)P(k+1)) k (P(1)P(2)  …  P(k)P(k+1))
------------------------------- -----------------------------------------------------------
 n P(n)  n P(n)
Example
■ Prove that for all n>1, n is divisible by a prime number.

■ We will use strong mathematical induction to prove it.


■ P(n) : n is divisible by a prime number, for all n>1
P(2)
k (P(2)P(3)  …  P(k)P(k+1))
-----------------------------------------------------------
 n P(n)
Example
■ BASE CASE:
P(2) : 2 is divisible by a prime number.
It is TRUE as every number is divisible by itself.
■ INDUCTIVE CASE:
We assume that P(2), P(3), …, P(k) are all true,
Now, if P(2), P(3), …, P(k) are all true,
then it must imply that P(k+1) is also true.
If it happens, then we can conclude that n P(n) is also true
for all n>1.
Example
So, we assume that P(2), P(3), …, P(k) are all true,
Now k+1 can be either a prime or a composite number.
If it is prime, then nothing to prove more as every number is divisible by
itself.
If it is a composite number, then let it be a product of two numbers. Now,
k+1 = a.b | 1<a,b<k+1
As “a” and “b” are integers between 2 and k, it is obvious that that both of
these are divisible by a prime number.
In other words, P(k+1) is TRUE or k+1 is also divisible by a prime number.
Hence, we can conclude that for all n>1, n is divisible by a prime number.
Practice Questions
1. If n  Z, n >1, then n can be written as product of primes.

2. Show that any amount in cents ≥ 8 cents can be obtained


using 3 cents and 5 cents coins only.
n

when n2, i.e., A1  A2  ...  An  A1  A2  ...  An 


n

3. Prove  Aj   Aj
j 1
j 1

4. Prove that postage ticket of amount  12 cents can be formed


using only 4 cent and 5 cent stamps.
DESIGN & ANALYSIS OF
ALGORITHMS

Lecture # 13, 14 & 15


INTRODUCTION TO TIme COmplexITy,
TIme COmplexITy Of lOOpS
IN S T R U C T O R : S u la m a n A h m a d N a z
Time Complexity of Algorithms
■ When we talk about time complexity of algorithms, what do we
actually mean?

■ It is not the actual running time. It is not the standard.


– It varies from machine to machine & compiler to compiler.
– It even varies on one machine due to availability of resources.
Time Complexity of Algorithms
■ We actually calculate the estimation, not actual running time.
■ Complexity can be viewed as the maximum number of primitive
operations that a program may execute.
■ Regular operations are:
– Addition
– Multiplication
– Assignment
– Accessing array element, etc.
■ We may leave some operations uncounted and concentrate on
those that are performed the largest number of times.
Time Complexity of Algorithms
■ We define complexity as a numerical function T(n) - time versus
the input size n.

■ We want to define time taken by an algorithm without depending


on the implementation details.

■ Talgorithm#1(n) = n(n-1) = n2 - n //quadratic time

■ Talgorithm#2(n) = 2(n-1) = 2n – 2 //linear time


Asymptotic Complexity
■ The resulting function gives only an
approximate measure of efficiency of the
original function.

■ However, this approximation is


sufficiently close to the original.

■ This measure of efficiency is called


asymptotic complexity.
Elimination of Lower Order Terms
■ Any terms that do not substantially change the function’s
magnitude should be eliminated from the function.

■ Consider the following quadratic function:

■ For small values of n, the last term, 1,000, is the largest. But
we calculate time complexity to see the behavior on large input
sets.
Elimination of Lower Order Terms
≈n2
Asymptotic Notations
■ Asymptotic notations are mathematical tools to represent the
complexity of algorithms for asymptotic analysis.

1. Big-Oh (O) Notation


2. Big-Omega (Ω) Notation
3. (Big) Theta () Notation
4. Small-Oh (o) Notation
5. Small-Omega (ω) Notation
Big-Oh (O) Notation
■ The Big O notation defines an upper bound of an algorithm.

■ It bounds a function only from above.

■ It is the lowest upper bound.

■ It specifically describes the worst case scenario.


f(n)c.g(n)
Big-Oh (O) Notation for all nN0
& c>0, n1
Let f(n) is the function of
growth of time for some input
n.
Then there is a function g(n)
such that c.g(n) is the lowest
upper bound of f(n).
In other words, after some
value N0, the value of c.g(n)
will always be greater than or
equal to g(n)
f(n)c.g(n)
Example for all nN0
& c>0, n1
■ Let f(n)=3n+2 and g(n)=n. Can we say that f(n)=O(g(n))?

■ Solution:
f(n)  c.g(n)
 3n+2  c.n
Now, c can be any number greater
Than or equal to 4.
 3n+2  4.n
and N0=2. Hence f(n)=O(g(n))=O(n)
Big-Omega (Ω) Notation
■ The Big Ω notation defines a lower bound of an algorithm.

■ It bounds a function only from below.

■ It is the greatest lower bound.

■ It specifically describes the best case scenario.


f(n)c.g(n)
Big-Omega (Ω) Notation for all nN0
& c>0, n1
Let f(n) is the function of
growth of time for some input
n.
Then there is a function g(n)
such that c.g(n) is the greatest
lower bound of f(n).
In other words, after some
value N0, the value of c.g(n)
will always be less than or
equal to g(n)
f(n)c.g(n)
Example for all nN0
& c>0, n1
■ Let f(n)=3n+2 and g(n)=n. Can we say that f(n)=Ω(g(n))?

■ Solution:
f(n)  c.g(n)
 3n+ 2  c.n
Now, c can be any number smaller
than or equal to 3.
 3n+2  3.n
and N0=1. Hence f(n)= Ω(g(n))= Ω(n)
(Big)-Theta () Notation
■ The Theta  notation defines a both upper and lower bounds of
an algorithm.

■ It bounds a function both from above and below.

■ It is the also called as tight bound.

■ It is used when we have the same function as lower as well as


upper bound.
c1.g(n)f(n)c2.g(n)
(Big)-Theta () Notation for all nN0
& c1,c2>0, n1
Let f(n) is the function of growth C2 g(n)
of time for some input n.
Then there is a function g(n)
such that c1.g(n) is the greatest
lower bound and c2.g(n) is the
lowest upper bound of f(n). C1 g(n)
In other words, after some value
N0, the value of c1.g(n) will
always be less than or equal to
f(n) and the value of c2.g(n) will
always be greater than or equal
to f(n).
c1.g(n)f(n)c2.g(n)
Example for all nN0
& c1,c2>0, n1
■ Let f(n)=3n+2 and g(n)=n. Can we say that f(n)=(g(n))?

■ Solution:
c1.g(n)  f(n)  c2.g(n)
From previous two examples, it is
obvious that:
3n  3n+2  4n
where N0=2.
Hence f(n)= (g(n))= (n)
Small-Oh (o) Notation
■ The small O notation also defines an upper bound of an
algorithm.

■ It bounds a function only from above.

■ It is the just the upper bound.

■ Big-Oh is the least upper bound while it is just any upper bound.
Small-Omega (ω) Notation
■ The small ω notation also defines a lower bound of an algorithm.

■ It bounds a function only from below.

■ It is the just the lower bound.

■ Big-Omega is the greatest lower bound while it is just any lower


bound.
Asymptotic Notations
time

Ω(n)
ω(n)

ω(n)
Growth of functions
Usually growth of functions
can be categorized into the
following:
– Constant time
– Logarithmic time
– Linear time
– N-logarithmic time
– Polynomial time
– Exponential time
– Factorial time
Remarks
■ Most of the time, we are concerned with the worst
case complexity of algorithms.

■ It means we want to rate the algorithm in the


appropriate category, such that on any input, what will
be the maximum time required to transform input to
output.

■ So, most of the time, we will use the Big-Oh notation.


Input Cases & Analysis Cases
■ There are three types of input cases.
– Best Case
– Worst Case
– Arbitrary/Random Case

■ There are three types of analysis cases.


– Best Case Analysis
– Worst Case Analysis
– Average Case Analysis
Example
■ Lets take the example of Linear Sequential Search.
■ It scans the array elements one by one and searches for the
required value.
■ If there are n elements in the array, and key is the value to be
searched, then its algorithm may be roughly refined as:
– found=false
– For i = 1 to n
– If Array[i]=key
– found=true & stop;
– End if
– End for
Example
num_1

■ There are three types of input cases.


num_2
– Best Case key=num_1
– Worst Case key=num_n or key not found
– Arbitrary/Random Case key=any number num_3

num_4
■ There are three types of analysis cases.
– Best Case Analysis 1 =Ω(1)


– Worst Case Analysis n =O(n)
– Average Case Analysis (n+1)/2 =O(n)
num_n
Time Complexity of Algorithms
■ We have already talked about how to get the time complexity function
of any algorithm.

■ We consider the basic operations (computational steps) only:


– Arithmetic operations
– Logical operations
– Relational operations
– Accessing the element of an array
– Assignment operation

■ We do not consider all the operations. We consider only the most


dominant ones as we need the approximation.
Time Complexity of Iterations
■ What if a set of instructions is repeated many times in an
algorithm?

■ Usually a set of code is repeated using either of the two


techniques:
– Loops
■ For Loop
■ While Loop
– Recursions
Iterative vs. Recursive Time

A() A(n)
{ {
For j = 1 to n If (-----)
Print “Hello”
A(n/2)
}
}
Time Complexity (Iterations)
For i=1 to n T(n)=n
<computational instruction>
=O(n)
-----------------------------------------------------------
For i=1 to n
For j=1 to n
T(n)=n.n=n2

<computational instruction> =O(n )


2

i 1 2 3 4 5 6 7 … n
J n n n n n n n … n
Time Complexity (Iterations)
For i=1 to n T(n)=n.n.n=n3
For j=1 to n
For k=1 to n
=O(n )
3
<computational instruction>
----------------------------------------------------------------
For i=1 to n
For j=1 to n T(n)=n.n.n.n=n 4
For k=1 to n
For l=1 to n =O(n )
4
<computational instruction>
Time Complexity (Iterations)
i=1 T(n)=k
s=1
While(s<=n)
i=i+1
s=s+i
<computational instruction>

S 1 3 6 10 15 21 28 … n
i 1 2 3 4 5 6 7 … k
Things to remember
Sum of 1st m natural numbers=m(m+1)/2

Time Complexity (Iterations)


i=1 T(n)=k =O( )
s=1
While(s<=n)
i=i+1
s=s+i
<computational instruction> n=k(k+1)/2
S 1 3 6 10 15 21 28 … n 2n=k +k
2
i 1 2 3 4 5 6 7 … k
k≈
Time Complexity (Iterations)

For i=1; i2<=n;i++


<computational instruction>

T(n)=O( ) i <=n
2

i<=
Time Complexity (Iterations)
T(n)=100+200+…+n.100
For i=1 to n =100(1+2+…+n)
For j=1 to i =100.n(n+1)/2
For k=1 to 100 =O(n )
2
<computational instruction>

i 1 2 3 4 … n
j 1 times 2 times 3 times 4 times … n times
k 1x100 2x100 3x100 4x100 … nx100
times times times times times
Things to remember
Sum of squares of 1st m natural numbers=m(m+1)(2m+1)/6

Time Complexity (Iterations)


T(n)=1(n/2)+4(n/2)+9(n/2)+…+n2(n/2)
=(n/2)(1+4+9+…+n2)
For i=1 to n
For j=1 to i2
=(n/2)(12+22+32+…n2)

For k=1 to n/2 =(n/2).n(n+1)(2n+1)/6 = O(n 4)

<computational instruction>
I 1 2 3 4 … n
J 1 times 4 times 9 times 16 times … n2 times
k n/2x1 n/2x4 n/2x9 n/2x16 … n/2xn2
times times times times times
Time Complexity (Iterations)
i 1 2 4 8 … n

For i=1 to n & i=i*2


<computational instruction>
---------------------------
For i=1 to n & i=i*3
<computational instruction>
---------------------------
For i=1 to n & i=i*m
<computational instruction>
Time Complexity (Iterations)
i 1=20 2=21 4=22 8=23 … n=2k

For i=1 to n & i=i*2 T(n)=k+1


<computational instruction> =log2n+1
--------------------------- n=2k
=O(log2n)
For i=1 to n & i=i*3 log2n=log22k
<computational instruction> log2n=k.log22
--------------------------- log2n=k
For i=1 to n & i=i*m Or
<computational instruction> k=log2n
Time Complexity (Iterations)
i 1=20 2=21 4=22 8=23 … n=2k

For i=1 to n & i=i*2 T(n)=k+1


<computational instruction> =log2n+1
--------------------------- n=2k
=O(log2n)
For i=1 to n & i=i*3 log2n=log22k
<computational instruction> log2n=k.log22
--------------------------- T(n)=O(log3n) log2n=k
For i=1 to n & i=i*m Or
<computational instruction> k=log2n
T(n)=O(logmn)
Time Complexity (Iterations)
For i=n/2 to n T(n)=(n/2)(n/2)log2n
For j=1 to n/2 =O(n2.log2n)
For k=1 to n & k=k*2
<computational instruction>
-----------------------------------------------------------
For i=n/2 to n T(n)=(n/2).log2n.log2n
For j=1 to n & j=2*j =O(n.(log2n)2)
For k=1 to n & k=k*2
<computational instruction>
Time Complexity (Iterations)

While(n>1)
T(n)=O(log2n)
n=n/2
<computational instruction>

i = 1 to n i = n to 1
←same→
i=i*2 i=i/2
𝒏 𝟐 𝟒
where = 0.5772156649…

Time Complexity (Iterations)


T(n)=n+n/2+n/3+…+n/n
For i=1 to n
For j=1 to n & j=j+i =n(1+1/2+1/3+…+1/n)
<computational instruction> =n(Hn)
≈n.ln n
=O(n. ln n)

i 1 2 3 … n
j n times n/2 n/3 … n/n
times times times
DESIGN & ANALYSIS OF
ALGORITHMS

Lecture # 16, 17 & 18


RecuRsive AlgoRithms,
RecuRsive functions
IN S T R U C T O R : S u la m a n A h m a d N a z
Time Complexity of Iterations
■ What if a set of instructions is repeated many times in an
algorithm?

■ Usually a set of code is repeated using either of the two


techniques:
– Loops
■ For Loop
■ While Loop
– Recursions
Iterative vs. Recursive Time

A() A(n)
{ {
For j = 1 to n If (-----)
Print “Hello”
A(n/2)
}
}
Recursions
■ A function calls itself repeatedly

■ QUESTION: For how long a function will call itself?

■ ANSWER: Slight modification in parameters

■ PURPOSE: To reach the ANCHOR condition

■ ANCHOR CONDITION: The point at which the recursive calls


will come to an end
4
Example
Algo(n)
{
//some lines of code
If n=1
return 1
else
return Algo(n-1)
}
-------------------------------------------------------------------------
T(n)=c+T(n-1) for n>1; otherwise T(n) =c for n=1
-------------------------------------------------------------------------
Here, “c” is the time complexity of “//some lines of code” and the comparison
And, T(n-1) is the time complexity of the next recursive call
5
Example T(n) =c+c+…+c
Algo(n) =n.c
Algo(n-1) =O(n)
Algo(n-2)

c c
c
c Algo(1)

6
Methods of Solution

7
Back Substitution Method

■BASIC IDEA:
■ Substitute the value of next iteration into
current iteration and so on.
■ Do this until the last recursive call.
■ Last recursive call is controlled by the
Anchor condition.

8
Back Substitution Method
■ Let’s consider an example:
T(n) =T(n-1)+c, for n>1
T(1) =c  This is the Anchor Condition
-------------------------------------------------------------------------
T(n) =T(n-1)+c …eq(i)
T(n-1) =T(n-1-1)+c
=T(n-2)+c …eq(ii)
T(n-2) =T(n-2-1)+c
=T(n-3)+c …eq(iii)
9
Back Substitution Method
Putting eq(ii) in eq(i): T(n) =T(n-1)+c …eq(i)
T(n-1) =T(n-2)+c …eq(ii)
T(n) =T(n-1)+c T(n-2) =T(n-3)+c …eq(iii)

=T(n-2)+c+c
=T(n-2)+2c …eq(iv)
Putting eq(iii) in eq(iv):
T(n) = T(n-3)+c+2c
=T(n-3)+3c
Suppose we reach the anchor condition after k calls:
T(n) =T(n-k)+kc ..eq(v)
10
Back Substitution Method
As we know that T(1) will be the last recursive call,
so for what value of k will T(n-k) become T(1).
n-k=1  k=n-1
So eq(v) will become:
T(n) =T(n-k)+kc
=T(1)+(n-1)c
= c+nc-c
= O(n)

11
Back Substitution Method
■ Let’s consider another example:
T(n) =T(n/2)+n, for n>1
T(1) =n  This is the Anchor Condition
-------------------------------------------------------------------------
T(n) =T(n/2)+n …eq(i)
T(n/2) =T(n/2/2)+n/2
=T(n/22)+n/2 …eq(ii)
T(n/22) =T(n/22/2)+n/22
= T(n/23)+n/22 …eq(iii)
12
Back Substitution Method
Putting eq(ii) in eq(i): T(n) =T(n/2)+n …eq(i)
T(n/2) =T(n/22)+n/2 …eq(ii)
T(n) =T(n/2)+n T(n/22) = T(n/23)+n/22 …eq(iii)

=T(n/22)+n/2+n
=T(n/22)+n/2+n …eq(iv)
Putting eq(iii) in eq(iv):
T(n) =T(n/23)+n/22+n/2+n
=T(n/23)+n/22+n/2+n
Suppose we reach the anchor condition after k calls:
T(n) =T(n/2k)+n/2k-1+…+n/22+n/2+n ..eq(v)
13
Back Substitution Method
As we know that T(1) will be the last recursive call,
so for what value of 2k will T(n/2k) become T(1).
n/2k=1  n=2k
So eq(v) will become:
T(n) = T(n/2k)+n/2k-1+…+n/22+n/2+n
= T(1)+ n/2k-1+…+n/22+n/2+n
= n + n/2k-1+…+n/22+n/2+n
= n+n(1/2k-1+…+1/22+1/2+1)
 n+n(1+1) = n+2n = 3n = O(n)
14
Explanation

*Pictures taken from Wikipedia


Explanation
Recursion Tree Method

■BASIC IDEA:
■ Draw branches of tree for each recursive
call.
■ Do this until the last recursive call.
■ Last recursive call is controlled by the
Anchor condition.

17
Recursion Tree Method
■ Let’s consider an example:
T(n) = 2T(n/2)+c, for n>1
T(1) = c  This is the Anchor Condition
-------------------------------------------------------------------------
Note that here, at each step, 2 branches will be drawn; decreasing
the input size by a factor of 2 at each step.
And a total of c work will be done at each splitting.

18
T(n) c

T(n/2) T(n/2) 2c

T(n/22) T(n/22) T(n/22) T(n/22) 4c

T(n/23) T(n/23) 8c

T(n/2k) T(n/2k)
=T(1) =T(1)

19
T(n) 2 0c

T(n/2) T(n/2) 21c

T(n/22) T(n/22) T(n/22) T(n/22) 2 2c

T(n/23) T(n/23) 2 3c

T(n/2k) T(n/2k) 2kc


=T(1) =T(1)

20
Recursion Tree Method
As we know that T(1) will be the last recursive call, so for
what value of 2k will T(n/2k) become T(1).
n/2k=1  n=2k
T(n) = 20c + 21c + 22c + … + 2kc
= c(20 + 21 + 22 + … + 2k)
= c(20 + 21 + 22 + … + n)
= c(total number of nodes in the tree)
We know that, in a complete binary tree, if there are n
leaf nodes, then the tree has a total of 2n-1 nodes.
T(n) = c(2n-1) = O(n)
Recursion Tree Method
■ Let’s consider another example:
T(n) = T(n-1)+n, for n>1
T(1) = n  This is the Anchor Condition
-------------------------------------------------------------------------
Note that here, at each step, only 1 branches will be drawn;
decreasing the input size by 1 at each step.
And a total of n work will be done at branching.

22
T(n) n

T(n-1) n-1

T(n) = n+(n-1)+(n-3)+…+2+n
T(n-2) n-2
= n+(n-1)+(n-3)+…+2+1-1+n
= n(n+1)/2 -1 +n
T(n-3) n-3
= O(n2)

T(n-k) n
=T(1) 23
Master Theorem
■ The master theorem provides a solution in asymptotic
terms (using Big  or Big O notations) for recurrence
relations of types that occur in the analysis of many
divide and conquer algorithms.
■ The function must be in some specific pattern.
■ Not all the problems can be solved by Master Theorem.
■ It has various versions; we will use an easy one.

24
Master Theorem
■ Function Pattern:

– T(n)=aT(n/b)+ (nc.logdn)

– Where a,b,c,d are constants


– Constraints:
■ a>=1
■ b>1
■ c>=0
■ d is any real number
25
Master Theorem
■ Now, we have three possibilities:
1. If a>bc, then T(n)= (nlogba)
2. If a=bc, then:
– If d>-1, then T(n)= (nlogbalogd+1n)
– If d=-1, then T(n)= (nlogbalog2n)
– If d<-1, then T(n)= (nlogba)
3. If a<bc, then:
– If d>=0, then T(n) = (nclogdn)
– If d<0, then T(n) = O(nc) 26
Master Theorem

Lets consider an example:


T(n)=3T(n/2)+n2
Here, a=3, b=2, c=2, and d=0
Now, bc=22=4
Here a<bc, so third possibility is our selection.
As d=0, so solution will be:
T(n) = (n2.log0n) = (n2)
27
Master Theorem
■ Here are some practice questions.
■ Do solve them by yourselves and consult in case of
any problem.
1. T(n)=4T(n/2)+n2
2. T(n)=T(n/2)+n2
3. T(n)=2nT(n/2)+nn
4. T(n)=16T(n/4)+n
5. T(n)=2T(n/2)+n.log n
6. T(n)=2T(n/2)+n/log n
7. T(n)=2T(n/4)+n0.51

28
Announcement
■ Online Quiz #1

■ Topic:

– Time Complexity of Algorithms


– See Lectures 13 to 18

■ Date:
– Date: Thursday, 17 December, 2020
DESIGN & ANALYSIS OF
ALGORITHMS

Lecture # 19, 20 & 21


IntroductIon to
Space complexIty
IN S T R U C T O R : S u la m a n A h m a d N a z
Space Complexity
■ Until now, we mainly focused on the Time Complexity
of algorithms.

■ Now, we will see:


– What is Space Complexity
– How to calculate it?
– What to consider and what to not?

2
Space Complexity
■ Given an algorithm, how much space (memory) is required to
finish the job.

■ Job: Transform Input to Output

■ We define Space Complexity in terms of the input size, n.

■ By space, we mean the extra memory being used. Original input,


n, is not included.

3
Order of Growth
Just like Time Complexity,
the order of growth of
Space Complexity can also
be categorized as:
– Constant space
– Logarithmic space
– Linear space
– N-logarithmic space
– Polynomial space
– Exponential space
– Factorial space
4
Asymptotic Notations
space

Ω(n)
ω(n)

ω(n)

5
Remarks
■ Most of the time, we are concerned with the worst
case complexity of algorithms.

■ It means we want to rate the algorithm in the


appropriate category, such that on any input, what will
be the maximum space required to transform input to
output.

■ So, most of the time, we will use the Big-Oh notation.


6
Input Cases & Analysis Cases
■ There are three types of input cases.
– Best Case
– Worst Case
– Arbitrary/Random Case

■ There are three types of analysis cases.


– Best Case Analysis
– Worst Case Analysis
– Average Case Analysis

7
Example 1
1 2 3 4 5 … n
Algo( Array[n] )
{
For i = 1 to n
S(n) =/ n + 1
Array[i] = i;
}
S(n) = 1 = O(1) = O(c)

8
Example 2
1 2 3 4 5 … n
Algo( Array[n] )
{
For i = 1 to n
S(n) =/ n + 2
Array[i] = j;
}
S(n) = 2 = O(2) = O(1) = O(c)

9
Array
Example 3 1 2 3 4 5 … n

B
Algo( Array[n] )
1 2 3 4 5 … n
{
New B[n];
S(n) = n + 1
For i = 1 to n
B[i]=Array[i];
} S(n) = O(n)

10
Array
Example 4 1 2 3 4 5 … n

B
Algo( Array[n] )
1 2 3 4 5 … n
{
2
New B[n,n];
For i = 1 to n 3

For j = 1 to n
B[i,j]=Array[i] * j;
n
}
S(n) = n2 + 2
S(n) = O(n2)
11
Space Complexity of Recursive Algorithms
■ Before starting to analyze the space
complexity of recursive algorithms, we
need to know how recursions actually
work.

■ There is also a concept of the CALL


STACK or RECURSION STACK in this
respect.

12
STACK
■ Suppose you want to put these balls
on this tower.

13
STACK
■ First, we put a red ball.

14
STACK
■ Now, we put a green ball.

15
STACK
■ Next, we put a black ball on top of it.

16
STACK
■ At the end, we put the blue ball on
top of it.

17
STACK
■ Now, please answer the following questions.

– Can we get the black ball before getting the


blue ball?
– Can we insert a new ball between green and
red ball?

■ The answer to both of these questions is:


NO

18
STACK
■ It is because, there is only one way to
remove or insert a ball.
– We can insert any ball only at the top of other
balls (if any)
– We can remove only the topmost ball at a time.
– We have no direct access to any other balls,
except the one at the top.

■ Such a list is called a “Stack”.

19
STACK
■ In a stack, we have two operations:
– Push(X)
– Pop()

■ Push(X) means to insert “X” at the top of the


stack.
■ Pop() means to remove the topmost element
of the stack.

20
STACK
■ In a stack, we have two operations:
– Push(X)
– Pop()

■ Push(X) means to insert “X” at the top of the


stack.
■ Pop() means to remove the topmost element
of the stack.
■ If there is nothing in the stack, it is empty.

21
STACK
■ Similarly, initially stack is empty.

22
STACK
■ Similarly, initially stack is empty.
Last
■ After performing the following five
operations, the stack looks like this. Fourth
– Push(“First”)
– Push(“Second”) Third

– Push(“Third”)
Second
– Push(“Fourth”)
– Push(“Last”) First

23
STACK
■ Now, we will perform the following operation.
Last
– Pop()
Fourth

Third

Second

First

24
STACK
Last
■ Now, we will perform the following operation.
– Pop()
Fourth
■ You will see that the “Last” element has
been popped out first. Third

Second
■ In other words, we can say that the behavior
of Stack is LIFO (Last in First out). First

■ In other words, FILO (First in Last out).


25
STACK
The stack is a simple data structure. We may have been using a
stack in our lives without even realizing it!

26
The Call Stack
■ Your computer uses a stack internally called the call stack. Let’s
see it in action. Here’s a simple function:

■ This function greets you and then calls two other functions. Here
are those two functions:

27
The Call Stack

■ Let’s walk through what happens when you call a function.


■ Suppose you call greet(“maggie”).
■ First, your computer allocates a box of memory for that function call.

28
The Call Stack

■ Let’s walk through what happens when you call a function.


■ Suppose you call greet(“maggie”).
■ First, your computer allocates a box of memory for that function call.
■ Now let’s use the memory.
■ The variable name is set to “maggie”.
■ That needs to be saved in memory.

29
The Call Stack

■ Every time you make a function call, your


computer saves the values for all the variables
for that call in memory like this.
■ Next, you print hello, maggie!
■ Then you call greet2(“maggie”).
■ Again, your computer allocates a box of memory
for this function call.

30
The Call Stack

■ Your computer is using a stack for these


boxes.
■ The second box is added on top of the first
one.
■ You print how are you, maggie?
■ Then you return from the function call.
■ When this happens, the box on top of the
stack gets popped off.
31
The Call Stack

■ Your computer is using a stack for these


boxes.
■ The second box is added on top of the first
one.
■ You print how are you, maggie?
■ Then you return from the function call.
■ When this happens, the box on top of the
stack gets popped off.
32
The Call Stack

■ Now the topmost box on the stack is for the greet


function, which means you returned back to the
greet function.
■ When you called the greet2 function, the greet
function was partially completed.
■ This is the big idea behind this section: when you call
a function from another function, the calling function is
paused in a partially completed state.
■ All the values of the variables for that function are
still stored in memory.
33
The Call Stack

■ Now that you’re done with the greet2 function,


you’re back to the greet function, and you pick
up where you left off.
■ First you print getting ready to say bye…. You
call the bye function.
■ A box for that function is added to the top of
the stack.

34
The Call Stack

■ Then you print ok bye! and return from the


function call.

■ Bye function is popped out.

■ And now you’re back to the greet function.

35
The Call Stack

■ In greet function, there’s nothing else to be


done now.

■ So, we will pop it too.

36
The Call Stack

■ The stack is now empty.

■ This stack, used to save the variables for


multiple functions, is called the call stack.

37
The Call Stack with Recursion
■ Recursive functions use the call stack too!

■ Let’s look at this in action with the factorial function.

■ factorial(5) is written as 5!, and it’s defined like this:


5 * 4 * 3 * 2 * 1.
■ Similarly, factorial(3) is:
3 * 2 * 1.
38
The Call Stack with Recursion
■ Here’s a recursive function to calculate the factorial of a number:

■ Now let’s call fact(3) and step through this call line by line and
see how the stack changes.

39
The Call Stack with Recursion

40
The Call Stack with Recursion

41
The Call Stack with Recursion

42
The Call Stack with Recursion

43
A(3)
Example 5
A( n ) A(2) //do something

{
If n >= 1 A(1) //do something
A(n-1);
//do something
} A(0) //do something
A(0) 1 call
A(1) 2 calls
A(2) 3 calls
A(3) 4 calls

A(n) n+1 calls


44
A(3)
Example 5
A( n ) A(2) //do something

{
If n >= 1 A(1) //do something
A(n-1);
//do something
} A(0) //do something

A(0) 1 call
A(1) 2 calls
A(2) 3 calls
A(3) 4 calls

A(n) n+1 calls 45


A(3)
Example 5
A( n ) A(2) //do something

{
If n >= 1 A(1) //do something
A(n-1);
//do something
} A(0) //do something

A(0) 1 call
A(1) 2 calls
A(2) 3 calls
A(3) 4 calls
A(3)

A(n) n+1 calls 46


A(3)
Example 5
A( n ) A(2) //do something

{
If n >= 1 A(1) //do something
A(n-1);
//do something
} A(0) //do something

A(0) 1 call
A(1) 2 calls
A(2)
A(2) 3 calls
A(3) 4 calls
A(3)

A(n) n+1 calls 47


A(3)
Example 5
A( n ) A(2) //do something

{
If n >= 1 A(1) //do something
A(n-1);
//do something
} A(0) //do something
A(1)
A(0) 1 call
A(1) 2 calls
A(2)
A(2) 3 calls
A(3) 4 calls
A(3)

A(n) n+1 calls 48


A(3)
Example 5
A( n ) A(2) //do something

{
If n >= 1 A(1) //do something
A(n-1);
//do something A(0)

} A(0) //do something


A(1)
A(0) 1 call
A(1) 2 calls
A(2)
A(2) 3 calls
A(3) 4 calls
A(3)

A(n) n+1 calls 49


A(3)
Example 5
A( n ) A(2) //do something

{
If n >= 1 A(1) //do something
A(n-1);
//do something
} A(0) //do something
A(1)
A(0) 1 call
A(1) 2 calls
A(2)
A(2) 3 calls
A(3) 4 calls
A(3)

A(n) n+1 calls 50


A(3)
Example 5
A( n ) A(2) //do something

{
If n >= 1 A(1) //do something
A(n-1);
//do something
} A(0) //do something

A(0) 1 call
A(1) 2 calls
A(2)
A(2) 3 calls
A(3) 4 calls
A(3)

A(n) n+1 calls 51


A(3)
Example 5
A( n ) A(2) //do something

{
If n >= 1 A(1) //do something
A(n-1);
//do something
} A(0) //do something

A(0) 1 call
A(1) 2 calls
A(2) 3 calls
A(3) 4 calls
A(3)

A(n) n+1 calls 52


A(3)
Example 5
A( n ) A(2) //do something

{
If n >= 1 A(1) //do something
A(n-1);
//do something
} A(0) //do something

A(0) 1 call
A(1) 2 calls
A(2) 3 calls
A(3) 4 calls

A(n) n+1 calls 53


A(3)
Example 5
A( n ) A(2) //do something

{
If n >= 1 A(1) //do something
A(n-1);
//do something
} A(0) //do something

A(0) 1 call
You can see that for an A(1) 2 calls

Input of size n, we need A(2) 3 calls


A(3) 4 calls
a stack of height n+1.
A(n) n+1 calls 54
A(3)
Example 5
A( n ) A(2) //do something

{
If n >= 1 A(1) //do something
A(n-1);
//do something
} A(0) //do something

A(0) 1 call
If each block of stack A(1) 2 calls

consists of k memory A(2) 3 calls


A(3) 4 calls
locations, then total
space required is k(n+1) A(n) n+1 calls 55
A(3)
Example 5
A( n ) A(2) //do something

{
If n >= 1 A(1) //do something
A(n-1);
//do something
} A(0) //do something

A(0) 1 call
A(1) 2 calls
A(2) 3 calls
S(n) = k(n+1) = O(n) A(3) 4 calls

A(n) n+1 calls 56


A(3)
Example 6
A(2) P(3) A(2)
A( n )
{
A(1) P(2) A(1) A(1) P(2) A(1)
If n >= 1
A(n-1);
print n A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(n-1);
A(0) 1 call
} A(1) 3 calls
A(2) 7 calls
A(3) 15 calls

A(n) m calls
57
A(3)
Example 6
A(2) P(3) A(2)
A( n )
{
A(1) P(2) A(1) A(1) P(2) A(1)
If n >= 1
A(n-1);
print n A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(n-1);
A(0) 1 = 20+1 ― 1 call
} A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
A(3) 15 = 23+1 ― 1 calls
S(n) = k(2n+1-1) = O(2n)

A(n) m = 2n+1 ― 1 calls


58
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
A(3) 15 = 23+1 ― 1 calls

A(n) m = 2n+1 ― 1 calls


59
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
60
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) A(2) 7 = 22+1 ― 1 calls
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
61
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) A(2) 7 = 22+1 ― 1 calls
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
62
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0)
A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) A(2) 7 = 22+1 ― 1 calls
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
63
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0)
A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
64
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0)
A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
65
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0)
A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
66
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0)
A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
12
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
67
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0)
A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
12
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
68
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0)


A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
12
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
69
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0)


A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
121
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
70
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
121
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
71
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
121
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
72
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
121
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
73
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
74
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
75
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
76
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
77
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
12131
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
78
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
12131
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
79
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
12131
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
80
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
121312
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
81
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
121312
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
82
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
121312
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
83
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213121
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
84
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213121
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
85
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213121
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
86
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213121
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
87
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213121
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
88
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213121
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
89
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(0) 1 = 20+1 ― 1 call
You can see that for an A(1) 3 = 21+1 ― 1 calls
Input of size n, we need A(2) 7 = 22+1 ― 1 calls
A(3) 15 = 23+1 ― 1 calls
a stack of height n+1.
A(n) m = 2n+1 ― 1 calls
90
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(0) 1 = 20+1 ― 1 call
If each block of stack A(1) 3 = 21+1 ― 1 calls
consists of k memory A(2) 7 = 22+1 ― 1 calls
A(3) 15 = 23+1 ― 1 calls
locations, then total
space required is k(n+1) A(n) m = 2n+1 ― 1 calls
91
A(3) A( n )

Example 6 {
If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
S(n) = k(n+1) = O(n) A(3) 15 = 23+1 ― 1 calls

A(n) m = 2n+1 ― 1 calls


92
A( n )
{
Analyzing Time Complexity If n >= 1
A(n-1);
print n
T(n) = 2T(n-1) + c; n>=1
A(n-1);
T(0) = c  This is the Anchor Condition }
-------------------------------------------------------------------------
T(n) =2T(n-1)+c …eq(i)
T(n-1) =2T(n-1-1)+c
=2T(n-2)+c …eq(ii)
T(n-2) =2T(n-2-1)+c
=2T(n-3)+c …eq(iii)
93
A( n )
{
Analyzing Time Complexity If n >= 1
Putting eq(ii) in eq(i): T(n) =2T(n-1)+c …eq(i) A(n-1);
T(n-1) =2T(n-2)+c …eq(ii) print n
T(n) =2T(n-1)+c T(n-2) =2T(n-3)+c …eq(iii)
A(n-1);
=2[2T(n-2)+c] +c }
=22T(n-2)+2c+c …eq(iv)
Putting eq(iii) in eq(iv):
T(n) =22[2T(n-3)+c]+2c+c
=23T(n-3)+22c+2c+c
Suppose we reach the anchor condition after k calls:
T(n) =2kT(n-k)+2k-1c+…+22c+2c+c ..eq(v)
94
A( n )
{
Analyzing Time Complexity If n >= 1
As we know that T(0) will be the last recursive call, so A(n-1);
for what value of k will T(n-k) become T(0). print n
A(n-1);
n-k=0  n=k
}
So eq(v) will become:
T(n) = 2kT(n-k)+2k-1c+…+22c+2c+c
= 2nT(0)+2n-1c+…+22c+2c+c
= 2nc+2n-1c+…+22c+2c+c
= c(2n+2n-1+…+22+2+1)
= c(2n+1 ― 1)
= O(2n) 95
A( n )
{
Our Approach If n >= 1
A(n-1);
S(n) = O(n) A(3)
print n
A(n-1);
T(n) = O(2n)
}
A(2) P(3) A(2)

A(1) P(2) A(1) A(1) P(2) A(1)

A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)

96
A( n )
{
New Approach If n >= 1
A(n-1);
print n
A(3) A(n-1);
}
A(2)

A(1) 0 1 2 3

Ø Ø Ø Ø

A(0)

97
A( n )
{
New Approach If n >= 1
A(n-1);
print n
A(3) A(n-1);
}
A(2)

A(1) 0 1 2 3

<blank> Ø Ø Ø

A(0) P(1) R(0)

98
A( n )
{
New Approach If n >= 1
A(n-1);
print n
A(3) A(n-1);
}
A(2)

A(1) P(2) R(1) 0 1 2 3

<blank> 1 Ø Ø

A(0) P(1) R(0)

99
A( n )
{
New Approach If n >= 1
A(n-1);
print n
A(3) A(n-1);
}
A(2) P(3) R(2)

A(1) P(2) R(1) 0 1 2 3

<blank> 1 121 Ø

A(0) P(1) R(0)

100
A( n )
{
New Approach If n >= 1
A(n-1);
S(n) = O(n) + 2(n+1) print n
= O(n) A(3) A(n-1);

T(n) = (n+1).c }
= O(n) A(2) P(3) R(2)

A(1) P(2) R(1) 0 1 2 3

<blank> 1 121 1213121

A(0) P(1) R(0)

101
Dynamic Programming
■ Dynamic programming is essentially recursion without
repetition.
■ Developing a dynamic programming algorithm generally
involves two separate steps:
– Formulate problem recursively. Write down a formula for the
whole problem as a simple combination of answers to smaller
sub-problems.
– Build solution to recurrence from bottom up. Write an
algorithm that starts with base cases and works its way up to
the final solution.
Dynamic Programming
■ Decompose into sub-problems like divide and conquer.
■ Sub-problems are dependant.
■ Record results of smaller sub-problems. Dynamic
programming algorithms need to store the results of
intermediate sub-problems.
■ Re-use it for further occurrence.
■ Mostly reduces complexity exponential to polynomial.
In our case, it reduced complexity from exponential time to linear time.
Assignment #2
■ Find the Time Complexity of two algorithms given
in next slides.

■ Both the algorithms solve the same problem.


Remember that sorting takes (n.lo g n) time.

■ Also compare the time complexity of both the


algorithms.

■ Date:
– Date: Sunday, 31 December, 2020 before 23:55 HRS
Groups up to 4 students are allowed, but not mandatory. 104
Assignment #2
Suppose you want to buy a car. You w ant the pick the fastest car. B ut
fast cars are expensive; you want the cheapest. You cannot decide
which is m ore im portant: speed or price. D efinitely you do not want a
car if there is another that is both faster and cheaper. W e say that
the fast, cheap car dom inates the slow , expensive car relative to
your selection criteria. So, given a collection of cars, we want to list
those cars that are not dom inated by any other.
H ere is how w e m ight m odel this as a form al problem .
■ Let a point p in 2-dim ensional space be given by its integer
coordinates, p = (p.x, p.y).
■ A point p is said to be dom inated by point q if p.x ≤ q.x and p.y ≤
q.y.
■ Given a set of n points, P = {p 1, p 2 , . . . , p n } in 2-space a point is
said to be m axim al if it is not dom inated by any other point in P.
105
Assignment #2
The car selection problem can be m odelled this w ay: For each car
we associate (x, y) pair w here x is the speed of the car and y is the
negation of the price. H igh y value m eans a cheap car and low y
m eans expensive car. Think of y as the m oney left in your pocket
after you have paid for the car. M axim al points correspond to the
fastest and cheapest cars.

The 2-dim ensional M axim a is thus defined as


■ Given a set of points P = {p 1, p 2 , . . . , p n } in 2-space, output the set
of m axim al points of P , i.e., those points p i such that p i is not
dom inated by any other point of P.

106
Assignment #2

107
Assignment #2
■ Algo#1

108
Assignment #2
■ Algo#2

109
110
DESIGN & ANALYSIS OF
ALGORITHMS

Lecture # 22, 23 & 24


DESIGN & ANALYSIS OF BASIC
SEARCHING ALGORITHMS
IN S T R U C T O R : S u la m a n A h m a d N a z
Searching
■ Searching is the process of determining
whether or not a given value exists in a data
structure or a storage media.

■ We will discuss two searching methods on 1D


(Linear) data structures:
– Linear search
– Binary search.

2
Linear Search
■ It is also called as Serial or Sequential Search.
■ The linear (or sequential) search algorithm on
an array is:
– Sequentially scan the array, comparing each array
item with the searched value.
– If a match is found; return the index of the
matched element; otherwise return NULL.
■ Note: linear search can be applied to both
sorted and unsorted arrays.
3
Linear Search
■ Step through array of records, one at a time.

■ Look for record with matching key.

■ Search stops when


– record with matching key is found
– or when search has examined all records without
success.

4
Linear Search
Linear Search ( Array A, KEY)
Step 1: Set i to L
Step 2: While i <= U do
Step 3: if A[i] = KEY then go to step 7
Step 4: Set i to i + 1
Step 5: End While
Step 6: Print “Element not found” & Exit
Step 7: Print “Element x Found at index i”
Step 8: Exit
5
Linear Search
procedure linear_search (Array, KEY)

For each item in the Array


If item == KEY
return the item's location
End If
End For

End procedure
6
Linear Search
Algorithm LinearSearch (Array A[L..U], KEY)
For i = L to U
If A[i] == KEY
return i
End If
End For
return NULL

7
Working
KEY=33

8
Working
KEY=33

9
Working
KEY=33

10
Working
KEY=33

11
Working
KEY=33

12
Working
KEY=33

13
Working
KEY=33

14
Working
KEY=33

15
Working
KEY=33

1 2 3 4 5 6 7 8 9 10

return 7
16
Analysis of Linear Search
Algorithm LinearSearch (Array A[L..U], KEY)
For i = L to U
If A[i] == KEY
If n is the size of the array,
return i
End If S(n) = O(1)
End For
return NULL

17
Analysis of Linear Search
Algorithm LinearSearch (Array A[L..U], KEY)
For i = L to U
If A[i] == KEY If n is the size of the array,
the loop will run not more
return i than “n” times.
End If In each iteration, it performs
End For some constant time operations.
return NULL
T(n) = n.c = O(n)
18
Best Case T(n)BestCase = O(1)

KEY=10

1 2 3 4 5 6 7 8 9 10

return 1
19
Worst Case - 1 T(n)WorstCase = O(n)

KEY=44

1 2 3 4 5 6 7 8 9 10

return 10
20
Worst Case - 2 T(n)WorstCase = O(n)

KEY=100

1 2 3 4 5 6 7 8 9 10

return NULL
21
Average Case
■ In the Average Case, the value can be somewhere in the array.
■ Average Case Time Complexity is the average of Time
Complexities of all the possible inputs.

■ Suppose we have an array of n records.


■ If we search for the first record, then it requires 1 comparison; if
the second, then 2 comparisons, and so on.
■ The average of all these searches is:
(1+2+3+4+5+6+7+8+9+…+n)/n = (n+1)/2 = O(n)
22
Advantages of Linear Search
■ The linear search is simple.

■ It is easier to understand & implement.

■ It does not require the data to be stored in any


particular order.

23
Disadvantages of Linear Search
■ It has a very poor efficiency as it takes lots of
comparisons to find a particular record in big
files.

■ It fails in very large databases especially in


real-time systems.

24
Scenario 1
■ Suppose you’re searching for a person in the
phone book. His/Her name starts with K.
■ You could start at the beginning and keep
flipping pages until you get to the Ks.
■ But you’re more likely to start at a page in the
middle, because you know the Ks are going to
be near the middle of the phone book.

25
Scenario 2

■ Suppose you’re searching for a word in a


dictionary, and it starts with O.

■ Again, you’ll start near the middle.

26
Scenario 3
■ Suppose you log on to Facebook. When you do,
Facebook has to verify that you have an account on
the site.
■ So, it needs to search for your username in its
database.
■ Let your username is LARAIB – for instance -
Facebook could start from the As and search for your
name.
2.7 billion monthly active users
■ But it makes more sense for it to begin somewhere in
the middle.
27
Binary Search
■ All these cases use the same algorithm to solve the problem:
BINARY SEARCH.

28
Binary Search
■ Binary search is a searching algorithm.

■ Its input is a sorted list of elements.

■ If an element you’re looking for is in that list, binary search


returns the position where it’s located.

■ Otherwise, binary search returns NULL.

29
Comparison with Linear Search
■ I’m thinking of a number between 1 and 100.

■ You have to try to guess my number in the fewest tries possible.


■ With every guess, I’ll tell you if your guess is too low, too high, or
correct.

30
Comparison with Linear Search
■ Suppose you start guessing like this:
1, 2, 3, 4 ….
■ Here’s how it would go.

31
Comparison with Linear Search
■ Suppose you start guessing like this:
1, 2, 3, 4 ….
■ Here’s how it would go.

32
Comparison with Linear Search
■ This is the sequential search.
■ With each guess, you’re eliminating only one number.
■ If my number was 99, it could take you 99 guesses to get there!

33
Comparison with Linear Search
■ Here’s a better technique.
■ Start with 50.

34
Comparison with Linear Search
■ You just eliminated half the numbers!
■ Now you know that 1–50 are all too low.

35
Comparison with Linear Search
■ Next guess: 75.

■ Too high, but again you cut down half the remaining numbers!
■ With binary search, you guess the middle number and eliminate
half the remaining numbers every time.

36
Comparison with Linear Search
■ Next is 63 (halfway between 50 and 75).

37
Comparison with Linear Search
■ This is binary search.

■ Here’s how many numbers you can eliminate every time.

38
Comparison with Linear Search

■ Whatever number I’m thinking of, you can guess in a maximum of


seven guesses—because you eliminate so many numbers with
every guess!

39
Comparison with Linear Search

■ Suppose you’re looking for a word in the


dictionary. The dictionary has 240,000
words.

■ In the worst case, how many steps do you


think each search will take?

40
Comparison with Linear Search
■ Simple search could take 240,000 steps if the word you’re
looking for is the very last one in the book.
■ With each step of binary search, you cut the number of words
in half until you’re left with only one word.

41
Working
KEY=31

42
Working
KEY=31

43
Working
KEY=31

44
Working
KEY=31

45
Working
KEY=31

46
Working
KEY=31

return 5
47
Binary Search (Iterative Version)
Algorithm BinarySearch (Array A[L..U], KEY)
While L<=U, do
mid = (L+U)/2
If A[mid] == KEY
return mid
ElseIf A[mid] < KEY
L = mid + 1
Else
U = mid - 1
End If
End While
return NULL
48
Binary Search (Recursive Version)
Algorithm BinSearch (Array A, KEY, U, L)
If L>U
return NULL
End If
mid = (L+U)/2
If A[mid] == KEY
return mid
ElseIf A[mid] < KEY
return BinSearch(A, KEY, mid + 1, U)
Else
return BinSearch(A, KEY, L, mid - 1)
End If
49
Binary Search – Side by Side
Algorithm BinarySearch (Array A[L..U], KEY) Algorithm BinSearch (Array A, KEY, U, L)
While L<=U, do If L>U
mid = (L+U)/2 return NULL
If A[mid] == KEY End If
return mid mid = (L+U)/2
ElseIf A[mid] < KEY If A[mid] == KEY
L = mid + 1 return mid
Else ElseIf A[mid] < KEY
U = mid - 1 return BinSearch(A, KEY, mid + 1, U)
End If Else
End While return BinSearch(A, KEY, L, mid - 1)
return NULL End If
50
If n is the size of the array,

Space Complexity S(n)Iterative = O(1)


Algorithm BinarySearch (Array A[L..U], KEY) Algorithm BinSearch (Array A, KEY, U, L)
While L<=U, do If L>U
mid = (L+U)/2 return NULL
If A[mid] == KEY End If
return mid mid = (L+U)/2
ElseIf A[mid] < KEY If A[mid] == KEY
L = mid + 1 return mid
Else ElseIf A[mid] < KEY
U = mid - 1 return BinSearch(A, KEY, mid + 1, U)
End If Else
End While return BinSearch(A, KEY, L, mid - 1)
return NULL End If
51
If n is the size of the array,

Space Complexity S(n)Recursive = O(log2n)


Algorithm BinarySearch (Array A[L..U], KEY) Algorithm BinSearch (Array A, KEY, U, L)
While L<=U, do If L>U
mid = (L+U)/2 return NULL
If A[mid] == KEY End If
return mid mid = (L+U)/2
ElseIf A[mid] < KEY If A[mid] == KEY
L = mid + 1 return mid
Else ElseIf A[mid] < KEY
U = mid - 1 return BinSearch(A, KEY, mid + 1, U)
End If Else
End While return BinSearch(A, KEY, L, mid - 1)
return NULL End If
52
A(32)
A(0) 1 call

Space Complexity A(1)


A(2)
2 calls
3 calls A(16) c

Algorithm BinSearch (Array A, KEY, U, L) A(4) 4 calls


If L>U A(8) 5 calls A(8) c
A(16) 6 calls
return NULL
A(32) 7 calls
End If A(4) c
mid = (L+U)/2 A(n) x calls
If A[mid] == KEY A(2) c
return mid
ElseIf A[mid] < KEY A(1) c
return BinSearch(A, KEY, mid + 1, U)
Else A(0) c

return BinSearch(A, KEY, L, mid - 1)


End If c
53
A(32)
A(0) 1 call

Space Complexity A(1)


A(2)
log21+2=2 calls
log22+2=3 calls A(16) c

Algorithm BinSearch (Array A, KEY, U, L) A(4) log24+2=4 calls


If L>U A(8) log28+2=5 calls A(8) c
A(16) log216+2=6 calls
return NULL
A(32) log232+2=7 calls
End If A(4) c
mid = (L+U)/2 A(n) log2n+2=x calls
If A[mid] == KEY A(2) c
return mid
ElseIf A[mid] < KEY A(1) c
return BinSearch(A, KEY, mid + 1, U)
Else A(0) c

return BinSearch(A, KEY, L, mid - 1)


End If c
54
If each block of stack
Space Complexity consists of k memory
Algorithm BinSearch (Array A, KEY, U, L)
locations, then total
If L>U
space required is k(log2n+2)
return NULL
End If
mid = (L+U)/2
If A[mid] == KEY
return mid
ElseIf A[mid] < KEY
return BinSearch(A, KEY, mid + 1, U)
log2n+2
Else
blocks
return BinSearch(A, KEY, L, mid - 1)
End If
55
Space Complexity S(n)Recursive = O(log2n)
Algorithm BinSearch (Array A, KEY, U, L)
If L>U
return NULL
End If
mid = (L+U)/2
If A[mid] == KEY
return mid
ElseIf A[mid] < KEY
return BinSearch(A, KEY, mid + 1, U)
log2n+2
Else
blocks
return BinSearch(A, KEY, L, mid - 1)
End If
56
Time Complexity
Algorithm BinSearch (Array A, KEY, U, L)
If L>U T(n) = T( n/2 ) + c
return NULL T(1) = T(0) + c
End If T(0) = c
mid = (L+U)/2
If A[mid] == KEY
return mid
ElseIf A[mid] < KEY T(n) = T( n/2 ) + c
return BinSearch(A, KEY, mid + 1, U) T(1) = T(0) + c
Else T(0) = c
return BinSearch(A, KEY, L, mid - 1)
End If
57
Back Substitution Method
T(n) =T(n/2)+c, for n>1
T(1) =T(0)+c
T(0) =c  This is the Anchor Condition
-------------------------------------------------------------------------
T(n) =T(n/2)+c …eq(i)
T(n/2) =T(n/2/2)+c
=T(n/22)+c …eq(ii)
T(n/22) =T(n/22/2)+c
=T(n/23)+c …eq(iii) 58
Back Substitution Method
Putting eq(ii) in eq(i): T(n) =T(n/2)+c …eq(i)
T(n/2) =T(n/22)+c …eq(ii)
T(n) =T(n/2)+c T(n/22) =T(n/23)+c …eq(iii)

=T(n/22)+c+c
=T(n/22)+2c …eq(iv)
Putting eq(iii) in eq(iv):
T(n) = T(n/23)+c+2c
=T(n/23)+3c
Suppose we reach T(1) after k calls:
T(n) =T(n/2k)+kc ..eq(v)
59
Back Substitution Method
For what value of k will T(n/2k) become T(1).
n/2k=1  n=2k  k=log2n
So eq(v) will become:
T(n) = T(n/2k)+kc
= T(1)+kc
= T(0) + c + kc
= c+c+kc
= 2c+clog2n
= O(log2n)
60
Recursion Tree Method
T(n) =T(n/2)+c, for n>1
T(1) =T(0)+c
T(0) =c  This is the Anchor Condition
-------------------------------------------------------------------------
Note that here, at each step, only 1 branch will be drawn;
decreasing the input size by a factor of 2 at each step.
And a total of c work will be done at each branching.

61
A(n) c

A(n/2) c

A(n/22) c

For what value of 2k will T(n/2k) become T(1). A(n/23) c


n/2k=1  n=2k  k=log2n
T(n) = c(k+2)
= c(log2n+2)
A(n/2k) = T(1) c
= O(log2n)

A(0) c

62
Master Theorem

T(n) =T(n/2)+c, for n>1


Here, a=1, b=2, c=0, and d=0
Now, bc=20=1
Here a=bc, so second possibility is our selection.
As d=0, so solution will be:
T(n) = (nlog21log0+1n) = (n0log1n) = (log n)

63
Practice Questions

64
ONLINE QUIZ #1
■ Time Slot: 13:30 pm to 16:30 pm
■ Duration: 50 minutes

65
66
DESIGN & ANALYSIS OF
ALGORITHMS

Lecture # 25
INTRODUCTION TO DesIgN Of
sORTINg ALgORITHMs
IN S T R U C T O R : S u la m a n A h m a d N a z
Sorting
■ In this lesson and in this series of lessons we are going to
talk about sorting algorithms.

■ Sorting as a concept is deeply embedded in a lot of things


that we do.

■ It's quite often that we like to arrange things or data in


certain order.

2
Sorting
■ Sometimes to improve the readability of that data, at others to
be able to search or extract some information quickly out of
that data.

■ For example:
We keep currency notes sorted.

3
Sorting
■ Now, have a look at this, when we go to an online marketplace
such as OLX.
■ The site gives us options of sorting the items.

4
Sorting
■ So, sorting is a really helpful feature here and there are so many
places where we like to keep our data sorted.
■ Be it the language dictionary where we want to keep the words
sorted so that searching a word in the dictionary is easy.

5
Sorting
■ If we want to define sorting formally, then sorting is arranging
the elements in a list or collection in increasing or decreasing
order of some property.

■ The list should be homogeneous, that is all the elements in the


list should be of same type.

■ To study sorting algorithms, most of the times we use a list of


integers, and typically we sort the list of integers in increasing
order of value.
6
Sorting
■ For example, if we have the following list:

12 21 6 8 5

■ Then sorting in increasing order of values will mean rearranging


the above values as:

5 6 8 12 21

7
Sorting
■ For example, if we have the following list:

12 21 6 8 5

■ Similarly, sorting in decreasing order of values will result as


following permutation of the above list:

21 12 8 6 5

8
Sorting
■ For example, if we have the following list:

12 21 6 8 5

■ But we can sort on any property of data, it is not just the


magnitude of values, like increasing number of factors.

9
Sorting
■ For example, if we have the following list:

12 21 6 8 5
1,2,3,4, 1,3,7,21 1,2,3,6 1,2,4,8 1,5
6,12

■ But we can sort on any property of data, it is not just the


magnitude of values, like increasing number of factors.

5 21 6 8 12

10
Sorting
■ A sorted list is the permutation of the original list, When we sort a
list, we just rearrange the elements.

■ Now this was a list of integers. We may have a list of any data
type. We may want to sort a list of strings or words in
lexicographical order, the order in which they will occur in
dictionary.

11
Sorting

“fork” “knife” “screen” “mouse” “key”

■ A list of strings like above in lexicographical order will be arranged


as below:

“fork” “key” “knife” “mouse” “screen”

12
Sorting
■ And we may have a list of complex data type as well.

■ A car object in the OLX website, that we had seen earlier, was a
complex type.

■ Cars will have many properties, like the following, and the list can
be sorted on any of these properties.
– Price
– Relevancy
– Date of inclusion in the database
– Etc.
13
Sorting
■ Sorted data is good, not just for presentation or manual retrieval
of information, but even when we are using computational power
of machines, sorted data is really helpful.
■ If a list is stored in computer's memory as unsorted, then to
search something in this list, we will have to run Linear Search.
(We cannot do binary search on unsorted data)
■ In linear search, we start with the first element of the list and
keep scanning the whole list, until we get the element that we are
looking for, so, in the worst case, like, when an element will not be
there in the list, we will compare it with all the elements in the list,
we will scan it with whole list.
14
Sorting
■ So, if there are n elements in the list, we will make n comparisons
in the worst case.
■ Just, think about the kind and amount of data, modern day
computers deal with.
■ What if this n is very very large! If we take n=2128 items and
imagine that 1 comparison takes 1 millisecond.
■ Then how much time is required to search a value in the worst
case?

Time = 2128 milliseconds


15
Sorting
Time = 2128 milliseconds
Time = 340282366920938000000000000000000000000 milliseconds
Time = 10790283070806000000000000000 years
■ But, if data was sorted, by using binary search, it would take only
128 milliseconds.
Time = log22128 = 128 milliseconds
Time = 0.128 second
16
Sorting
■ Now, you can imagine how important sorting as a problem is.

■ Sorting as a problem is well studied and a great deal of research


has gone into devising efficient algorithms for sorting.

■ In this series of lessons we are going to study, analyze, and


compare these sorting algorithms.

17
Sorting
■ Some of the sorting algorithms that we will be talking about, that
we will be analyzing about are:

18
Sorting
■ We have so many algorithms for sorting that have been designed over
a period of time.

■ We often classify sorting algorithms based on some parameters.

■ The first parameter that we want to classify upon is time complexity


which is the measure of rate of growth of time taken by an algorithm
with respect to input size.

■ Some algorithms will be relatively faster than others.


19
Sorting w.r.t Time Complexity

Sorting Algorithms

Polynomial Time n.Logrithmic Time Linear Time

Bubble Sort Quick Sort Counting Sort


Selection Sort Merge Sort Radix Sort
Insertion Sort Heap Sort Bucket Sort

20
Sorting
■ The second parameter that we use for classification is space
complexity or memory usage.

■ Some sorting algorithms are in-place, they use constant i.e. O(1)
or logarithmic i.e. O(log n) amount of extra memory to rearrange
the elements in the list.

■ While some sorting algorithms use extra memory to temporarily


store data and the memory usage grows with input size. These
are called out-place sorting algorithms. Here, the S(n) > (log n)
21
Sorting w.r.t Space Complexity

Sorting Algorithms

In-Place Out-Place

Bubble Sort Merge Sort


Selection Sort Counting Sort
Insertion Sort
Quick Sort
22
Sorting
■ The third parameter that we talk about is stability.
■ A stable sorting algorithm guarantees that in the case of duplicate
values, the relative order of elements is preserved.

Unsorted List: 12 21 6 8 6

After stable sorting: 6 6 8 12 21

After unstable sorting: 6 6 8 12 21


23
Sorting w.r.t Stability

Sorting Algorithms

Stable Unstable

Bubble Sort Quick Sort


Merge Sort Heap Sort
Insertion Sort Selection Sort

24
Sorting
■ The next parameter of classification is whether a sort is internal
sort or external sort.

■ When all the records that need to be sorted in the main memory
or RAM, then such sort is internal sort.

■ And if the records are on auxiliary storage like disk and tapes,
quite often because it's not possible to get all of them in the main
memory in one go, then we call such a sort external sort.
25
Sorting w.r.t Location of Values

Sorting Algorithms

Internal External

Bubble Sort Merge Sort (Hybrid version)


Quick Sort Distribution Sort
Insertion Sort

26
Sorting
■ The next parameter upon which we may classify sorting algorithms
is whether algorithm is recursive or non recursive in nature.

27
Sorting w.r.t Design

Sorting Algorithms

Iterative Recursive

Bubble Sort Merge Sort


Selection Sort Quick Sort
Insertion Sort

28
Sorting
■ There is one more parameter upon which we may classify sorting
algorithms, and it is whether algorithm compares the values or
does sorting in some other way.

■ A comparison sort algorithm sorts items by comparing values


between each other. It can be applied to any input.

■ A non-comparison sort algorithm uses the internal character of


the values to be sorted. It can only be applied to some particular
cases, and requires particular values.

29
Sorting w.r.t Comparison

Sorting Algorithms

Comparison Sorting Non-Comparison Sorting

Bubble Sort Counting Sort


Selection Sort Radix Sort
Insertion Sort Bucket Sort

30
31
DESIGN & ANALYSIS OF
ALGORITHMS

Lecture # 26 & 27
DESIGN OF SElEctION, INSErtION,
BuBBlE & ShEll SOrt
IN S T R U C T O R : S u la m a n A h m a d N a z
Basic Sorting Block

3
■ Before starting to learn sorting
algorithms, lets start with a very
basic sorting block. 4 4

■ It takes two numbers as an input


and tells which one is smaller and 3
vice versa.

2
Basic Sorting Block

num1
If num1 <= num2
Pop_Down num1
Pop_Right num2 num2 larger
Else
Pop_Down num2
Pop_Right num1
smaller

We will combine these sorting block in a triangular way and try to sort a list.
3
5
Constructing Our
4 5 1st Sorting Algorithm
4
After each step, we
3 4 5 get/select the minimum
element of the (remaining)
3 4 list.

1 3 4 5

1 3 4

2 2 3 4

1 2 3 4 5 4
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 12 21 6 8 5

For i = L to U-1
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

5
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 12 21 6 8 5

For i = L to U-1 i
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

6
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 12 21 6 8 5

For i = L to U-1 i
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

7
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 12 21 6 8 5

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

8
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 12 21 6 8 5

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

9
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 12 21 6 8 5

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

10
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 12 21 6 8 5

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

11
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 12 21 6 8 5

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

12
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 12 21 6 8 5

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

13
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 12 21 6 8 5

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

14
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 12 21 6 8 5

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

15
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 12 21 6 8 5

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

16
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 12 21 6 8 5

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

17
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 12 21 6 8 5

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

18
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 21 6 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

19
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 21 6 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

20
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 21 6 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

21
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 21 6 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

22
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 21 6 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

23
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 21 6 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

24
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 21 6 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

25
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 21 6 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

26
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 21 6 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

27
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 21 6 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

28
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 21 6 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

29
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 6 21 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

30
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 6 21 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

31
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 6 21 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

32
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 6 21 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

33
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 6 21 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

34
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 6 21 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

35
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 6 21 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

36
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 6 21 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

37
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 6 21 8 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

38
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 6 8 21 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

39
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 6 8 21 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

40
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 6 8 21 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

41
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 6 8 21 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

42
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 6 8 21 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

43
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 6 8 21 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

44
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 6 8 21 12

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

45
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 6 8 12 21

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

46
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 6 8 12 21

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

47
1 st Sorting Algorithm
L U
Algorithm ColumnWiseSorting(List[L…U]) 5 6 8 12 21

For i = L to U-1 i j
Min
Min=i
For j = i+1 to U
If List[ j ] < List[Min]
Min=j
Swap List[i]↔List[Min]

48
5
5
Constructing Our
4 2nd Sorting Algorithm
4 5
After each step, we insert
3 4 the next element in its
correct position in the
3 4 5 sorted sub-list.

1 3 4

1 3 4 5

2 2 3 4

1 2 3 4 5 49
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 12 21 6 8 5

For i = L+1 to U
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break
List[ j+1 ] = List[ j ]
List[ j+1 ] = value
50
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 12 21 6 8 5

For i = L+1 to U i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break
List[ j+1 ] = List[ j ]
List[ j+1 ] = value
51
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 12 21 6 8 5

For i = L+1 to U i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 21

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
52
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 12 21 6 8 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 21

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
53
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 12 21 6 8 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 21

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
54
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 12 21 6 8 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 21

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
55
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 12 21 6 8 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 21

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
56
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 12 21 6 8 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 21

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
57
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 12 21 6 8 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 6

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
58
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 12 21 6 8 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 6

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
59
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 12 21 6 8 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 6

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
60
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 12 21 21 8 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 6

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
61
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 12 21 21 8 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 6

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
62
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 12 21 21 8 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 6

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
63
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 12 12 21 8 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 6

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
64
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 12 12 21 8 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 6

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
65
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 12 21 8 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 6

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
66
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 12 21 8 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 6

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
67
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 12 21 8 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 8

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
68
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 12 21 8 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 8

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
69
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 12 21 8 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 8

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
70
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 12 21 21 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 8

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
71
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 12 21 21 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 8

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
72
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 12 21 21 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 8

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
73
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 12 12 21 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 8

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
74
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 12 12 21 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 8

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
75
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 12 12 21 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 8

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
76
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 12 12 21 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 8

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
77
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 8 12 21 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 8

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
78
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 8 12 21 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 8

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
79
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 8 12 21 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 5

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
80
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 8 12 21 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 5

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
81
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 8 12 21 5

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 5

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
82
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 8 12 21 21

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 5

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
83
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 8 12 21 21

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 5

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
84
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 8 12 21 21

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 5

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
85
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 8 12 12 21

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 5

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
86
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 8 12 12 21

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 5

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
87
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 8 12 12 21

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 5

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
88
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 8 8 12 21

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 5

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
89
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 8 8 12 21

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 5

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
90
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 8 8 12 21

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 5

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
91
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 6 8 12 21

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 5

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
92
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 6 6 8 12 21

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 5

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
93
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 5 6 8 12 21

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 5

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
94
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 5 6 8 12 21

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 5

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
95
2 nd Sorting Algorithm
L U
Algorithm RowWiseSorting(List[L…U]) 5 6 8 12 21

For i = L+1 to U j i
value=List[i]
For j = i-1 to L
If List[ j ] < value value
break 5

List[ j+1 ] = List[ j ]


List[ j+1 ] = value
96
Comparing Both Algorithms
After each step, we get / SELECT After each step, we INSERT
the minimum element of the the next element in its correct
(remaining) list. position in the sorted sub-list.
ColumnWiseSorting(List[L…U]) RowWiseSorting(List[L…U])
For i = L to U-1 For i = L+1 to U
Min=i value=List[i]
For j = i+1 to U For j = i-1 to L
If List[ j ] < List[Min] If List[ j ] < value
Min=j break
Swap List[i]↔List[Min] List[ j+1 ] = List[ j ]
List[ j+1 ] = value
97
Comparing Both Algorithms
SELECTION SORT INSERTION SORT

ColumnWiseSorting(List[L…U]) RowWiseSorting(List[L…U])
For i = L to U-1 For i = L+1 to U
Min=i value=List[i]
For j = i+1 to U For j = i-1 to L
If List[ j ] < List[Min] If List[ j ] < value
Min=j break
Swap List[i]↔List[Min] List[ j+1 ] = List[ j ]
List[ j+1 ] = value
98
Another Basic Sorting Block

■ Before starting to learn out third 4 3


sorting algorithm, lets have a look at
another very basic sorting block.

■ It also takes two numbers in a


circle/bubble and tells which one is 3 4
smaller and vice versa.

99
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
12 21 6 8 5

100
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
12 21 6 8 5

i 101
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
12 21 6 8 5

j
i 102
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
12 21 6 8 5

j
i 103
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
12 21 6 8 5

j
i 104
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
12 21 6 8 5

j
i 105
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
12 6 21 8 5

j
i 106
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
12 6 21 8 5

j
i 107
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
12 6 21 8 5

j
i 108
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
12 6 8 21 5

j
i 109
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
12 6 8 21 5

j
i 110
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
12 6 8 21 5

j
i 111
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
12 6 8 5 21

j
i 112
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
12 6 8 5 21

j
i 113
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
12 6 8 5 21

j
i 114
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
12 6 8 5 21

j
i 115
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
12 6 8 5 21

j
i 116
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
6 12 8 5 21

j
i 117
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
6 12 8 5 21

j
i 118
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
6 12 8 5 21

j
i 119
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
6 8 12 5 21

j
i 120
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
6 8 12 5 21

j
i 121
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
6 8 12 5 21

j
i 122
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
6 8 5 12 21

j
i 123
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
6 8 5 12 21

j
i 124
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
6 8 5 12 21

j
i 125
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
6 8 5 12 21

j
i 126
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
6 8 5 12 21

j
i 127
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
6 8 5 12 21

j
i 128
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
6 8 5 12 21

j
i 129
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
6 5 8 12 21

j
i 130
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
6 5 8 12 21

j
i 131
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
6 5 8 12 21

j
i 132
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
6 5 8 12 21

j
i 133
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
6 5 8 12 21

j
i 134
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
5 6 8 12 21

j
i 135
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
5 6 8 12 21

j
i 136
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
5 6 8 12 21

j
i 137
3 rd Sorting Algorithm : Bubble Sort
Algorithm BubbleSort(List[L…U])
For i = U-1 to L
For j = L to i
If List[ j ] > List[ j+1 ]
Swap List[ j ] = List[ j+1 ]

L U
5 6 8 12 21

j
i 138
Shell Sort
■ In the insertion sort, we took an element out and we
compared it to the elements before it.

■ The elements before it was a sorted sub-list.

■ If the value at the far right was the smallest one, it took
about (n-1) shifts to place it in the correct position.

■ The idea behind the shell sort is to avoid such large shifts.
139
Shell Sort
Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
Swap List[ i+gap ] ↔ List[ i ]

140
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
141
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
142
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
5

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
143
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
5

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
144
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
5

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
145
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
6

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
146
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
6

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
147
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
6

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
148
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
6

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
149
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
7

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
150
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
7

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
151
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
7

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
152
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
7

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
153
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
8

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
154
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
8

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
155
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
8

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
156
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
8

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
157
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
9

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
158
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
9

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
159
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
9

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
160
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
9

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
161
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 4
2 7 9 5 23 29 15 9 31
j
i i+gap
10

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
162
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 7 9 5 23 29 15 9 31
j
i i+gap
10

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
163
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 7 9 5 23 29 15 9 31
j
i i+gap
3

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
164
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 7 9 5 23 29 15 9 31
j
i i+gap
3

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
165
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 7 9 5 23 29 15 9 31
j
i i+gap
3

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
166
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 7 9 5 23 29 15 9 31
j
i i+gap
3

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
167
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 7 9 5 23 29 15 9 31
j
i i+gap
4

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
168
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 7 9 5 23 29 15 9 31
j
i i+gap
4

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
169
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 7 9 5 23 29 15 9 31
j
i i+gap
4

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
170
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 23 29 15 9 31
j
i i+gap
4

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
171
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 23 29 15 9 31
j
i i+gap
4

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
172
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 23 29 15 9 31
j
i i+gap
5

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
173
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 23 29 15 9 31
j
i i+gap
5

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
174
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 23 29 15 9 31
j
i i+gap
5

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
175
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 23 29 15 9 31
j
i i+gap
5

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
176
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 23 29 15 9 31
j
i i+gap
6

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
177
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 23 29 15 9 31
j
i i+gap
6

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
178
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 23 29 15 9 31
j
i i+gap
6

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
179
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 23 29 15 9 31
j
i i+gap
6

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
180
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 23 29 15 9 31
j
i i+gap
7

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
181
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 23 29 15 9 31
j
i i+gap
7

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
182
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 23 29 15 9 31
j
i i+gap
7

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
183
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 15 29 23 9 31
j
i i+gap
7

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
184
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 15 29 23 9 31
j
i i+gap
7

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
185
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 15 29 23 9 31
j
i i+gap
7

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
186
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 15 29 23 9 31
j
i i+gap
7

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
187
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 15 29 23 9 31
j
i i+gap
8

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
188
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 15 29 23 9 31
j
i i+gap
8

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
189
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 15 29 23 9 31
j
i i+gap
8

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
190
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 15 9 23 29 31
j
i i+gap
8

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
191
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 15 9 23 29 31
j
i i+gap
8

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
192
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 15 9 23 29 31
j
i i+gap
8

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
193
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 15 9 23 29 31
j
i i+gap
8

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
194
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 15 9 23 29 31
j
i i+gap
9

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
195
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 15 9 23 29 31
j
i i+gap
9

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
196
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 15 9 23 29 31
j
i i+gap
9

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
197
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 15 9 23 29 31
j
i i+gap
9

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
198
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 2
2 5 9 7 15 9 23 29 31
j
i i+gap
10

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
199
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 9 7 15 9 23 29 31
j
i i+gap
10

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
200
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 9 7 15 9 23 29 31
j
i i+gap
2

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
201
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 9 7 15 9 23 29 31
j
i i+gap
2

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
202
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 9 7 15 9 23 29 31
j
i i+gap
2

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
203
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 9 7 15 9 23 29 31
j
i i+gap
2

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
204
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 9 7 15 9 23 29 31
j
i i+gap
3

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
205
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 9 7 15 9 23 29 31
j
i i+gap
3

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
206
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 9 7 15 9 23 29 31
j
i i+gap
3

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
207
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 9 7 15 9 23 29 31
j
i i+gap
3

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
208
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 9 7 15 9 23 29 31
j
i i+gap
4

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
209
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 9 7 15 9 23 29 31
j
i i+gap
4

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
210
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 9 7 15 9 23 29 31
j
i i+gap
4

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
211
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 15 9 23 29 31
j
i i+gap
4

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
212
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 15 9 23 29 31
j
i i+gap
4

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
213
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 15 9 23 29 31
j
i i+gap
4

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
214
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 15 9 23 29 31
j
i i+gap
4

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
215
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 15 9 23 29 31
j
i i+gap
5

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
216
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 15 9 23 29 31
j
i i+gap
5

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
217
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 15 9 23 29 31
j
i i+gap
5

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
218
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 15 9 23 29 31
j
i i+gap
5

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
219
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 15 9 23 29 31
j
i i+gap
6

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
220
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 15 9 23 29 31
j
i i+gap
6

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
221
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 15 9 23 29 31
j
i i+gap
6

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
222
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 9 15 23 29 31
j
i i+gap
6

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
223
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 9 15 23 29 31
j
i i+gap
6

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
224
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 9 15 23 29 31
j
i i+gap
6

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
225
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 9 15 23 29 31
j
i i+gap
6

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
226
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 9 15 23 29 31
j
i i+gap
7

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
227
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 9 15 23 29 31
j
i i+gap
7

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
228
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 9 15 23 29 31
j
i i+gap
7

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
229
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 9 15 23 29 31
j
i i+gap
8

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
230
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 9 15 23 29 31
j
i i+gap
8

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
231
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 9 15 23 29 31
j
i i+gap
8

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
232
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 9 15 23 29 31
j
i i+gap
9

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
233
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 9 15 23 29 31
j
i i+gap
9

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
234
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 9 15 23 29 31
j
i i+gap
9

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
235
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 1
2 5 7 9 9 15 23 29 31
j
i i+gap
10

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
236
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 0
2 5 7 9 9 15 23 29 31
j
i i+gap
10

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
237
gap L U
1 2 3 4 5 6 7 8 9
Shell Sort 0
2 5 7 9 9 15 23 29 31
j
i i+gap
10

Algorithm ShellSort(List[L…U])
For gap = U/2 to 1: gap=gap/2
For j = gap+1 to U
For i = j – gap to L: i = i - gap
If List[ i+gap ] > List[ i ]
break
Else
List[ i+gap ] ↔ List[ i ]
238
Time & Space Complexity of Shell Sort

S(n) = O(1)

T(n) = O(n2) Worst Case


T(n) = O(n.logn) Best Case
T(n) = O(n.logn) Average Case

239
Assignment #3
■ Find the Time & Space Complexity of:
– Bubble Sort
– Selection Sort
– Insertion Sort

■ Give Best, Average & Worst Case for


Time Complexity

240
Assignment #3

■ Instructions:
– Single .pdf file to be uploaded on LMS.
– Groups are allowed but not mandatory.
– Maximum group size is 5.

■ Deadline: Sunday, 07 January, 2021; before 23:55 HRS

241
242
DESIGN & ANALYSIS OF
ALGORITHMS

Lecture # 29 & 30
Quick Sort, Merge Sort,
Linear tiMe Sorting
(counting Sort, raDiX Sort)
IN S T R U C T O R : S u la m a n A h m a d N a z
Sorting w.r.t Time Complexity

Sorting Algorithms

Polynomial Time n.Logrithmic Time Linear Time

Bubble Sort Quick Sort Counting Sort


Selection Sort Merge Sort Radix Sort
Insertion Sort Heap Sort Bucket Sort
Shell Sort 2
Sorting w.r.t Time Complexity

Sorting Algorithms

Polynomial Time n.Logrithmic Time Linear Time

Bubble Sort Quick Sort Counting Sort


Selection Sort Merge Sort Radix Sort
Insertion Sort Heap Sort Bucket Sort
Shell Sort 3
Sorting w.r.t Time Complexity

Sorting Algorithms

Polynomial Time n.Logrithmic Time Linear Time

Bubble Sort Quick Sort Counting Sort


Selection Sort Merge Sort Radix Sort
Insertion Sort Heap Sort Bucket Sort
Shell Sort 4
Divide & Conquer : Merge Sort

5
Sorting While Merging

6
Sorting While Merging

7
Sorting While Merging

8
Sorting While Merging

9
Merge Sort

■ Merge sort is a sorting technique based on divide and


conquer technique.

■ Merge sort first divides the array into equal halves and then
combines them in a sorted manner.
– MERGE procedure

10
MERGE Function
Algorithm Merge(A[],p,q,r)
New Array Temp[1…r-p+1]
i=p,j=q+1
for k=1 to (r-p+1)
if i>q, Temp[k]=A[j++]
elseif j>r, Temp[k]=A[i++]
elseif A[i] <= A[j], Temp[k]=A[i++]
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]
11
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
else Temp[k]=A[j++]

q+1 r Copy Temp[1…k] to A[p…r]

3 12 27 28

15 16 17 18

p q q+1 r

A 1 2 3 20 25 3 12 27 28

10 11 12 13 14 15 16 17 18 12
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28

15 16 17 18

13
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28

15 16 17 18

Temp
1 2 3 4 5 6 7 8 9
14
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp
1 2 3 4 5 6 7 8 9
15
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp k
1 2 3 4 5 6 7 8 9
16
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp k
1 2 3 4 5 6 7 8 9
17
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp k
1 2 3 4 5 6 7 8 9
18
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1
k
1 2 3 4 5 6 7 8 9
19
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1
k
1 2 3 4 5 6 7 8 9
20
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1
k
1 2 3 4 5 6 7 8 9
21
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1
k
1 2 3 4 5 6 7 8 9
22
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2
k
1 2 3 4 5 6 7 8 9
23
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2
k
1 2 3 4 5 6 7 8 9
24
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2
k
1 2 3 4 5 6 7 8 9
25
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2
k
1 2 3 4 5 6 7 8 9
26
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2 3
k
1 2 3 4 5 6 7 8 9
27
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2 3
k
1 2 3 4 5 6 7 8 9
28
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2 3
k
1 2 3 4 5 6 7 8 9
29
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2 3 9
k
1 2 3 4 5 6 7 8 9
30
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2 3 9
k
1 2 3 4 5 6 7 8 9
31
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2 3 9
k
1 2 3 4 5 6 7 8 9
32
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2 3 9
k
1 2 3 4 5 6 7 8 9
33
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2 3 9 12
k
1 2 3 4 5 6 7 8 9
34
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2 3 9 12
k
1 2 3 4 5 6 7 8 9
35
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2 3 9 12
k
1 2 3 4 5 6 7 8 9
36
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2 3 9 12 20
k
1 2 3 4 5 6 7 8 9
37
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2 3 9 12 20
k
1 2 3 4 5 6 7 8 9
38
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2 3 9 12 20
k
1 2 3 4 5 6 7 8 9
39
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2 3 9 12 20 25
k
1 2 3 4 5 6 7 8 9
40
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2 3 9 12 20 25
k
1 2 3 4 5 6 7 8 9
41
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2 3 9 12 20 25 27
k
1 2 3 4 5 6 7 8 9
42
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2 3 9 12 20 25 27
k
1 2 3 4 5 6 7 8 9
43
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2 3 9 12 20 25 27 28
k
1 2 3 4 5 6 7 8 9
44
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 9 20 25 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

3 12 27 28
j
15 16 17 18

Temp 1 2 3 9 12 20 25 27 28
k
1 2 3 4 5 6 7 8 9
45
Algorithm Merge(A[],p,q,r)

MERGE Function
New Array Temp[1…r-p+1]
i=p,j=q+1

p q for k=1 to (r-p+1)


if i>q, Temp[k]=A[j++]
1 2 3 9 12 elseif j>r, Temp[k]=A[i++]
i elseif A[i] <= A[j], Temp[k]=A[i++]
10 11 12 13 14
A q+1 r
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]

20 25 27 28
j
15 16 17 18

Temp 1 2 3 9 12 20 25 27 28
k
1 2 3 4 5 6 7 8 9
46
Analysis of “MERGE”

Algorithm Merge(A[],p,q,r)
S(n) = n+c = O(n)
New Array Temp[1…r-p+1]
i=p,j=q+1 T(n) = c+n+n(c)
for k=1 to (r-p+1)
if i>q, Temp[k]=A[j++]
= O(n)
elseif j>r, Temp[k]=A[i++]
elseif A[i] <= A[j], Temp[k]=A[i++]
else Temp[k]=A[j++]
Copy Temp[1…k] to A[p…r]
Merge Sort: Divide & Conquer

48
MERGE SORT

Algorithm MergeSort(A[],p,r)
if p<r
q=(p+r)/2
MergeSort(A,p,q)
MergeSort(A,q+1,r)
Merge(A,p,q,r)

49
Merge Sort: Divide & Conquer
Sorted
MS(1,9)

MS(1,5)
Sorted + MS(6,9)
Sorted

MS(1,3)
Sorted + MS(4,5)
Sorted MS(6,7)
Sorted + MS(8,9)
Sorted

MS(1,2)
Sorted + MS
(3,3)
MS
(4,4) + MS
(5,5)
MS
(6,6) + MS
(7,7)
MS
(8,8) + MS
(9,9)

MS
(1,1) + MS
(2,2) 50
Analysis of MERGE SORT

Algorithm MergeSort(A[],p,r)
S(n) = O(n) + O(log2n)
if p<r = O(n)
q=(p+r)/2
MergeSort(A,p,q) T(n) = 2T(n/2)+O(n)
MergeSort(A,q+1,r)
T(1) = c = O(1)
Merge(A,p,q,r) By Master’s Theorem
T(n) = O(n.log2n)
51
Quick Sort
■ Quick sort is also based on divide and conquer technique.

■ Quick sort takes a value from array as a pivot element and


divides the array in such a way that all the elements to the
left of pivot are smaller that the value of this pivot element
and vice versa.
■ Hence, we place the pivot element in its correct sorted
position and we get two sub-lists; one on the left and other on
the left side of the pivot element.
– PLACE&DIVIDE procedure

52
Divide & Conquer : Quick Sort

53
PLACE&DIVIDE Function
Algorithm Place&Divide(A[],p,r)
x=A[r]
i=p-1
for j=p to (r-1)
if A[j]<=x
i++
if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division
54
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
i++
if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18

55
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18

56
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
i
57
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
i j
58
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
i j
59
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
ij
60
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
ij
61
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
i j
62
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
i j
63
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
ij
64
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
ij
65
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
i j
66
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
i j
67
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
ij
68
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
ij
69
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
i j
70
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
i j
71
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
i j
72
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
i j
73
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
i j
74
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
i j
75
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 8 7 4 6 9 5

10 11 12 13 14 15 16 17 18
i j
76
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 4 7 8 6 9 5

10 11 12 13 14 15 16 17 18
i j
77
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 4 7 8 6 9 5

10 11 12 13 14 15 16 17 18
i j
78
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 4 7 8 6 9 5

10 11 12 13 14 15 16 17 18
i j
79
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 4 7 8 6 9 5

10 11 12 13 14 15 16 17 18
i j
80
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 4 7 8 6 9 5

10 11 12 13 14 15 16 17 18
i j
81
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 4 7 8 6 9 5

10 11 12 13 14 15 16 17 18
i j
82
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 4 5 8 6 9 7

10 11 12 13 14 15 16 17 18
i j
83
Algorithm Place&Divide(A[],p,r)

PLACE&DIVIDE Function x=A[r]


i=p-1
for j=p to (r-1)
if A[j]<=x
x
i++
5 if(i≠j), A[i]↔A[j]
A[i+1]↔A[r]
return i+1 as index of division

p r

A 2 1 3 4 5 8 6 9 7

10 11 12 13 14 15 16 17 18
i j
84
QUICK SORT
Algorithm QuickSort(A[],p,r)
if p<r
q=Place&Divide(A,p,r)
QuickSort(A,p,q-1)
QuickSort(A,q+1,r)

p q-1 q q+1 r
A 2 1 3 4 5 8 6 9 7

85
Analysis of Place&Divide
Algorithm Place&Divide(A[],p,r)
x=A[r] S(n) = c
i=p-1
for j=p to (r-1)
= O(1)
if A[j]<=x
i++ T(n) = c+(n-1)c
if(i≠j), A[i]↔A[j]
= O(n)
A[i+1]↔A[r]
return i+1 as index of division

86
Analysis of QuickSort
Best Case
Algorithm QuickSort(A[],p,r) S(n) = O(1) + O(log2n)
if p<r = O(log2n)
q=Place&Divide(A,p,r)
QuickSort(A,p,q-1) T(n) = 2T(n/2) + O(n)
QuickSort(A,q+1,r) T(1) = c = O(1)
By Master’s Theorem
T(n) = O(n.log2n)
87
Analysis of QuickSort
Worst Case
Algorithm QuickSort(A[],p,r) S(n) = O(1) + O(n)
if p<r = O(n)
q=Place&Divide(A,p,r)
QuickSort(A,p,q-1) T(n) = T(n-1) + O(n)
QuickSort(A,q+1,r) T(1) = c = O(1)
By Back-Substitution Method
T(n) = O(n2)
88
Practice Question
■ Run Quick Sort Algorithm on the following Lists.

1 2 3 4 5 6 7 8 9 10

5 5 5 5 5 5 5 5 5 5

■ Remember that both the lists are already sorted.

89
Linear Time Sorting
■ The lower bound for Comparison based sorting algorithm (Merge
Sort, Heap Sort, Quick-Sort, etc.) is Ω(n.log n), i.e., they cannot do
better than n.log n.
■ However, we do have some non-comparison based algorithms,
that run faster than O(n log n) time but they require special
assumptions about the input sequence to be sort.
■ Examples of sorting algorithms that run in linear time are:
– Counting sort,
– Radix sort and
– Bucket sort
90
Counting Sort
■ Counting sort is a sorting technique that sorts the elements of an
array by counting the number of occurrences of each unique
element in the array.

■ It is very efficient when the range of values is very small.

■ For larger range of values, it is not so efficient.

91
Counting Sort
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1
New Array Temp[1…Range]
Adjust=1-Min(A[])
for i=L to U
Temp[A[i]+Adjust]++
i=L
for j=1 to Range
for k=1 to Temp[j]
A[i++]=j-Adjust
92
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
for k=1 to Temp[j]
A[i++]=j-Adjust

93
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
for k=1 to Temp[j]
A[i++]=j-Adjust

Range
5

94
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Range
Temp 0 0 0 0 0
5

95
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 0 0 0 0 0
2 5

96
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 0 0 0 0 0
2 5

97
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 0 0 1 0 0
2 5

98
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 0 0 1 0 0
2 5

99
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 0 1 1 0 0
2 5

100
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 0 1 1 0 0
2 5

101
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 0 1 1 0 1
2 5

102
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 0 1 1 0 1
2 5

103
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 0 1
2 5

104
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 0 1
2 5

105
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 0 2
2 5

106
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 0 2
2 5

107
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 1 2
2 5

108
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 1 2
2 5

109
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 1 2
2 5

110
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 1 2
2 5
j
111
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 1 2
2 5
j
112
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
―1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 1 2
2 5
j
113
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
―1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 1 2
2 5
j
114
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
―1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 1 2
2 5
j
115
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
―1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 1 2
2 5
j
116
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
―1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 1 2
2 5
j
117
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
―1 0 3 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 1 2
2 5
j
118
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
―1 0 1 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 1 2
2 5
j
119
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
―1 0 1 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 1 2
2 5
j
120
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
―1 0 1 ―1 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 1 2
2 5
j
121
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
―1 0 1 2 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 1 2
2 5
j
122
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
―1 0 1 2 3 2
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 1 2
2 5
j
123
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
―1 0 1 2 3 3
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 1 2
2 5
j
124
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
―1 0 1 2 3 3
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 1 2
2 5
j
125
Algorithm CountingSort(A[L…U])
Range=Max(A[])-Min(A[])+1

Counting Sort New Array Temp[1…Range]


Adjust=1-Min(A[])
L U for i=L to U

A
Temp[A[i]+Adjust]++
―1 0 1 2 3 3
i=L
for j=1 to Range
i
for k=1 to Temp[j]
A[i++]=j-Adjust

1 2 3 4 5
Adjust Range
Temp 1 1 1 1 2
2 5
j
126
Analysis of Counting Sort
Algorithm CountingSort(A[L…U])
S(n) = k + c
Range=Max(A[])-Min(A[])+1
New Array Temp[1…Range] = O(k)
Adjust=1-Min(A[])
for i=L to U
Temp[A[i]+Adjust]++
T(n) = c+nc+(k+n)+n
i=L = O(n+k)
for j=1 to Range
for k=1 to Temp[j]
A[i++]=j-Adjust

127
Radix Sort
■ Counting sort is a linear time sorting algorithm that sort in O(n+k)
time when elements are in range from 1 to k.
■ What if the elements are in range from 1 to n2?
■ We can’t use counting sort because counting sort will take O(n2)
which is worse than comparison based sorting algorithms.
■ Can we sort such an array in linear time?
■ Radix Sort will do so. The idea of Radix Sort is to do digit by digit
sort starting from least significant digit to most significant digit.
■ Radix sort uses counting sort as a subroutine to sort.
128
Radix Sort
182 182 005 005
523 192 523 083
005 523 233 182
233 233 182 192
192 083 192 233
083 005 083 523
129
Radix Sort
Algorithm RadixSort(A[L…U],d)
For d=LSD to MSD S(n) = 10 + c
CountingSort(A[]) w.r.t <d> = O(1)

T(n) = d * O(n+k)
= d * O(n+10)
= O(n.d)
= O(n)
130
131
DESIGN & ANALYSIS OF
ALGORITHMS

Lecture # 31, 32 & 33


TREES ADT,
BinARy SEARch TREE,
hEAp, AnALySiS OF hEAp SORT
IN S T R U C T O R : S u la m a n A h m a d N a z
Trees
■ A tree is a nonlinear abstract data type that stores elements in a
hierarchy.
■ Examples in real life:
– Family tree
– Table of contents of a book
– Class inheritance hierarchy in OOP
– Computer file system (folders and subfolders)
– Decision trees

2
Example: Computer File System

Root directory of C drive

Documents and Settings Program Files My Music

Desktop Favorites Start Menu Adobe Microsoft Office

3
Tree Definition
■ Tree: a set of elements that either

– it is empty

– or, it has a distinguished element called the root and


zero or more trees (called subtrees of the root)

4
Tree Definition
Root

Subtrees of
the root

5
Tree Terminology
■ Nodes: the elements in the tree

■ Edges: connections between nodes

■ Root: the distinguished element that is the origin of the tree


– There is only one root node in a tree

■ Empty tree has no nodes and no edges

6
Tree Terminology
edges Root

nodes
7
Tree Terminology

■ Parent or predecessor: the node directly above another node in


the hierarchy
– A node can have only one parent

■ Child or successor: a node directly below another node in the


hierarchy

■ Siblings: nodes that have the same parent

8
Tree Terminology
■ Ancestors of a node: its parent, the parent of its parent, etc.

■ Descendants of a node: its children, the children of its children,


etc.

■ Leaf node: a node without children

■ Internal node: a node that is not a leaf node

9
Tree Terminology
Interior or
internal Root
nodes

Leaf
nodes 10
Height of a Tree
■ A path is a sequence of edges leading from one node to another

■ Length of a path: number of edges on the path

■ Height of a (non-empty) tree : length of the longest path from the


root to a leaf
– What is the height of a tree that has only a root node?
– By convention, the height of an empty tree is -1

11
Level of a Node

■ Level of a node: number of edges between root and the node

■ It can be defined recursively:


– Level of root node is 0
– Level of a node that is not the root node is level of its parent + 1

12
Level of a Node
Level 0

Level 1

Level 2

Level 3

13
Subtrees
■ Subtree of a node: consists of a child node and all its
descendants:

– A subtree is itself a tree

– A node may have many subtrees

14
Subtrees

Subtrees of the
node labeled E

15
More Tree Terminology

■ Degree or arity of a node:


– the number of children it has

■ Degree or arity of a tree:


– the maximum of the degrees of the tree’s nodes

16
Binary Trees
■ General tree: a tree each of whose nodes may have any number
of children
■ n-ary tree: a tree each of whose nodes may have no more than n
children
■ Binary tree: a tree each of whose nodes may have no more than 2
children
– i.e. a binary tree is a tree with degree (arity) 2
■ The children (if present) are called the left child and right child

17
Binary Tree
A

B C

D E F G

H I

18
Array Based Implementation of BT
A
LeftChild(k)=k*2
RightChild(k)=k*2+1
B C
Parent(k)=k/2
D E F G

H I

Root
1 2 3 4 5 6 7 8 9 10 11 12

A B C D E F G Ø H Ø Ø I
19
Array Based Implementation of BT
A
LeftChild(k)=k*2
RightChild(k)=k*2+1
B C
Parent(k)=k/2
D E F G

H I

Root
1 2 3 4 5 6 7 8 9 10 11 12

A B C D Ø F G Ø H Ø Ø I
20
Array Based Implementation of BT
A
LeftChild(k)=k*2
RightChild(k)=k*2+1
B C
Parent(k)=k/2
D F G

H I

1 2 3 4 5 6 7 8 9 10 15

A B C D Ø F G Ø H Ø I
21
Linked List Based Implementation of BT
NULL Root
A Parent

Right
Left
A
B C

Parent Parent
E

Right
Right

Left
Left
B C
NULL NULL
NULL
Parent

Right
Left
E

NULL NULL
22
Tree Traversals
■ A traversal of a tree requires that each node of the tree be visited
once
– Example: a typical reason to traverse a tree is to display the data stored
at each node of the tree
■ Standard traversal orderings:
– Pre-order
– In-order
– Post-order
– Level-order

23
Traversals
A

B C

D E F G

H I

We’ll trace the different traversals using this tree; recursive calls,
returns, and “visits” will be numbered in the order they occur

24
Pre-order Traversal
■ Start at the root
■ Visit each node, followed by its children; we will choose to visit left
child before right

■ Recursive algorithm for preorder traversal:


– If tree is not empty,
■ Visit root node of tree
■ Perform preorder traversal of its left subtree
■ Perform preorder traversal of its right subtree

25
Pre-order Traversal
1: visit A
2 24
23 45
3: visit B 25: visit C
4 16 26 38
15 22 37 44
5: visit D 17: visit E 27: visit F 39: visit G

. . . . .
6 8 18 28
21 40
19 34 36 41 43
7 14 20 35 42
9: visit H 29: visit I

10
. .
11 12
13
30
. .
31 32
33

Nodes are visited in the order ABDHECFIG


26
In-order Traversal
■ Start at the root
■ Visit the left child of each node, then the node, then any
remaining nodes

■ Recursive algorithm for inorder traversal


– If tree is not empty,
■ Perform inorder traversal of left subtree of root
■ Visit root node of tree
■ Perform inorder traversal of its right subtree

27
In-order Traversal
23: visit A
1 24
22 45
14: visit B 37: visit C
2 15 25 38
13 21 36 44
5: visit D 18: visit E 33: visit F 41: visit G

. . . . .
3 6 16 26
20 39
17 32 35 40 43
4 12 19 34 42
9: visit H 29: visit I

7
. .
8 10
11
27
. .
28 30
31

Nodes are visited in the order DHBEAIFCG


28
Post-order Traversal
■ Start at the root
■ Visit the children of each node, then the node

■ Recursive algorithm for postorder traversal


– If tree is not empty,
■ Perform postorder traversal of left subtree of root
■ Perform postorder traversal of right subtree of root
■ Visit root node of tree

29
Post-order Traversal
45: visit A
1 23
22 44
21: visit B 43: visit C
2 14 24 36
13 20 35 42
12: visit D 19: visit E 34: visit F 41: visit G

. . . . .
3 5 15 25
18 37
16 31 33 38 40
4 11 17 32 39
10: visit H 30: visit I

6
. .
7 8
9
26
. .
27 28
29

Nodes are visited in the order HDEBIFGCA


30
Level Order Traversal

■ Start at the root

■ Visit the nodes at each level, from left to right

31
Level Order Traversal
A

B C

D E F G

H I

Nodes will be visited in the order ABCDEFGHI

32
Tree Traversals
■ Pre-order
Nodes are visited in the order ABDHECFIG

■ In-order
Nodes are visited in the order DHBEAIFCG

■ Post-order
Nodes are visited in the order HDEBIFGCA

■ Level-order
Nodes are visited in the order ABCDEFGHI

33
Traversal Analysis
■ Consider a binary tree with n nodes.

■ How many recursive calls are there at most?


– For each node, 2 recursive calls at most
– So, 2*n recursive calls at most

■ So, time taken for a complete traversal is O(n).


34
Binary Search Tree
■ A Binary Search Tree (BST) is a tree in
which all the nodes follow the below-
mentioned properties:

– The value of left child is less than the


value of its parent node.

– The value of right child is greater than or


equal to the value of its parent node.

35
Basic Operations

■ Following are the basic operations on a BST:

– Search − Searches an element in a BS tree.

– Insert − Inserts an element in a BS tree.

– Delete − Deletes an element in a BS tree.

– Traversal − Traverses a BS tree.

36
key=31

Search in BST
Root

37
key=31

Search in BST
Root
Current

38
key=31

Search in BST
Root

Current

39
key=31

Search in BST
Root

Current

40
key=15

Search in BST
Root

41
key=15

Search in BST
Root
Current

42
key=15

Search in BST
Root

Current

43
key=15

Search in BST
Root

Current

44
key=15

Search in BST
Root

Current=NULL
45
Insertion in BST

46
Insertion in BST

47
Insertion in BST

48
Insertion in BST

49
Insertion in BST

50
Insertion in BST

51
Insertion in BST

52
Insertion in BST

53
Insertion in BST

54
Insertion in BST

55
Insertion in BST

56
Insertion in BST

57
Insertion in BST

58
Insertion in BST

59
Insertion in BST

60
Insertion in BST

61
Insertion in BST

62
Insertion in BST

63
Insertion in BST

64
Insertion in BST

65
Insertion in BST

66
Insertion in BST

67
Insertion in BST

68
Insertion in BST

69
Insertion in BST

70
Insertion in BST

71
Insertion in BST

72
Insertion in BST

73
Insertion in BST

74
Insertion in BST

75
Insertion in BST

76
Insertion in BST

77
Insertion in BST

78
Insertion in BST

79
Insertion in BST

80
Insertion in BST

81
Insertion in BST

82
Insertion in BST

83
Insertion in BST

84
Insertion in BST

85
Insertion in BST

86
Insertion in BST

87
Insertion in BST

88
Insertion in BST

89
Insertion in BST

90
Insertion in BST

91
Insertion in BST

92
Insertion in BST

93
Insertion in BST

94
Deletion in BST

95
Deletion in BST

96
Deletion in BST

97
Deletion in BST

98
Deletion in BST

99
Deletion in BST

100
Deletion in BST

101
Deletion in BST

102
Deletion in BST

103
Deletion in BST

104
Deletion in BST

105
Deletion in BST

106
Deletion in BST

107
Deletion in BST

108
Deletion in BST

109
Deletion in BST

110
Deletion in BST

111
Deletion in BST

112
Deletion in BST

113
Deletion in BST

114
Deletion in BST

115
1

Unbalanced BSTs
5

■ Suppose we have the list:


7

10
■ After insertion in BST, we get →
■ To search 20, we will have to make “n” comparisons.
14
■ Height of the tree is (n-1), so T(n)=O(n)
■ Thus, it will be equivalent to linear search. 20
116
Balanced BSTs 10

7 14
■ If we re-arrange the same list:
8 20
1

■ Now, after insertion in BST, we get →


■ To search any value, we will have to make “log2n” comparisons.
■ Height of the tree is ≈log2n, so T(n)=O(log2n)
■ Thus, it will be equivalent to binary search.
117
Types of Balanced Trees
■ Now we understand why binary search trees need to be balanced
& what happens when we have a tree that’s not balanced?
■ There are a number of algorithms that can be used to remedy this
issue. Some of the most popular algorithms are:
– Red black trees
– AVL trees
■ Both of these force a BST to remain balanced, and therefore can
guarantee search performance.

118
Complexity in BST

119
Heap
■ Heap is an almost complete binary tree data structure where the
root-node key is compared with its children and arranged
accordingly.
■ As the value of parent is greater than that of child, this property
generates Max Heap. Based on this criteria, a heap can be of two
types:
– Min Heap
– Max Heap
■ For the sake on convenience, we will implement heap using arrays.

120
Complete vs. Almost Complete BT

121
Incomplete vs. Almost Complete BT

122
Min-Heap
For Input → 35 33 42 10 14 19 27 44 26 31
The value of any parent node is always less than or equal to either of
its children.

123
Max-Heap
For Input → 35 33 42 10 14 19 27 44 26 31
The value of any parent node is always greater than or equal to either
of its children.

124
Some Properties of Heap
■ As heap is a complete or almost complete BT, here are some
properties for your convenience
– Max number of nodes in a CBT of height, h: n = 2h+1-1
– Height, h, of CBT/ACBT with n nodes: h = log2n
– Number of Leaf nodes in CBT/ACBT: nleaf= n/2+1 to n
– Max. no. of nodes at height, h, in a CBT/ACBT: nh = n/2h+1

125
MAXHEAPIFY Function
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
if l<=A.HeapSize & A[l]>A[i]
largest=l
else largest=i
if r<=A.HeapSize & A[r]>A[largest]
largest=r
if max≠i
A[i]↔A[largest]
MaxHeapify(A[],largest) 126
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
max=l
else max=i
if r<=A.HeapSize & A[r]>A[max]
max=r
if max≠i
A[i]↔A[max]
MaxHeapify(A[],max)

A 1 5 9 2 8 7 3 4

1 2 3 4 5 6 7 8

127
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
i max=l
1
else max=i
if r<=A.HeapSize & A[r]>A[max]
5 9 max=r
if max≠i

2 8 7 3 A[i]↔A[max]
MaxHeapify(A[],max)

A 1 5 9 2 8 7 3 4

1 2 3 4 5 6 7 8

128
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
i max=l
1
else max=i
l r if r<=A.HeapSize & A[r]>A[max]
5 9 max=r
if max≠i

2 8 7 3 A[i]↔A[max]
MaxHeapify(A[],max)

A 1 5 9 2 8 7 3 4

1 2 3 4 5 6 7 8

129
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
i max=l
1
else max=i
l r if r<=A.HeapSize & A[r]>A[max]
5 9 max=r
if max≠i

2 8 7 3 A[i]↔A[max]
MaxHeapify(A[],max)

A 1 5 9 2 8 7 3 4

1 2 3 4 5 6 7 8

130
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
i max=l
1
else max=i
l r if r<=A.HeapSize & A[r]>A[max]
max
5 9 max=r
if max≠i

2 8 7 3 A[i]↔A[max]
MaxHeapify(A[],max)

A 1 5 9 2 8 7 3 4

1 2 3 4 5 6 7 8

131
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
i max=l
1
else max=i
l r if r<=A.HeapSize & A[r]>A[max]
max
5 9 max=r
if max≠i

2 8 7 3 A[i]↔A[max]
MaxHeapify(A[],max)

A 1 5 9 2 8 7 3 4

1 2 3 4 5 6 7 8

132
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
i max=l
1
else max=i
l r max if r<=A.HeapSize & A[r]>A[max]
5 9 max=r
if max≠i

2 8 7 3 A[i]↔A[max]
MaxHeapify(A[],max)

A 1 5 9 2 8 7 3 4

1 2 3 4 5 6 7 8

133
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
i max=l
1
else max=i
l r max if r<=A.HeapSize & A[r]>A[max]
5 9 max=r
if max≠i

2 8 7 3 A[i]↔A[max]
MaxHeapify(A[],max)

A 1 5 9 2 8 7 3 4

1 2 3 4 5 6 7 8

134
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
i max=l
9
else max=i
l r max if r<=A.HeapSize & A[r]>A[max]
5 1 max=r
if max≠i

2 8 7 3 A[i]↔A[max]
MaxHeapify(A[],max)

A 9 5 1 2 8 7 3 4

1 2 3 4 5 6 7 8

135
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
max=l
9
else max=i
l r max if r<=A.HeapSize & A[r]>A[max]
5 i 1 max=r
if max≠i

2 8 7 3 A[i]↔A[max]
MaxHeapify(A[],max)

A 9 5 1 2 8 7 3 4

1 2 3 4 5 6 7 8

136
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
max=l
9
else max=i
max if r<=A.HeapSize & A[r]>A[max]
5 i 1 max=r
if max≠i
l r
2 8 7 3 A[i]↔A[max]
MaxHeapify(A[],max)

A 9 5 1 2 8 7 3 4

1 2 3 4 5 6 7 8

137
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
max=l
9
else max=i
max if r<=A.HeapSize & A[r]>A[max]
5 i 1 max=r
if max≠i
l r
2 8 7 3 A[i]↔A[max]
MaxHeapify(A[],max)

A 9 5 1 2 8 7 3 4

1 2 3 4 5 6 7 8

138
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
max=l
9
else max=i
if r<=A.HeapSize & A[r]>A[max]
5 i 1 max=r
if max≠i
l r
2 8 7 3 A[i]↔A[max]
MaxHeapify(A[],max)
max
4

A 9 5 1 2 8 7 3 4

1 2 3 4 5 6 7 8

139
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
max=l
9
else max=i
if r<=A.HeapSize & A[r]>A[max]
5 i 1 max=r
if max≠i
l r
2 8 7 3 A[i]↔A[max]
MaxHeapify(A[],max)
max
4

A 9 5 1 2 8 7 3 4

1 2 3 4 5 6 7 8

140
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
max=l
9
else max=i
if r<=A.HeapSize & A[r]>A[max]
5 i 1 max=r
if max≠i
l r
2 8 7 3 A[i]↔A[max]
MaxHeapify(A[],max)
max
4

A 9 5 1 2 8 7 3 4

1 2 3 4 5 6 7 8

141
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
max=l
9
else max=i
if r<=A.HeapSize & A[r]>A[max]
5 i 7 max=r
if max≠i
l r
2 8 1 3 A[i]↔A[max]
MaxHeapify(A[],max)
max
4

A 9 5 7 2 8 1 3 4

1 2 3 4 5 6 7 8

142
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
max=l
9
else max=i
if r<=A.HeapSize & A[r]>A[max]
5 7 max=r
if max≠i
l i r
2 8 1 3 A[i]↔A[max]
MaxHeapify(A[],max)
max
4

A 7 5 7 2 8 1 3 4

1 2 3 4 5 6 7 8

143
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
max=l
9
else max=i
if r<=A.HeapSize & A[r]>A[max]
5 7 max=r

i if max≠i

2 8 1 3 A[i]↔A[max]
MaxHeapify(A[],max)
max
4

l r
A 9 5 7 2 8 1 3 4

1 2 3 4 5 6 7 8

144
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
max=l
9
else max=i
if r<=A.HeapSize & A[r]>A[max]
5 7 max=r

i if max≠i

2 8 1 3 A[i]↔A[max]
MaxHeapify(A[],max)
max
4

l r
A 9 5 7 2 8 1 3 4

1 2 3 4 5 6 7 8

145
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
max=l
9
else max=i
if r<=A.HeapSize & A[r]>A[max]
5 7 max=r

i if max≠i

2 8 1 3 A[i]↔A[max]
MaxHeapify(A[],max)
max
4

l r
A 9 5 7 2 8 1 3 4

1 2 3 4 5 6 7 8

146
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
max=l
9
else max=i
if r<=A.HeapSize & A[r]>A[max]
5 7 max=r

i if max≠i

2 8 1 3 A[i]↔A[max]
MaxHeapify(A[],max)
max
4

l r
A 9 5 7 2 8 1 3 4

1 2 3 4 5 6 7 8

147
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
max=l
9
else max=i
if r<=A.HeapSize & A[r]>A[max]
5 7 max=r

i if max≠i

2 8 1 3 A[i]↔A[max]
MaxHeapify(A[],max)
max
4

l r
A 9 5 7 2 8 1 3 4

1 2 3 4 5 6 7 8

148
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
MAXHEAPIFY Function if l<=A.HeapSize & A[l]>A[i]
max=l
9
else max=i
if r<=A.HeapSize & A[r]>A[max]
5 7 max=r

i if max≠i

2 8 1 3 A[i]↔A[max]
MaxHeapify(A[],max)
max
4

l r
A 9 5 7 2 8 1 3 4

1 2 3 4 5 6 7 8

149
BUILDMAXHEAP Function
■ If we call MaxHeapify() for each node, we can build our Max Heap
from ordinary array.
■ But there is no need to call it on the leaf nodes, as each leaf node
already satisfies the property of Max Heap.
■ We also know that,
– Number of Leaf nodes in CBT/ACBT: nleaf= n/2+1 to n
■ So, we just need to call MaxHeapify() for all the non-leaf nodes,
i.e. from 1 to n/2.
■ And we will call this function from n/2 to 1,so that as we move
above, all the lower nodes already satisfy the Max Heap property.
150
BUILDMAXHEAP Function
Algorithm BuildMaxHeap(A[])
for i=n/2 to 1 9
MaxHeapify(A[],i)
5 7

2 8 1 3

151
Algorithm BuildMaxHeap(A[])
for i=n/2 to 1
BUILDMAXHEAP Function
MaxHeapify(A[],i)
1

5 9

2 8 7 3

A 1 5 9 2 8 7 3 4

1 2 3 4 5 6 7 8

152
Algorithm BuildMaxHeap(A[])
for i=n/2 to 1
BUILDMAXHEAP Function
MaxHeapify(A[],i)
1

5 9

i
2 8 7 3

A 1 5 9 2 8 7 3 4

1 2 3 4 5 6 7 8
i
153
Algorithm BuildMaxHeap(A[])
for i=n/2 to 1
BUILDMAXHEAP Function
MaxHeapify(A[],i)
1

5 9

i
4 8 7 3

A 1 5 9 4 8 7 3 2

1 2 3 4 5 6 7 8
i
154
Algorithm BuildMaxHeap(A[])
for i=n/2 to 1
BUILDMAXHEAP Function
MaxHeapify(A[],i)
1

i
5 9

4 8 7 3

A 1 5 9 4 8 7 3 2

1 2 3 4 5 6 7 8
i
155
Algorithm BuildMaxHeap(A[])
for i=n/2 to 1
BUILDMAXHEAP Function
MaxHeapify(A[],i)
1

i
5 9

4 8 7 3

A 1 5 9 4 8 7 3 2

1 2 3 4 5 6 7 8
i
156
Algorithm BuildMaxHeap(A[])
for i=n/2 to 1
BUILDMAXHEAP Function
MaxHeapify(A[],i)
1

i
5 9

4 8 7 3

A 1 5 9 4 8 7 3 2

1 2 3 4 5 6 7 8
i
157
Algorithm BuildMaxHeap(A[])
for i=n/2 to 1
BUILDMAXHEAP Function
MaxHeapify(A[],i)
1

i
8 9

4 5 7 3

A 1 8 9 4 5 7 3 2

1 2 3 4 5 6 7 8
i
158
Algorithm BuildMaxHeap(A[])
for i=n/2 to 1
BUILDMAXHEAP Function
MaxHeapify(A[],i)
1
i

8 9

4 5 7 3

A 1 8 9 4 5 7 3 2

1 2 3 4 5 6 7 8
i
159
Algorithm BuildMaxHeap(A[])
for i=n/2 to 1
BUILDMAXHEAP Function
MaxHeapify(A[],i)
9
i

8 1

4 5 7 3

A 9 8 1 4 5 7 3 2

1 2 3 4 5 6 7 8
i
160
Algorithm BuildMaxHeap(A[])
for i=n/2 to 1
BUILDMAXHEAP Function
MaxHeapify(A[],i)
9
i

8 1

4 5 7 3

A 9 8 1 4 5 7 3 2

1 2 3 4 5 6 7 8
i
161
Algorithm BuildMaxHeap(A[])
for i=n/2 to 1
BUILDMAXHEAP Function
MaxHeapify(A[],i)
9
i

8 7

4 5 1 3

A 9 8 7 4 5 1 3 2

1 2 3 4 5 6 7 8
i
162
EXTRACTMAXFROMHEAP Function
Algorithm ExtractMaxFromHeap(A[])
max=A[1] 9
A[1]=A[A.HeapSize--]
MaxHeapify(A[],1) 8 7

return max
4 5 1 3

163
EXTRACTMAXFROMHEAP Function
Algorithm ExtractMaxFromHeap(A[])
9 max=A[1]
A[1]=A[A.HeapSize - -]

8 7 MaxHeapify(A[],1)
return max

4 5 1 3

A 9 8 7 4 5 1 3 2

1 2 3 4 5 6 7 8

164
EXTRACTMAXFROMHEAP Function
Algorithm ExtractMaxFromHeap(A[])
9 max=A[1]
A[1]=A[A.HeapSize - -]

8 7 MaxHeapify(A[],1)
return max

4 5 1 3
max 9 A.HeapSize = 8

A 9 8 7 4 5 1 3 2

1 2 3 4 5 6 7 8

165
EXTRACTMAXFROMHEAP Function
Algorithm ExtractMaxFromHeap(A[])
2 max=A[1]
A[1]=A[A.HeapSize - -]

8 7 MaxHeapify(A[],1)
return max

4 5 1 3
max 9 A.HeapSize = 7

A 2 8 7 4 5 1 3 2

1 2 3 4 5 6 7 8

166
EXTRACTMAXFROMHEAP Function
Algorithm ExtractMaxFromHeap(A[])
2 max=A[1]
A[1]=A[A.HeapSize - -]

8 7 MaxHeapify(A[],1)
return max

4 5 1 3
max 9 A.HeapSize = 7

A 2 8 7 4 5 1 3

1 2 3 4 5 6 7

167
EXTRACTMAXFROMHEAP Function
Algorithm ExtractMaxFromHeap(A[])
2 max=A[1]
A[1]=A[A.HeapSize - -]

8 7 MaxHeapify(A[],1)
return max

4 5 1 3
max 9 A.HeapSize = 7

A 2 8 7 4 5 1 3

1 2 3 4 5 6 7

168
EXTRACTMAXFROMHEAP Function
Algorithm ExtractMaxFromHeap(A[])
2 max=A[1]
A[1]=A[A.HeapSize - -]

8 7 MaxHeapify(A[],1)
return max

4 5 1 3
max 9 A.HeapSize = 7

A 2 8 7 4 5 1 3

1 2 3 4 5 6 7

169
EXTRACTMAXFROMHEAP Function
Algorithm ExtractMaxFromHeap(A[])
8 max=A[1]
A[1]=A[A.HeapSize - -]

2 7 MaxHeapify(A[],1)
return max

4 5 1 3
max 9 A.HeapSize = 7

A 8 2 7 4 5 1 3

1 2 3 4 5 6 7

170
EXTRACTMAXFROMHEAP Function
Algorithm ExtractMaxFromHeap(A[])
8 max=A[1]
A[1]=A[A.HeapSize - -]

2 7 MaxHeapify(A[],1)
return max

4 5 1 3
max 9 A.HeapSize = 7

A 8 2 7 4 5 1 3

1 2 3 4 5 6 7

171
EXTRACTMAXFROMHEAP Function
Algorithm ExtractMaxFromHeap(A[])
8 max=A[1]
A[1]=A[A.HeapSize - -]

5 7 MaxHeapify(A[],1)
return max

4 2 1 3
max 9 A.HeapSize = 7

A 8 5 7 4 2 1 3

1 2 3 4 5 6 7

172
EXTRACTMAXFROMHEAP Function
Algorithm ExtractMaxFromHeap(A[])
8 max=A[1]
A[1]=A[A.HeapSize - -]

5 7 MaxHeapify(A[],1)
return max

4 2 1 3
max 9 A.HeapSize = 7

A 8 5 7 4 2 1 3

1 2 3 4 5 6 7

173
EXTRACTMAXFROMHEAP Function
Algorithm ExtractMaxFromHeap(A[])
8 max=A[1]
A[1]=A[A.HeapSize - -]

5 7 MaxHeapify(A[],1)
return max

4 2 1 3
max 9 A.HeapSize = 7

A 8 5 7 4 2 1 3

1 2 3 4 5 6 7

174
EXTRACTMAXFROMHEAP Function
Algorithm ExtractMaxFromHeap(A[])
8 max=A[1]
A[1]=A[A.HeapSize - -]

5 7 MaxHeapify(A[],1)
return max

4 2 1 3
max 9 A.HeapSize = 7

A 8 5 7 4 2 1 3

1 2 3 4 5 6 7

175
HeapSort
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
New Array Temp[1…A.Size]
for i=A.Size to 1
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]

176
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
1
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]
5 9

2 8 7 3

A 1 5 9 2 8 7 3 4

1 2 3 4 5 6 7 8

177
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
9
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]
8 7

4 5 1 3

A 9 8 7 4 5 1 3 2

1 2 3 4 5 6 7 8

178
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
9
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]
8 7

4 5 1 3

2 A 9 8 7 4 5 1 3 2

1 2 3 4 5 6 7 8

Temp
179
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
9
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]
8 7

4 5 1 3

2 A 9 8 7 4 5 1 3 2

1 2 3 4 5 6 7 8

Temp
i
180
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
8
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]
5 7

4 2 1 3

A 8 5 7 4 2 1 3 2

1 2 3 4 5 6 7 8

Temp 9
i
181
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
8
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]
5 7

4 2 1 3

A 8 5 7 4 2 1 3 2

1 2 3 4 5 6 7 8

Temp 9
i
182
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
7
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]
5 3

4 2 1

A 7 5 3 4 2 1 3 2

1 2 3 4 5 6 7 8

Temp 8 9
i
183
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
7
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]
5 3

4 2 1

A 7 5 3 4 2 1 3 2

1 2 3 4 5 6 7 8

Temp 8 9
i
184
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
5
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]
4 3

1 2

A 5 4 3 1 2 1 3 2

1 2 3 4 5 6 7 8

Temp 7 8 9
i
185
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
5
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]
4 3

1 2

A 5 4 3 1 2 1 3 2

1 2 3 4 5 6 7 8

Temp 7 8 9
i
186
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
4
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]
2 3

A 4 2 3 1 2 1 3 2

1 2 3 4 5 6 7 8

Temp 5 7 8 9
i
187
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
4
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]
2 3

A 4 2 3 1 2 1 3 2

1 2 3 4 5 6 7 8

Temp 5 7 8 9
i
188
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
3
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]
2 1

A 3 2 1 1 2 1 3 2

1 2 3 4 5 6 7 8

Temp 4 5 7 8 9
i
189
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
3
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]
2 1

A 3 2 1 1 2 1 3 2

1 2 3 4 5 6 7 8

Temp 4 5 7 8 9
i
190
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
2
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]
1

A 2 1 1 1 2 1 3 2

1 2 3 4 5 6 7 8

Temp 3 4 5 7 8 9
i
191
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
2
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]
1

A 2 1 1 1 2 1 3 2

1 2 3 4 5 6 7 8

Temp 3 4 5 7 8 9
i
192
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
1
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]

A 1 1 1 1 2 1 3 2

1 2 3 4 5 6 7 8

Temp 2 3 4 5 7 8 9
i
193
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
1
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]

A 1 1 1 1 2 1 3 2

1 2 3 4 5 6 7 8

Temp 2 3 4 5 7 8 9
i
194
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]

A 1 1 1 1 2 1 3 2

1 2 3 4 5 6 7 8

Temp 1 2 3 4 5 7 8 9
i
195
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]

A 1 1 1 1 2 1 3 2

1 2 3 4 5 6 7 8

Temp 1 2 3 4 5 7 8 9
i
196
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
HeapSort New Array Temp[1…A.Size]
for i=A.Size to 1
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]

A 1 2 3 4 5 7 8 9

1 2 3 4 5 6 7 8

Temp 1 2 3 4 5 7 8 9
i
197
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
Analysis of Heap Sort New Array Temp[1…A.Size]
for i=A.Size to 1
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]

S(n) = SBuildMaxHeap + n + 1 + SExtractMaxFromHeap

T(n) = TBuildMaxHeap + n(c+TExtractMaxFromHeap) + O(n)

198
Algorithm ExtractMaxFromHeap(A[])
max=A[1]
Analysis of A[1]=A[A.HeapSize - -]

ExtractMaxFromHeap() MaxHeapify(A[],1)
return max

SExtractMaxFromHeap(n) = c + SMaxHeapify

TExtractMaxFromHeap(n) = c + TMaxHeapify

199
Algorithm BuildMaxHeap(A[])
for i=n/2 to 1
Analysis of MaxHeapify(A[],i)
BuildMaxHeap()

SBuildMaxHeap(n) = c + SMaxHeapify

TBuildMaxHeap(n) = c + (TMaxHeapify)*(n/2)

200
Some Properties of Heap
■ As heap is a complete or almost complete BT, here are some
properties for your convenience
– Max number of nodes in a CBT of height, h: n = 2h+1-1
– Height, h, of CBT/ACBT with n nodes: h = log2n
– Number of Leaf nodes in CBT/ACBT: nleaf= n/2+1 to n
– Max. no. of nodes at height, h, in a CBT/ACBT: nh = n/2h+1

201
Algorithm MaxHeapify(A[],i)
l=2*i, r=2*i+1
Analysis of if l<=A.HeapSize & A[l]>A[i]

MaxHeapify() max=l
else max=i
if r<=A.HeapSize & A[r]>A[max]
max=r
SMaxHeapify(n) = c + k.log2n if max≠i

= O(log2n) A[i]↔A[max]
MaxHeapify(A[],max)

TMaxHeapify(n) = c.log2n
= O(log2n)
202
Algorithm BuildMaxHeap(A[])
for i=n/2 to 1
Analysis of MaxHeapify(A[],i)
BuildMaxHeap()

SBuildMaxHeap(n) = c + O(log2n)
= O(log2n)

TBuildMaxHeap(n)*= c + O(log2n)*(n/2)
= O(n.log2n)
*Here actually, T(n) = O(n). But, lets consider a higher upper bound as it will not affect the final answer. 203
Algorithm ExtractMaxFromHeap(A[])
max=A[1]
Analysis of A[1]=A[A.HeapSize - -]

ExtractMaxFromHeap() MaxHeapify(A[],1)
return max

SExtractMaxFromHeap(n) = c + O(log2n)
= O(log2n)

TExtractMaxFromHeap(n) = c + O(log2n)
= O(log2n)
204
Algorithm HeapSort(A[])
BuildMaxHeap(A[])
Analysis of Heap Sort New Array Temp[1…A.Size]
for i=A.Size to 1
Temp[i]=ExtractMaxFromHeap(A[])
Copy Temp[] to A[]

S(n) = O(log2n) + n + 1 + O(log2n)


= O(n)

T(n) = O(n.log2n) + n(c+O(log2n)) + O(n)


= O(n.log2n)
205
Online Quiz #2
■ Graded Quiz
■ 50 Minutes Quiz
■ Lecture 19 to 33

■ Date: 14 Jan, 2021


■ Time: Within Class hours

206
207
DESIGN & ANALYSIS OF
ALGORITHMS

Lecture # 34 & 35
BASIC GRAPH ALGORITHMS
& SPANNING TREES
IN S T R U C T O R : S u la m a n A h m a d N a z
The Graph ADT

■ The Graph Abstract Data Type


– Graph Representations
■ Elementary Graph Operations
– Depth First and Breadth First Search
– Spanning Tree
■ Minimum Cost Spanning Trees
– Kruskal’s, Prim’s and Sollin’s Algorithm

2
The Graph ADT
■ Definitions
– A graph G consists of two sets
■ a finite, nonempty set of vertices V(G)
■ a finite, possible empty set of edges E(G)
– G(V,E) represents a graph
– An undirected graph is one in which the pair of vertices
in an edge is unordered, (v0, v1) = (v1,v0)
– A directed graph is one in which each edge is a
directed pair of vertices, <v0, v1> != <v1, v0>

tail head
3
The Graph ADT
■ Examples for Graph
– complete undirected graph: n(n-1)/2 edges
– complete directed graph: n(n-1) edges
0
0 0 0

1 2 1
1 2 1 2
3 3 4 5 6 2
3
G1 G2 G3
complete graph incomplete graph
V(G1)={0,1,2,3} E(G1)={(0,1),(0,2),(0,3),(1,2),(1,3),(2,3)}
V(G2)={0,1,2,3,4,5,6} E(G2)={(0,1),(0,2),(1,3),(1,4),(2,5),(2,6)}
V(G3)={0,1,2} E(G3)={<0,1>,<1,0>,<1,2>}
4
The Graph ADT
■ Restrictions on graphs
– A graph may not have an edge from a vertex, i, back to
itself. Such edges are known as self loops
– A graph may not have multiple occurrences of the
same edge. If we remove this restriction, we obtain a
data referred to as a multigraph

feedback
loops multigraph

5
The Graph ADT
■ Adjacent and Incident
■ If (v0, v1) is an edge in an undirected graph,
– v0 and v1 are adjacent
– The edge (v0, v1) is incident on vertices v0 and v1

V0 V1

■ If <v0, v1> is an edge in a directed graph


– v0 is adjacent to v1, and v1 is adjacent from v0
– The edge <v0, v1> is incident from v0 on v1

V0 V1
6
The Graph ADT
■ A subgraph of G is a graph G’ such that V(G’)  V(G)
and E(G’)  E(G).
0

1 2
3
G1 0

2
G3 7
The Graph ADT
■ Path
– A path from vertex vp to vertex vq in a graph G, is a
sequence of vertices, vp, vi1, vi2, ..., vin, vq, such that (vp,
vi1), (vi1, vi2), ..., (vin, vq) are edges in an undirected graph.
■ A path such as (0, 2), (2, 1), (1, 3) is also written as 0, 2, 1, 3

– The length of a path is the number of edges on it

0 0 0

1 2 1 2 1 2
3 3 3
8
The Graph ADT
■ Simple path and cycle
– simple path (simple directed path): a path in which all
vertices, except possibly the first and the last, are
distinct.
– A cycle is a simple path in which any vertex is repeated.

1 2
3
9
The Graph ADT
■ Connected graph
– In an undirected graph G, two vertices, v and v , are
0 1

connected if there is a path in G from v to v


0 1

– An undirected graph is connected if, for every pair of


distinct vertices v , v , there is a path from v to v
i j i j

■ Connected component
– A connected component of an undirected graph is a
maximal connected
subgraph.
– A tree is a graph
that is connected
and acyclic (i.e,
has no cycle).

10
The Graph ADT
■ Strongly Connected Graph/Component
– A directed graph is strongly connected if there is a
directed path from vi to vj and also from vj to vi
– A strongly connected component is a maximal
subgraph that is strongly connected

0 strongly connected component


(maximal strongly connected subgraphs)

0
1

2
G3 not strongly connected 1 2
11
The Graph ADT
■ Degree (undirected graph)
– The degree of a vertex is the number of edges incident to
that vertex.
■ For directed graph
– in-degree (v) : the number of edges that have v as the head
– out-degree (v) : the number of edges that have v as the tail
■ If di is the degree of a vertex i in a graph G with n
vertices and e edges, the number of edges is
n 1

e  ( di ) / 2
0

12
The Graph ADT
■ We shall refer to a directed graph as a digraph.
When we us the term graph, we assume that it is
an undirected graph
digraph
undirected graph in-degree & out-degree
degree 2
3
0 in:1, out: 1
0
0 3 3
1 2 1 in: 1, out: 2
3 1 2 3
1 1 1 1
3 3 4 5 6 2 in: 1, out: 0
3
G1 G2 G3 13
The Graph ADT

14
Graph Representations
■ Adjacency Matrix

15
Graph Representations 4

■ Adjacency Matrix 5
– Let G = (V,E) be a graph with n vertices. 6
– The adjacency matrix of G is a two-dimensional
7
n x n array, say adj_mat
0
– If the edge (vi, vj) is(not) in E(G), adj_mat[i][j]=(0)
2 1
– The adjacency matrix for an undirected graph is
symmetric; the adjacency matrix for a digraph 3 G4
need not be symmetric
0 1 1 0 0 0 0 0
0 1

0 0 1 0 0 0 0

0 0 1 1 1 1 0 0 1 0 0 0 0
1  
0 1 1  0 1 0
0 1 1 0 0 0 0 0
  1 0 1 
1 2 1 1 0 1 1   0 0 0 0 0 1 0 0
  0 0 0  
3 1 1 1 0 0 0 0 0 1 0 1 0
0 0 0 0 0 1 0 1
G1 2  
G3 0 0 0 0 0 0 1 0 16
0
Graph Representations 1 2
■ Merits of Adjacency Matrix 3
0 1 1 1
– For an undirected graph, the degree of any 1 0 1 1
 
vertex, i, is its row sum: 1 1 0 1
n 1  
1 1 1 0
 adj _ mat[i ][ j ]
j 0

– For a directed graph, the row sum is the out- 0


degree, while the column sum is the in-degree.
 0 1 0
1 0 1 
n 1 n 1 1  
ind (vi )   A[ j , i ] outd ( vi )   A[i , j ] 0 0 0
j 0 j 0

2
17
Graph Representations
■ Adjacency Lists

18
Graph Representations
■ Adjacency lists

– There is one list for each vertex in G.


The nodes in list i represent the
vertices that are adjacent from vertex i

– For an undirected graph with n


vertices and e edges, this
representation requires n head nodes
and 2e list nodes

19
Graph Representations G3
0 4
0 2 1 5
3 6
0
G2 1 7
1 2
G1 3 2

20
0

Graph Representations 1 2
3

■ Interesting Operations
– degree of a vertex in an undirected graph
■ # of nodes in adjacency list
– # of edges in a graph
■ determined in O(n+e)
0
– out-degree of a vertex in a directed graph
■ # of nodes in its adjacency list
– in-degree of a vertex in a directed graph 1
■ traverse the whole data structure

2
21
Graph Representations
■ Vertices in Any Order
Order is of no significance
Head nodes vertex links

0  3  1  2 NULL
0
1  2  0  3 NULL
1 2
2  3  0  1 NULL
3
3  2  1  0 NULL

22
Graph Representations

23
Graph Representations

■ Weighted edges
– The edges of a graph have weights assigned to them.
– These weights may represent as
■ the distance from one vertex to another
■ cost of going from one vertex to an adjacent vertex.
– adjacency matrix: adj_mat[i][j] would keep the weights.
– adjacency lists: add a weight field to the node structure.
– A graph with weighted edges is called a network.

24
Graph Representations

25
Graph Operations

■ Traversal:
Given G=(V,E) and vertex v, find all wV, such that
w connects v
– Depth First Search (DFS): preorder traversal
– Breadth First Search (BFS): level order traversal

■ Spanning Trees

26
Depth First Search
Algorithm DepthFirstSearch(Vertex v)
if v is unvisited
push v
print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)
27
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
v 5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

OUTPUT:
28
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
v 5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

5
OUTPUT: 5
29
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
v 5 print v
mark v as visited
u=next unvisited adjacent node of v
u 3 1 9
if u=NULL
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

5
OUTPUT: 5
30
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
v 5 print v
mark v as visited
u=next unvisited adjacent node of v
u 3 1 9
if u=NULL
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

5
OUTPUT: 5
31
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited

v u=next unvisited adjacent node of v


u 3 1 9
if u=NULL
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

5
OUTPUT: 5
32
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited

v u=next unvisited adjacent node of v


u 3 1 9
if u=NULL
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0
3

5
OUTPUT: 5 3
33
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited

v u=next unvisited adjacent node of v


if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

u
8 2 0
3

5
OUTPUT: 5 3
34
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited

v u=next unvisited adjacent node of v


if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

u
8 2 0
3

5
OUTPUT: 5 3
35
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

u v
8 2 0
3

5
OUTPUT: 5 3
36
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 u v
8 2 0
3

5
OUTPUT: 5 3 8
37
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 v
8 2 0
3

u=NULL OUTPUT: 5 3 8
38
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited

u u=next unvisited adjacent node of v


if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

v
8 2 0
3

5
OUTPUT: 5 3 8
39
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited

u u=next unvisited adjacent node of v


if u=NULL
3 1 9
pop() & u=peak()
v if u≠NULL, DepthFirstSearch(u)

8 2 0
3

5
OUTPUT: 5 3 8
40
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited

u u=next unvisited adjacent node of v


if u=NULL
3 1 9
pop() & u=peak()
v if u≠NULL, DepthFirstSearch(u)

8 2 0
3

5
OUTPUT: 5 3 8
41
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
v if u≠NULL, DepthFirstSearch(u)

8 2 0
3

5
u=NULL OUTPUT: 5 3 8
42
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 u print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
v if u≠NULL, DepthFirstSearch(u)

8 2 0

5
OUTPUT: 5 3 8
43
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited

v push v
5 u print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

5
OUTPUT: 5 3 8
44
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited

v push v
5 u print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

5
OUTPUT: 5 3 8
45
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited

v push v
print v
5
mark v as visited
u=next unvisited adjacent node of v
u if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

5
OUTPUT: 5 3 8
46
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited

v push v
print v
5
mark v as visited
u=next unvisited adjacent node of v
u if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

5
OUTPUT: 5 3 8
47
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
v u if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

5
OUTPUT: 5 3 8
48
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
v u if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0
1

5
OUTPUT: 5 3 8 1
49
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
v if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0
1

5
u=NULL OUTPUT: 5 3 8 1
50
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5
u print v
mark v as visited
u=next unvisited adjacent node of v
v if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

5
OUTPUT: 5 3 8 1
51
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited

v u
push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

5
OUTPUT: 5 3 8 1
52
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited

v u
push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

5
OUTPUT: 5 3 8 1
53
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited

v push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
u if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

5
OUTPUT: 5 3 8 1
54
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited

v push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
u if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

5
OUTPUT: 5 3 8 1
55
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
u if u=NULL
3 1 v 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

5
OUTPUT: 5 3 8 1
56
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
u if u=NULL
3 1 v 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0
9

5
OUTPUT: 5 3 8 1 9
57
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 v 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

9
8 2
u 0

5
OUTPUT: 5 3 8 1 9
58
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 v 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

9
8 2
u 0

5
OUTPUT: 5 3 8 1 9
59
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

9
8
v 2
u 0

5
OUTPUT: 5 3 8 1 9
60
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

9
8
v 2
u 0

5
OUTPUT: 5 3 8 1 9 2
61
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

9
8
v 2 0

5
u=NULL OUTPUT: 5 3 8 1 9 2
62
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
u if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

9
8
v 2 0

5
OUTPUT: 5 3 8 1 9 2
63
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
v u if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0
9

5
OUTPUT: 5 3 8 1 9 2
64
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
v u if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0
9

5
OUTPUT: 5 3 8 1 9 2
65
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
v if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

9
8 2 0 u
5
OUTPUT: 5 3 8 1 9 2
66
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
v if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

9
8 2 0 u
5
OUTPUT: 5 3 8 1 9 2
67
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

9
8 2 v 0 u
5
OUTPUT: 5 3 8 1 9 2
68
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

9
8 2 v 0 u
5
OUTPUT: 5 3 8 1 9 2 0
69
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

9
8 2 v 0

5
u=NULL OUTPUT: 5 3 8 1 9 2 0
70
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited

u u=next unvisited adjacent node of v


if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

9
8 2 v 0

5
OUTPUT: 5 3 8 1 9 2 0
71
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited

v u u=next unvisited adjacent node of v


if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0
9

5
OUTPUT: 5 3 8 1 9 2 0
72
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited

v u u=next unvisited adjacent node of v


if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0
9

5
OUTPUT: 5 3 8 1 9 2 0
73
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited


push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
v if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0
9

5
u=NULL OUTPUT: 5 3 8 1 9 2 0
74
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited

u push v
print v
5
mark v as visited
u=next unvisited adjacent node of v
v if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

5
OUTPUT: 5 3 8 1 9 2 0
75
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited

v u push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

5
OUTPUT: 5 3 8 1 9 2 0
76
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited

v u push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

5
OUTPUT: 5 3 8 1 9 2 0
77
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited

v push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

5
u=NULL OUTPUT: 5 3 8 1 9 2 0
78
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited

v push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

u=NULL OUTPUT: 5 3 8 1 9 2 0
79
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited

v push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

u=NULL OUTPUT: 5 3 8 1 9 2 0
80
Algorithm DepthFirstSearch(Vertex v)

Depth First Search if v is unvisited

v push v
5 print v
mark v as visited
u=next unvisited adjacent node of v
if u=NULL
3 1 9
pop() & u=peak()
if u≠NULL, DepthFirstSearch(u)

8 2 0

u=NULL OUTPUT: 5 3 8 1 9 2 0
81
Breadth First Search
Algorithm BreadthFirstSearch(Vertex v)
print v
unvisited
if v is

mark v as visited
enqueue(all unvisited adjacent nodes of v)

u=dequeue()
if u≠NULL, BreadthFirstSearch(u)

82
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited

v 5
enqueue(all unvisited adjacent nodes of v)

u=dequeue()
if u≠NULL, BreadthFirstSearch(u)

3 1 9
FRONT

QUEUE:
8 2 0

OUTPUT: 5
83
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited

v 5
enqueue(all unvisited adjacent nodes of v)

u=dequeue()
if u≠NULL, BreadthFirstSearch(u)

3 1 9
FRONT

QUEUE:
8 2 0

OUTPUT: 5
84
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited

v 5
enqueue(all unvisited adjacent nodes of v)

u=dequeue()
if u≠NULL, BreadthFirstSearch(u)

3 1 9
FRONT

QUEUE: 3 1 9
8 2 0

OUTPUT: 5
85
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited

v 5
enqueue(all unvisited adjacent nodes of v)

u=dequeue()
if u≠NULL, BreadthFirstSearch(u)
u
3 1 9
FRONT

QUEUE: 1 9
8 2 0

OUTPUT: 5
86
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)
u
3 1 9
v FRONT

QUEUE: 1 9
8 2 0

OUTPUT: 5
87
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)
u
3 1 9
v FRONT

QUEUE: 1 9
8 2 0

OUTPUT: 5 3
88
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)
u
3 1 9
v FRONT

QUEUE: 1 9 8
8 2 0

OUTPUT: 5 3
89
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)
u
3 1 9
v FRONT

QUEUE: 9 8
8 2 0

OUTPUT: 5 3
90
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)
u
3 1 9
v FRONT

QUEUE: 9 8
8 2 0

OUTPUT: 5 3
91
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)
u
3 1 9
v FRONT

QUEUE: 9 8
8 2 0

OUTPUT: 5 3 1
92
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)
u
3 1 9
v FRONT

QUEUE: 9 8
8 2 0

OUTPUT: 5 3 1
93
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)
u
3 1 9
v FRONT

QUEUE: 8
8 2 0

OUTPUT: 5 3 1
94
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)
u
3 1 9 v
FRONT

QUEUE: 8
8 2 0

OUTPUT: 5 3 1
95
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)
u
3 1 9 v
FRONT

QUEUE: 8
8 2 0

OUTPUT: 5 3 1 9
96
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)
u
3 1 9 v
FRONT

QUEUE: 8 2 0
8 2 0

OUTPUT: 5 3 1 9
97
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)

3 1 9 v FRONT

QUEUE: 2 0
u
8 2 0

OUTPUT: 5 3 1 9
98
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)

3 1 9
FRONT

QUEUE: 2 0
u v
8 2 0

OUTPUT: 5 3 1 9
99
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)

3 1 9
FRONT

QUEUE: 2 0
u v
8 2 0

OUTPUT: 5 3 1 9 8
100
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)

3 1 9
FRONT

QUEUE: 2 0
u v
8 2 0

OUTPUT: 5 3 1 9 8
101
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)

3 1 9
FRONT

QUEUE: 0
v u
8 2 0

OUTPUT: 5 3 1 9 8
102
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)

3 1 9
FRONT

QUEUE: 0
u v
8 2 0

OUTPUT: 5 3 1 9 8
103
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)

3 1 9
FRONT

QUEUE: 0
u v
8 2 0

OUTPUT: 5 3 1 9 8 2
104
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)

3 1 9
FRONT

QUEUE: 0
u v
8 2 0

OUTPUT: 5 3 1 9 8 2
105
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)

3 1 9
FRONT

QUEUE:
v u
8 2 0

OUTPUT: 5 3 1 9 8 2
106
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)

3 1 9
FRONT

QUEUE:
u v
8 2 0

OUTPUT: 5 3 1 9 8 2
107
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)

3 1 9
FRONT

QUEUE:
u v
8 2 0

OUTPUT: 5 3 1 9 8 2 0
108
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)

3 1 9
FRONT

QUEUE:
u v
8 2 0

OUTPUT: 5 3 1 9 8 2 0
109
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)

3 1 9
FRONT

QUEUE:
v
8 2 0

u=NULL OUTPUT: 5 3 1 9 8 2 0
110
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)

3 1 9
FRONT

QUEUE:
v
8 2 0

u=NULL OUTPUT: 5 3 1 9 8 2 0
111
Algorithm BreadthFirstSearch(Vertex v)

Breadth First Search print v


mark v as visited
enqueue(all unvisited adjacent nodes of v)
5
u=dequeue()
if u≠NULL, BreadthFirstSearch(u)

3 1 9
FRONT

QUEUE:
v
8 2 0

u=NULL OUTPUT: 5 3 1 9 8 2 0
112
Graph Operations
■ Spanning trees
– Definition: A tree T is said to be a spanning tree of a
connected graph G if T is a subgraph of G and T
contains all vertices of G.
– E(G): T (tree edges) + N (nontree edges)
■ T: set of edges used during search
■ N: set of remaining edges

113
Graph Operations
■ When graph G is connected, a depth first or breadth first
search starting at any vertex will visit all vertices in G

■ We may use DFS or BFS to create a spanning tree

– Depth first spanning tree when DFS is used

– Breadth first spanning tree when BFS is used

114
Breadth First Spanning Tree
Algorithm BreadthFirstSpanningTree(Vertex v)
Graph ST.AddVertices(V)
BreadthFirstTraversal(v)
Algorithm BreadthFirstTraversal(Vertex v)
if v is unvisited
mark v as visited and added
enqueue(all unvisited unadded adjacent nodes of v)
ST.AddEdges(v,all unvisited unadded adjacent nodes of v)
u=dequeue()
if u≠NULL, BreadthFirstTraversal(u) 115
Breadth First
Spanning Tree

G
5

3 1 9

8 2 0
116
Breadth First
Spanning Tree
5 *
G 3 *
5 ST
5
1 *
9 *
8 *
3 1 9
3 1 9
2 *
0 *
ST
8 2 0 8 2 0
117
Breadth First
Spanning Tree
5 *
V
G 3 *
5 ST
5
1 *
9 *
8 *
3 1 9
3 1 9
2 *
0 *
ST
8 2 0 8 2 0
118
Breadth First Spanning Tree
FRONT

QUEUE:

V ST
G 5 ST
5
5 *
3 *
1 *
3 1 9 3 1 9
9 *
8 *
2 *
8 2 0
8 2 0
0 * 119
Breadth First Spanning Tree
FRONT

QUEUE:

V
G ST ST
5 5
5 *
3 *
1 *
3 1 9 3 1 9
9 *
8 *
2 *
8 2 0
8 2 0
0 * 120
Breadth First Spanning Tree
FRONT

QUEUE: 3 1 9

V
G ST ST
5 5
5 *
3 *
1 *
3 1 9 3 1 9
9 *
8 *
2 *
8 2 0
8 2 0
0 * 121
Breadth First Spanning Tree
FRONT

QUEUE: 3 1 9

V
G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 Ø
1 * 5 Ø
3 1 9 3 1 9
9 * 5 Ø
8 *
2 *
8 2 0
8 2 0
0 * 122
Breadth First Spanning Tree
FRONT

QUEUE: 1 9

V
G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 Ø
U
1 * 5 Ø
3 1 9 3 1 9
9 * 5 Ø
8 *
2 *
8 2 0
8 2 0
0 * 123
Breadth First Spanning Tree
FRONT

QUEUE: 1 9

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 Ø
V U
1 * 5 Ø
3 1 9 3 1 9
9 * 5 Ø
8 *
2 *
8 2 0
8 2 0
0 * 124
Breadth First Spanning Tree
FRONT

QUEUE: 1 9

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 Ø
V U
1 * 5 Ø
3 1 9 3 1 9
9 * 5 Ø
8 *
2 *
8 2 0
8 2 0
0 * 125
Breadth First Spanning Tree
FRONT

QUEUE: 1 9

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 Ø
V U
1 * 5 Ø
3 1 9 3 1 9
9 * 5 Ø
8 *
2 *
8 2 0
8 2 0
0 * 126
Breadth First Spanning Tree
FRONT

QUEUE: 1 9 8

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 Ø
V U
1 * 5 Ø
3 1 9 3 1 9
9 * 5 Ø
8 *
2 *
8 2 0
8 2 0
0 * 127
Breadth First Spanning Tree
FRONT

QUEUE: 1 9 8

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
V U
1 * 5 Ø
3 1 9 3 1 9
9 * 5 Ø
8 * 3 Ø
2 *
8 2 0
8 2 0
0 * 128
Breadth First Spanning Tree
FRONT

QUEUE: 9 8

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
V U
1 * 5 Ø
3 1 9 3 1 9
9 * 5 Ø
8 * 3 Ø
2 *
8 2 0
8 2 0
0 * 129
Breadth First Spanning Tree
FRONT

QUEUE: 9 8

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
V U
1 * 5 Ø
3 1 9 3 1 9
9 * 5 Ø
8 * 3 Ø
2 *
8 2 0
8 2 0
0 * 130
Breadth First Spanning Tree
FRONT

QUEUE: 9 8

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
V U
1 * 5 Ø
3 1 9 3 1 9
9 * 5 Ø
8 * 3 Ø
2 *
8 2 0
8 2 0
0 * 131
Breadth First Spanning Tree
FRONT

QUEUE: 9 8

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
V U
1 * 5 Ø
3 1 9 3 1 9
9 * 5 Ø
8 * 3 Ø
2 *
8 2 0
8 2 0
0 * 132
Breadth First Spanning Tree
FRONT

QUEUE: 9 8

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
V U
1 * 5 Ø
3 1 9 3 1 9
9 * 5 Ø
8 * 3 Ø
2 *
8 2 0
8 2 0
0 * 133
Breadth First Spanning Tree
FRONT

QUEUE: 9 8

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
V U
1 * 5 Ø
3 1 9 3 1 9
9 * 5 Ø
8 * 3 Ø
2 *
8 2 0
8 2 0
0 * 134
Breadth First Spanning Tree
FRONT

QUEUE: 8

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
V U
1 * 5 Ø
3 1 9 3 1 9
9 * 5 Ø
8 * 3 Ø
2 *
8 2 0
8 2 0
0 * 135
Breadth First Spanning Tree
FRONT

QUEUE: 8

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
V U
1 * 5 Ø
3 1 9 3 1 9
9 * 5 Ø
8 * 3 Ø
2 *
8 2 0
8 2 0
0 * 136
Breadth First Spanning Tree
FRONT

QUEUE: 8

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
V U
1 * 5 Ø
3 1 9 3 1 9
9 * 5 Ø
8 * 3 Ø
2 *
8 2 0
8 2 0
0 * 137
Breadth First Spanning Tree
FRONT

QUEUE: 8

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
V U
1 * 5 Ø
3 1 9 3 1 9
9 * 5 Ø
8 * 3 Ø
2 *
8 2 0
8 2 0
0 * 138
Breadth First Spanning Tree
FRONT

QUEUE: 8 2 0

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
V U
1 * 5 Ø
3 1 9 3 1 9
9 * 5 Ø
8 * 3 Ø
2 *
8 2 0
8 2 0
0 * 139
Breadth First Spanning Tree
FRONT

QUEUE: 8 2 0

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
V U
1 * 5 Ø
3 1 9 3 1 9
9 * 5 2 0 Ø
8 * 3 Ø
2 * 9 Ø
8 0
8 2 0
0 * 9 Ø 2
140
Breadth First Spanning Tree
FRONT

QUEUE: 2 0

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
V
1 * 5 Ø
3 1 9 3 1 9
9 * 5 2 0 Ø
8 * 3 Ø
2 * 9 Ø
U 8 0
8 2 0
0 * 9 Ø 2
141
Breadth First Spanning Tree
FRONT

QUEUE: 2 0

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
1 * 5 Ø
3 1 9 3 1 9
9 * 5 2 0 Ø
8 * 3 Ø
V Ø
U 2 * 9
8 0
8 2 0
0 * 9 Ø 2
142
Breadth First Spanning Tree
FRONT

QUEUE: 2 0

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
1 * 5 Ø
3 1 9 3 1 9
9 * 5 2 0 Ø
8 * 3 Ø
V Ø
U 2 * 9
8 0
8 2 0
0 * 9 Ø 2
143
Breadth First Spanning Tree
FRONT

QUEUE: 2 0

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
1 * 5 Ø
3 1 9 3 1 9
9 * 5 2 0 Ø
8 * 3 Ø
V Ø
U 2 * 9
8 0
8 2 0
0 * 9 Ø 2
144
Breadth First Spanning Tree
FRONT

QUEUE: 2 0

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
1 * 5 Ø
3 1 9 3 1 9
9 * 5 2 0 Ø
8 * 3 Ø
V Ø
U 2 * 9
8 0
8 2 0
0 * 9 Ø 2
145
Breadth First Spanning Tree
FRONT

QUEUE: 0

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
1 * 5 Ø
3 1 9 3 1 9
9 * 5 2 0 Ø
8 * 3 Ø
V U Ø
2 * 9
8 0
8 2 0
0 * 9 Ø 2
146
Breadth First Spanning Tree
FRONT

QUEUE: 0

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
1 * 5 Ø
3 1 9 3 1 9
9 * 5 2 0 Ø
8 * 3 Ø
U Ø
V 2 * 9
8 0
8 2 0
0 * 9 Ø 2
147
Breadth First Spanning Tree
FRONT

QUEUE: 0

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
1 * 5 Ø
3 1 9 3 1 9
9 * 5 2 0 Ø
8 * 3 Ø
U Ø
V 2 * 9
8 0
8 2 0
0 * 9 Ø 2
148
Breadth First Spanning Tree
FRONT

QUEUE: 0

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
1 * 5 Ø
3 1 9 3 1 9
9 * 5 2 0 Ø
8 * 3 Ø
U Ø
V 2 * 9
8 0
8 2 0
0 * 9 Ø 2
149
Breadth First Spanning Tree
FRONT

QUEUE:

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
1 * 5 Ø
3 1 9 3 1 9
9 * 5 2 0 Ø
8 * 3 Ø
V U 2 * 9 Ø
8 0
8 2 0
0 * 9 Ø 2
150
Breadth First Spanning Tree
FRONT

QUEUE:

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
1 * 5 Ø
3 1 9 3 1 9
9 * 5 2 0 Ø
8 * 3 Ø
U 2 * 9 Ø
V 8 0
8 2 0
0 * 9 Ø 2
151
Breadth First Spanning Tree
FRONT

QUEUE:

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
1 * 5 Ø
3 1 9 3 1 9
9 * 5 2 0 Ø
8 * 3 Ø
U 2 * 9 Ø
V 8 0
8 2 0
0 * 9 Ø 2
152
Breadth First Spanning Tree
FRONT

QUEUE:

G ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
1 * 5 Ø
3 1 9 3 1 9
9 * 5 2 0 Ø
8 * 3 Ø
U 2 * 9 Ø
V 8 0
8 2 0
0 * 9 Ø 2
153
Breadth First Spanning Tree
FRONT

QUEUE:

G U=NULL ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
1 * 5 Ø
3 1 9 3 1 9
9 * 5 2 0 Ø
8 * 3 Ø
2 * 9 Ø
V 8 0
8 2 0
0 * 9 Ø 2
154
Breadth First Spanning Tree
FRONT

QUEUE:

G U=NULL ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
1 * 5 Ø
3 1 9 3 1 9
9 * 5 2 0 Ø
8 * 3 Ø
2 * 9 Ø
V 8 0
8 2 0
0 * 9 Ø 2
155
Breadth First Spanning Tree
FRONT

QUEUE:

G U=NULL ST ST
5 5
5 * 3 1 9 Ø
3 * 5 8 Ø
1 * 5 Ø
3 1 9 3 1 9
9 * 5 2 0 Ø
8 * 3 Ø
2 * 9 Ø
V 8 0
8 2 0
0 * 9 Ø 2
156
Breadth First Spanning Tree
Algorithm BreadthFirstSpanningTree(Vertex v)
Graph ST.AddVertices(V)
BreadthFirstTraversal(v)
return ST
Algorithm BreadthFirstTraversal(Vertex v)
if v is unvisited
mark v as visited and added
enqueue(all unvisited unadded adjacent nodes of v)
ST.AddEdges(v,all unvisited unadded adjacent nodes of v)
u=dequeue()
if u≠NULL, BreadthFirstTraversal(u)
157
158
DESIGN & ANALYSIS OF
ALGORITHMS

Lecture # 36, 37 & 38


The Greedy ApproAch
& InTervAl SchedulInG
IN S T R U C T O R : S u la m a n A h m a d N a z
ONLINE QUIZ #2
■ Conducted on 14TH January, 2021
■ Time Slot: 13:30 pm to 16:30 pm
■ Duration: 50 minutes

2
Greedy Algorithms

■ Hard to define exactly but can give general properties


• Solution is built in small steps
• Decisions on how to build the solution are made to maximize
some criterion without looking to the future
■ Want the ‘best’ current partial solution as if the current
step were the last step

■ May be more than one greedy algorithm using different


criteria to solve a given problem.

3
Greedy Strategy
Goal: Given currency denominations: 1, 5, 10, 25, 100,
give change to customer using fewest number of coins.

Ex: Rs. 34
– 25, 5, 1,1,1,1

Cashier's algorithm: At each iteration, give the largest


coin valued ≤ the amount to be paid.

Ex: Rs. 30
– 25, 5

4
Greedy is not always Optimal
Observation: Greedy algorithm is sub-optimal for say postal denominations:
1, 10, 21, 34, 70, 100, 350, 1225, 1500.

Counterexample. Rs. 140.


– Greedy: 100, 34, 1, 1, 1, 1, 1, 1.
– Optimal: 70, 70.

Lesson: Greedy is short-sighted. Always chooses the most attractive choice


at the moment. But this may lead to a dead-end later.

5
Interval Scheduling

6
Compatibility
• Two jobs compatible if they don’t overlap.

7
Greedy Strategy for Interval Scheduling
1. Use a simple rule to select a request i.
2. Reject all requests that are incompatible with i.
3. Repeat until all requests are processed.

■ This is just a template. We can substitute the underlined phrase


with a greedy rule.
■ There are many possible candidates for this greedy rule.

8
Heuristic 1
■ From the given set of intervals, select a request that STARTS
EARLIEST; that is, the one with the least start time (minimum
s(i)).

9
Heuristic 2
■ Accept the request that has the SMALLEST INTERVAL OF TIME,
i.e., the request for which f(i) − s(i) is minimum.

10
Heuristic 3
■ For each request, we count the number of other requests that
are not compatible, and accept the request that has the fewest
number of non-compatible requests.
■ In other words, we select the interval with the FEWEST
’CONFLICTS’.

11
Heuristic 4
■ Consider the request with respect to their NUMERIC ORDER.

12
Heuristic 5
■ Accept first the request that finishes first, that is, the request i
for which f(i) is as small as possible.
■ Consider the request with EARLIEST FINISH TIME.

Not finding a counterexample of


a heuristic doesn’t imply its proof.
13
Is Heuristic 5 Correct i.e. OPTIMAL?
■ In general, to prove that a greedy algorithm always yields an
optimum solution, we compare the solution output by algorithm
and any optimum solution.
■ Let ALG = { i1, i2, … , ik } be the set of requests output by our
algorithm in order.
■ Let OPT = { j1, j2, … , jm } be an ordered OPTIMAL solution in which
all the request are compatible.
■ Our goal is to prove that k = m. i.e. we need to prove that
|ALG| = |OPT|
14
Are requests in ALG COMPATIBLE?
■ Template of ALG:

■ The intervals in the set ALG returned by the algorithm are


compatible. Thus, heuristic 5 guarantees that for all indices r ≤ k:
f(ir) ≤ f(ik)

1 2 3 r … k

15
ALG = { i1, i2, … , ik }
OPT = { j1, j2, … , jm }
Lemma: f(ir) ≤ f(jr), 1 ≤ r ≤ k
■ For each r ≥ 1, the r th accepted request in the algorithm’s
schedule finishes no later than the rth request in the optimal
schedule.

P(r): f(ir) ≤ f(jr), 1 ≤ r ≤ k

i1 i2 i3 i4 … ik

j1 j2 j3 j4 … jm

16
ALG = { i1, i2, … , ik }
OPT = { j1, j2, … , jm }
Proof of Lemma

P(r): f(ir) ≤ f(jr), 1 ≤ r ≤ k i1


■ We shall prove this by mathematical induction on r.
j1

BASE CASE:
P(1): f(i1) ≤ f(j1) is TRUE.
For r = 1, the algorithm starts by selecting the request i1 with
minimum finish time. So, for every other request, f(i1) ≤ f(ip), 2 ≤ p ≤ k.
In particular, in OPT, j1 is either i1 itself (or) some other request say ip,
2 ≤ p ≤ k. Therefore, f(ir) ≤ f(jr) when r = 1. 17
ALG = { i1, i2, … , ik }
OPT = { j1, j2, … , jm }
Proof of Lemma
HYPOTHESIS:
We assume that P(r-1): f(ir-1) ≤ f(jr-1) is TRUE for r>1.

INDUCTION STEP:
P(r-1)P(r)
or f(ir-1) ≤ f(jr-1)  f(ir) ≤ f(jr)

ir-1 ir

jr-1 jr
18
ALG = { i1, i2, … , ik }
OPT = { j1, j2, … , jm }
Proof of Lemma
ir-1 ir
P(r-1)P(r)
or f(ir-1) ≤ f(jr-1)  f(ir) ≤ f(jr) jr-1 jr

Now, suppose f(ir)>f(jr) is TRUE, as shown in the diagram below.


We will prove that our supposition is not possible.
f(ir-1) ≤ f(jr-1) By Induction Hypothesis ir-1 ir

f(jr-1) ≤ s(jr) OPT contains compatible requests


jr-1 jr
 f(ir-1) ≤ s(jr) By Hypothetical Syllogism

Thus if f(jr) < f(ir), then as jr is the interval with shortest finish
time, according to the heuristic, jrALG and irALG. The later is
contrary to GIVEN and hence not possible.
19
ALG = { i1, i2, … , ik }
OPT = { j1, j2, … , jm }
Proof of Lemma
So, by trichotomy property, if f(ir)>f(jr) is FALSE, then f(ir)  f(jr)
must be TRUE.

Therefore, for each r, the r th interval the ALG selects, finishes no


later than the r th interval in OPT.

ir-1 ir

jr-1 jr
PROOF: Heuristic 5 is Correct i.e. Optimal
ALG = { i1, i2, … , ik } ik-1 ik

OPT = { j1, j2, … , jm } jk-1 jm

Statement: The greedy algorithm ALG is optimal. i.e., k = m.


Proof by contradiction: If ALG is not optimal, then k ≠ m. Since,
k>m is not possible as |ALG| cannot be greater than |OPT|, so by
trichotomy property m>k.
This means that there is a request jk+1 in OPT.

ALG = { i1, i2, … , ik } ik-1 ik

OPT = { j1, j2, … , jk, jk+1, … , jm } jk-1 jk jk+1 … jm

21
PROOF: Heuristic 5 is Correct i.e. Optimal
ALG = { i1, i2, … , ik } ik-1 ik

OPT = { j1, j2, … , jk, jk+1, … , jm } jk-1 jk jk+1 … jm

From above diagram and results from previous slides, it is obvious


that:
f(ik)  f(jk)  s(jk+1)  f(jk+1) or simply f(ik)  f(jk+1)
So after deleting all requests that are not compatible with ik, the set
of possible requests, say R, still contains jk+1.
But the greedy algorithm stops with request ik, and it is only
supposed to stop when R is empty.
Therefore, jk+1 cannot exist and our assumption that m > k is wrong.
Hence, m = k.
22
Class Task 6
■ What is the time complexity of solving the Interval Scheduling
Problem if BRUTE FORCE TECHNIQUE would have been used?

– Upload a single .pdf file on LMS


– Deadline: 28 January, 2021 before midnight.

23
Class Task 7
■ What is the time complexity of solving the Interval Scheduling
Problem using GREEDY HEURISTIC 5?

– Upload a single .pdf file on LMS


– Deadline: 28 January, 2021 before midnight.

24
Online Quiz #3
■ Graded Quiz
■ 50 Minutes Quiz
■ Lecture 31 to 39

■ Date: 30 Jan, 2021


■ Time: Will be announced later

25
26
DESIGN & ANALYSIS OF
ALGORITHMS

Lecture # 39
MiniMuM spanning trees (Mst),
priM’s & KrusKaL’s aLgOritHM
IN S T R U C T O R : S u la m a n A h m a d N a z
Minimum Cost Spanning Trees

■ Introduction
– The cost of a spanning tree of a weighted, undirected graph is
the sum of the costs (weights) of the edges in the spanning tree.
– A minimum-cost spanning tree is a spanning tree of least cost.
– Two different algorithms can be used to obtain a minimum cost
spanning tree.
■ Kruskal’s algorithm
■ Prim’s algorithm
– All these algorithms use a design strategy called the greedy
approach.
– MST of a graph may not be unique.
Minimum Cost Spanning Trees
■ Greedy Strategy
– Construct an optimal solution in stages
– At each stage, we make the best decision (selection) possible at
this time.
■ using least-cost criterion for constructing minimum-cost spanning trees
– Make sure that the decision will result in a feasible solution
– A feasible solution works within the constraints specified by the
problem
■ Our solution must satisfy the following constrains
– Must use only edges within the graph.
– Must use exactly n - 1 edges.
– May not use edges that produce a cycle
Kruskal’s Algorithm
Algorithm Kruskal_MST(Graph G)
New Graph MST
MST.AddVertices(G.V[])
Sort G.E[] in ascending order
For min(u,v) in G.E[]
MST.AddEdges((u,v))
If MST.hasCycles()=TRUE
MST.RemoveEdge((u,v))
if |MST.E[]|≠|MST.V[]|-1
return NULL
return MST
4
Kruskal’s Algorithm G
A

4 3
5
B
D
1
C 5 7
1 3 E
10
F
3
G

G.E[ ]
A,B A,C A,D B,E C,D C,E C,F E,F D,G E,G
3 5 4 7 1 5 1 3 10 3
5
Kruskal’s Algorithm G MST
A

4 3
5
B
D
1
C 5 7
1 3 E
10
F
3
G

G.E[ ]
A,B A,C A,D B,E C,D C,E C,F E,F D,G E,G
3 5 4 7 1 5 1 3 10 3
6
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
A,B A,C A,D B,E C,D C,E C,F E,F D,G E,G
3 5 4 7 1 5 1 3 10 3
7
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
8
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
9
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
10
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
11
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
12
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
13
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
14
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
15
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
16
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
17
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
18
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
19
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
20
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
21
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
22
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
23
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
24
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
25
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
26
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
27
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
28
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
29
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
30
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
31
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
32
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
33
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
34
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
35
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
36
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
37
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
38
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
39
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
40
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
41
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
42
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
43
Kruskal’s Algorithm G MST
A A

4 3 4 3
5 5
B B
D D
1 1
C 5 7 C 5 7
1 3 E
1 3 E
10 10
F F
3 3
G G

G.E[ ]
C,D C,F A,B E,F E,G A,D A,C C,E B,E D,G
1 1 3 3 3 4 5 5 7 10
44
Prim’s Algorithm
Algorithm Prims_MST(Graph G, Vertex r)
New Set A[], Temp[]=G.V[]
For each v in G.V[]
v.cost= & v.previous=NULL
r.cost=o
While Temp[] is not Empty
u=Min(Temp[]) by cost
Temp[].Remove(u)
if u.previous≠NULL
A[].Add( (u,u.previous) )
for each v in adjacent(u)
if Temp[].Search(v)=True & w(u,v)<v.cost
v.cost=w(u,v) & v.previous=u

45
Prim’s Algorithm r
G
A

4 3
5
B
D
1
C 5 7
1 3 E
10
F
3
G

46
Prim’s Algorithm r
G
A

4 3
5
B
D
1
C 5 7
1 3 E
10
F
3
G

A[ ]={ }
Temp[ ]={A,B,C,D,E,F,G}
47
Prim’s Algorithm r
G ,
A

4 3 ,
, 5
D
, B
1
C 5 7
1 3
, E
10
F ,
3
G

A[ ]={ } ,

Temp[ ]={A,B,C,D,E,F,G}
48
Prim’s Algorithm r
G 0,
A

4 3 ,
, 5
D
, B
1
C 5 7
1 3
, E
10
F ,
3
G

A[ ]={ } ,

Temp[ ]={A,B,C,D,E,F,G}
49
Prim’s Algorithm r
G 0,
A

4 3 ,
, 5
D
, B
1
C 5 7
1 3
, E
10
F ,
3
G

A[ ]={ } ,

Temp[ ]={A,B,C,D,E,F,G}
50
Prim’s Algorithm r
G 0,
u A

4 3 ,
, 5
D
, B
1
C 5 7
1 3
, E
10
F ,
3
G

A[ ]={ } ,

Temp[ ]={A,B,C,D,E,F,G}
51
Prim’s Algorithm r
G 0,
u A

4 3 ,
, 5
D
, B
1
C 5 7
1 3
, E
10
F ,
3
G

A[ ]={ } ,

Temp[ ]={B,C,D,E,F,G}
52
Prim’s Algorithm r
G 0,
u A

4 3 ,
, 5
D
, B
1
C 5 7
1 3
, E
10
F ,
3
G

A[ ]={ } ,

Temp[ ]={B,C,D,E,F,G}
53
Prim’s Algorithm r
G 0,
u A

4 3 ,
, 5
D
, B
1
C 5 7
1 3
, E
10
F ,
3
G

A[ ]={ } ,

Temp[ ]={B,C,D,E,F,G}
54
Prim’s Algorithm r
G 0,
u A

4 3 ,
, 5
D
, B
1
C 5 7
1 3
, E
10
F ,
3
G

A[ ]={ } ,

Temp[ ]={B,C,D,E,F,G}
55
Prim’s Algorithm r
G 0,
u A

4 3 3,A
4,A 5
D
5,A B
1
C 5 7
1 3
, E
10
F ,
3
G

A[ ]={ } ,

Temp[ ]={B,C,D,E,F,G}
56
Prim’s Algorithm r
G 0,
u A

4 3 3,A
4,A 5
D
5,A B
1
C 5 7
1 3
, E
10
F ,
3
G

A[ ]={ } ,

Temp[ ]={B,C,D,E,F,G}
57
Prim’s Algorithm r
G 0,
A

4 3 u 3,A
4,A 5
D
5,A B
1
C 5 7
1 3
, E
10
F ,
3
G

A[ ]={ } ,

Temp[ ]={B,C,D,E,F,G}
58
Prim’s Algorithm r
G 0,
A

4 3 u 3,A
4,A 5
D
5,A B
1
C 5 7
1 3
, E
10
F ,
3
G

A[ ]={ } ,

Temp[ ]={C,D,E,F,G}
59
Prim’s Algorithm r
G 0,
A

4 3 u 3,A
4,A 5
D
5,A B
1
C 5 7
1 3
, E
10
F ,
3
G

A[ ]={ } ,

Temp[ ]={C,D,E,F,G}
60
Prim’s Algorithm r
G 0,
A

4 3 u 3,A
4,A 5
D
5,A B
1
C 5 7
1 3
, E
10
F ,
3
G

A[ ]={ AB } ,

Temp[ ]={C,D,E,F,G}
61
Prim’s Algorithm r
G 0,
A

4 3 u 3,A
4,A 5
D
5,A B
1
C 5 7
1 3
, E
10
F ,
3
G

A[ ]={ AB } ,

Temp[ ]={C,D,E,F,G}
62
Prim’s Algorithm r
G 0,
A

4 3 u 3,A
4,A 5
D
5,A B
1
C 5 7
1 3
, E
10
F ,
3
G

A[ ]={ AB } ,

Temp[ ]={C,D,E,F,G}
63
Prim’s Algorithm r
G 0,
A

4 3 u 3,A
4,A 5
D
5,A B
1
C 5 7
1 3
, E
10
F 7,B
3
G

A[ ]={ AB } ,

Temp[ ]={C,D,E,F,G}
64
Prim’s Algorithm r
G 0,
A

4 3 u 3,A
4,A 5
D
5,A B
1
C 5 7
1 3
, E
10
F 7,B
3
G

A[ ]={ AB } ,

Temp[ ]={C,D,E,F,G}
65
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
5,A B
u D
1
C 5 7
1 3
, E
10
F 7,B
3
G

A[ ]={ AB } ,

Temp[ ]={C,D,E,F,G}
66
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
5,A B
u D
1
C 5 7
1 3
, E
10
F 7,B
3
G

A[ ]={ AB } ,

Temp[ ]={C,E,F,G}
67
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
5,A B
u D
1
C 5 7
1 3
, E
10
F 7,B
3
G

A[ ]={ AB } ,

Temp[ ]={C,E,F,G}
68
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
5,A B
u D
1
C 5 7
1 3
, E
10
F 7,B
3
G

A[ ]={ AB,AD } ,

Temp[ ]={C,E,F,G}
69
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
5,A B
u D
1
C 5 7
1 3
, E
10
F 7,B
3
G

A[ ]={ AB,AD } ,

Temp[ ]={C,E,F,G}
70
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
5,A B
u D
1
C 5 7
1 3
, E
10
F 7,B
3
G

A[ ]={ AB,AD } ,

Temp[ ]={C,E,F,G}
71
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
1,D B
u D
1
C 5 7
1 3
, E
10
F 7,B
3
G

A[ ]={ AB,AD } 10,D

Temp[ ]={C,E,F,G}
72
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
1,D B
u D
1
C 5 7
1 3
, E
10
F 7,B
3
G

A[ ]={ AB,AD } 10,D

Temp[ ]={C,E,F,G}
73
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D u 1,D B
1
C 5 7
1 3
, E
10
F 7,B
3
G

A[ ]={ AB,AD } 10,D

Temp[ ]={C,E,F,G}
74
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D u 1,D B
1
C 5 7
1 3
, E
10
F 7,B
3
G

A[ ]={ AB,AD } 10,D

Temp[ ]={E,F,G}
75
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D u 1,D B
1
C 5 7
1 3
, E
10
F 7,B
3
G

A[ ]={ AB,AD } 10,D

Temp[ ]={E,F,G}
76
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D u 1,D B
1
C 5 7
1 3
, E
10
F 7,B
3
G

A[ ]={ AB,AD,DC } 10,D

Temp[ ]={E,F,G}
77
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D u 1,D B
1
C 5 7
1 3
, E
10
F 7,B
3
G

A[ ]={ AB,AD,DC } 10,D

Temp[ ]={E,F,G}
78
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D u 1,D B
1
C 5 7
1 3
, E
10
F 7,B
3
G

A[ ]={ AB,AD,DC } 10,D

Temp[ ]={E,F,G}
79
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D u 1,D B
1
C 5 7
1 3
1,C E
10
F 5,C
3
G

A[ ]={ AB,AD,DC } 10,D

Temp[ ]={E,F,G}
80
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D u 1,D B
1
C 5 7
1 3
1,C E
10
F 5,C
3
G

A[ ]={ AB,AD,DC } 10,D

Temp[ ]={E,F,G}
81
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D u 1,D B
1
C 5 7
1 3
1,C E
10
F 5,C
3
G

A[ ]={ AB,AD,DC } 10,D

Temp[ ]={E,F,G}
82
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3
1,C u E
10
F 5,C
3
G

A[ ]={ AB,AD,DC } 10,D

Temp[ ]={E,F,G}
83
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3
1,C u E
10
F 5,C
3
G

A[ ]={ AB,AD,DC } 10,D

Temp[ ]={E,G}
84
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3
1,C u E
10
F 5,C
3
G

A[ ]={ AB,AD,DC } 10,D

Temp[ ]={E,G}
85
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3
1,C u E
10
F 5,C
3
G

A[ ]={ AB,AD,DC,CF } 10,D

Temp[ ]={E,G}
86
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3
1,C u E
10
F 5,C
3
G

A[ ]={ AB,AD,DC,CF } 10,D

Temp[ ]={E,G}
87
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3
1,C u E
10
F 5,C
3
G

A[ ]={ AB,AD,DC,CF } 10,D

Temp[ ]={E,G}
88
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3
1,C u E
10
F 3,F
3
G

A[ ]={ AB,AD,DC,CF } 10,D

Temp[ ]={E,G}
89
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3
1,C u E
10
F 3,F
3
G

A[ ]={ AB,AD,DC,CF } 10,D

Temp[ ]={E,G}
90
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3 u
1,C E
10
F 3,F
3
G

A[ ]={ AB,AD,DC,CF } 10,D

Temp[ ]={E,G}
91
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3 u
1,C E
10
F 3,F
3
G

A[ ]={ AB,AD,DC,CF } 10,D

Temp[ ]={G}
92
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3 u
1,C E
10
F 3,F
3
G

A[ ]={ AB,AD,DC,CF } 10,D

Temp[ ]={G}
93
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3 u
1,C E
10
F 3,F
3
G

A[ ]={ AB,AD,DC,CF,EF } 10,D

Temp[ ]={G}
94
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3 u
1,C E
10
F 3,F
3
G

A[ ]={ AB,AD,DC,CF,EF } 10,D

Temp[ ]={G}
95
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3 u
1,C E
10
F 3,F
3
G

A[ ]={ AB,AD,DC,CF,EF } 10,D

Temp[ ]={G}
96
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3 u
1,C E
10
F 3,F
3
G

A[ ]={ AB,AD,DC,CF,EF } 3,E

Temp[ ]={G}
97
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3 u
1,C E
10
F 3,F
3
G

A[ ]={ AB,AD,DC,CF,EF } 3,E

Temp[ ]={G}
98
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3
1,C E
10
F 3,F
3
G
u
A[ ]={ AB,AD,DC,CF,EF } 3,E

Temp[ ]={G}
99
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3
1,C E
10
F 3,F
3
G
u
A[ ]={ AB,AD,DC,CF,EF } 3,E

Temp[ ]={ }
100
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3
1,C E
10
F 3,F
3
G
u
A[ ]={ AB,AD,DC,CF,EF } 3,E

Temp[ ]={ }
101
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3
1,C E
10
F 3,F
3
G
u
A[ ]={ AB,AD,DC,CF,EF,EG } 3,E

Temp[ ]={ }
102
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3
1,C E
10
F 3,F
3
G
u
A[ ]={ AB,AD,DC,CF,EF,EG } 3,E

Temp[ ]={ }
103
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3
1,C E
10
F 3,F
3
G
u
A[ ]={ AB,AD,DC,CF,EF,EG } 3,E

Temp[ ]={ }
104
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3
1,C E
10
F 3,F
3
G
u
A[ ]={ AB,AD,DC,CF,EF,EG } 3,E

Temp[ ]={ }
105
Prim’s Algorithm r
G 0,
A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3
1,C E
10
F 3,F
3
G
u
A[ ]={ AB,AD,DC,CF,EF,EG } 3,E

Temp[ ]={ }
106
Prim’s Algorithm r
G 0,
MST A

4 3 3,A
4,A 5
D
1,D B
1
C 5 7
1 3
New Graph MST 1,C E
10
MST.AddVertices(G.V[]) F 3,F
MST.AddEdges(A[]) 3
if |MST.E[]|≠|MST.V[]|-1 G
return NULL u
return MST
3,E

A[ ]={ AB,AD,DC,CF,EF,EG } 107


Prim’s Algorithm r
G 0,
MST A

A 4 3 3,A
4,A 5
D
1,D B
1
D
B C 5 7
C 1 3
New Graph MST 1,C E
10
E MST.AddVertices(G.V[]) F 3,F
F MST.AddEdges(A[]) 3
if |MST.E[]|≠|MST.V[]|-1 G
return NULL u
G
return MST
3,E

A[ ]={ AB,AD,DC,CF,EF,EG } 108


Prim’s Algorithm r
G 0,
MST A

A 4 3 3,A
3 4,A 5
4 1,D B
D
1
D
B C 5 7
1
C 1 3
New Graph MST 1,C E
1 3 10
E MST.AddVertices(G.V[]) F 3,F
F MST.AddEdges(A[]) 3
3 if |MST.E[]|≠|MST.V[]|-1 G
return NULL u
G
return MST
3,E

A[ ]={ AB,AD,DC,CF,EF,EG } 109


Prim’s Algorithm r
G 0,
MST A

A 4 3 3,A
3 4,A 5
4 1,D B
D
1
D
B C 5 7
1
C 1 3
New Graph MST 1,C E
1 3 10
E MST.AddVertices(G.V[]) F 3,F
F MST.AddEdges(A[]) 3
3 if |MST.E[]|≠|MST.V[]|-1 G
return NULL u
G
return MST
3,E

A[ ]={ AB,AD,DC,CF,EF,EG } 110


Prim’s Algorithm r
G 0,
MST A

A 4 3 3,A
3 4,A 5
4 1,D B
D
1
D
B C 5 7
1
C 1 3
New Graph MST 1,C E
1 3 10
E MST.AddVertices(G.V[]) F 3,F
F MST.AddEdges(A[]) 3
3 if |MST.E[]|≠|MST.V[]|-1 G
return NULL u
G
return MST
3,E

A[ ]={ AB,AD,DC,CF,EF,EG } 111


112
DESIGN & ANALYSIS OF
ALGORITHMS

Lecture # 40, 41 & 42


DYNAMIC PROGRAMMING
& SHORTEST PATH PROBLEM
IN S T R U C T O R : S u la m a n A h m a d N a z
A(3)
Example 1
A(2) P(3) A(2)
A( n )
{
A(1) P(2) A(1) A(1) P(2) A(1)
If n >= 1
A(n-1);
print n A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(n-1);
A(0) 1 call
} A(1) 3 calls
A(2) 7 calls
A(3) 15 calls

A(n) m calls
2
A(3)
Example 1
A(2) P(3) A(2)
A( n )
{
A(1) P(2) A(1) A(1) P(2) A(1)
If n >= 1
A(n-1);
print n A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(n-1);
A(0) 1 = 20+1 ― 1 call
} A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
A(3) 15 = 23+1 ― 1 calls
S(n) = k(2n+1-1) = O(2n)

A(n) m = 2n+1 ― 1 calls


3
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
A(3) 15 = 23+1 ― 1 calls

A(n) m = 2n+1 ― 1 calls


4
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
5
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) A(2) 7 = 22+1 ― 1 calls
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
6
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) A(2) 7 = 22+1 ― 1 calls
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
7
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0)
A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) A(2) 7 = 22+1 ― 1 calls
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
8
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0)
A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
9
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0)
A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
10
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0)
A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
11
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0)
A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
12
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
12
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0)
A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
12
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
13
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0)


A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
12
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
14
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0)


A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
121
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
15
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
121
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
16
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
121
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
17
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
121
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
18
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213
A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
19
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
20
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
21
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
22
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
12131
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
23
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
12131
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
24
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
12131
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
25
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
121312
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
26
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
121312
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
27
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
121312
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
28
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213121
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
29
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213121
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
30
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213121
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
31
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213121
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
32
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213121
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
33
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) A(0) A(0) A(0)


A(0) A(0) A(0) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(1) A(1) A(1) A(1) A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
1213121
A(2) A(2)
A(3) 15 = 23+1 ― 1 calls
A(3)
A(n) m = 2n+1 ― 1 calls
34
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(0) 1 = 20+1 ― 1 call
You can see that for an A(1) 3 = 21+1 ― 1 calls
Input of size n, we need A(2) 7 = 22+1 ― 1 calls
A(3) 15 = 23+1 ― 1 calls
a stack of height n+1.
A(n) m = 2n+1 ― 1 calls
35
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(0) 1 = 20+1 ― 1 call
If each block of stack A(1) 3 = 21+1 ― 1 calls
consists of k memory A(2) 7 = 22+1 ― 1 calls
A(3) 15 = 23+1 ― 1 calls
locations, then total
space required is k(n+1) A(n) m = 2n+1 ― 1 calls
36
A(3) A( n )
{
Example 1 If n >= 1
A(2) P(3) A(2) A(n-1);
print n
A(n-1);
A(1) P(2) A(1) A(1) P(2) A(1) }

A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)
A(0) 1 = 20+1 ― 1 call
A(1) 3 = 21+1 ― 1 calls
A(2) 7 = 22+1 ― 1 calls
S(n) = k(n+1) = O(n) A(3) 15 = 23+1 ― 1 calls

A(n) m = 2n+1 ― 1 calls


37
A( n )
{
Analyzing Time Complexity If n >= 1
A(n-1);
print n
T(n) = 2T(n-1) + c; n>=1
A(n-1);
T(0) = c  This is the Anchor Condition }
-------------------------------------------------------------------------
T(n) =2T(n-1)+c …eq(i)
T(n-1) =2T(n-1-1)+c
=2T(n-2)+c …eq(ii)
T(n-2) =2T(n-2-1)+c
=2T(n-3)+c …eq(iii)
38
A( n )
{
Analyzing Time Complexity If n >= 1
Putting eq(ii) in eq(i): T(n) =2T(n-1)+c …eq(i) A(n-1);
T(n-1) =2T(n-2)+c …eq(ii) print n
T(n) =2T(n-1)+c T(n-2) =2T(n-3)+c …eq(iii)
A(n-1);
=2[2T(n-2)+c] +c }
=22T(n-2)+2c+c …eq(iv)
Putting eq(iii) in eq(iv):
T(n) =22[2T(n-3)+c]+2c+c
=23T(n-3)+22c+2c+c
Suppose we reach the anchor condition after k calls:
T(n) =2kT(n-k)+2k-1c+…+22c+2c+c ..eq(v)
39
A( n )
{
Analyzing Time Complexity If n >= 1
As we know that T(0) will be the last recursive call, so A(n-1);
for what value of k will T(n-k) become T(0). print n
A(n-1);
n-k=0  n=k
}
So eq(v) will become:
T(n) = 2kT(n-k)+2k-1c+…+22c+2c+c
= 2nT(0)+2n-1c+…+22c+2c+c
= 2nc+2n-1c+…+22c+2c+c
= c(2n+2n-1+…+22+2+1)
= c(2n+1 ― 1)
= O(2n) 40
A( n )
{
Our Approach If n >= 1
A(n-1);
S(n) = O(n) A(3)
print n
A(n-1);
T(n) = O(2n)
}
A(2) P(3) A(2)

A(1) P(2) A(1) A(1) P(2) A(1)

A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0) A(0) P(1) A(0)

41
A( n )
{
New Approach If n >= 1
A(n-1);
print n
A(3) A(n-1);
}
A(2)

A(1) 0 1 2 3

Ø Ø Ø Ø

A(0)

42
A( n )
{
New Approach If n >= 1
A(n-1);
print n
A(3) A(n-1);
}
A(2)

A(1) 0 1 2 3

<blank> Ø Ø Ø

A(0) P(1) M(0)

43
A( n )
{
New Approach If n >= 1
A(n-1);
print n
A(3) A(n-1);
}
A(2)

A(1) P(2) M(1) 0 1 2 3

<blank> 1 Ø Ø

A(0) P(1) M(0)

44
A( n )
{
New Approach If n >= 1
A(n-1);
print n
A(3) A(n-1);
}
A(2) P(3) M(2)

A(1) P(2) M(1) 0 1 2 3

<blank> 1 121 Ø

A(0) P(1) M(0)

45
A( n )
{
New Approach If n >= 1
A(n-1);
S(n) = O(n) + 2(n+1) print n
= O(n) A(3) A(n-1);

T(n) = (n+1).c }
= O(n) A(2) P(3) M(2)

A(1) P(2) M(1) 0 1 2 3

<blank> 1 121 1213121

A(0) P(1) M(0)

46
Dynamic Programming
■ Dynamic programming is essentially recursion without
repetition.
■ Developing a dynamic programming algorithm generally
involves two separate steps:
– Formulate problem recursively. Write down a formula for the
whole problem as a simple combination of answers to smaller
sub-problems.
– Build solution to recurrence from bottom up. Write an
algorithm that starts with base cases and works its way up to
the final solution.
Dynamic Programming
■ Decompose into sub-problems like divide and conquer.
■ Sub-problems are dependant.
■ Record results of smaller sub-problems. Dynamic
programming algorithms need to store the results of
intermediate sub-problems.
■ Re-use it for further occurrence.
■ Mostly reduces complexity exponential to polynomial.
In our case, it reduced complexity from exponential time to linear time.
Dynamic Programming
■ It's especially good and intended for optimization problems.
■ In optimization problems, you want to minimize, maximize
something, things like shortest paths.
■ In optimization problems, you want to find the best way to do
something.
■ Typically good algorithms to solve them involve dynamic
programming.

49
DP is Careful Brute Force
■ You can also think of dynamic programming as a kind of
exhaustive search. which is usually a bad thing to do because it
leads to exponential time.
■ But if you do it in a clever way, via dynamic programming, you
typically get polynomial time.
■ So according to this perspective, dynamic programming is
approximately ‘careful brute force’.

50
Literal Meaning of DP
■ The notion ‘Dynamic’ means ‘opposite of static’.
■ While the word ‘Programming’ refers to ‘Optimization’ or
‘Planning’

Dynamic Planning to dynamically



Programming optimize something

51
1
Example 2
2 6

V(n) = f ( V(n-3), V(n-4), V(n-5) )


3 7 11

Properties:
1. Recursive in nature 4 8
2. Dividing into subproblems
3. Overlapping subproblems
4. Optimal Substructure 5

52
Example 3
■ Fibonacci Numbers:
– 1 1 2 3 5 8 13 21 34 55 …
■ Formula:
 Fn = Fn-1 + Fn-2
 F 1 = F2 = 1

Or

 F(n) = F(n-1) + F(n-2)


 F(1) = F(2) = 1
53
Example 3:
Naïve Recursive Algorithm
fib(n):
if n<3: f=1
else: f=fib(n-1) + fib(n-2)
return f

■ Time Complexity:

 T(n) = T(n-1) + T(n-2) + (1)


 T(1) = T(2) = (1)

54
Example 3:
Naïve Recursive Algorithm
■ Time Complexity:
 T(n) = T(n-1) + T(n-2) + (1)
 T(1) = T(2) = (1)

■ Solution:
T(n) > T(n-2) + T(n-2) + (1)
=>T(n) > 2T(n-2) + (1)
 T(n) > (2n/2)

So, growth is exponential.


55
Example 3:
Memoized DP Algorithm
initiate memo as EMPTY

fib(n):
if n is memo: return memo[n]
if n<3: f=1
else: f=fib(n-1) + fib(n-2)
memo[n] = f
return f
56
Example 3:
Memoized DP Algorithm
fib(n)
■ Time Complexity:

fib(n-1) fib(n-2)

fib(n-2) fib(n-3) fib(n-3) fib(n-4)

fib(n-3) fib(n-4) fib(n-4) fib(n-5) fib(n-4) fib(n-5) fib(n-5) fib(n-6)

fib(n-4)
57
Example 3:
Memoized DP Algorithm
fib(n)
■ Time Complexity:

fib(n-1) fib(n-2)

fib(n-2) fib(n-3) fib(n-3) fib(n-4)

fib(n-3) fib(n-4) fib(n-4) fib(n-5) fib(n-4) fib(n-5) fib(n-5) fib(n-6)

fib(n-4)
58
Example 3:
Memoized DP Algorithm
fib(n)
■ Time Complexity:

fib(n-1) fib(n-2)

fib(n-2) fib(n-3) fib(n-3) fib(n-4)

fib(n-3) fib(n-4) fib(n-4) fib(n-5) fib(n-4) fib(n-5)

fib(n-4)
59
Example 3:
Memoized DP Algorithm
fib(n)
■ Time Complexity:

fib(n-1) fib(n-2)

fib(n-2) fib(n-3) fib(n-3) fib(n-4)

fib(n-3) fib(n-4)

fib(n-4)
60
Example 3:
Memoized DP Algorithm
fib(n)
■ Time Complexity:

fib(n-1) fib(n-2)  T(n) = T(n-1) + (1)


 T(1) = T(2) = (1)

fib(n-2) fib(n-3)
■ Solution:
T(n) = T(n-1) + (1)
fib(n-3) fib(n-4) = (n)

fib(n-4) So, growth is linear.


61
Example 3:
Bottom-up DP Algorithm

fib(n):
initiate array[1…n] to EMPTY
for i = 1 to n
if i<3: f=1
else: f=array[i-1] + array[i-2]
array[i] = f
return array[n]

62
Example 3:
Bottom-up DP Algorithm
■ Time Complexity:
fib(n):
initiate array[1…n] to EMPTY ---> O(n)
for i = 1 to n ---> O(1)+O(n)+O(n) + c = O(n)
if i<3: f=1 ---> 2c = O(1)
else: f=array[i-1] + array[i-2] ---> c.(n-2) = O(n)
array[i] = f ---> cn = O(n)
return array[n] ---> O(1)

63
Class Tasks 8 & 9
8. Write a C++ code to implement the BOTTOM
UP version of the example 1 discussed in the
beginning of lecture.

9. Write the pseudocode of the MEMOIZED


version of the same example.

■ Instructions:
– Upload a single “.CPP” file for Class Task 8 and a
single “.DOCX” file for Class Task 9 by 04 February,
2021 on LMS.
64
Shortest Path Problem (Single Source)
s
■ Here, again, it is an optimization problem. G
A

4 3
■ We can write at least three algorithms for it: 5
B
– Naïve Recursive Algorithm D
1
– Memoized DP Algorithm C 4 7
– Bottom-up DP Algorithm 1 2 E
10
■ I will discuss the necessary basics here, and F

leave the rest on you as an ungraded


assignment. G

65
Shortest Path Problem (Single Source)
s
■ NAÏVE Algorithm G
– Tool: GUESSING A

– We will guess the right answer. 4 3


– Concept: 5
B
■ Try guessing the right answer. D
1
■ Don’t just try any guess. C 4 7
■ Try all of them.
1 2 E
■ Take the best one.
10
F
– This is the BRUTE FORCE part.
– In the DP version of this problem, you will
CAREFULLY use it by avoiding unnecessary G
recursions by memoization.
66
Shortest Path Problem (Single Source)
■ Applying the GUESSING principle on the s
G
shortest path problem. A

– Look at all the nodes from where we can reach 4 3


‘d’ and them look at the shortest path from ‘s’ 5
B
to there. D
1
– So, there is some hypothetical shortest path C 4 7
from ‘s’ to ‘d’. 1 2 E
– We don’t know from where we can reach ‘d’. It
may be any incoming edge to ‘d’. What to do
10
F
then?
– Answer: We will GUESS. Or more precisely, we d
will try all possible guesses. G

67
Shortest Path Problem (Single Source)
■ Applying the GUESSING principle on the s
G
shortest path problem. A

– After that recursively compute the shortest 4 3


path from ‘s’ to ‘d’. 5
B
D
D(s,d) = D(s,u) + w(u,d) 1
C 4 7
– Here, (u,v) is our current GUESS and ‘u’ is an
adjacent node of ‘d’ from where we can reach 1 2 E
here. 10
F
– But may be our guess is wrong!!! So, we will
find the minimum of all possible guesses.
d
D(s,d) = Min[D(s,u) + w(u,d) : (u,d)E] G

68
Shortest Path Problem (Single Source)
D(s,d) = Min[w(u,d) + D(s,u) : (u,v)E] s
D(s,s) = 0
G
A

4 3
D(A,G) = Min[ 10 + D(A,D)] 5
B
D(A,D) = Min[ 4 + D(A,A) or 1 + D(A,C)] D
1
D(A,A) = 0 C 4 7
D(A,C) = Min [ 5 + D(A,A)] 1 2 E
D(A,A) = 0 10
F
D(A,C) = 5
D(A,D) = Min[4 or 6] = 4 G
d
D(A,G) = 10 + 4 = 14 69
Shortest Path Problem (Single Source)
■ The same will be applied for all nodes. s
G
A
■ So in the worst case, there can be (n-1) 3
incoming edges to each node. 4
5
B
■ Hence, its time complexity will be D
1
exponential. C 4 7
1 2 E
10
NOT PRACTICAL IN F

BIG GRAPHS. G
d
70
Shortest Path Problem (Single Source)
s
■ MEMOIZED Algorithm G
A
– To make algorithms better, we learnt to
memorize. 4 3
– Memoized function is the same Brute Force
5
B
D
recursive algorithm: 1
C 4 7
D(s,d) = Min[w(u,d) + D(s,u) : (u,d)E]
1 2 E
D(s,s) = 0
10
– But in the beginning of the function definition, F
we check if the value of D(s,d) is already in the
memo or not.
G
– So, do we have a better algorithm, now???
71
Shortest Path Problem (Single Source)
s
■ Limitation of our Algorithm G
A
– Remember, we are only considering Directed
Acyclic Graphs, DAGs here. 4 3
– It will not work for Cyclic Graphs.
5
B
D
1
C 4 7
■ Ungraded Class Tasks 1
– Considering the above statement, show that our
2 E

algorithm is not correct for Directed Cyclic 10


F
Graphs, DCGs. It is rather an infinite algorithm.

72
73
DESIGN & ANALYSIS OF
ALGORITHMS

Lecture # 43 & 44
ShorteSt Path Problem,
DijkStra’S algorithm
IN S T R U C T O R : S u la m a n A h m a d N a z
Shortest Path Problem
■ The shortest path problem is about finding a path
between 2 vertices in a graph such that the total sum of the
edges weights is minimum.

■ Dijkstra’s Algorithm finds the minimal distance from the source


vertex to all other vertices in the graph.

■ This information can be used to get the shortest path from the
source to any other node.
2
Dijkstra’s Algorithm
Algorithm Dijkstra(Graph G, Vertex s)
New Set Temp[]=G.V[]
For each v in G.V[]
v.distance= & v.previous=NULL
s.distance=o
While Temp[] is not Empty
u=Min(Temp[]) by distance
Temp[].Remove(u)
for each v in adjacent(u)
if v.distance > u.distance + w(u,v)
v.distance = u.distance + w(u,v)
v.previous = u

3
Dijkstra’s Algorithm G
A

4 3
5
D s B
1
C 5 7
1 3 E
10
F
3
G

4
Dijkstra’s Algorithm G
A

4 3
5
D s B
1
C 5 7
1 3 E
10
F
3
G

Temp[ ]={A,B,C,D,E,F,G}
5
Dijkstra’s Algorithm G ,
A

4 3 ,
, 5
D s , B
1
C 5 7
1 3
, E
10
F ,
3
G

,
Temp[ ]={A,B,C,D,E,F,G}
6
Dijkstra’s Algorithm G ,
A

4 3 ,
, 5
D s 0, B
1
C 5 7
1 3
, E
10
F ,
3
G

,
Temp[ ]={A,B,C,D,E,F,G}
7
Dijkstra’s Algorithm G ,
A

4 3 ,
, 5
D s 0, B
1
C 5 7
1 3
, E
10
F ,
3
G

,
Temp[ ]={A,B,C,D,E,F,G}
8
Dijkstra’s Algorithm G ,
A

4 3 ,
, 5
D s 0, B
1u
C 5 7
1 3
, E
10
F ,
3
G

,
Temp[ ]={A,B,C,D,E,F,G}
9
Dijkstra’s Algorithm G ,
A

4 3 ,
, 5
D s 0, B
1u
C 5 7
1 3
, E
10
F ,
3
G

,
Temp[ ]={A,B,D,E,F,G}
10
Dijkstra’s Algorithm G ,
A

4 3 ,
, 5
D s 0, B
1u
C 5 7
1 3
, E
10
F ,
3
G

,
Temp[ ]={A,B,D,E,F,G}
11
Dijkstra’s Algorithm G ,
A

4 3 ,
, 5
D s 0, B
1u
C 5 7
1 3
, E
10
F ,
3
G

,
Temp[ ]={A,B,D,E,F,G}
12
Dijkstra’s Algorithm G 5,C
A

4 3 ,
1,C 5
D s 0, B
1u
C 5 7
1 3
1,C E
10
F 5,C
3
G

,
Temp[ ]={A,B,D,E,F,G}
13
Dijkstra’s Algorithm G 5,C
A

4 3 ,
1,C 5
D s 0, B
1u
C 5 7
1 3
1,C E
10
F 5,C
3
G

,
Temp[ ]={A,B,D,E,F,G}
14
Dijkstra’s Algorithm G 5,C
A

4 3 ,
1,C 5
u D s 0, B
1
C 5 7
1 3
1,C E
10
F 5,C
3
G

,
Temp[ ]={A,B,D,E,F,G}
15
Dijkstra’s Algorithm G 5,C
A

4 3 ,
1,C 5
u D s 0, B
1
C 5 7
1 3
1,C E
10
F 5,C
3
G

,
Temp[ ]={A,B,E,F,G}
16
Dijkstra’s Algorithm G 5,C
A

4 3 ,
1,C 5
u D s 0, B
1
C 5 7
1 3
1,C E
10
F 5,C
3
G

,
Temp[ ]={A,B,E,F,G}
17
Dijkstra’s Algorithm G 5,C
A

4 3 ,
1,C 5
u D s 0, B
1
C 5 7
1 3
1,C E
10
F 5,C
3
G

,
Temp[ ]={A,B,E,F,G}
18
Dijkstra’s Algorithm G 5,C
A

4 3 ,
1,C 5
u D s 0, B
1
C 5 7
1 3
1,C E
10
F 5,C
3
G

11,D
Temp[ ]={A,B,E,F,G}
19
Dijkstra’s Algorithm G 5,C
A

4 3 ,
1,C 5
u D s 0, B
1
C 5 7
1 3
1,C E
10
F 5,C
3
G

11,D
Temp[ ]={A,B,E,F,G}
20
Dijkstra’s Algorithm G 5,C
A

4 3 ,
1,C 5
D s 0, B
1
C 5 7
1 3
1,C E
10 u
F 5,C
3
G

11,D
Temp[ ]={A,B,E,F,G}
21
Dijkstra’s Algorithm G 5,C
A

4 3 ,
1,C 5
D s 0, B
1
C 5 7
1 3
1,C E
10 u
F 5,C
3
G

11,D
Temp[ ]={A,B,E,G}
22
Dijkstra’s Algorithm G 5,C
A

4 3 ,
1,C 5
D s 0, B
1
C 5 7
1 3
1,C E
10 u
F 5,C
3
G

11,D
Temp[ ]={A,B,E,G}
23
Dijkstra’s Algorithm G 5,C
A

4 3 ,
1,C 5
D s 0, B
1
C 5 7
1 3
1,C E
10 u
F 5,C
3
G

11,D
Temp[ ]={A,B,E,G}
24
Dijkstra’s Algorithm G 5,C
A

4 3 ,
1,C 5
D s 0, B
1
C 5 7
1 3
1,C E
10 u
F 4,F
3
G

11,D
Temp[ ]={A,B,E,G}
25
Dijkstra’s Algorithm G 5,C
A

4 3 ,
1,C 5
D s 0, B
1
C 5 7
1 3
1,C E
10 u
F 4,F
3
G

11,D
Temp[ ]={A,B,E,G}
26
Dijkstra’s Algorithm G 5,C
A

4 3 ,
1,C 5
D s 0, B
1
C 5 7
1 3 u
1,C E
10
F 4,F
3
G

11,D
Temp[ ]={A,B,E,G}
27
Dijkstra’s Algorithm G 5,C
A

4 3 ,
1,C 5
D s 0, B
1
C 5 7
1 3 u
1,C E
10
F 4,F
3
G

11,D
Temp[ ]={A,B,G}
28
Dijkstra’s Algorithm G 5,C
A

4 3 ,
1,C 5
D s 0, B
1
C 5 7
1 3 u
1,C E
10
F 4,F
3
G

11,D
Temp[ ]={A,B,G}
29
Dijkstra’s Algorithm G 5,C
A

4 3 ,
1,C 5
D s 0, B
1
C 5 7
1 3 u
1,C E
10
F 4,F
3
G

11,D
Temp[ ]={A,B,G}
30
Dijkstra’s Algorithm G 5,C
A

4 3 11,E
1,C 5
D s 0, B
1
C 5 7
1 3 u
1,C E
10
F 4,F
3
G

7,E
Temp[ ]={A,B,G}
31
Dijkstra’s Algorithm G 5,C
A

4 3 11,E
1,C 5
D s 0, B
1
C 5 7
1 3 u
1,C E
10
F 4,F
3
G

7,E
Temp[ ]={A,B,G}
32
Dijkstra’s Algorithm G 5,C
A
u
4 3 11,E
1,C 5
D s 0, B
1
C 5 7
1 3
1,C E
10
F 4,F
3
G

7,E
Temp[ ]={A,B,G}
33
Dijkstra’s Algorithm G 5,C
A
u
4 3 11,E
1,C 5
D s 0, B
1
C 5 7
1 3
1,C E
10
F 4,F
3
G

7,E
Temp[ ]={B,G}
34
Dijkstra’s Algorithm G 5,C
A
u
4 3 11,E
1,C 5
D s 0, B
1
C 5 7
1 3
1,C E
10
F 4,F
3
G

7,E
Temp[ ]={B,G}
35
Dijkstra’s Algorithm G 5,C
A
u
4 3 11,E
1,C 5
D s 0, B
1
C 5 7
1 3
1,C E
10
F 4,F
3
G

7,E
Temp[ ]={B,G}
36
Dijkstra’s Algorithm G 5,C
A
u
4 3 8,A
1,C 5
D s 0, B
1
C 5 7
1 3
1,C E
10
F 4,F
3
G

7,E
Temp[ ]={B,G}
37
Dijkstra’s Algorithm G 5,C
A
u
4 3 8,A
1,C 5
D s 0, B
1
C 5 7
1 3
1,C E
10
F 4,F
3
G

7,E
Temp[ ]={B,G}
38
Dijkstra’s Algorithm G 5,C
A

4 3 8,A
1,C 5
D s 0, B
1
C 5 7
1 3
1,C E
10
F 4,F
3
G
u
7,E
Temp[ ]={B,G}
39
Dijkstra’s Algorithm G 5,C
A

4 3 8,A
1,C 5
D s 0, B
1
C 5 7
1 3
1,C E
10
F 4,F
3
G
u
7,E
Temp[ ]={B}
40
Dijkstra’s Algorithm G 5,C
A

4 3 8,A
1,C 5
D s 0, B
1
C 5 7
1 3
1,C E
10
F 4,F
3
G
u
7,E
Temp[ ]={B}
41
Dijkstra’s Algorithm G 5,C
A

4 3 8,A
1,C 5
D s 0, B
1
C 5 7
1 3
1,C E
10
F 4,F
3
G
u
7,E
Temp[ ]={B}
42
Dijkstra’s Algorithm G 5,C
A

4 3 8,A
1,C 5
D s 0, B
1
C 5 7
1 3
1,C E
10
F 4,F
3
G
u
7,E
Temp[ ]={B}
43
Dijkstra’s Algorithm G 5,C
A

4 3 8,A
1,C 5
D s 0, B
1
C 5 7
1 3
1,C E
10
F 4,F
3
G
u
7,E
Temp[ ]={B}
44
Dijkstra’s Algorithm G 5,C
A

4 3 8,A
1,C 5 u
D s 0, B
1
C 5 7
1 3
1,C E
10
F 4,F
3
G

7,E
Temp[ ]={B}
45
Dijkstra’s Algorithm G 5,C
A

4 3 8,A
1,C 5 u
D s 0, B
1
C 5 7
1 3
1,C E
10
F 4,F
3
G

7,E
Temp[ ]={ }
46
Dijkstra’s Algorithm G 5,C
A

4 3 8,A
1,C 5 u
D s 0, B
1
C 5 7
1 3
1,C E
10
F 4,F
3
G

7,E
Temp[ ]={ }
47
Dijkstra’s Algorithm G 5,C
A

4 3 8,A
1,C 5 u
D s 0, B
1
C 5 7
1 3
1,C E
10
F 4,F
3
G

7,E
Temp[ ]={ }
48
Dijkstra’s Algorithm G 5,C
A

4 3 8,A
1,C 5 u
D s 0, B
1
C 5 7
1 3
1,C E
10
F 4,F
3
G

7,E
Temp[ ]={ }
49
Dijkstra’s Algorithm G 5,C
A

4 3 8,A
1,C 5 u
D s 0, B
1
C 5 7
1 3
1,C E
10
F 4,F
3
G

7,E
Temp[ ]={ }
50
Dijkstra’s Algorithm G 5,C
A

4 3 8,A
Vertex Distance Previous 1,C 5 u
A 5 C D s 0, B
1
B 8 A C 5 7
C 0  1 3
1,C E
D 1 C 10
F 4,F
E 4 F
3
F 1 C
G
G 7 E
7,E

51
Dijkstra’s Algorithm G 5,C
A

4 3 8,A
Vertex Distance Previous 1,C 5 u
A 5 C D s 0, B
1
B 8 A C 5 7
C 0  1 3
1,C E
D 1 C 10
F 4,F
E 4 F
3
F 1 C
G
G 7 E
7,E
Distance(C,G):
52
Dijkstra’s Algorithm G 5,C
A

4 3 8,A
Vertex Distance Previous 1,C 5 u
A 5 C D s 0, B
1
B 8 A C 5 7
C 0  1 3
1,C E
D 1 C 10
F 4,F
E 4 F
3
F 1 C
G
G 7 E
7,E
Distance(C,G): G
53
Dijkstra’s Algorithm G 5,C
A

4 3 8,A
Vertex Distance Previous 1,C 5 u
A 5 C D s 0, B
1
B 8 A C 5 7
C 0  1 3
1,C E
D 1 C 10
F 4,F
E 4 F
3
F 1 C
G
G 7 E
7,E
Distance(C,G): E→ G
54
Dijkstra’s Algorithm G 5,C
A

4 3 8,A
Vertex Distance Previous 1,C 5 u
A 5 C D s 0, B
1
B 8 A C 5 7
C 0  1 3
1,C E
D 1 C 10
F 4,F
E 4 F
3
F 1 C
G
G 7 E
7,E
Distance(C,G): F→ E→ G
55
Dijkstra’s Algorithm G 5,C
A

4 3 8,A
Vertex Distance Previous 1,C 5 u
A 5 C D s 0, B
1
B 8 A C 5 7
C 0  1 3
1,C E
D 1 C 10
F 4,F
E 4 F
3
F 1 C
G
G 7 E
7,E
Distance(C,G): C→ F→ E→ G
56
Dijkstra’s Algorithm G 5,C
A

4 3 8,A
Vertex Distance Previous 1,C 5 u
A 5 C D s 0, B
1
B 8 A C 5 7
C 0  1 3
1,C E
D 1 C 10
F 4,F
E 4 F
3
F 1 C
G
G 7 E
7,E
Distance(C,G): C→ F→ E→ G
57
Dijkstra’s Algorithm

Vertex Distance Previous


A 5 C
B 8 A C

C 0  1 3 E
D 1 C
F
E 4 F
3
F 1 C
G
G 7 E

Distance(C,G): C→ F→ E→ G
58
QUIZ #3
■ Time: 03:50 pm to 04:40 pm
■ Duration: 50 minutes

59
60
DESIGN & ANALYSIS OF
ALGORITHMS

Lecture # 45 & 46
IntroductIon to computatIonal
complEXItY claSSES
IN S T R U C T O R : S u la m a n A h m a d N a z
Topics: Computational Complexity

■ P, EXP & R Classes

■ Most problems are not computable

■ NP Class

■ P vs. NP?

■ Hardness & Completeness

■ Reductions
2
P, EXP & R Classes R
EXP
P = {problems solvable in << polynomial time} P
nc
EXP = {problems solvable in << exponential time}
2n^c
R = {problems solvable in some finite time}

EXP

Not Solvable

R
P 3
Examples
Tetris

 Negative –weight cycle detection  P &  EXP


 N x N chess  EXP but  P
 Tetris  EXP and ?(,)P
 The Halting Problem  R
4
Most Decision Problems  R
PROOF:
finite
 a program ≈ a binary string ≈ a natural number  n
 a decision problem ≈ a function from input to Yes/No
Input: a binary string ≈ a natural number  n
Output: Yes/No or 0/1
≈ a function from n to {0,1}
0 1 2 3 4 5 6 7 8 …
0 0 1 0 1 1 0 0 1 …
5
Most Decision Problems  R
PROOF:

 a program ≈ a finite binary string ≈ a natural number  n


 a decision problem ≈ a function from input to Yes/No
Input: a binary string ≈ a natural number  n
Output: Yes/No or 0/1
≈ a function from n to {0,1} ≈ an infinite binary string  r
0 1 2 3 4 5 6 7 8 …
. 0 0 1 0 1 1 0 0 1 …
6
Most Decision Problems  R
PROOF:

 a decision problem  r
 a program  n
|r| >> |n|
Hence, almost every problem is unsolvable by any program.

7
NP Class
NP = {decision problems solvable in << polynomial time
via a ‘lucky’ algorithm}
A nondeterministic model: Algorithm makes guesses & says Y/N.
Guesses are guaranteed to lead to a ‘Yes’ answer, if possible.
EXP

Not Solvable

P R
Tetris Halting
Problem
NP 8
NP Class
NP = {decision problems solvable in << polynomial time
via a ‘lucky’ algorithm}
A nondeterministic model: Algorithm makes guesses & says Y/N.
Guesses are guaranteed to lead to a ‘Yes’ answer, if possible.

No
No
Yes

9
NP = {decision problems solvable in
<< polynomial time via a ‘lucky’ algorithm}
Tetris  NP
The decision problem “Can I Survive?” is in NP.

10
NP Class
NP = {decision problems with “solutions” that can be “checked”
or “verified” in << polynomial time}
Whenever the answer is ‘Yes’, it can be proved and its proof can
be verified in polynomial time.
EXP

Not Solvable

P R
Tetris Halting
Problem
NP 11
NP = {decision problems with “solutions” that can be
“checked” or “verified” in << polynomial time}
Verify !!??
Consider a following scenario:

Khalid: I am ‘lucky’. I am really good at playing Tetris.


Anwar: I am not ‘convinced’.

Khalid: Give me a problem and I can prove it.


Anwar gives him a really hard Tetris problem.

Anwar: Can you survive these pieces?


Khalid: Yes, I can survive?

12
NP = {decision problems with “solutions” that can be
“checked” or “verified” in << polynomial time}
Verify !!??
So, PROOF is a sequence of steps that Khalid makes.

Anwar can then CHECK/VERIFY that the sequence works or not by


implementing the rules of Tetris in polynomial time.

NP = {decision problems with “solutions” that can be “checked”


or “verified” in << polynomial time}

Whenever the answer is ‘Yes’, it can be proved and its proof can be
verified in polynomial time.

13
P=NP ???

It is just a conjecture. Not proved yet.

General Consensus: R
P≠NP EXP
Some Mathematicians:
NP
P=NP
P
Millennium Prize Problem
$ 1 000 000 14
NP-HARD

If P ≠ NP,
Then Tetris  NP-P
So, Tetris  P
So, it is one of the hardest problems in NP.
Tetris is NP-HARD.
 As hard as every problem in NP.

15
NP-HARD

If P = NP,
Then Tetris  P
But, it is still in the right extreme of P and NP.
So, it is still one of the hardest problems in P and NP.
Thus, Tetris is still NP-HARD.
 As hard as every problem in NP.

16
NP-Complete

Tetris is NP-COMPLETE:

 Tetris is NP-HARD.
 As hard as every problem in NP.
 Tetris is in NP.

NP  NP-Hard = NP-COMPLETE
17
NP-Complete
EXP-Complete
NP-Complete
NP-Hard

EXP-Hard

P = NP ??? NP=EXP ??? Not Solvable

R
P Halting
Tetris Chess
NP Problem

EXP
18
Reductions Problem A  Problem B

Reductions are a way to design algorithms.


 Graph reductions.

Examples:
 Unweighted shortest path  Dijkstra’s WSP Problem

 <see next slide>

19
Reductions Problem A  Problem B

Lets consider a game called “Number Scrabble”

P1 P2 P2 P1

First, player 1 will pick a number from 1 to 9.


Then player 2 will pick a number.
The aim of the game is to pick 3 such numbers that add up to 15.
Whoever does this first, wins. If all numbers are picked and no player
has exactly 15, the game is a draw. 20
Reductions Problem A  Problem B

Let us reframe the game now and arrange them in a square.


Now, these numbers are arranged in a very
specific order.
Every triplet adds to 15.
This is called as a “Magic Square”
This type of arrangement looks very familiar.
Remember the “Tic-Tac-Toe” game.
21
Reductions Problem A  Problem B

O X X

O X

Similarly, a lot of games can be reduced to the exact same game.


22
Reductions Problem A  Problem B

O X X

O X

And lot of problems in math and computer science can be reduced


to the exact same problem. 23
‫‪End of COURSE‬‬
‫لی ٰذلِ َ‬
‫ک‬ ‫َ‬
‫ع‬ ‫ِ‬ ‫ﱣ‬ ‫ِ‬ ‫ُ‬
‫د‬ ‫ْ‬
‫م‬ ‫َ‬
‫ح‬ ‫ْ‬
‫ال‬ ‫َ‬
‫ف‬
‫ٰ‬

‫‪24‬‬

You might also like