Download as pdf or txt
Download as pdf or txt
You are on page 1of 34

UNIT I

INTRODUCTION
Contents:
 Algorithm
- Definition
- Criterion (Properties of algorithm)
- Different study areas of algorithm
 Analysis of algorithms.
 Asymptotic notations.
 Recurrence Relations
- Substitution method
- Recursion tree
- Masters Theorem
 General rules to find complexity and examples.
 Practical complexities.
Prerequisites & Objectives
Prerequisites: Before beginning this subject, you should be able to
 Read and create algorithms.
 Read and create recursive algorithms.
 Identify comparison and arithmetic operations.
 Use basic algebra (Geometric series & Progression, Recurrence relation).

Objectives: At the end of this chapter you should be able to


 Analyze the asymptotic performance of algorithm.
 Understand how to choose the operations that are counted and why
others are not.
 Learn how to do a best-case, worst-case, and average-case analysis.
 Use important algorithmic design strategies.
 Synthesize efficient algorithms in common engineering situation.
 Convert a simple recurrence relation into its closed form.
Algorithm 1/3

Definition:
An algorithm is a finite set of instructions that, if followed, accomplishes
a particular task.
OR
An algorithm is a sequence of unambiguous instructions for solving a
problem, i.e., for obtaining a required output for any legitimate input in a
finite amount of time.
OR
An algorithm is a finite set of instructions that accomplishes a particular
task.
OR
A clearly specified set of instructions to solve a problem.
Algorithm 2/3

Criterion : All Algorithms must follow following criterion.


Input : Zero/more quantities externally supplied.

Output : At least one quantity is produced.

Definiteness : Each instruction/statement/step is clear and unambiguous.

Finiteness : Must terminates after a finite number of steps.

Effectiveness : Each instruction must be very basic so that can be carried


out.
Algorithm 3/3

Different study areas of Algorithm:


 How to devise an algorithm?

 How to validate algorithm?

 How to analyze algorithm?

 How to test algorithm?


Analysis of algorithms/Performance Analysis

How good is the algorithm?

 Time efficiency/complexity: The time complexity of an algorithm


is the amount of computer time it needs to run to completion.

 Space efficiency/complexity: The space complexity of an


algorithm is the amount of memory it needs to run to completion.

 N/W Consumption : (WB / CB)

 CPU registers: (DD / SP)

 Power consumption
 Correctness ignored in this course.
Analysis of algorithms
How to Analyse an Algorithm:
Example : 1
Time Space
Algorithm
Complexity Complexity
Algorithm Swap(a, b) a- 1
{ b- 1
temp := a; 1 temp - 1
a := b; 1
b := temp; 1
}
f(n) = 3
S(n) = 3 word
Time complexity : f(n) = 3 O (1) constant
Space complexity: S(n) = 3 word O (1) constant
Analysis of algorithms
Frequency count method:
Example : 1 Time Space
Algorithm
Complexity Complexity
Algorithm Sum(A, n) s- 1
{ A- n
s := 0; 1 n- 1
for (i:=0; i<n; i++) n+1 i- 1
s:= s + A[i]; n
return s; 1
}
f(n) = 2n+3
S(n) = n+3 word
Time complexity : f(n) = 2n+3 O (n) Linear
Space complexity : S(n) = n+3 word O (n) Linear
Analysis of algorithms
Frequency count method:
Example : 2 Time Space
Algorithm
Complexity Complexity
Algorithm Add(A, B, n) A- n2
{ B- n2
for (i:=0; i<n; i++) n+1 C- n2
n- 1
for (j:=0; j<n; j++) n*(n+1)
i- 1
C[i, j]:= A[i, j]+B[i, j];
n*n j- 1
}
f(n) = 2 n2+2n+1
S(n) = 3n2+3 word
Time complexity : f(n) = 2 n2+2n+1 O (n2 ) Quadratic
Space complexity : S(n) = 3n2+3 word O (n2) Quadratic
Analysis of algorithms
Frequency count method:
Example : 3
Time To terminate while loop
Algorithm
Complexity a≥b
Algorithm Test(a, b) T(n)
{ a: 1 Since a = 2k
a:= 1; 1*2 = 21 2k ≥ b ie 2k = b
while(a < b) 2*2 = 22
22*2 = 23 k= log2 b
statement 1;
--
a:=a*2; -- T(n) = O (log n)
} 2k
T(b) =
Priori Analysis Vs Posteriori Testing

Priori Analysis Posteriori Testing


1. Algorithm 1. Program
2. Independent of 2. Programming Language
Programming Language. dependent.
3. H/W & S/W independent. 3. H/W & S/W dependent.
4. Time in some unit &
4. Time & Space functions.
no. of bytes.
Asymptotic Complexity

 Running time of an algorithm as a function of input size n


for large n.

 Expressed using only the highest-order term in the


expression for the exact running time.

 Instead of exact running time, say (n2).

 Describes behaviour of function in the limit.

 Written using Asymptotic Notation.


Asymptotic notations

 , O, , o, 
 Defined for functions over the natural numbers.
o Ex: f(n) = (n2).
o Describes how f(n) grows in comparison to n2.
 Define a set of functions; in practice used to compare two
function sizes.
 The notations describe different rate-of-growth relations
between the defining function and the defined set of
functions.
Asymptotic notations cont….

Big "oh": The function f(n) = O(g(n)) (read as “ f of n is


big 0 of g of n”) iff (if and only if) there exist positive constants
c and no such that f(n)  cg(n) for all n, n ≥ no.
Intuitively: Set of all functions whose
rate of growth is the same as or lower
than that of g(n).
g(n) is an asymptotic upper
bound for f(n).
Example:
3n+2=O(n) /* 3n+24n for n2 */
10n2+4n+2=O(n2) /* 10n2+4n+2 
11n2 for n5 */
Asymptotic notations cont….

 (Theta): The function f(n) =  (g(n)) (read as “ f of n is


theta of g of n”) iff there exist positive constants c1, c2 and
no such that c1g(n)  f(n)  c2g(n) for all n, n ≥ no.

g(n) is an asymptotically tight


bound for f(n).
Asymptotic notations cont….

 (Omega): The function f(n) = (g(n)) (read as “ f of n is


omega of g of n”) iff (if and only if) there exist positive
constants c and no such that cg(n)  f(n) for all n, n ≥ no.

Intuitively: Set of all functions


whose rate of growth is the same
as or higher than that of g(n).

g(n) is an asymptotic lower


bound for f(n).
Relations Between , O, 

 I.e., (g(n)) = O(g(n))  (g(n)).


 In practice, asymptotically tight bounds are obtained from
asymptotic upper and lower bounds.
Analysis of Recursive Algorithms
Recurrence Relation:
An equation that defines the sequence in terms of one or more of it’s previous
terms of the sequence is called as recurrence relation.
OR
A recurrence relation is an equation that recursively defines a sequence where the
next term is a function of the previous term/s.

Example : 1) Fibonacci series : < 1, 1, 2, 3, 5, 8, 13, ………>

Fn = Fn-1 + Fn-2 , n ≥ 2 Initial values: F0 = F1 = 1

2) Series : < 3, 9, 27, 81, ……>

Fn = 3* Fn-1 , n ≥ 1 Initial value: F0 = 3

F225 = 3* F224 = 3* (3* F243) = ? Closed form Fn = 3* (3n-1)


Recurrence Relation 1/8
Substitution Method : Example 1:
T(n) = T(n-1) + 1
Time
Algorithm
Complexity
Algorithm Test(n) T(n)
{ T(n) = T(n-1) + 1
if (n > 0) 1 = [T(n-2) + 1] +1 = T(n-2) + 2
print n; 1 = [T(n-3) + 1] +2 = T(n-3) + 3
Test(n-1); T(n-1)
} ------ continue for k times
T(n) = T(n-1) + 2
= T(n-k) + k
Test(3)
Assume n – k = 0. Therefore n = k
3 Test(2)
n+1
T(n) = T(0) + n = n+1
2 Test(1)
T(n) = n+1 T(n) = O(n)
1 Test(0)
Recurrence Relation 2/8
Substitution Method : Example 2

Algorithm Time T(n) = T(n-1) + n


Complexity = [T(n-2) + n-1] +n = T(n-2) + (n-1)+n
Algorithm Test(n) T(n) = [T(n-3) + n-2] +(n-1)+n = T(n-3) +(n-2)+ (n-1)+n
{
if (n > 0) 1 ------ continue for k times
for(i:=0; i<n; i:=i+1) n+1
print i; n = T(n-k) + (n-(k-1))+ (n-(k-2))+ ---+ (n-1)+ n
Test(n-1); T(n-1)
} Assume n- k=0. Therefore n=k

T(n) = T(n-1) + 2n+2 T(n) = T(n-n) + (n-n+1) + (n-n+2) + ---(n-1)+ n


= T(0) + 1 + 2+ 3+ -----+ (n-1) + n = 1+ n(n+1)/2
T(n) = T(n-1) + n T(n) = (n2+n+2)/2
T(n) = (n2+n+2)/2 T(n) = O(n2)
Recurrence Relation 3/8
Recursion Tree : Example 1
T(n)
Algorithm Time Complexity
Algorithm Test(n) T(n) log n T(n-1)
{
if (n > 0) 1 log (n-1) T(n-1)
for(i:=1; i<n; i:=i*2) log n : :
print i; log n : :
T(2)
Test(n-1); T(n-1)
} log 2 T(n-1)
T(n) = T(n-1) + 2log n +1
log 1 T(0)

T(n) = T(n-1) + log n + 1 T(n) = log n + log (n-1) + ------- + log 2 + log 1
= log [ n * (n-1) * - - - - * 2 * 1]
= log n! No time bound.
Use Upper bound
T (n) = O(n log n)
Recurrence Relation 4/8
Masters Theorem :
A) For decreasing functions:
Let be constants, let be a function, and
T(n) be defined on the nonnegative integers by the recurrence relation
then
Case :
1: if a < 1 then O (nk) OR O (f(n)) .
2: if a = 1 then O (nk+1) OR O (n * f(n)) .
3: if a > 1 then O (nk * an/b) OR O (f(n) * an/b ) .

Examples:
1) T(n) = T(n-1) + 1 T(n) = O(n)
2) T(n) = T(n-1) + n T(n) = O(n2)
3) T(n) = T(n-1) + log n T(n) = O(n log n)
4) T(n) = 2T(n-2) + 1 T(n) = O(2n/2)
5) T(n) = 2T(n-1) + n T(n) = O(n 2n)
Recurrence Relation 5/8
Masters Theorem :
B) For dividing functions:
Let a ≥ 1, b > 1 be constants, let be a function, and let
T(n) be defined on the nonnegative integers by the recurrence relation
then
Recurrence Relation 6/8
Masters Theorem :
B) For dividing functions: Examples for case 1

1) 3)

a = 2, b = 2, k = 0, p = 0 a = 8, b = 2, k = 1, p = 1

Then T(n) = = Then T(n) = =

2)

a = 4, b = 2, k = 1, p = 0

Then T(n) = =
Recurrence Relation 7/8
Masters Theorem :
B) For dividing functions: Examples for case 2

1) 3)

a = 4, b = 2, k = 2, p = 0 a = 2, b = 2, k = 1, p = -2
Since p > -1 case a. Since p < -1 case c.

Then T(n) = = Then T(n) = =

2)

a = 2, b = 2, k = 1, p = -1
Since p = -1 case b.

Then T(n) = =
Recurrence Relation 8/8
Masters Theorem :
B) For dividing functions: Examples for case 3

1)
a = 1, b = 2, k = 2, p = 0
Since p ≥ 0 case a.

Then T(n) = =

2)

a = 4, b = 2, k = 3, p = -1
Since p < 0 case b.

Then T(n) = =
General Rules 1/2

Rule 1- for Loop: The running time of a for loop is at most the
running time of the statements inside the for loop (including tests) times
the number of iterations.
As an example: The following program fragment is O (n) .

for (i=0; i<n; i++)


k++;

Rule 2- if / else: For the following code fragment


if (condition)
s1
else
s2
The running time of if else statement is never more than the running time
of the test plus the larger of the running times of s1 & s2 .
General Rules 2/2

Rule 3- Nested loop: Analyze these inside out. The total running time of a
statement inside a group of nested loop is the running time of the statement
multiplied by the product of the sizes of all the loops.
As an example: The following program fragment is O (n2)
for (i=0; i<n; i++)
for ( j=0; j<n; j++)
k++;
Rule 4- Consecutive Statements: These just add (which means that the
maximum is the one that counts) .
As an example: The following program fragment is O (n2).
for (i=0; i<n; i++)
a[i] =0;
for (i=0; i<n; i++)
for ( j=0; j<n; j++)
k++;
Practical complexities

 O(1) : constant
 O(n) : linear
 O(n^2) : quadratic
 O(n^3) : cubic
 O(2^n) : exponential
 O(log n) : logarithmic
 O(n log n) :
Plot of Function values
Plot of Practical complexities
That’s it.
CPU Time calculation

CPU Time = CPU Clock cycles / Clock rate

Time CPU spends computing for some task/set of instruction/program excluding I/O
delays, any other delays.

CPU Clock cycles = No. of instructions * CPI#

CPI – Cycles per instructions.


# in case of all instructions taking same number of cycles.
CPU Time calculation
Ex: Consider three different processors P1, P2, P3 executing the same instruction set
(program).
P1 has a 3 GHz clock rate and a CPI of 1.5
P2 has a 2.5 GHz clock rate and a CPI of 1.0
P3 has a 4 GHz clock rate and a CPI of 2.2
Which processor has the highest performance expressed in instructions per second?
Solution: We know Performance = 1 / Execution time

CPU time = CPU Clock cycles / Clock rate = I * 1.5 / 3 * 109

Performance = 3 * 109 / I * 1.5 = 2 * 109 P1

2.5 * 109 / I * 1.0 = 2.5 * 109 P2

4 * 109 / I * 2.2 = 1.8 * 109 P3

Processor _____ has the highest performance.


CPU Time calculation
Expt No 1: Write a C/C++ program to find the actual time of a program.

Functions to be used:

clock(); clock_t

record/ copy start time in some variable.


call the function to solve the problem.
record/ copy end time in some variable.
Calculate actual time ie (end time – start time)

time_taken = ((double)t)/CLOCKS_PER_SEC;

You might also like