Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 35

CSA06

Design and Analysis of Algorithms

Unit 1 – Introduction

Saveetha School of Engineering


SIMATS, Chennai
Date : 20.04.2021
Day 2
Topics

NAME OF PRESENTATION
S.No Topic Name
1 Company
Fundamentals of the Analysis of Algorithm EfficiencyName

2 Analysis Frame work

3 Asymptotic Notations and Basic Efficiency Class


1. Fundamentals of the Analysis of Algorithm
Efficiency
 Time efficiency (time complexity): indicates how fast an algorithm runs
 Space efficiency (space complexity): refers to the amount of memory
units required by the algorithm in addition to the space needed for its input and output
 Measuring an Input size - The time efficiency of an algorithm is typically as a
function of the input size
 Units for Measuring Running time
 Order of Growth
 Best-case, Average-case, Worst-case
Units for Measuring Running Time
•The running time of an algorithm is to be
measured with a unit that is independent of the
extraneous factors like the processor speed,
quality of implementation, compiler, etc.
•We will count the number of times the algorithm’s basic
operation is executed on inputs of size n.
Example
Example
•Algorithm
1. Input n= 20
2. For (i=0; i<n;i++)
3. sum = sum+i
4. display sum

Find the Basic operations


Time Complexity
•1 – constant
•Log n – Logarithmic
•n – Linear
•N2 – Quadratic
•N3 – Cubic
•2n - exponential
Example 1
•Algorithm Sum(A,n)
•{
•S = 0; 1
•For i = 1 to n do n+1
•{
•S= s + A[i]; n
}
Return s; 1

}
Example 2
Example 3
Order of Growth
The order of growth of the running time of an
algorithms, gives a simple characterization of algorithms
efficiency.
We are more interested in the order of growth on the
number of times the basic operation is executed on the
input size of an algorithm.
Because, for smaller inputs, it is difficult to distinguish
efficient algorithms vs. inefficient ones.
Order of Growth
2. Analysis Framework
 Time complexity:
 Compilation Time: Time taken to compile an algorithm while compiling, it checks for syntax and semantics errors.
 Run Time: Time taken to compile an algorithm program. It depends upon the number of instructions present in the
algorithm. We consider 1 unit for executing 1 instruction.

 Space complexity:
 Fixed amount of memory
 Variable amount of memory

 Time Space Tradeoff - Reduction of time increases the space


and vice versa.
Best-case, Average-case, Worst-case

Sequential Search
Best-case, Average-case, Worst-case

Sequential Search
Best Case
k = 7, n = 5 [7,8,9,1,2]
Comparison = 1 time
Cbest(n) = 1
Worst Case
k=2, n=5 [7,8,9,1,2]
Comparison = 5 times – n times
Cworst(n) = n
Best-case, Average-case, Worst-case

Sequential Search
Average Case
k = 7, n = 5 [1,2,3,7,8]
P (0<=P<=1)
Cavg(n) = [1.p/n+2.p/n+….+i.p/n+….+n.p/n] + n.(1-p)
=p/n [1+2+….+n]+n(1-p)
p/n . N(n+1)/2 + n(1-p)
Success : p=1  n+1/2
Unsuccess : p=0 : n
Best-case, Average-case, Worst-case

(a)Best Case
- Shortest time that an algorithm will use for a given
problem to produce the desired result.
T(n)=Ω(f(n))
(b)Average Case
- Average time that an algorithm will use for a given
problem.
(c) Worst Case
- longest time that an algorithm will use (Size n
(input)) for a given problem to produce a desired results.
T(n)=O(f(n))
Asymptotic Analysis
Asymptotic Analysis

 Analysis of a given algorithm with larger values of


input data
 Theory of approximation.
 Asymptote of a curve is a line that closely
approximates a curve but does not touch the curve at
any point of time
 Specify the behavior of the algorithm when the input
size increases
3. Asymptotic Notation

Asymptotic order is concerned with how the running


time of an
algorithm increases with the size of the input, if input
increases
from small value to large values
1. Big-Oh notation (O)
2. Big-Omega notation (Ω)
3. Theta notation (θ)
4. Little-oh notation (o)
5. Little-omega notation (ω)
Big-Oh Notation (O)
 Big-oh notation is used to define the worst-case
running time of an algorithm and concerned with large
values of n.
Definition: A function t(n) is said to be in O(g(n)), denoted
as t(n) ϵ O(g(n)), if t(n) is bounded above by some
constant multiple of g(n) for all large n. i.e., if there exist
some positive constant c and some non-negative integer
n0 such that
O(g(n)) = {f(n); 0<=f(n)<=C(g(n)) for all n>=n0}, where C
and n0 are positive constants.
 f(n) is Big ‘O’ of g(n)  f(n)= O(g(n))
Big-Oh Notation (O)
Big-Oh Notation (O)
Big-Oh Notation (O)
Big-Omega notation (Ω)
 This notation is used to describe the best case running
time of algorithms and concerned with large values of n.
Definition: A function t(n) is said to be in Ω(g(n)),denoted as
t(n) ϵ Ω(g(n)), if t(n) is bounded below by some positive
constant multiple of g(n) for all large n. i.e., there exist
some positive constant c and some non-negative integer
n0. Such that
f(n)>=C(g(n), n>n0 , C>0, n0>=1
 it denotes the best case complexity.
 It represents the lower bound of the resources required
to solve a problem.
Big-Omega notation (Ω)
Big-Omega notation (Ω)
Theta notation (θ)
 Also called as Asymptotically Equal
Definition: A function t(n) is said to be in θ (g(n)), denoted
t(n) ϵ θ(g(n)), if t(n) is bounded both above and below by
some positive constant multiples of g(n) for all large n. i.e., if
there exist some positive constant c1 and c2 and some non-
negative integer n0 such that
C1g(n)<=f(n)<=C2g(n), where C1,C2>0 for n>=n0 and n0>=1
 f(n)=θ(g(n))
 it denotes the average case complexity
 f(n) is bounded by g(n) both in upper as well as in lower
bounds
Theta notation (θ)
Theta notation (θ)
Properties of O, Ω and θ
General property:
If f(n) is O(g(n)) then a * f(n) is O(g(n)). Similar for Ω and
θ
Transitive Property :
If f (n) ϵ O(g(n)) and g(n) ϵ O(h(n)), then f (n) ϵ O(h(n));
that is O is transitive. Also Ω, θ, o and ω are transitive.
Reflexive Property
If f(n) is given then f(n) is O(f(n))
Symmetric Property
If f(n) is θ(g(n)) then g(n) is θ(f(n))
Transpose Property
If f(n) = O(g(n)) then g(n) is Ω(f(n))
Summary

Asymptotic analysis estimate an algorithmic complexity


• Based on theory of approximation.
• Effective in specifying the behavior of algorithm when
the input size increases
• Big – Oh notation – upper bound
• Big – Omega notation – lower bound
• Little – oh notation – tight bound
Thank You

You might also like