Download as pdf or txt
Download as pdf or txt
You are on page 1of 17

Data Structure & Algorithms

CS-201
Lecture 02-Algorithm Complexity

Dr. Muhammad Mobeen Movania


Outline
• Algorithm Complexity
• Time and Space Complexity
• Running Time
• Asymptotic Notation
• Big O, Big Omega and Big Delta Notation
• Standard measures of efficiency
Algorithm Complexity
• Two ways to measure
– Time complexity
• How much time is required for an algorithm
– Space complexity
• How much space is required for an algorithm

• Usually we work with estimates rather than


exact values
Time Complexity
• More important as compared to space
complexity

• 3-4GHz. processors on the market


– still …
– researchers estimate that the computation of various transformations
for 1 single DNA chain for one single protein on 1 TerraHz. computer
would take about 1 year to run to completion
Running time

5 ms worst-case
4 ms

3 ms
}
average-case?
best-case
2 ms

1 ms

A B C D E F G
Input

Suppose the program includes an if-then statement that may execute


or not:  variable running time
Typically algorithms are measured by their worst case performance
Use a Theoretical Approach
• Based on high-level description of the
algorithms, rather than language dependent
implementations
• Makes possible an evaluation of the
algorithms that is independent of the
hardware and software environments
 Generality
Asymptotic Notation

• Goal: We want to simplify analysis by getting


rid of unneeded information (like “rounding”
1,000,001≈1,000,000) or how data is
represented in memory
• We want to say in a formal way 3n2 ≈ n2
• The “Big-Oh” Notation:
– given functions f(n) and g(n), we say that f(n) is
O(g(n)) if and only if there are positive constants c
and n0 such that f(n)≤ c g(n) for n ≥ n0
“Relatives” of Big-Oh

• O (f(n)): Big O – asymptotic upper bound

• “Relatives” of the Big-Oh


–  (f(n)): Big Omega – asymptotic lower bound
–  (f(n)): Big Theta – asymptotic tight bound

• Big-Omega – think of it as the inverse of O(n)


– g(n) is  (f(n)) if f(n) is O(g(n))

• Big-Theta – combine both Big-Oh and Big-Omega


– f(n) is  (g(n)) if f(n) is O(g(n)) and g(n) is  (f(n))
Asymptotic Notation
• f(x) = O(g(x)) (big-oh) means that the growth rate of f(x) is
asymptotically less than or equal to the growth rate of g(x).

• f(x) = Ω(g(x)) (big-omega) means that the growth rate of f(x) is


asymptotically greater than or equal to the growth rate of g(x)

• f(x) = o(g(x)) (small-oh) means that the growth rate of f(x) is


asymptotically less than the growth rate of g(x).

• f(x) = ω(g(x)) (small-omega) means that the growth rate of f(x) is


asymptotically greater than the growth rate of g(x)

• f(x) = Θ(g(x)) (theta) means that the growth rate of f(x) is


asymptotically equal to the growth rate of g(x)
Big O Notation
• Can be derived from f(n) as follows:
1. For each term of f(n), set the coefficient to 1
2. Keep the largest term in the function and
discard others.
• Terms are ranked from lowest (left) to highest
(right)
log2n n nlog2n n2 n3 … nk 2n n!
Fastest Slowest
Example
• Calculate Big O notation for dependent
quadratic loop with efficiency f(n)=n(n+1)/2
• First expand all terms: n2/2 + n/2
• Remove coefficients: n2 + n
• Take the bigger term: n2
• So asymptotic upper bound: O(n2)
Example
• Matrix addition asymptotic upper bound O(n2)
Example
• Matrix Multiplication asymptotic upper
bound: O(n3)
Standard Measures of Efficiency
• Scientists has identified the following seven
categories of algorithm efficiencies (n=10,000)
Plot of Efficiency Measures
Next Lecture
• Lecture 03-Searching Algorithms
Dependent Quadratic - Inner loop
i j Total j iterations Total Times j loop runs
1 1 1
2 1,2 2
3 1,2,3 3
4 1,2,3,4 4 1+2+3+4+5+6+7+8+9+10=55
5 1,2,…,5 5 Average:
(1+10)/2=5.5
6 1,2,…,6 6 Average Generalized:
7 1,2,…,7 7 (n+1)/2
8 1,2,…,8 8
9 1,2,…,9 9
10 1,...,10 5

You might also like