Professional Documents
Culture Documents
Asymptotic Analysis
Asymptotic Analysis
Asymptotic Analysis
David Kaplan
T(n) = ( f(n) )
Omega Exist positive constants c, n0 such that Lower bound
T(n) cf(n) for all n n0
T(n) = θ( f(n) )
Theta Tight bound
T(n) = O(f(n)) AND T(n) = (f(n))
T(n) = o( f(n) ) Strict upper
little-o T(n) = O(f(n)) AND T(n) != θ(f(n)) bound
n2 + 100 n = (n2)
follows from …( n2 + 100 n ) 1 n2 for n 0
n2 + 100 n = (n2)
by definition
n log n = O(n2)
n log n = (n log n)
n log n = (n)
Asymptotic Analysis CSE 326 Autumn 2001 6
Big-O Usage
Order notation is not symmetric:
we can say 2n2 + 4n = O(n2)
… but never O(n2) = 2n2 + 4n
Right-hand side is a crudification of the left
n0.1 log n
vs.
n + 100n0.1 2n + 10 log n
5n5 n!
n-152n/100 1000n15
82log n 3n7 + 7n
In this one, crossover point is very late! So, which algorithm is really better???
Asymptotic Analysis CSE 326 Autumn 2001 10
Race C
n + 100n0.1 vs. 2n + 10 log n
5n5 n! O(n5)
Distinguish
Worst case
Your worst enemy is choosing input
Average case
Assume probability distribution of inputs
Amortized
Average time over many runs
Best case (not too useful)
Asymptotic Analysis CSE 326 Autumn 2001 17
Analyzing Code
C++ operations constant time
Consecutive stmts sum of times
Conditionals larger branch plus test
Loops sum of iterations
Function calls cost of function body
Recursive functions solve recursive equation
n n
1 n n
n 2
j 1
i 1 i 1
n n n n n
i 1 j i
1 (n i 1) (n 1) i
i 1 i 1 i 1
n(n 1) n(n 1) 2
n(n 1) n
2 2
One subproblem
Linear reduction in size (decrease by 1)
Combining: constant (cost of 1 add)
T(0) b
T(n) c + T(n – 1) for n>0
Solution:
T(n) c + c + T(n-2)
c + c + c + T(n-3)
kc + T(n-k) for all k
nc + T(0) for k=n
cn + b = O(n)
7 12 30 35 75 83 87 90 97 99
Solution:
T(n) T(n/2) + c
T(n/4) + c + c
T(n/8) + c + c + c
T(n/2k) + kc
T(1) + c log n where k = log n
b + c log n = O(log n)