Professional Documents
Culture Documents
Lecture 02
Lecture 02
Asymptotic Notation
Notation
ymp - 1
Asymptotic Complexity
Running time of an algorithm as a function of
input size n for large n.
Expressed using only the highest-order term in
the expression for the exact running time.
Instead of exact running time, say (n2).
Describes behavior of function in the limit.
Written using Asymptotic Notation.
ymp - 2
Asymptotic Notation
, O, , o,
Defined for functions over the natural numbers.
Ex: f(n) = (n2).
Describes how f(n) grows in comparison to n2.
Define a set of functions; in practice used to compare
two function sizes.
The notations describe different rate-of-growth
relations between the defining function and the
defined set of functions.
ymp - 3
-notation
For function g(n), we define (g(n)),
big-Theta of n, as the set:
(g(n)) = {f(n) :
positive constants c1, c2, and n0,
such that n n0,
we have 0 c1g(n) f(n) c2g(n)
}
Intuitively: Set of all functions that
have the same rate of growth as g(n).
ymp - 4
-notation
For function g(n), we define (g(n)),
big-Theta of n, as the set:
(g(n)) = {f(n) :
positive constants c1, c2, and n0,
such that n n0,
we have 0 c1g(n) f(n) c2g(n)
}
Technically, f(n) (g(n)).
Older usage, f(n) = (g(n)).
I’ll accept either…
10n2 - 3n = (n2)
What constants for n0, c1, and c2 will work?
Make c1 a little smaller than the leading
coefficient, and c2 a little bigger.
To compare orders of growth, look at the
leading term.
Exercise: Prove that n2/2-3n= (n2)
ymp - 6
Example
(g(n)) = {f(n) : positive constants c1, c2, and n0,
such that n n0, 0 c1g(n) f(n) c2g(n)}
Is 3n3 (n4) ??
How about 22n (2n)??
ymp - 7
O-notation
For function g(n), we define O(g(n)),
big-O of n, as the set:
O(g(n)) = {f(n) :
positive constants c and n0,
such that n n0,
we have 0 f(n) cg(n) }
Intuitively: Set of all functions
whose rate of growth is the same as
or lower than that of g(n).
g(n) is an asymptotic upper bound for f(n).
f(n) = (g(n)) f(n) = O(g(n)).
(g(n)) O(g(n)).
ymp - 8
-notation
For function g(n), we define (g(n)),
big-Omega of n, as the set:
(g(n)) = {f(n) :
positive constants c and n0,
such that n n0,
we have 0 cg(n) f(n)}
Intuitively: Set of all functions
whose rate of growth is the same
as or higher than that of g(n).
g(n) is an asymptotic lower bound for f(n).
f(n) = (g(n)) f(n) = (g(n)).
(g(n)) (g(n)).
ymp - 9
Example
(g(n)) = {f(n) : positive constants c and n0, such
that n n0, we have 0 cg(n) f(n)}
ymp - 10
Relations Between , O,
ymp - 11
Relations Between , , O
Theorem
Theorem :: ForFor any
any two
two functions
functions g(n)
g(n) and
and f(n),
f(n),
f(n) == (g(n))
f(n) (g(n)) iff
iff
f(n)
f(n) == O(g(n))
O(g(n)) and f(n) == (g(n)).
and f(n) (g(n)).
I.e., (g(n)) = O(g(n)) (g(n))
ymp - 12
Example
Insertion sort takes (n2) in the worst case, so
sorting (as a problem) is O(n2). Why?
ymp - 13
Asymptotic Notation in Equations
Can use asymptotic notation in equations to
replace expressions containing lower-order terms.
For example,
4n3 + 3n2 + 2n + 1 = 4n3 + 3n2 + (n)
= 4n3 + (n2) = (n3). How to interpret?
In equations, (f(n)) always stands for an
anonymous function g(n) (f(n))
In the example above, (n2) stands for
3n2 + 2n + 1.
ymp - 14
o-notation
For a given function g(n), the set little-o:
o(g(n)) = {f(n): c > 0, n0 > 0 such that
n n0, we have 0 f(n) < cg(n)}.
f(n) becomes insignificant relative to g(n) as n
approaches infinity:
lim [f(n) / g(n)] = 0
n
ymp - 16
Comparison of Functions
fg ab
f (n) = O(g(n)) a b
f (n) = (g(n)) a b
f (n) = (g(n)) a = b
f (n) = o(g(n)) a < b
f (n) = (g(n)) a > b
ymp - 17
Algorithm Definition
18
ymp - 18
Good Algorithms?
19
ymp - 19
Measuring Efficiency
20
Is it correct ?
ymp - 20
Factors
Hardware
Operating System
Compiler
Size of input
Nature of Input
Algorithm
21
ymp - 21
RUNNING TIME OF AN ALGORITHM
Depends upon
• Input Size
• Nature of Input
Generally time grows with size of input, so
running time of an algorithm is usually measured
as function of input size.
Running time is measured in terms of number of
steps/primitive operations performed
Independent from machine, OS
22
ymp - 22
Finding running time of an Algorithm / Analyzing an
Algorithm
ymp - 23
Simple Example
ymp - 24
Simple Example
ymp - 25
What Dominates in Previous Example?
ymp - 26
Asymptotic Complexity
ymp - 27
COMPARING FUNCTIONS: ASYMPTOTIC
NOTATION
28
ymp - 28
Big Oh Notation
ymp - 29
Big Oh Notation [2]
30
ymp - 30
Big-Oh Notation
Simple Rule:
Drop lower order terms and constant factors
7n-3 is O(n)
31
8n2log n + 5n2 + n is O(n2log n)
ymp - 31
Big Omega Notation
If we wanted to say “running time is at least…” we
use Ω
Big Omega notation, Ω, is used to express the
lower bounds on a function.
If f(n) and g(n) are two complexity functions then
we can say:
f(n) is Ω(g(n)) if there exist positive numbers c and n0 such that 0<=f(n)>=cΩ (n) for all n>=n0
32
ymp - 32
Big Theta Notation
33
ymp - 33
What does this all mean?
ymp - 34
Which Notation do we use?
35
ymp - 35
Performance Classification
f(n) Classification
1 Constant: run time is fixed, and does not depend upon n. Most instructions are executed once, or
only a few times, regardless of the amount of information being processed
log n Logarithmic: when n increases, so does run time, but much slower. Common in programs which solve
large problems by transforming them into smaller problems. Exp : binary Search
n Linear: run time varies directly with n. Typically, a small amount of processing is done on each
element. Exp: Linear Search
n log n When n doubles, run time slightly more than doubles. Common in programs which break a problem
down into smaller sub-problems, solves them independently, then combines solutions. Exp: Merge
n2 Quadratic: when n doubles, runtime increases fourfold. Practical only for small problems; typically
the program processes all pairs of input (e.g. in a double nested loop). Exp: Insertion Search
2n Exponential: when n doubles, run time squares. This is often the result of a natural, “brute force”
solution. Exp: Brute Force.
Note: logn, n, nlogn, n2>> less Input>>Polynomial
n3, 2n>>high input>> non polynomial 36
ymp - 36
Size does matter[1]
N log2N 5N N log2N N2 2N
8 3 40 24 64 256
16 4 80 64 256 65536
32 5 160 160 1024 ~109
64 6 320 384 4096 ~1019
128 7 640 896 16384 ~1038
256 8 1280 2048 65536 ~1076
37
ymp - 37
Complexity Classes
Time (steps)
38
ymp - 38
Analyzing Loops[1]
Any loop has two parts:
How many iterations are performed?
How many steps per iteration?
ymp - 39
Analyzing Nested Loops[1]
Treat just like a single loop and evaluate each
level of nesting as needed:
int j,k;
for (j=0; j<N; j++)
for (k=N; k>0; k--)
sum += k+j;
42
ymp - 42
Sequence of Statements
ymp - 43
Details
44
ymp - 44
A Big Warning
Do NOT ignore constants that are not
multipliers:
n3 is O(n2) is FALSE
3n is O(2n) is FALSE
46
47
49
Reflexivity
f(n) = (f(n))
f(n) = O(f(n))
f(n) = (f(n))
ymp - 50
Properties HomeWork
Symmetry
f(n) = (g(n)) iff g(n) = (f(n))
Complementarity
f(n) = O(g(n)) iff g(n) = (f(n))
f(n) = o(g(n)) iff g(n) = ((f(n))
ymp - 51
Common Functions
ymp - 53
Reading Assignment
Chapter 1-2 of recommended text book