Professional Documents
Culture Documents
L2-Mathematical Background
L2-Mathematical Background
Overview
• Rate of growth of functions.
• When we look at input sizes large enough to make only the order of
growth of the running time relevant, we are studying the asymptotic
efficiency of algorithms. That is, we are concerned with how the
running time of an algorithm increases with the size of the input in the
limit, as the size of the input increases without bound.
• If f(x) and g(x) are two functions, then the symbols above
are used to compare the rapidity of growth of f and g.
o – little oh
• We say that
• Alternative definition
• f(x) and g(x) are of the same growth rate, only multiplicative constants
are uncertain.
Θ - notation
• Besides is a picture of functions f(n) and g(n), where
we have f(n) = Θ(g(n)). For all values of n to the
right of n0, the value of f(n) lies at or above c1g(n)
and at or below c2g(n). In other words, for all n≥n0,
the function f(n) is equal to g(n) to within a constant
factor. We say that g(n) is an asymptotically tight
bound for f(n) .
Θ – notation: Problem 1
• Show that
• Since O-notation describes an upper bound, when we use it to bound the worst case
running time of an algorithm, we have bound on the running time of the algorithm
on every input.
•
O-notation (big oh)
• Prove the following:
a) sinx = O(x)
b) sinx = O(1)
c) x3+5x2+77cosx = O(x5)
d) 1/(1+x2) = O(1)
Ω - notation
• We use Ω-notation when we have only an asymptotic lower bound.
• Definition: For a given function g(n), we denote by Ω(g(n))
(pronounced big omega of g of n) the set of functions
• Reflexivity property
Comparison of functions
• Symmetry property
• Transpose property
• Because many of the relational properties hold for asymptotic
notations, one can draw an analogy between the asymptotic
comparison of two functions f and g and the comparison of two real
numbers a and b:
Trichotomy property
• Trichotomy property does not carry over to asymptotic notations.
Functions that grows without
bound as x→∞
1. Slowest growing ones are functions like
f(x) = log logx
log logx → ∞ as x → ∞
2. f(x) = logx
logx → ∞ as x → ∞, but grows a bit faster than log logx.
Hence, logx is a better algorithm.
3. f(x) = x0.01
x0.01 → ∞ as x → ∞, but grows a faster than logx.
Assignment: Prove that f(x) = x0.01 grows faster than logx.
Hint: use L’ Hospital rule.
Functions that grows without
bound as x→∞
4. Polynomial functions
f(x) = xn such as x, x2, x15, x16logx, x16log2x, …
x2 → ∞ as x → ∞
5. Then we encounter functions which grow faster than every fixed power
of x , just as logx grows slower than every fixed power of x.
Consider f(x) = elog2x (e power log square of x)
since this is the same as xlogx sobviously it will grow faster than x1000 .
It will be larger than x1000 as soon as x > e1000. Therefore, elogx is an
example of a function grows faster than every fixed power of x.
another example is e√x.
Kinds of analyses
Worst-case: (usually)
• T(n) = maximum time of algorithm on any input
of size n.
Average-case: (sometimes)
• T(n) = expected time of algorithm over all
inputs of size n.
• Need assumption of statistical distribution of
inputs.
Best-case: (NEVER)
• Cheat with a slow algorithm that works fast on
some input.
L1.31