Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 31

Mathematical Background

Overview
• Rate of growth of functions.

• Symbols of asymptotes that are used to describe rate of


growth.

• Need a way to compare speeds with which different


algorithms do the same job or the amount of memory they
use.
Order of Magnitude
• The order of growth of the running time of an algorithm gives a
simple characterization of the algorithm’s efficiency and also allows
us to compare the relative performance of alternative algorithms.

• When we look at input sizes large enough to make only the order of
growth of the running time relevant, we are studying the asymptotic
efficiency of algorithms. That is, we are concerned with how the
running time of an algorithm increases with the size of the input in the
limit, as the size of the input increases without bound.

• Usually, an algorithm that is asymptotically more efficient will be the


best choice for all but very small inputs.
Asymptotic notation
• The notations we use to describe the asymptotic running
time of an algorithm are defined in terms of functions
whose domains are the set of natural numbers

• Such notations are convenient for describing the worst-


case running-time function T(n), which usually is defined
only on integer input sizes.
Symbols
• o – little oh of …
• O – big oh of
• Θ – theta of
• Ω – omega of
• ~ – asymptotically equal

• If f(x) and g(x) are two functions, then the symbols above
are used to compare the rapidity of growth of f and g.
o – little oh
• We say that

f(x) = o(g(x)) ( x →∞)


If limx →∞ f(x)/g(x) exists and is equal to 0.

• f(x)=o(g(x)) means f grows more slowly than g when x is very large.

• Prove the following:


a) x2 = o(x5)
b) sin x = o(x)
c) 14.709√x = o(x/2+7cos x)
d) 1/x = o(1)
e) 23log x = o(x0.02)
• Here … 28.03.2019
Θ - notation
• For a given function g(n), we denote by Θ(g(n)) the set of functions

• Alternative definition

• f(x) and g(x) are of the same growth rate, only multiplicative constants
are uncertain.
Θ - notation
• Besides is a picture of functions f(n) and g(n), where
we have f(n) = Θ(g(n)). For all values of n to the
right of n0, the value of f(n) lies at or above c1g(n)
and at or below c2g(n). In other words, for all n≥n0,
the function f(n) is equal to g(n) to within a constant
factor. We say that g(n) is an asymptotically tight
bound for f(n) .
Θ – notation: Problem 1
• Show that

• We will use formal definition of Θ – notation.

• We must determine positive constants c1, c2 and n0 such that


Θ – notation: Problem 2
• Use forma definition to verify that 6n3≠ Θ(n2).

• For any polynomial , where the ai are constants


and ad>0, we have P(n)= Θ(nd).
• Since any constants is a degree 0 polynomial, we can express any
constant functions as Θ(n0) or Θ(1).
• We use notation Θ(1) to mean a constant or a constant function with
respect to some variable.
Asymptotic Notation
• Θ-notation asymptotically bounds the function from above and below.
• O-notation asymptotically bounds the function from above
(asymptotic upper bound).
• Ω-notation asymptotically bounds the function from below
(asymptotic lower bound).
Θ - theta of
• Θ is more precise than either the o or O.
• If we know that f(x)= Θ(x2), then we know that f(x)/x2 stays between
two non zero constants for all sufficiently large values of x, i.e.
c1g(x) < f(x) < c2g(x)
c1 < f(x)/g(x) < c2
• The rate of growth of f(x) is established, it grows quadratically with x.

• Problems. Prove the following:


a) (x+1)2 = Θ(3x2)
b) (x2+5x+7)/(5x3+7x+2) = Θ(1/x)
c) (1+3/x)x = Θ(1)
d) n2/2-3n = Θ(n2)
e)
• Here … 04.04.2019
O-notation (big oh)
• We use O-notation when we have only an asymptotic upper bound.
• Definition: For a given function g(n), we denote by O(g(n)), the set of
functions

• Alternative definition: We say that


f(n) = O(g(n)) ( n →∞)
If ∃c, n0 such that |f(n)| < cg(n) (∀n>n0) .

• f(n)=O(g(n)) means that f does not grow at a faster rate


than g. It may grow at the same rate or more slowly than g.
O-notation (big oh)
• O-notation gives an upper bound on a function, to within a constant factor. For all
values n at and to the right of n0 , the value of the function f(n) is on or below cg(n).

• Since O-notation describes an upper bound, when we use it to bound the worst case
running time of an algorithm, we have bound on the running time of the algorithm
on every input.


O-notation (big oh)
• Prove the following:
a) sinx = O(x)
b) sinx = O(1)
c) x3+5x2+77cosx = O(x5)
d) 1/(1+x2) = O(1)
Ω - notation
• We use Ω-notation when we have only an asymptotic lower bound.
• Definition: For a given function g(n), we denote by Ω(g(n))
(pronounced big omega of g of n) the set of functions

• Alternative definition: We say that

f(n) = Ω(g(n)) ( n →∞)


If ∃c, n0 such that |f(n)| > cg(n) (∀n>n0) .

• Ω-notation gives a lower bound on a function, to


within a constant factor. For all values n at and to the
right of n0 , the value of the function f(n) is on or
above cg(n).
Ω – notation
• Ω-notation is the negative of the o-notation, i.e.
f(n)= Ω(g(n)) means that it is not true that f(n)=og(n).

• Ω is used when we want to say that a certain calculation takes at least


so and so long to do. For example,
– No one can write a matrix multiplication program with fewer steps than n2, thus
computing time of cn2 is a lower bound on the speed of any matrix multiplication
program. Thus, multiplying two n x n matrices requires Ω(n2) time.
– If the running time of an algorithm is Ω(g(n)), no matter what particular input of
size n is chosen for each value of n, the running time on that input is at least cg(n),
for sufficiently large n.

• Ω gives a lower bound on the best-case running time of an algorithm.


– Best-case running time of insertion sort is Ω(n), which implies running time of
insertion sort is Ω(n).
Ω - notation
• Theorem
For any two functions f(n) and g(n), we have f(n) = Θ(g(n)) if
and only if f(n) =O(g(n)) and f(n) = Ω(g(n)).

~ - Asymptotically equal
• Definition:
We say that f(x) ~ g(x) If limx →∞ f(x)/g(x) = 1.

• Most precise of the symbols of asymptotics is the ~ which tells us that


not only do f and g grow at the same rate, but also f/g approaches 1 as
x tends to infinity.
a) x2 + x ~ x2
b) (3x + 1)4 ~ 81x4
c) sin(1/x) ~ 1/x
d) (2x3 + 5x + 7)/(x2 + 4) ~ 2x
e) 2x + 7logx + cosx ~ 2x
~ - Asymptotically equal
• Note it is important to get multiplicative constants right when the ‘~’
is used. For example
2x2= Θ(x2) … This is true
2x2 ~ x2 … This is not true
o - notation
• The asymptotic upper bound provided by O-notation may or may not be
asymptotically tight. The bound 2n2 = O(n2) is asymptotically tight, but
the bound 2n = O(n2) is not.
• o-notation denotes an upper bound that is not asymptotically tight.
• o(g(n)) (little-oh of g of n) is formally defined as the set

o(g(n)) = { f(n) : for any positive constant c > 0, there exists


a constant n0 > 0 such that 0 ≤ f(n) < cg(n) for all n ≥ n0 }.
For example 2n = o(n2), 2n2 ≠ o(n2)
• O-notation and o-notation are similarly defined. The main difference is
In f(n) = O(g(n)), the bound 0 ≤ f(n) < cg(n) holds for some constant c > 0.
In f(n) = o(g(n)), the bound 0 ≤ f(n) < cg(n) holds for all constants c > 0.
• In o-notation, the function f(n) becomes insignificant relative to g(n) as
n approaches infinity, that is,
ω - notation
• By analogy ω-notation is to Ω-notation as o-notation is to O-notation.
One way to define it is by
f(n) ϵ ω(g(n)) if and only if g(n) ϵ o(f(n)).
• ω-notation denotes a lower bound that is not asymptotically tight.
• ω(g(n)) (little-omega of g of n) is formally defined as the set

ω(g(n)) = { f(n) : for any positive constant c > 0, there exists


a constant n0 > 0 such that 0 ≤ cg(n) < f(n) for all n ≥ n0 }.
For example, n2/2= ω(n), but n2/2 ≠ ω(n2)

• The relation f(n) = ω(g(n)) implies if the limit exists.

that is, f(n) becomes arbitrarily large relative to g(n) as n approaches


infinity.
Comparison of functions
• Many of the relational properties of real numbers apply to asymptotic
comparisons as well. Assume that f(n) and g(n) are asymptotically
positive, then we have
• Transitivity property

• Reflexivity property
Comparison of functions
• Symmetry property

• Transpose property
• Because many of the relational properties hold for asymptotic
notations, one can draw an analogy between the asymptotic
comparison of two functions f and g and the comparison of two real
numbers a and b:
Trichotomy property
• Trichotomy property does not carry over to asymptotic notations.
Functions that grows without
bound as x→∞
1. Slowest growing ones are functions like
f(x) = log logx
log logx → ∞ as x → ∞

2. f(x) = logx
logx → ∞ as x → ∞, but grows a bit faster than log logx.
Hence, logx is a better algorithm.

3. f(x) = x0.01
x0.01 → ∞ as x → ∞, but grows a faster than logx.
Assignment: Prove that f(x) = x0.01 grows faster than logx.
Hint: use L’ Hospital rule.
Functions that grows without
bound as x→∞
4. Polynomial functions
f(x) = xn such as x, x2, x15, x16logx, x16log2x, …
x2 → ∞ as x → ∞

5. Then we encounter functions which grow faster than every fixed power
of x , just as logx grows slower than every fixed power of x.
Consider f(x) = elog2x (e power log square of x)
since this is the same as xlogx sobviously it will grow faster than x1000 .
It will be larger than x1000 as soon as x > e1000. Therefore, elogx is an
example of a function grows faster than every fixed power of x.
another example is e√x.
Kinds of analyses

Worst-case: (usually)
• T(n) = maximum time of algorithm on any input
of size n.
Average-case: (sometimes)
• T(n) = expected time of algorithm over all
inputs of size n.
• Need assumption of statistical distribution of
inputs.
Best-case: (NEVER)
• Cheat with a slow algorithm that works fast on
some input.

L1.31

You might also like