Professional Documents
Culture Documents
Lecture Notes 2 Asymptotic Notation
Lecture Notes 2 Asymptotic Notation
Lecture Notes 2 Asymptotic Notation
Analysis of Algorithms
Lecture Notes 2
Asymptotic Notation
(Big-Oh Complexity As a Basic Tool)
1
Theoretical analysis of time efficiency
• Time efficiency is analyzed by determining the
number of repetitions of the basic operation
as a function of input size
• Basic operation: the operation that contributes
most towards the running time of the
algorithm input size
T(n) ≈ copC(n)
running time execution time Number of times
for basic operation basic operation is
executed
2
Input size and basic operation examples
Checking primality of a
n’size = number of digits Division
given integer n
Visiting a vertex or
Typical graph problem # vertices and/or edges
traversing an edge
3
Empirical analysis of time efficiency
• Select a specific (typical) sample of inputs
4
Best-case, average-case, worst-case
5
Best-case, average-case, worst-case
For some algorithms, efficiency depends on form of input:
• Average case: Cavg(n) – “average” over inputs of size n
– Number of times the basic operation will be executed on typical
input
– NOT the average of worst and best case
– Expected number of basic operations considered as a random
variable under some assumption about the probability
distribution of all possible inputs
6
Example: Sequential search
• Worst case ?
• Best case ?
• Average case ?
7
Example: Sequential search
• Average case
• The standard assumptions are that
(a) the probability of a successful search is
equal to p (0 ≤ p ≤ 1) and
(b) the probability of the first match
occurring in the ith position of the list is
the same for every i.
8
Sequential Search Variation
• Consider a variation of sequential search
that scans a list to return the number of
occurrences of a given search key in the
list.
• Does its efficiency differ from the
efficiency of classic sequential search?
9
Types of formulas for basic operation’s count
• Exact formula
e.g., C(n) = n(n-1)/2
11
Values of some important functions as n →
12
Asymptotic notation
• Big O notation or Big Oh notation, and also
known as Landau notation or asymptotic notation
• A mathematical notation used to describe the
asymptotic behavior of functions
• Characterizes a function's behavior for very large
(or very small) inputs in a simple but rigorous way
that enables comparison to other functions
13
Asymptotic notation
• More precisely, the symbol O is used to describe
an asymptotic upper bound for the magnitude of
a function in terms of another, usually simpler,
function
• In computer science, useful in the analysis of the
complexity of algorithms.
14
Formal Definition of Big Oh
Definition: f(n) is in O(g(n))
if order of growth of f(n) ≤ order of growth of g(n)
(within constant multiple),
i.e., if there exist positive constant c and non-
negative integer n0 such that
f(n) ≤ c g(n) for every n ≥ n0
Examples:
• 10n is O(n) and also O(n2)
• 5n+20 is O(n)
15
Which running time is better?
16
Simplification
• Big-O notation lets us focus on the big picture.
• When faced with a complicated function like
3n2 + 4n + 5, we just replace it with O(f(n)),
where f(n) is as simple as possible.
• In this particular example we’d use O(n2),
because the quadratic portion of the sum
dominates the rest.
17
Simplification
Simplification Rules
1. Multiplicative constants can be omitted:
14n2 becomes n2 .
2. na dominates nb if a > b: for instance,
n2 dominates n.
3. Any exponential dominates any polynomial:
3n dominates n5 (it even dominates 2n ).
4. Likewise, any polynomial dominates any
logarithm:
n dominates (log n)3.
This also means, for example, that
n2 dominates nlogn.
18
Common plots of O(T(n))
19
Common plots of O(T(n))
Source: medium.com
20
Basic asymptotic efficiency classes
1 constant
log n logarithmic
n linear
n2 quadratic
n3 cubic
2n exponential
n! factorial
21
Basic asymptotic efficiency classes
Remember a
search
algorithm?
22
Important Note
• Although ignored in Big-OH notation, constants are
very important!
• Programmers and algorithm developers are very
interested in constants and ready to spend nights in
order to make an algorithm run faster by a factor of
2.
• But understanding algorithms would be impossible
without the simplicity afforded by big-O notation
23
Overview
• Both time and space efficiencies are measured as
functions of the algorithm’s input size.
• Time efficiency: Measured by counting the number of
times the algorithm’s basic operation is executed.
• Space efficiency: Measured by counting the number of
extra memory units consumed by the algorithm.
• The efficiencies of some algorithms may differ
significantly for inputs of the same size. So,
distinguishing between the worst-case, average-case,
and best-case efficiencies may be required!
24