Lecture Notes 2 Asymptotic Notation

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 24

CMP3005

Analysis of Algorithms
Lecture Notes 2
Asymptotic Notation
(Big-Oh Complexity As a Basic Tool)

1
Theoretical analysis of time efficiency
• Time efficiency is analyzed by determining the
number of repetitions of the basic operation
as a function of input size
• Basic operation: the operation that contributes
most towards the running time of the
algorithm input size

T(n) ≈ copC(n)
running time execution time Number of times
for basic operation basic operation is
executed
2
Input size and basic operation examples

Problem Input size measure Basic operation

Searching for key in a Number of list’s items, i.e.


Key comparison
list of n items n

Multiplication of two Matrix dimensions or total Multiplication of two


matrices number of elements numbers

Checking primality of a
n’size = number of digits Division
given integer n

Visiting a vertex or
Typical graph problem # vertices and/or edges
traversing an edge

3
Empirical analysis of time efficiency
• Select a specific (typical) sample of inputs

• Use physical unit of time (e.g., milliseconds)


or
count actual number of basic operation’s
executions

• Analyze the empirical data

4
Best-case, average-case, worst-case

For some algorithms, efficiency depends on form of input:

• Worst case: Cworst(n) – maximum over inputs of size n


• Best case: Cbest(n) – minimum over inputs of size n
• Average case: Cavg(n) – “average” over inputs of size n

5
Best-case, average-case, worst-case
For some algorithms, efficiency depends on form of input:
• Average case: Cavg(n) – “average” over inputs of size n
– Number of times the basic operation will be executed on typical
input
– NOT the average of worst and best case
– Expected number of basic operations considered as a random
variable under some assumption about the probability
distribution of all possible inputs

6
Example: Sequential search

• Worst case ?

• Best case ?

• Average case ?
7
Example: Sequential search
• Average case
• The standard assumptions are that
(a) the probability of a successful search is
equal to p (0 ≤ p ≤ 1) and
(b) the probability of the first match
occurring in the ith position of the list is
the same for every i.

8
Sequential Search Variation
• Consider a variation of sequential search
that scans a list to return the number of
occurrences of a given search key in the
list.
• Does its efficiency differ from the
efficiency of classic sequential search?

9
Types of formulas for basic operation’s count

• Exact formula
e.g., C(n) = n(n-1)/2

• Formula indicating order of growth with specific


multiplicative constant
e.g., C(n) ≈ 0.5 n2

• Formula indicating order of growth with


unknown multiplicative constant
e.g., C(n) ≈ cn2
10
Order of growth
• Most important: Order of growth within a
constant multiple as n→∞

• Example for a cn2 algorithm


– How much faster will algorithm run on computer that is twice
as fast?

– How much longer does it take to solve problem of half input


size?

11
Values of some important functions as n → 

12
Asymptotic notation
• Big O notation or Big Oh notation, and also
known as Landau notation or asymptotic notation
• A mathematical notation used to describe the
asymptotic behavior of functions
• Characterizes a function's behavior for very large
(or very small) inputs in a simple but rigorous way
that enables comparison to other functions

13
Asymptotic notation
• More precisely, the symbol O is used to describe
an asymptotic upper bound for the magnitude of
a function in terms of another, usually simpler,
function
• In computer science, useful in the analysis of the
complexity of algorithms.

14
Formal Definition of Big Oh
Definition: f(n) is in O(g(n))
if order of growth of f(n) ≤ order of growth of g(n)
(within constant multiple),
i.e., if there exist positive constant c and non-
negative integer n0 such that
f(n) ≤ c g(n) for every n ≥ n0

Examples:
• 10n is O(n) and also O(n2)
• 5n+20 is O(n)

15
Which running time is better?

16
Simplification
• Big-O notation lets us focus on the big picture.
• When faced with a complicated function like
3n2 + 4n + 5, we just replace it with O(f(n)),
where f(n) is as simple as possible.
• In this particular example we’d use O(n2),
because the quadratic portion of the sum
dominates the rest.

17
Simplification
Simplification Rules
1. Multiplicative constants can be omitted:
14n2 becomes n2 .
2. na dominates nb if a > b: for instance,
n2 dominates n.
3. Any exponential dominates any polynomial:
3n dominates n5 (it even dominates 2n ).
4. Likewise, any polynomial dominates any
logarithm:
n dominates (log n)3.
This also means, for example, that
n2 dominates nlogn.
18
Common plots of O(T(n))

19
Common plots of O(T(n))

Source: medium.com

20
Basic asymptotic efficiency classes
1 constant

log n logarithmic

n linear

n log n n-log-n or linearithmic

n2 quadratic

n3 cubic

2n exponential

n! factorial

21
Basic asymptotic efficiency classes

Remember a
search
algorithm?

22
Important Note
• Although ignored in Big-OH notation, constants are
very important!
• Programmers and algorithm developers are very
interested in constants and ready to spend nights in
order to make an algorithm run faster by a factor of
2.
• But understanding algorithms would be impossible
without the simplicity afforded by big-O notation

23
Overview
• Both time and space efficiencies are measured as
functions of the algorithm’s input size.
• Time efficiency: Measured by counting the number of
times the algorithm’s basic operation is executed.
• Space efficiency: Measured by counting the number of
extra memory units consumed by the algorithm.
• The efficiencies of some algorithms may differ
significantly for inputs of the same size. So,
distinguishing between the worst-case, average-case,
and best-case efficiencies may be required!

24

You might also like