Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 22

The Efficiency of Algorithms

Chapter 9
Chapter Contents
Motivation
Measuring an Algorithm's Efficiency
• Big Oh Notation
Formalities
Picturing Efficiency
The Efficiency of Implementations of the
ADT List
• The Array-Based Implementation
• The Linked Implementation
• Comparing Implementations 2
Motivation
Even a simple program can be noticeably
inefficient
long firstOperand = 7562;
long secondOperand = 423;
long product = 0;
for (; secondOperand > 0; secondOperand--)
product = product + firstOperand;
System.out.println(product);

When the 423 is changed to 100,000,000


there is a significant delay in seeing the
result 3
Measuring Algorithm Efficiency
Types of complexity
• Space complexity
• Time complexity

Analysis of algorithms
• The measuring of the complexity of an
algorithm
Cannot compute actual time for an
algorithm
• We usually measure worst-case time
4
Measuring Algorithm Efficiency

Fig. 9-1 Three algorithms for computing


1 + 2 + … n for an integer n > 0

5
Measuring Algorithm Efficiency

Fig. 9-2 The number of operations required by the


algorithms for Fig 9-1
6
Measuring Algorithm Efficiency

Fig. 9-3 The number of operations required by the


algorithms in Fig. 9-1 as a function of n 7
Big Oh Notation

To say "Algorithm A has a worst-case


time requirement proportional to n"
• We say A is O(n)
• Read "Big Oh of n"

For the other two algorithms


• Algorithm B is O(n2)
• Algorithm C is O(1)

8
Big Oh Notation

Fig. 9-4 Typical growth-rate functions


evaluated at increasing values of n
9
Big Oh Notation

Fig. 9-5 The number of digits in an integer n


compared with the integer portion of log10n

10
Big Oh Notation

Fig. 9-6 The values of two logarithmic growth-


rate functions for various ranges of n.

11
Formalities
Formal definition of Big Oh
An algorithm's time requirement f(n) is of
order at most g(n)
• f(n) = O(g(n))
• For a positive real number c and positive
integer N exist such that

f(n) ≤ c•g(n) for all n ≥ N

12
Formalities

Fig. 9-7 An illustration of the definition of Big Oh


13
Formalities

The following identities hold for Big Oh


notation:

• O(k f(n)) = O(f(n))


• O(f(n)) + O(g(n)) = O(f(n) + g(n))
• O(f(n)) O(g(n)) = O(f(n) g(n))

14
Picturing Efficiency

Fig. 9-8 an O(n) algorithm.

15
Picturing Efficiency

Fig. 9-9 An O(n2) algorithm. 16


Picturing Efficiency

Fig. 9-10 Another O(n2) algorithm. 17


Picturing Efficiency

Fig. 9-11 The effect of doubling the problem size


on an algorithm's time requirement.
18
Picturing Efficiency

Fig. 9-12 The time to process one million items by


algorithms of various orders at the rate of one
million operations per second.
19
Comments on Efficiency
A programmer can use O(n2), O(n3) or
O(2n) as long as the problem size is small
At one million operations per second it
would take 1 second …
• For a problem size of 1000 with O(n2)
• For a problem size of 1000 with O(n3)
• For a problem size of 20 with O(2n)

20
Efficiency of Implementations of ADT List

For array-based implementation


• Add to end of list O(1)
• Add to list at given position O(n)
For linked implementation
• Add to end of list O(n)
• Add to list at given position O(n)
• Retrieving an entry O(n)

21
Comparing Implementations

Fig. 9-13 The time efficiencies of the ADT list operations


for two implementations, expressed in Big Oh notation
22

You might also like