Download as pdf or txt
Download as pdf or txt
You are on page 1of 31

Csci 1933: Introduction to

Algorithms and Data Structures


Spring 2022
Announcements

• Project 2 due Thursday (3/3)

• Lab 6 assigned Monday (2/28)


Intro to analysis of algorithm complexity
How do we decide which algorithm is “best”?

• Possible metrics:
– Correctness!
– How much time does it take? (time complexity)
– How much memory does it require? (space
complexity)
(we can precisely quantify these)

– How easy is it to implement?


(this is often just as/more important)
Measuring time/space complexity

• Sometimes the speed/memory used by an algorithm


varies (e.g. depends on the input data)

• Some useful measures:


– worst-case (most typical)
– best-case
– average-case
Measuring time/space complexity

• we don’t usually precisely estimate the time (e.g. in


milliseconds it takes an algorithm to complete)

• Instead: we want to characterize generally how the


speed of the algorithm changes with the size of the
problem (e.g. the number of items you store in your
sorted list)

• We want: a function of the problem size, whose


value is proportional to the time it will take to finish
Example: measuring algorithm efficiency
Problem: we want an algorithm that computes the sum of
integers from 1 to n for n > 0.
S(n) = 1+2+3+ … + n

How many operations will each of these take as a function of n ? Count


variable assignments, additions, multiplications, divisions

Carrano and Henry, Data Structures and Abstractions with Java.


Example: measuring algorithm efficiency

The number of operations required by the three algorithms


Carrano and Henry, Data Structures and Abstractions with Java.
Example: measuring algorithm efficiency

The number of operations required by the three algorithms


plotted as a function of n

Carrano and Henry, Data Structures and Abstractions with Java.


“Big Oh” Notation

• To say "Algorithm A has a worst-case time requirement proportional to


n"
– We say A is O(n)
– Read "Big Oh of n"
• For the other two algorithms
– Algorithm B is O(n2)
– Algorithm C is O(1)

Carrano and Henry, Data Structures and Abstractions with Java.


Growth with respect to input size

Typical growth-rate functions evaluated at increasing


values of n

Carrano and Henry, Data Structures and Abstractions with Java.


Formalities of Big Oh notation
• What does it mean to say an algorithm’s complexity
is of O(g(n)) ?
• Formal definition:
– assume an algorithm requires f(n)
operations/space (e.g. f(n) = 2n+1)
– The algorithm is of order at most g(n) (O(g(n)))
if there exists a positive real number c and a positive
integer N exist such that
f(n) ≤ c • g(n) for all n ≥ N
Intuitive interpretation: c*g(n) is an upper bound on f(n) when n is
large enough
Carrano and Henry, Data Structures and Abstractions with Java.
Formalities of Big Oh notation

An illustration of the formal definition of Big Oh

Carrano and Henry, Data Structures and Abstractions with Java.


More formalities of Big Oh notation

• The following identities hold for Big Oh notation:

– O(k f(n)) = O(f(n)) (for a constant k)


– O(f(n)) + O(g(n)) = O(f(n) + g(n))
– O(f(n)) O(g(n)) = O(f(n) g(n))

Carrano and Henry, Data Structures and Abstractions with Java.


Other standard notations for describing
complexity

• Big Oh
O(g(n))
- f(n) is of order at most g(n)
• Big Omega
Ω (g(n))
– f(n) is of order at least g(n)
(lower bound)
• Big Theta Θ (g(n))
– f(n) is of order g(n)
– cg(n) is both a lower and upper bound
(closest estimate of complexity)
Exercise: estimating time complexity
sum=0;
for (int i=1; i <= n; i++) {
for(int j=1; j <= n; j++) {
sum = sum+1;
}
}
Exercise: estimating time
complexity

Answer: O(n2) algorithm


Carrano and Henry, Data Structures and Abstractions with Java.
Exercise: estimating time complexity
sum=0;
for(int k=1; k <= n; k++) {
for (int i=1; i <= n; i++) {
for(int j=1; j <= n; j++) {
sum = sum+1;
}
}
}
Exercise: estimating time complexity
public static double loopFactorial(int n) {
double result=1;

for(int i=1; i <= n; i++) {


result *= i;
}

return result;
}
Complexity of common sorting algorithms
Sorting algorithms
• Algorithm #1
• Find the minimum value in the list “Selection” sort
• Swap it with the value in the first position
• Repeat the steps above for the remainder of the list (starting at the second position and
advancing each time)

• Algorithm #2
• Iterate from 1st to last element of the array “Insertion” sort
• Compare the current element to its predecessor
• If curr. element smaller, compare it to elements before; move greater elements right one
position to make space for smaller element

• Algorithm #3
“Merge sort”
- Divide an array into halves
- Sort the two halves (recursively)
- Merge them into one sorted array

• Algorithm #3 “Quicksort”
– Pick an element from the list (pivot).
– Reorder the list so that all elements less than the pivot come before the pivot and so that all
elements greater than the pivot come after it (equal values can go either way).
– Repeat the process on the sub-list of lesser elements and the sub-list of greater elements.
Summary of various sorting
algorithms

Bubble sort O(n2) O(n) O(n2)

The time efficiency of various algorithms in Big


Oh notation

Carrano and Henry, Data Structures and Abstractions with Java.


Selection Sort
• Find the minimum value
in the array

• Swap it with the value in


the first position

• Repeat the steps above


for the remainder of the
array (starting at the
second position and
advancing each time)

Scan steps:

Swap steps:

Carrano and Henry, Data Structures and Abstractions with Java.


Selection Sort Complexity
• Each step: One scan + One swap

– Scan steps: n + (n-1) + (n-2) … + 1 = 1 + 2 … + n


= n(n+1)/2 → O(n2)
– One swap each position = n-1 total swaps → O(n)

– Total = O(n2) + O(n) = O(n2)


Selection sort questions to think
about
• We’ve discussed the worst-case complexity—
how about best-case?

• What’s the worst-case space complexity of


selection sort?

You might also like