Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 36

RAIK 283: Data Structures & Algorithms

Divide and Conquer (I)

Dr. Ying Lu
ylu@cse.unl.edu

Design and Analysis of Algorithms – Chapter 5 1


RAIK 283: Data Structures & Algorithms
 Giving credit where credit is due:
• Most of the lecture notes are based on the slides from
the Textbook’s companion website
– http://www.aw-bc.com/info/levitin
• Some examples and slides are based on lecture notes
created by Dr. Ben Choi, Louisiana Technical University
and Dr. Chuck Cusack, Hope College
• I have modified many of their slides and added new
slides.

Design and Analysis of Algorithms – Chapter 5 2


Divide and Conquer

The most well known algorithm design strategy:


1. Divide instance of problem into two or more smaller
instances

2. Solve smaller instances recursively

3. Obtain solution to original (larger) instance by combining


these solutions

Design and Analysis of Algorithms – Chapter 5 3


Divide-and-conquer Technique

a problem of size n

subproblem 1 subproblem 2
of size n/2 of size n/2

a solution to a solution to
subproblem 1 subproblem 2

a solution to
the original problem
Design and Analysis of Algorithms – Chapter 5 4
Divide and Conquer Examples

 Sorting: mergesort and quicksort

 Tree traversals

 Matrix multiplication-Strassen’s algorithm

 Closest pair problem

 Convex hull-QuickHull algorithm

Design and Analysis of Algorithms – Chapter 5 5


Review Mergesort

Design and Analysis of Algorithms – Chapter 5 6


Mergesort

Algorithm:
 Split array A[1..n] in two and make copies of each half in
arrays B[1.. n/2 ] and C[1.. n/2 ]
 Sort arrays B and C
 Merge sorted arrays B and C into array A

Design and Analysis of Algorithms – Chapter 5 7


Using Divide and Conquer: Mergesort
 Mergesort Strategy
(first  last)2

first last

Sort recursively Sort recursively


by Mergesort by Mergesort

Sorted Sorted

Merge

Sorted
Design and Analysis of Algorithms – Chapter 5 8
Mergesort

Algorithm:
 Split array A[1..n] in two and make copies of each half in
arrays B[1.. n/2 ] and C[1.. n/2 ]
 Sort arrays B and C
 Merge sorted arrays B and C into array A

Design and Analysis of Algorithms – Chapter 5 9


Mergesort
Algorithm:
 Split array A[1..n] in two and make copies of each half in
arrays B[1.. n/2 ] and C[1.. n/2 ]
 Sort arrays B and C
 Merge sorted arrays B and C into array A as follows:
• Repeat the following until no elements remain in one of the arrays:
– compare the first elements in the remaining unprocessed
portions of the arrays
– copy the smaller of the two into A, while incrementing the index
indicating the unprocessed portion of that array
• Once all elements in one of the arrays are processed, copy the
remaining unprocessed elements from the other array into A.

Design and Analysis of Algorithms – Chapter 5 10


Algorithm: Mergesort

Input: Array E and indices first and last, such that the
elements E[i] are defined for first  i  last.
Output: E[first], …, E[last] is a sorted rearrangement of
the same elements
void mergeSort(Element[] E, int first, int last)
if (first < last)
int mid = (first+last)/2;
mergeSort(E, first, mid);
mergeSort(E, mid+1, last);
merge(E, first, mid, last);
return;

Design and Analysis of Algorithms – Chapter 5 11


In-Class Exercise
 P175 Problem 5.1.6
Apply Mergesort to sort the list E, X, A, M, P, L, E in
alphabetical order.

Design and Analysis of Algorithms – Chapter 5 12


Algorithm: Mergesort

void mergeSort(Element[] E, int first, int last)


if (first < last)
int mid = (first+last)/2;
mergeSort(E, first, mid);
mergeSort(E, mid+1, last);
merge(E, first, mid, last);
return;

Time Efficiency

Design and Analysis of Algorithms – Chapter 5 13


Mergesort Complexity
 Mergesort always partitions the array equally.
 Thus, the recursive depth is always (lg n)
 The amount of work done at each level is (n)
 Intuitively, the complexity should be (n lg n)

 We have,
• T(n) =2T(n/2)  (n) for n>1, T(1)=0  (n lg n)

 The amount of extra memory used is (n)

Design and Analysis of Algorithms – Chapter 5 14


Efficiency of Mergesort

 All cases have same efficiency: Θ( n log n)

 Number of comparisons is close to theoretical minimum for


comparison-based sorting:
• lg n ! ≈ n lg n - 1.44 n

 Space requirement: Θ( n ) (NOT in-place)

 Can be implemented without recursion (bottom-up)

Design and Analysis of Algorithms – Chapter 5 15


Animation

http://math.hws.edu/TMCM/java/xSortLab/index.html

Design and Analysis of Algorithms – Chapter 5 16


The master theorem
 Given: a divide and conquer algorithm

 Then, the Master Theorem gives us a cookbook for the


algorithm’s running time:

Design and Analysis of Algorithms – Chapter 5 17


A general divide-and-conquer recurrence

T(n) = aT(n/b) + f (n) where f (n) ∈ Θ(nd)


T(1) = c

 a < bd T ( n ) ∈ Θ (n d )
 a = bd T(n) ∈ Θ(nd lg n )
 a > bd T(n) ∈ Θ(nlog a) b

Please refer to Appendix B for the proof

Design and Analysis of Algorithms – Chapter 5 18


In-class exercise
 Page 175: Problem 5
 5. Find the order of growth for solutions of the following
recurrences.
• a. T(n) = 4T(n/2) + n, T(1) = 1
• b. T(n) = 4T(n/2) + n2 , T(1) = 1
2

• c. T(n) = 4T(n/2) + n3 , T(1) = 1


3

Design and Analysis of Algorithms – Chapter 5 19


In-Class Exercise
 5.1.9 Let A[0, n-1] be an array of n distinct real
numbers. A pair (A[i], A[j]) is said to be an
inversion if these numbers are out of order, i.e., i<j
but A[i]>A[j]. Design an O(nlogn) algorithm for
counting the number of inversions.

Design and Analysis of Algorithms – Chapter 5 20


Review Quicksort

Design and Analysis of Algorithms – Chapter 5 21


Quicksort by Hoare (1962)
 Select a pivot (partitioning element)
 Rearrange the list so that all the elements in the positions
before the pivot are smaller than or equal to the pivot and
those after the pivot are larger than or equal to the pivot
 Exchange the pivot with the last element in the first (i.e., ≤)
sublist – the pivot is now in its final position
 Sort the two sublists recursively

A[i]≤p A[i]p
Design and Analysis of Algorithms – Chapter 5 22
Quicksort by Hoare (1962)
 Select a pivot (partitioning element)
 Rearrange the list so that all the elements in the positions
before the pivot are smaller than or equal to the pivot and
those after the pivot are larger than or equal to the pivot
 Exchange the pivot with the last element in the first (i.e., ≤)
sublist – the pivot is now in its final position
 Sort the two sublists recursively

A[i]≤p A[i]p
 Apply quicksort to sort the list 7 2 9 10 5 4
Design and Analysis of Algorithms – Chapter 5 23
Quicksort Example

 Recursive implementation with the left most array


entry selected as the pivot element.

Design and Analysis of Algorithms – Chapter 5 24


Quicksort

Animated Example:
http://math.hws.edu/TMCM/java/xSortLab/index.html

Design and Analysis of Algorithms – Chapter 5 25


Quicksort Algorithm
 Input:
 Array E and indices first, and last, s.t. elements E[i] are defined for
first  i  last
 Ouput:
 E[first], …, E[last] is a sorted rearrangement of the array

 Void quickSort(Element[] E, int first, int last)


if (first < last)
Element pivotElement = E[first];
int splitPoint = partition(E, pivotElement, first, last);
quickSort (E, first, splitPoint –1 );
quickSort (E, splitPoint +1, last );
return;

Design and Analysis of Algorithms – Chapter 5 26


Quicksort Analysis

 Partition can be done in O(n) time, where n is the


size of the array
 Let T(n) be the number of comparisons required
by Quicksort
 If the pivot ends up at position k, then we have
• T(n) T(nk)  T(k 1)  n
 To determine best-, worst-, and average-case
complexity we need to determine the values of k
that correspond to these cases.

Design and Analysis of Algorithms – Chapter 5 27


Quicksort Analysis

 Partition can be done in O(n) time, where n is the


size of the array
 Let T(n) be the number of comparisons required
by Quicksort
 If the pivot ends up at position k, then we have
• T(n) T(nk)  T(k 1)  n
 To determine best-, worst-, and average-case
complexity we need to determine the values of k
that correspond to these cases.

Design and Analysis of Algorithms – Chapter 5 28


Best-Case Complexity
 The best case is clearly when the pivot always
partitions the array equally.
 Intuitively, this would lead to a recursive depth of
at most lg n calls
 We can actually prove this. In this case
• T(n) T(n/2)  T(n/2)  n  (n lg n)

Design and Analysis of Algorithms – Chapter 5 29


Worst-Case and Average-Case Complexity
 The worst-case is when the pivot always ends up in
the first or last element. That is, partitions the
array as unequally as possible.
 In this case
• T(n) ?

Design and Analysis of Algorithms – Chapter 5 30


Worst-Case and Average-Case Complexity
 The worst-case is when the pivot always ends up in
the first or last element. That is, partitions the
array as unequally as possible.
 In this case
• T(n)  T(n1)  T(11)  n  T(n1)  n
 n  (n1)  … + 1
 n(n  1)/2  (n2)
 Average case is rather complex, but is where the
algorithm earns its name. The bottom line is:
A(n)  1.386n lg n  (n lg n)
Design and Analysis of Algorithms – Chapter 5 31
Summary of Quicksort
 Best case: split in the middle — Θ( n log n)
 Worst case: sorted array! — Θ( n2)
 Average case: random arrays — Θ( n log n)

Design and Analysis of Algorithms – Chapter 5 32


Summary of Quicksort
 Best case: split in the middle — Θ( n log n)
 Worst case: sorted array! — Θ( n2)
 Average case: random arrays — Θ( n log n)

 Memory requirement?

Design and Analysis of Algorithms – Chapter 5 33


Summary of Quicksort
 Best case: split in the middle — Θ( n log n)
 Worst case: sorted array! — Θ( n2)
 Average case: random arrays — Θ( n log n)

 Memory requirement?
 In-place sorting algorithm
 Considered as the method of choice for internal sorting for
large files (n ≥ 10000)

Design and Analysis of Algorithms – Chapter 5 34


Summary of Quicksort
 Best case: split in the middle — Θ( n log n)
 Worst case: sorted array! — Θ( n2)
 Average case: random arrays — Θ( n log n)

 Considered as the method of choice for internal sorting for


large files (n ≥ 10000)

 Improvements:
• better pivot selection: median of three partitioning avoids worst
case in sorted files
• switch to insertion sort on small subfiles
• elimination of recursion
these combine to 20-25% improvement
Design and Analysis of Algorithms – Chapter 5 35
In-class exercises

Design and Analysis of Algorithms – Chapter 5 36

You might also like