Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 16

Sorting Algorithms

Quicksort

Shubham Jindal
CSE-A
Quicksort Design
• Follows the divide-and-conquer paradigm.
• Divide: Partition (separate) the array A[p..r] into two (possibly
empty) subarrays A[p..q–1] and A[q+1..r].
• Each element in A[p..q–1] < A[q].
• A[q] < each element in A[q+1..r].
• Index q is computed as part of the partitioning procedure.

• Conquer: Sort the two subarrays by recursive calls to quicksort.

• Combine: The subarrays are sorted in place – no work is needed to


combine them.
• In-Place sorting algorithm
• Non Stable sorting algorithm
Pseudocode
Quicksort(A, s, e)
if s < e then Partition(A, s, e)
pivot_idx := Partition(A, s, e); x := A[e];
Quicksort(A, s, pivot_idx – 1); i = s – 1;
Quicksort(A, pivot_idx + 1, e) for j := s to e do
if A[j]  x then
A[s..e] i := i + 1;
A[i]  A[j]
5 return i ;
A[s.. pivot_idx – 1] A[pivot_idx +1..e]
Partition
5

5 5
Partitioning in Quicksort

• A key step in the Quicksort algorithm is partitioning the


array
• We choose some (any) number p in the array to use as a pivot
• We partition the array into three parts:

numbers less p numbers greater than or


than p equal to p
Example of partition
x
initially: 2 5 8 3 9 4 1 7 10 6 note: pivot (x) = 6
i j

next iteration: 2 5 8 3 9 4 1 7 10 6 Partition(A, s, e)


i j x := A[e];
i = s – 1;
next iteration: 2 5 8 3 9 4 1 7 10 6 for j := s to e do
i j
if A[j]  x then
next iteration: 2 5 8 3 9 4 1 7 10 6
i := i + 1;
i j A[i]  A[j]
return i ;
next iteration: 2 5 3 8 9 4 1 7 10 6
i j
Example (Continued)
next iteration : 2 5 3 8 9 4 1 7 10 6
i j

next iteration: 2 5 3 4 8 9 1 7 10 6
i j
Partition(A, s, e)
x := A[e];
next iteration: 2 5 3 4 1 8 9 7 10 6 i = s – 1;
i j for j := s to e do
if A[j]  x then
next iteration: 2 5 3 4 1 8 9 7 10 6 i := i + 1;
i j A[i]  A[j]
return i ;
next iteration: 2 5 3 4 1 8 9 7 10 6
i j

final iteration: 2 5 3 4 1 6 8 9 7 10
i j
Complexity of Partition

• Partition Time(n) is given by the number of iterations in


the for loop.
• (n) : n = e – s + 1. Partition(A, s, e)
x := A[e];
i = s – 1;
for j := s to e do
if A[j]  x then
i := i + 1;
A[i]  A[j]
return i ;
Implementation
Analysis of quicksort—best case

• Suppose each partition operation divides the array almost


exactly in half
• Then the depth of the recursion in log n 2

• Because that’s how many times we can halve n

• We note that
• Each partition is linear over its subarray
• All the partitions at one level cover the array

• Time Complexity
• O(log2n) * O(n) = O(n log2n)
Partitioning at various levels
N=24

N=12

N=6

N=3

 = 4.58 == 5 levels N=1/2


Worst case

• In the worst case, partitioning always divides the size n array into these
three parts:
• A length one part, containing the pivot itself
• A length zero part, and
• A length n-1 part, containing everything else

• We don’t recur on the zero-length part


• Recurring on the length n-1 part requires (in the worst case) recurring to
depth n-1
• Worst Case will occur when array is already sorted or reverse sorted.
• Time Complexity
• O(n) * O(n) = O(n2)
Worst case partitioning
Time Complexity

• If the array is sorted to begin with, Quicksort is terrible:


O(n2) -> Worst case
• However, Quicksort is usually O(n log n) -> average
2
and best case
Space Complexity

• Best and Average Case: There will be at-most log(n)


recursive calls. -> O(Log(n))
• Worst Case: There will be n recursive calls. -> O(n)
Merge sort vs Quicksort

• Both run in O(n log(n))


• Merge sort – always.
• Quicksort – on average

• Compared with Quicksort, Merge sort has less number of


comparisons but larger number of moving elements
• In Java, an element comparison is expensive but moving elements
is cheap. Therefore, Merge sort is used in the standard Java library
for generic sorting
• In C++, copying objects can be expensive while comparing objects
often is relatively cheap. Therefore, IntroSort which uses quicksort
is the sorting routine commonly used in C++ libraries
Thank You

You might also like