Download as pdf or txt
Download as pdf or txt
You are on page 1of 21

Sorting Techniques-II

Concepts

● Sorting Techniques
○ Internal Sorting
■ Quick Sort
■ Radix Sort
○ External Sorting
■ Merge Sort
Questions for this session

We will answer the following questions in this session-

● How can we sort elements in a 1-D array?


Quick Sort

● Divide and Conquer Technique


○ Divide
○ Conquer
○ Combine
● Array to be sorted is split into recursive sub-arrays
● Random array element can be chosen as the pivot
● Array is then partitioned around the pivot such that
○ All elements smaller than the pivot are towards the left of the pivot
○ All elements greater than the pivot are towards the right of the pivot
Example: Quick Sort
Algorithm/Pseudocode: Quick Sort

Algorithm quick_Sort(array, low, high)


if (low < high)
pivotIndex <- partition(array,low,high)
quickSort(array, low, pivotIndex)
quickSort(array, pivotIndex + 1, high)
Algorithm/Pseudocode: Quick Sort (Contd…)
Algorithm partition (arr[], low, high) {
//Last element is made as the pivot
pivot = arr[high];
i = (low - 1) // Index of smaller element
for (j = low; j <= high- 1; j++) {
// If current element is smaller than the pivot
if (arr[j] < pivot) {
i++; // increment index of smaller element
swap arr[i] and arr[j]
}
}
swap arr[i + 1] and arr[high])
return (i + 1)
}
Example: Quick Sort (Contd…)
arr[] = {10, 80, 30, 90, 40, 50, 70}
Indexes: 0 1 2 3 4 5 6
low = 0, high = 6, pivot = arr[h] = 70
Initialize index of smaller element, i = -1
Traverse elements from j = low to high-1
j = 0 : Since arr[j] <= pivot, do i++ and swap(arr[i], arr[j])
i=0
arr[] = {10, 80, 30, 90, 40, 50, 70} // No change as i and j // are same
j = 1 : Since arr[j] > pivot, do nothing N9o change in i and arr)
j = 2 : Since arr[j] <= pivot, do i++ and swap(arr[i], arr[j])
i=1
arr[] = {10, 30, 80, 90, 40, 50, 70} // We swap 80 and 30
j = 3 : Since arr[j] > pivot, do nothing (No change in i and arr[])
Example: Quick Sort (Contd…)
j = 4 : Since arr[j] <= pivot, do i++ and swap(arr[i], arr[j])
i=2
arr[] = {10, 30, 40, 90, 80, 50, 70} // 80 and 40 Swapped
j = 5 : Since arr[j] <= pivot, do i++ and swap arr[i] with arr[j]
i=3
arr[] = {10, 30, 40, 50, 80, 90, 70} // 90 and 50 Swapped
We come out of loop because j is now equal to high-1.
Finally we place pivot at correct position by swapping arr[i+1] and arr[high] (or pivot)
arr[] = {10, 30, 40, 50, 70, 90, 80} // 80 and 70 Swapped

Now 70 is at its correct place. All elements smaller than 70 are before it and all elements
greater than 70 are after it.
Time Complexity Analysis: Quick Sort

Time Complexity of Quick Sort-

● Best Case Complexity- T(n)= 2T(n/2) + n


● Worst Case Complexity- T(n)= T(n-1) + n
Identify the solution: Quick Sort (10 mins)

Given the recurrence relations, can you


quickly figure out the time complexity for best
worst case and average case scenarios?

Best Case- O(nLogn)


Worst Case- O(n2)
Average Case- O(nLogn)
Merge Sort

● Divide and Conquer Algorithm


● Dividing the array to be sorted into two
halves
● Merging them back to get the final
sorted array
Algorithm/ Pseudocode: Merge Sort

Algorithm Merge_Sort(array, low, high)


If r > l
1. Find the middle point to divide the array into two halves:
middle m = (l+r)/2
2. Call mergeSort for first half:
Call mergeSort(arr, l, m)
3. Call mergeSort for second half:
Call mergeSort(arr, m+1, r)
4. Merge the two halves sorted in step 2 and 3:
Call merge(arr, l, m, r)
Algorithm/ Pseudocode: Merge Sort (Contd…)
Algorithm merge(array, low, mid, high) while (i < n1 && j < n2) {
if (L[i] <= R[j])
int n1 = m - l + 1;
arr[k++] = L[i++];
int n2 = r - m;
else
int L[n1], R[n2]; //create temp arrays
arr[k++] = R[j++];
/* Copy data to temp arrays L[] and R[] */
}
for (i = 0; i < n1; i++)
/* Copy the remaining elements of L[], if there are any */
L[i] = arr[l + i];
while (i < n1)
for (j = 0; j < n2; j++)
arr[k++] = L[i++];
R[j] = arr[m + 1+ j];
/* Copy the remaining elements of R[], if there are any */
i = 0; // Initial index of first subarray
while (j < n2)
j = 0; // Initial index of second subarray
arr[k++] = R[j++];
k = l; // Initial index of merged subarray
Complexity Analysis: Merge Sort

Time Complexity of Merge Sort-

● Recurrence Relation in all the three cases-

T(n)= 2T(n/2) + n

O(nLogn)
InClass Activity: Radix Sort
Instructions:

Activity Type- Exploration


Students can divide themselves into groups of 2
Time Allotted for this activity is 20 minutes

Question:
Develop and Analyze the algorithm for implementing Radix Sort technique
Solution: Radix Sort
This sorting technique sorts the elements using the following
steps-
● Step 1- It groups the individual digits of the same place
value
● Step 2- Sorts the elements according to their
increasing/decreasing order using counting sort

Refer to the following resources-


● https://www.geeksforgeeks.org/radix-sort/
● https://www.programiz.com/dsa/radix-sort
Learning Outcomes
In this session, you have learnt to:

1. Explain the algorithms for internal sorting techniques

2. Describe the algorithm to implement external sorting technique

3. Design and implement the appropriate sorting technique for developing solutions to real-world
problems and applications in C

4. Students will be able to analyze the complexity of sorting algorithms

5. Students will be able to sort a list of elements by deciding and choosing the appropriate sorting
algorithm

Go through the following learning resources on the platform

● Sorting Techniques-II
Q&A

If you have more questions, please post them in the community on


the platform.
What Next?

In the next session the following concepts will be covered

● Lab Project- Movie Ratings Manager

Go through the following learning resources on the platform

● Sorting Techniques-II
● Searching Algorithms

You might also like