Professional Documents
Culture Documents
Data Structures Unit 2 SPJ Sorting
Data Structures Unit 2 SPJ Sorting
SWATI JAGTAP
Syllabus
Unit II Searching and Sorting Algorithms (06 Hrs)
Implementation
Simple implementation, relatively slow: O(n2)
Complex implementation, more efficient: O(n.logn)
4
Sorting Terminology
7
External sorting
2) External sorting
Data is stored on secondary devices (External memory)
is required when there is huge amount of data.
when sorting is done between data lying on secondary devices, then it is called external sorting.
Advantage:
No data is lost because of power off.
Disadvantage
More complex
Not efficient because comparison of data is between secondary devices.
8
Stability in sorting algorithms
What is it?
A sorting algorithm is said to be stable if two objects with
equal keys appear in the same order in sorted output as they
appear in the input array to be sorted.
Formally stability may be defined as,
Stability in sorting algorithms
Stability
A sorting algorithm is said to be stable if after sorting , identical elements
appear in the same sequence as in the original list.
Name Subject Marks
Name Subject Marks
Amit Chemistry 74
Mohan Physics 65
Mohan Physics 65
Sohan Chemistry 70
Mohan Chemistry 68
Mohan Chemistry 68 Fig: b)
Sohan Chemistry 70 Sorted List (Stable)
Amit Chemistry 74
Sohan Physics 75
Sohan Physics 75
13
Time Complexities of all Sorting Algorithms
Bubble
Insertion
Selection
Merge
Quick Sort
Sorting
Rearrange
a[0], a[1], …, a[n-1]
into ascending order.
When done,
a[0] <= a[1] <= … <= a[n-1]
8, 6, 9, 4, 3 => 3, 4, 6, 8, 9
17
General Sorting
Assumptions
data in linear data structure
availability of comparator for
elements
availability of swap routine (or shift )
no knowledge about the data values
18
General Sorting
Swap
temp = a;
a = b;
b = temp;
Passes:
While sorting the elements in some specific order, there
is a lot of arrangement of the elements. The phases in
which the elements are moving to acquire their proper
position is called as passes.
Slowest
Most popular
Since the comparison position look like
bubbles, it is called so.
Take multiple passes over the array
Swap adjacent places when values are out of
order
Invariant: each pass guarantees that
largest remaining element is in the correct
(next last) position.
21
Bubble Sort Pass 1
Start – Unsorted
Compare, no swap
Compare, no swap
99 in position
22
Bubble Sort
Pass 2
swap (0, 1)
no swap
no swap
swap (3, 4)
21 in position
23
Pass 3
24
#include <stdio.h>
void swap(int *xp, int *yp)
{
int temp = *xp;
*xp = *yp;
*yp = temp;
}
// A function to implement bubble sort
void bubbleSort(int arr[], int n)
{
int i, j;
for (i = 0; i < n-1; i++)
#Compares
a for loop embedded inside a while loop (n-1) + (n-2) + (n-3) + ..... + 3 + 2 + 1
(n-1)+(n-2)+(n-3) …+1 , or O(n2) Sum = n(n-1)/2 i.e O(n2)
#Swaps
inside a conditional -> #swaps data dependent !!
Best Case 0, or O(1)
Worst Case (n-1)+(n-2)+(n-3) …+1 , or O(n2)
Space
size of the array
an in-place algorithm
Space Complexity: O(1) 26
Sorting Methods
Bubble
Insertion
Selection
Merge
Quick Sort
Insertion Sort
Algorithm
To sort an array of size n in ascending order:
1: Iterate from arr[0] to arr[n] over the array.
2: Compare the current element (key) to its predecessor.
3: If the key element is smaller than its predecessor, compare it to
the elements before. Move the greater elements one position up to
make space for the swapped element.
Insertion Sort:-Example:
Bubble
Insertion
Selection
Merge
Quick Sort
Selection Sort
https://www.youtube.com/watch?v=R_f3PJtRqUQ
Selection sort is a simple sorting algorithm. This sorting algorithm is an in-place
comparison-based algorithm in which the list is divided into two parts,
the sorted part at the left end and the unsorted part at the right end.
Initially, the sorted part is empty and the unsorted part is the entire list.
The smallest element is selected from the unsorted array and swapped with the
leftmost element, and that element becomes a part of the sorted array. This
process continues moving unsorted array boundary by one element to the right.
This algorithm is not suitable for large data sets as its average and worst case
complexities are of Ο(n2), where n is the number of items.
Selection Sort
Algorithm
Step 1 − Set MIN to location 0
Step 2 − Search the minimum element in the list
Step 3 − Swap with value at location MIN
Step 4 − Increment MIN to point to next element
Step 5 − Repeat until list is sorted
Selection Sort:-Example
#include <stdio.h>
Bubble
Insertion
Selection
Merge
Quick Sort
Merge Sort
https://www.youtube.com/watch?v=4VqmGXwpLqc&t=24s
➢ Merge Sort is a Divide and Conquer algorithm. It divides input array in two halves, calls
itself for the two halves and then merges the two sorted halves.
}
k++;
https://www.youtube.com/watch?v=cAv-4ltj1go
/* Copy the remaining elements of L[], if there
are any */
while (i < n1) {
arr[k] = L[i];
i++;
k++;
}
/* Copy the remaining elements of R[], if there
are any */
while (j < n2) {
arr[k] = R[j];
j++;
k++; } }
/* l is for left index and r is right index of the
sub-array of arr to be sorted */
void mergeSort(int arr[], int l, int r)
{ 9, 7, 3, 6, 2
0 1 2 3 4
if (l < r) {
// Same as (l+r)/2, but avoids overflow for
I m r
// large l and h
int m = (l + r) / 2;
m=0+2 / 2=1
{
int i;
for (i = 0; i < size; i++)
printf("%d ", A[i]);
printf("\n");
}
Bubble ,
selection,
Insertion sort
time
complexity
Merge sort time
complexity
Applications of Merge Sort
Merge Sort is useful for sorting linked lists in O(nLogn) time.In the case of
linked lists, the case is different mainly due to the difference in memory
allocation of arrays and linked lists. Unlike arrays, linked list nodes may not be
adjacent in memory. Unlike an array, in the linked list, we can insert items in the
middle in O(1) extra space and O(1) time. Therefore merge operation of merge
sort can be implemented without extra space for linked lists.
➢ Inversion Count Problem
1. Inversion Count for an array indicates – how far (or close) the array is from being
sorted. If array is already sorted then inversion count is 0. If array is sorted in reverse
order that inversion count is the maximum.
➢ Used in External Sorting
Algorithm Analysis
Bubble
Insertion
Selection
Merge
Quick Sort
Quick sort
Like Merge Sort, QuickSort is a Divide and Conquer algorithm. It picks an element as pivot and
partitions the given array around the picked pivot. There are many different versions of
quickSort that pick pivot in different ways.
1. Always pick first element as pivot.
2. Always pick last element as pivot (implemented below)
3. Pick a random element as pivot.
4. Pick median as pivot.
The key process in quickSort is partition(). Target of partitions is, given an array and an
element x of array as pivot, put x at its correct position in sorted array and put all smaller
elements (smaller than x) before x, and put all greater elements (greater than x) after x. All this
should be done in linear time.
https://www.youtube.com/watch?v=PgBzjlCcFvc
How QuickSort Works?
The sub-parts are again divided into smaller sub-parts until each subpart is formed of a
single element.
At this point, the array is already sorted.
#include<stdio.h>
// A utility function to swap two elements
void swap(int* a, int* b)
{
int t = *a;
*a = *b;
*b = t;
}
/* This function takes last element as pivot, places
the pivot element at its correct position in sorted
array, and places all smaller (smaller than pivot)
to left of pivot and all greater elements to right
of pivot */
int partition (int arr[], int low, int high)
{
int pivot = arr[high]; // pivot
int i = (low - 1); // Index of smaller element