Download as pdf or txt
Download as pdf or txt
You are on page 1of 11

Assignment 1

Sorting Algorithm Analysis


Chris Gabriel Alisasis, 21-01371

Date : Jan 28, 2023

Objective:

The objective of this assignment is to gain hands-on experience in analyzing the time and space complexity of
sorting algorithms. You will implement and compare the performance of different sorting algorithms, and
analyze their time and space complexity using asymptotic notation.

Instructions:

● Implement the following sorting algorithms: Bubble sort, insertion sort, selection sort, merge sort, and
quicksort.
● Write a program that will generate a random list of integers of varying sizes (e.g. 1000, 10000,
100000) and sort them using each of the above algorithms.
● Measure the execution time of each algorithm for each list size and create a graph to illustrate the
results.
● Analyze the time and space complexity of each algorithm using asymptotic notation and explain the
results.
● Write a report that includes your implementation, results, and analysis.

Random List generation

Python

import random

random_list = []
n = 10

for i in range(n):
random_list.append(random.randint(1,1000))
print (random_list)
OUTPUT:
[148, 697, 883, 921, 677, 794, 795, 978, 658, 786]

This function generates a list of random numbers from 1 to 1000 with a 10 given length. This will be used
as test inputs for the following sorting algorithms.

Bubble sort

Python
#Bubble sort function
def bubble_sort(arr1):

#Outer loop to traverse the entire list


for i in range(0,len(arr1)-1):
for j in range(len(arr1)-1):
if(arr1[j]>arr1[j+1]):
temp = arr1[j]
arr1[j] = arr1[j+1]
arr1[j+1] = temp
return arr1

arr1 = [148, 697, 883, 921, 677, 794, 795, 978, 658, 786]
print("Bubble Sorted Array:", bubble_sort(arr1))

OUTPUT:
Bubble Sorted Array: [148, 658, 677, 697, 786, 794, 795, 883,
921, 978]

Analysis:

Based on my analysis, bubble sort algorithm is a type of sorting algorithm which arranges a set of values
in ascending order, just like in a classroom where we call students in attendance by alphabetical order.
This algorithm has a worst-case time complexity of O(n2). The bubble sort has a space complexity of
O(1). The number of swaps in bubble sort equals the number of inversion pairs in the given array. With a
time complexity of O(n²) in the average and worst cases – and O(n) in the best case.

Insertion sort

Python
# Insertion Sort function
def insertion_sort(arr2):

#Outer loop to traverse through 1 to len(arr2)


for i in range(1, len(arr2)):

value = arr2[i]

# Move elements of list1[0..i-1], that are


# greater than value, to one position ahead of their current
position
j = i - 1
while j >= 0 and value < arr2[j]:
arr2[j + 1] = arr2[j]
j -= 1
arr2[j + 1] = value
return arr2

arr2 = [148, 697, 883, 921, 677, 794, 795, 978, 658, 786]
print("Insertion Array:", insertion_sort(arr2))

OUTPUT:
Insertion Array: [148, 658, 677, 697, 786, 794, 795, 883, 921,
978]
Analysis:

Based on my analysis, insertion sort is like bubble sort, well insertion sort algorithm is easy to implement
and understand. But unlike bubble sort, it builds the sorted list one element at a time by comparing each
item with the rest of the list and inserting it into its correct position. The insertion sort is a stable
algorithm with time complexity of average and worst cases of O(n2) and best of O(n).

Insertion sort space complexity is O(1) because an extra variable key is used. Insertion sort is used when
the array has a small number of elements and when there are only a few elements left to be sorted.

Selection sort

Python
#Defining a selection sort function
def selection_sort(arr3):
length = len(arr3)

for i in range(length-1):
minIndex = i

for j in range(i+1, length):


if arr3[j]<arr3[minIndex]:
minIndex = j

arr3[i], arr3[minIndex] = arr3[minIndex], arr3[i]

return arr3
arr3 = [148, 697, 883, 921, 677, 794, 795, 978, 658, 786]

print("Selection Sort Array:", selection_sort(arr3))

OUTPUT:
Selection Sort Array: [148, 658, 677, 697, 786, 794, 795, 883,
921, 978]

Analysis:

Selection Sort Algorithm, based on what I learned, it is a sorting algorithm where it selects the smallest
element from the remaining elements and exchanges it with the element in the correct location. This
algorithm is mainly used when it has only has a small list of elements to be sorted, cost of swapping does
not needed, checking of all the elements is compulsory cost of writing to a memory matters like in flash
memory (number of writes/swaps is O(n) as compared to O(n2) of bubble sort).

Time Complexities of selection sort algorithm, with a worst case complexity of O(n2) just like if we want
to sort in ascending order and the array is in descending order then, the worst case occurs. Best case is
when the array is already sorted. Average case when the elements of the array are in jumbled order
(neither ascending nor descending).

Merge sort

Python
def merge_sort(list1, left_index, right_index):
if left_index >= right_index:
return

middle = (left_index + right_index)//2


merge_sort(list1, left_index, middle)
merge_sort(list1, middle + 1, right_index)
merge(list1, left_index, right_index, middle)

# Function to merge the list


def merge(list1, left_index, right_index, middle):

# Subparts of a lists
left_sublist = list1[left_index:middle + 1]
right_sublist = list1[middle+1:right_index+1]
# Initial values for variables that we use to keep
# track of where we are in each list1
left_sublist_index = 0
right_sublist_index = 0
sorted_index = left_index

# traverse both copies until we get run out one element


while left_sublist_index < len(left_sublist) and
right_sublist_index < len(right_sublist):

# If our left_sublist has the smaller element, put it in the


sorted
# Part and then move forward in left_sublist (by increasing the
pointer)
if left_sublist[left_sublist_index] <=
right_sublist[right_sublist_index]:
list1[sorted_index] =
left_sublist[left_sublist_index]
left_sublist_index = left_sublist_index + 1
# Otherwise add it into the right sublist
else:
list1[sorted_index] =
right_sublist[right_sublist_index]
right_sublist_index = right_sublist_index + 1

# Move forward in the sorted part


sorted_index = sorted_index + 1

# Go through the remaining elements and add them


while left_sublist_index < len(left_sublist):
list1[sorted_index] = left_sublist[left_sublist_index]
left_sublist_index = left_sublist_index + 1
sorted_index = sorted_index + 1
while right_sublist_index < len(right_sublist):
list1[sorted_index] = right_sublist[right_sublist_index]
right_sublist_index = right_sublist_index + 1
sorted_index = sorted_index + 1

list1 = [148, 697, 883, 921, 677, 794, 795, 978, 658, 786]
merge_sort(list1, 0, len(list1) -1)
print("Merge Sort Array:", list1)

OUTPUT:
Merge Sort Array: [148, 658, 677, 697, 786, 794, 795, 883, 921,
978]

Analysis:

Merge Sort, it is a divide and conquer algorithm. It is a type of sorting algorithm that works by dividing
an array into smaller subarrays, sorting each subarray, and then merging the sorted subarrays back
together to form the final sorted array. First, finding the middle index of an array, then dividing the array
from the middle, and third calling merge sort for the first and second half of the array, and finally
merging the two sorted halves into a single sorted array.

Time Complexity
Best Case Complexity: O(n*log n)

Worst Case Complexity: O(n*log n)

Average Case Complexity: O(n*log n)

The space complexity of merge sort is O(n). Merge sort is applied when there is an inversion count
problem, also in e-commerce applications and external sorting.
Quicksort

Python
def QuickSort(arr5):

elements = len(arr5)

if elements < 2:
return arr5

#Position of the partitioning element


current_position = 0

#Partitioning loop
for i in range(1, elements):
if arr5[i] <= arr5[0]:
current_position += 1
temp = arr5[i]
arr5[i] = arr5[current_position]
arr5[current_position] = temp

temp = arr5[0]
arr5[0] = arr5[current_position]

#Brings pivot to it's appropriate position


arr5[current_position] = temp

#Sorts the elements to the left of pivot


left = QuickSort(arr5[0:current_position])

#sorts the elements to the right of pivot


right = QuickSort(arr5[current_position+1:elements])

#Merge everything together


arr5 = left + [arr5[current_position]] + right
return arr5

array_to_be_sorted = [148, 697, 883, 921, 677, 794, 795, 978,


658, 786]
print("Original Array: ",array_to_be_sorted)
print("Quick Sorted Array: ",QuickSort(array_to_be_sorted))

OUTPUT:
Original Array: [148, 697, 883, 921, 677, 794, 795, 978, 658,
786]
Quick Sorted Array: [148, 658, 677, 697, 786, 794, 795, 883,
921, 978]

Analysis:

Same thing as merge sort, Quick Sort Algorithm is a divide and conquer algorithm. Where it creates two
empty arrays to hold elements less than the pivot value and elements greater than the pivot value, and
then recursively sort the sub arrays. The Quick Sort algorithm is used when the programming language is
good for recursioning, also when time and space complexity matters.

Time and Space Complexities of Quicksort Algorithm


- Worst Case Complexity O(n2) is when it occurs when the pivot element picked is either the
greatest or the smallest element. This condition leads to the case in which the pivot element lies in
an extreme end of the sorted array. One sub-array is always empty and another sub-array
contains n - 1 elements. Thus, quicksort is called only on this sub-array.
- Best Case Complexity O(n*log n) is when it occurs when the pivot element is always the middle
element or near to the middle element.
- Worst Case Complexity O(n*log n) when it occurs when the above conditions do not occur.
- The space complexity for quicksort is O(log n).
Comparison
Bubble Sort
Advantage:
- It is the simplest sorting approach.
- Best case complexity is of O(N) [for optimized approach] while the array is sorted.
- Using an optimized approach, it can detect an already sorted array in first pass with time
complexity of O(N).
- Stable sort: does not change the relative order of elements with equal keys.
- In-Place sort.
Disadvantage:
- Bubble sort is a comparatively slower algorithm.
- Poor efficiency for large elements of arrays.

Insertion sort
Advantage:
- It can be easily computed.
- Best case complexity is of O(N) while the array is already sorted.
- Number of swaps reduced than bubble sort.
- For smaller values of N, insertion sort performs efficiently like other quadratic sorting
algorithms.
- Stable sort.
- Adaptive: total number of steps is reduced for partially sorted arrays.
- In-Place sort.
Disadvantage:
- It is generally used when the value of N is small. For larger values of N, it is inefficient.
- Similar as selection sort it requires n-squared number of steps for sorting n elements.

Selection Sort
Advantage:
- It can also be used on list structures that make add and remove efficient, such as a linked list. Just
remove the smallest element of the unsorted part and end at the end of the sorted part.
- The number of swaps reduced. O(N) swaps in all cases.
- In-Place sort.
Disadvantage:
- Time complexity in all cases is O(N2), no best case scenario.
- It requires n-squared number of steps for sorting n elements.
- It is not scalable.

Comparing merge sort to bubble sort


- An advantage of merge sort over bubble sort is that it is much faster and therefore takes less time
to sort large lists and lists that are more unordered. However, bubble sort can actually be quicker
than merge sort on smaller lists and lists that are mostly in order.
Quick sort has good performance in the average case and very good space complexity. This means that it
is usually chosen over merge sort, because of its efficient use of memory. However, the space complexity
of the algorithm is not as good as that of bubble or insertion sort, and with very large data sets this could
be a reason to favour a different approach.

You might also like