Professional Documents
Culture Documents
Ass1 21-01371
Ass1 21-01371
Objective:
The objective of this assignment is to gain hands-on experience in analyzing the time and space complexity of
sorting algorithms. You will implement and compare the performance of different sorting algorithms, and
analyze their time and space complexity using asymptotic notation.
Instructions:
● Implement the following sorting algorithms: Bubble sort, insertion sort, selection sort, merge sort, and
quicksort.
● Write a program that will generate a random list of integers of varying sizes (e.g. 1000, 10000,
100000) and sort them using each of the above algorithms.
● Measure the execution time of each algorithm for each list size and create a graph to illustrate the
results.
● Analyze the time and space complexity of each algorithm using asymptotic notation and explain the
results.
● Write a report that includes your implementation, results, and analysis.
Python
import random
random_list = []
n = 10
for i in range(n):
random_list.append(random.randint(1,1000))
print (random_list)
OUTPUT:
[148, 697, 883, 921, 677, 794, 795, 978, 658, 786]
This function generates a list of random numbers from 1 to 1000 with a 10 given length. This will be used
as test inputs for the following sorting algorithms.
Bubble sort
Python
#Bubble sort function
def bubble_sort(arr1):
arr1 = [148, 697, 883, 921, 677, 794, 795, 978, 658, 786]
print("Bubble Sorted Array:", bubble_sort(arr1))
OUTPUT:
Bubble Sorted Array: [148, 658, 677, 697, 786, 794, 795, 883,
921, 978]
Analysis:
Based on my analysis, bubble sort algorithm is a type of sorting algorithm which arranges a set of values
in ascending order, just like in a classroom where we call students in attendance by alphabetical order.
This algorithm has a worst-case time complexity of O(n2). The bubble sort has a space complexity of
O(1). The number of swaps in bubble sort equals the number of inversion pairs in the given array. With a
time complexity of O(n²) in the average and worst cases – and O(n) in the best case.
Insertion sort
Python
# Insertion Sort function
def insertion_sort(arr2):
value = arr2[i]
arr2 = [148, 697, 883, 921, 677, 794, 795, 978, 658, 786]
print("Insertion Array:", insertion_sort(arr2))
OUTPUT:
Insertion Array: [148, 658, 677, 697, 786, 794, 795, 883, 921,
978]
Analysis:
Based on my analysis, insertion sort is like bubble sort, well insertion sort algorithm is easy to implement
and understand. But unlike bubble sort, it builds the sorted list one element at a time by comparing each
item with the rest of the list and inserting it into its correct position. The insertion sort is a stable
algorithm with time complexity of average and worst cases of O(n2) and best of O(n).
Insertion sort space complexity is O(1) because an extra variable key is used. Insertion sort is used when
the array has a small number of elements and when there are only a few elements left to be sorted.
Selection sort
Python
#Defining a selection sort function
def selection_sort(arr3):
length = len(arr3)
for i in range(length-1):
minIndex = i
return arr3
arr3 = [148, 697, 883, 921, 677, 794, 795, 978, 658, 786]
OUTPUT:
Selection Sort Array: [148, 658, 677, 697, 786, 794, 795, 883,
921, 978]
Analysis:
Selection Sort Algorithm, based on what I learned, it is a sorting algorithm where it selects the smallest
element from the remaining elements and exchanges it with the element in the correct location. This
algorithm is mainly used when it has only has a small list of elements to be sorted, cost of swapping does
not needed, checking of all the elements is compulsory cost of writing to a memory matters like in flash
memory (number of writes/swaps is O(n) as compared to O(n2) of bubble sort).
Time Complexities of selection sort algorithm, with a worst case complexity of O(n2) just like if we want
to sort in ascending order and the array is in descending order then, the worst case occurs. Best case is
when the array is already sorted. Average case when the elements of the array are in jumbled order
(neither ascending nor descending).
Merge sort
Python
def merge_sort(list1, left_index, right_index):
if left_index >= right_index:
return
# Subparts of a lists
left_sublist = list1[left_index:middle + 1]
right_sublist = list1[middle+1:right_index+1]
# Initial values for variables that we use to keep
# track of where we are in each list1
left_sublist_index = 0
right_sublist_index = 0
sorted_index = left_index
list1 = [148, 697, 883, 921, 677, 794, 795, 978, 658, 786]
merge_sort(list1, 0, len(list1) -1)
print("Merge Sort Array:", list1)
OUTPUT:
Merge Sort Array: [148, 658, 677, 697, 786, 794, 795, 883, 921,
978]
Analysis:
Merge Sort, it is a divide and conquer algorithm. It is a type of sorting algorithm that works by dividing
an array into smaller subarrays, sorting each subarray, and then merging the sorted subarrays back
together to form the final sorted array. First, finding the middle index of an array, then dividing the array
from the middle, and third calling merge sort for the first and second half of the array, and finally
merging the two sorted halves into a single sorted array.
Time Complexity
Best Case Complexity: O(n*log n)
The space complexity of merge sort is O(n). Merge sort is applied when there is an inversion count
problem, also in e-commerce applications and external sorting.
Quicksort
Python
def QuickSort(arr5):
elements = len(arr5)
if elements < 2:
return arr5
#Partitioning loop
for i in range(1, elements):
if arr5[i] <= arr5[0]:
current_position += 1
temp = arr5[i]
arr5[i] = arr5[current_position]
arr5[current_position] = temp
temp = arr5[0]
arr5[0] = arr5[current_position]
OUTPUT:
Original Array: [148, 697, 883, 921, 677, 794, 795, 978, 658,
786]
Quick Sorted Array: [148, 658, 677, 697, 786, 794, 795, 883,
921, 978]
Analysis:
Same thing as merge sort, Quick Sort Algorithm is a divide and conquer algorithm. Where it creates two
empty arrays to hold elements less than the pivot value and elements greater than the pivot value, and
then recursively sort the sub arrays. The Quick Sort algorithm is used when the programming language is
good for recursioning, also when time and space complexity matters.
Insertion sort
Advantage:
- It can be easily computed.
- Best case complexity is of O(N) while the array is already sorted.
- Number of swaps reduced than bubble sort.
- For smaller values of N, insertion sort performs efficiently like other quadratic sorting
algorithms.
- Stable sort.
- Adaptive: total number of steps is reduced for partially sorted arrays.
- In-Place sort.
Disadvantage:
- It is generally used when the value of N is small. For larger values of N, it is inefficient.
- Similar as selection sort it requires n-squared number of steps for sorting n elements.
Selection Sort
Advantage:
- It can also be used on list structures that make add and remove efficient, such as a linked list. Just
remove the smallest element of the unsorted part and end at the end of the sorted part.
- The number of swaps reduced. O(N) swaps in all cases.
- In-Place sort.
Disadvantage:
- Time complexity in all cases is O(N2), no best case scenario.
- It requires n-squared number of steps for sorting n elements.
- It is not scalable.