Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 39

Basics of Algorithms

• An algorithm is a set of steps of operations to solve a problem performing calculation, data


processing, and automated reasoning tasks.

• An algorithm is an efficient method that can be expressed within finite amount of time and
space.

• An algorithm is the best way to represent the solution of a particular problem in a very
simple and efficient way.

• If we have an algorithm for a specific problem, then we can implement it in any


programming language,

• Meaning that the algorithm is independent from any programming languages.


Algorithm Design

• The important aspects of algorithm design include creating an efficient


algorithm to solve a problem in an efficient way using minimum time and
space.
• To solve a problem, different approaches can be followed. Some of them
can be efficient with respect to time consumption, whereas other
approaches may be memory efficient.
• However, one has to keep in mind that both time consumption and memory
usage cannot be optimized simultaneously.
• If we require an algorithm to run in lesser time, we have to invest in more
memory and if we require an algorithm to run with lesser memory, we need
to have more time.
Problem Development Steps
The following steps are involved in solving computational problems:
• Problem definition
• Development of a model
• Specification of an Algorithm
• Designing an Algorithm
• Checking the correctness of an Algorithm
• Analysis of an Algorithm
• Implementation of an Algorithm
• Program testing
• Documentation
Characteristics of Algorithms
The main characteristics of algorithms are as follows:

• Algorithms must have a unique name

• Algorithms should have explicitly defined set of inputs and outputs

• Algorithms are well-ordered with unambiguous operations

• Algorithms halt in a finite amount of time

• Algorithms should not run for infinity, i.e., an algorithm must end at some point
Pseudocode

• Pseudocode gives a high-level description of an algorithm without the


ambiguity associated with plain text but also without the need to know the
syntax of a particular programming language.
• The running time can be estimated in a more general manner by using
Pseudocode to represent the algorithm as a set of fundamental operations
which can then be counted.
Difference between Algorithm and Pseudocode

• An algorithm is a formal definition with some specific characteristics that


describes a process, which could be executed by a computer machine to
perform a specific task.

• On the other hand, pseudocode is an informal and (often rudimentary)


human readable description of an algorithm leaving many granular details of
it.

• Writing a pseudocode has no restriction of styles and its only objective is to


describe the high level steps of algorithm in a much realistic manner in
natural language.
For example, following is an algorithm for
Insertion Sort.
Working of Insertion Sort algorithm
Second Pass:
Design and Analysis of Algorithm
• Algorithm analysis is an important part of computational complexity theory,
which provides theoretical estimation for the required resources of an
algorithm to solve a specific computational problem.

• Most algorithms are designed to work with inputs of arbitrary length.

• Analysis of algorithms is the determination of the amount of time and space


resources required to execute it.

• Usually, the efficiency or running time of an algorithm is stated as a function


relating the input length to the number of steps, known as time complexity, or
volume of memory, known as space complexity.
The Need for Analysis

• Algorithms are often quite different from one another, though the objective
of these algorithms are the same.
• For example, a set of numbers can be sorted using different algorithms.
• Number of comparisons performed by one algorithm may vary with others
for the same input.
• Hence, time complexity of those algorithms may differ.
• At the same time, we need to calculate the memory space required by each
algorithm.
• Analysis of algorithm is the process of analyzing the problem-solving
capability of the algorithm in terms of the time and size required (the size of
memory for storage while implementation).
Generally, we perform the following types of analysis:

• Worst-case − The maximum number of steps taken on any instance of size


a.

• Best-case − The minimum number of steps taken on any instance of size a.

• Average case − An average number of steps taken on any instance of size


a.

• Amortized − A sequence of operations applied to the input of size a


averaged over time.
• To solve a problem, we need to consider time as well as
space complexity as the program may run on a system where
memory is limited but adequate space is available or may be
vice-versa.
• In this context, if we compare bubble sort and merge sort.
• Bubble sort does not require additional memory, but merge
sort requires additional space.
• Though time complexity of bubble sort is higher compared to
merge sort, we may need to apply bubble sort if the program
needs to run in an environment, where memory is very
limited.
Bubble sort Algorithm
• The working procedure of bubble sort is simplest.

• Bubble sort works on the repeatedly swapping of adjacent elements until


they are in the intended order.

• It is called bubble sort because the movement of array elements is just like
the movement of air bubbles in the water.

• Although it is simple to use, it is primarily used as an educational tool


because the performance of bubble sort is poor in the real world.
• It is not suitable for large data sets.
• The average and worst-case complexity of Bubble sort is O(n2), where n is a
number of items
• Bubble short is mainly used where -
complexity does not matter
simple and short code is preferred
Working of Bubble sort Algorithm

• Now, let's see the working of Bubble sort Algorithm.


• To understand the working of bubble sort algorithm, let's take an unsorted
array.
• We are taking a short and accurate array, as we know the complexity of
bubble sort is O(n2).
Let the elements of array are:
• First Pass
Sorting will start from the initial two elements. Let compare them to check
which is greater.

Here, 32 is greater than 13 (32 > 13), so it is already sorted. Now, compare 32
with 26.
Here, 26 is smaller than 36. So, swapping is required. After swapping new
array will look like :

Now, compare 32 and 35.


Here, 35 is greater than 32. So, there is no swapping required as they are
already sorted.
Now, the comparison will be in between 35 and 10.

Here, 10 is smaller than 35 that are not sorted. So, swapping is required.
Now, we reach at the end of the array. After first pass, the array will be :
• Second Pass
The same process will be followed for second iteration

Here, 10 is smaller than 32. So, swapping is required. After swapping, the array
will be :
• Third Pass

Here, 10 is smaller than 26. So, swapping is required. After swapping, the
array will be :
• Fourth pass
Similarly, after the fourth iteration, the array will be

Hence, there is no swapping required, so the array is completely sorted.


Bubble sort complexity

• Let's see the time complexity of bubble sort in the best case, average case,
and worst case. We will also see the space complexity of bubble sort.
1. Time Complexity
• Best Case Complexity - It occurs when there is no sorting required, i.e. the
array is already sorted. The best-case time complexity of bubble sort is O(n).

• Average Case Complexity - It occurs when the array elements are in jumbled
order that is not properly ascending and not properly descending. The
average case time complexity of bubble sort is O(n2).

• Worst Case Complexity - It occurs when the array elements are required to
be sorted in reverse order. That means suppose you have to sort the array
elements in ascending order, but its elements are in descending order. The
worst-case time complexity of bubble sort is O(n2).
2. Space Complexity

• The space complexity of bubble sort is O(1). It is because, in bubble sort, an


extra variable is required for swapping.
• The space complexity of optimized bubble sort is O(2). It is because two extra
variables are required in optimized bubble sort.
Optimized Bubble sort Algorithm

• In the bubble sort algorithm, comparisons are made even when the array is already
sorted.

• Because of that, the execution time increases.

• To solve it, we can use an extra variable swapped.

• It is set to true if swapping requires; otherwise, it is set to false.

• It will be helpful, as suppose after an iteration, if there is no swapping required, the


value of variable swapped will be false.

• It means that the elements are already sorted, and no further iterations are required.
Merge Sort Algorithm
• Merge sort is the sorting technique that follows the divide and conquer
approach

• erge sort is similar to the quick sort algorithm as it uses the divide and conquer
approach to sort the elements

• It is one of the most popular and efficient sorting algorithm.

• It divides the given list into two equal halves, calls itself for the two halves and
then merges the two sorted halves.
• The sub-lists are divided again and again into halves until the list cannot
be divided further.

• Then we combine the pair of one element lists into two-element lists,
sorting them in the process.

• The sorted two-element pairs is merged into the four-element lists, and
so on until we get the sorted list
Working of Merge sort Algorithm
• To understand the working of the merge sort algorithm, let's take an unsorted
array
• It will be easier to understand the merge sort via an example

• According to the merge sort, first divide the given array into two equal halves
• Merge sort keeps dividing the list into equal parts until it cannot be further
divided.
• As there are eight elements in the given array, so it is divided into two arrays of
size 4.
• Now, again divide these two arrays into halves. As they are of
size 4, so divide them into new arrays of size 2.
• Now, again divide these arrays to get the atomic value that cannot be further
divided.

• Now, combine them in the same manner they were broken.


• In combining, first compare the element of each array and then combine them
into another array in sorted order.
• So, first compare 12 and 31, both are in sorted positions.

• Then compare 25 and 8, and in the list of two values, put 8 first followed by
25.
• Then compare 32 and 17, sort them and put 17 first followed by 32.

• After that, compare 40 and 42, and place them sequentially.


• In the next iteration of combining, now compare the arrays with two data
values and merge them into an array of found values in sorted order.

• Now, there is a final merging of the arrays. After the final merging of above
arrays, the array will look like
Merge sort complexity

• Now, let's see the time complexity of merge sort in best case, average case,
and in worst case.
• We will also see the space complexity of the merge sort.
• 1. Time Complexity
• Best Case Complexity - It occurs when there is no sorting required, i.e. the
array is already sorted.
• The best-case time complexity of merge sort is O(n*logn).
• Average Case Complexity - It occurs when the array elements are in jumbled
order that is not properly ascending and not properly descending.
• The average case time complexity of merge sort is O(n*logn).
• Worst Case Complexity - It occurs when the array elements are required to
be sorted in reverse order.
• That means suppose you have to sort the array elements in ascending order,
but its elements are in descending order. The worst-case time complexity of
merge sort is O(n*logn).
2. Space Complexity

• The space complexity of merge sort is O(n). It is because, in merge sort, an


extra variable is required for swapping.

You might also like