Download as pdf or txt
Download as pdf or txt
You are on page 1of 28

Design & Analysis of Algorithms

Lecture 3
Types by Algorithms
 Recursive
 Brute force
 Divide and Conquer
 Dynamic Programming
 Greedy
 Heuristic
 Search etc
Recursive algorithms

 A simple recursive algorithm:

 A recursive algorithm is an algorithm which calls itself with


"smaller (or simpler)" input values.

 It obtains the result for the current input by applying simple


operations to the returned value for the smaller (or simpler)
input.
Brute force algorithm

 A brute force algorithm simply tries all possibilities until a


satisfactory solution is found

 These algorithms Stop as soon as a solution is found that is


good enough
Divide and Conquer

 A divide and conquer algorithm consists of two parts:

 Divide the problem into smaller subproblems of the same


type, and solve these subproblems recursively

 Combine the solutions to the subproblems into a solution to


the original problem
Dynamic programming algorithms
 A dynamic programming algorithm remembers past results
and uses them to find new results

 Dynamic programming is generally used for optimization


problems

 Multiple solutions exist, need to find the “best” one


Greedy algorithms
• An optimization problem is one in which you want to find,
not just a solution, but the best solution
• A “greedy algorithm” sometimes works well for optimization
problems
• A greedy algorithm works in phases: At each phase:
– You take the best you can get right now, without regard for
future consequences
– You hope that by choosing a local optimum at each step, you will
end up at a global optimum
Heuristic Algorithms
 The term heuristic is used for algorithms which find solutions
among all possible ones ,but they do not guarantee that the best
will be found, therefore they may be considered as approximately
and not accurate algorithms.

 These algorithms, usually find a solution close to the best one


and they find it fast and easily.

8
Search algorithms

 A search algorithm is an algorithm for finding an item


with specified properties among a collection of items
Algorithm Analysis
• The amount of resources used by the algorithm
– Space
– Computational time
• Running time:
– The number of primitive operations (steps) executed before
termination
• Order of growth
– The leading term of a formula
– Expresses the behavior of a function toward infinity

10
Types of Analysis
• Worst case (e.g. cards reversely ordered)
– Provides an upper bound on running time

• Best case (e.g., cards already ordered)


– Input is the one for which the algorithm runs the fastest

• Average case
– Provides a prediction about the running time
– Assumes that the input is random
(general case)

11
Asymptotic Notations

• A way to describe behavior of functions in the limit


– How we indicate running times of algorithms

– Describe the running time of an algorithm as n grows to 

• O notation: asymptotic “less than”: f(n) “≤” g(n)

•  notation: asymptotic “greater than”: f(n) “≥” g(n)

•  notation: asymptotic “equality”: f(n) “=” g(n)

12
Additional Reading
 https://www.codecademy.com/learn/cspath-asymptotic-
notation/modules/cspath-asymptotic-notation/cheatsheet
 https://www.geeksforgeeks.org/analysis-of-algorithms-set-
3asymptotic-notations/
 https://www.tutorialspoint.com/data_structures_algorithm
s/asymptotic_analysis.htm
 https://www.youtube.com/watch?v=7dz8Iaf_weM

13
Logarithms
 In algorithm analysis we often use the notation “log n” without

specifying the base

Binary logarithm lg n  log2 n log x  y log x


y

Natural logarithm ln n  loge n log xy  log x  log y


x
lg k n  (lg n )k log  log x  log y
y
lg lg n  lg(lg n )
loga x  loga b logb x

a logb x  x logb a

14
Asymptotic Notations - Examples
• For each of the following pairs of functions, either f(n) is O(g(n)), f(n)
is Ω(g(n)), or f(n) = Θ(g(n)).

– f(n) = log n2; g(n) = log n + 5 f(n) =  (g(n))

– f(n) = n; g(n) = log n2 f(n) = (g(n))

– f(n) = log log n; g(n) = log n f(n) = O(g(n))

– f(n) = n; g(n) = log2 n f(n) = (g(n))

– f(n) = n log n + n; g(n) = log n


15 f(n) = (g(n))
Examples (find initial values)

2n 2 ≤ cn3  2 ≤ cn  c = 1 and n = 2
– 2n2 = O(n3): 0

– n2 = O(n2): n2 ≤ cn2  c ≥ 1  c = 1 and n0= 1

– n = O(n2): n ≤ cn2  cn ≥ 1  c = 1 and n0= 1

16
Examples

– 5n2 = (n)
 c, n0 such that: 0  cn  5n2
 cn  5n2
 c = 1 and n0 = 1

Solve following , find co and no from the followings


– n = (2n),
n3 = (n2),

17
The Sorting Problem

• Input:

– A sequence of n numbers a1, a2, . . . , an

• Output:

– A permutation (reordering) a1’, a2’, . . . , an’ of the input

sequence such that a1’ ≤ a2’ ≤ · · · ≤ an’

18
Bubble Sort
• Idea:
– Repeatedly pass through the array
– Swaps adjacent elements that are out of order

i
1 2 3 n

8 4 6 9 2 3 1
j

• Easier to implement, but slower than Insertion sort

19
Summary of Steps of Bubble Sort
 Compare each pair of adjacent elements from the beginning of an
array and, if they are in reversed order, swap them.

 If at least one swap has been done, repeat step 1

20
Bubble Sort
Alg.: BUBBLESORT(A)

for i  1 to length[A]

do for j  1 to length[A]-i

do if A[j] > A[j +1]

then exchange A[j]  A[j+1]

21
Bubble Sort
 Worst case
О(n2)
 Best case
О(n2)
 Average case
О(n2)

22
Example
 Sort the following data using Bubble Sort

8 4 6 9 2 3 1

23
Selection Sort
 Summary of Steps
 Find the smallest element in the array

 Exchange it with the element in the first position

 Find the second smallest element and exchange it with


the element in the second position

 Continue until the array is sorted

24
Selection Sort
The algorithm works as follows:

1. Find the minimum value in the list

2. Swap it with the value in the first position

3. Repeat the steps above for the remainder of the list


(starting at the second position and advancing each time)

25
Example
Step 1

8 4 6 9 2 3 1 1 2 3 4 9 6 8

Step 2

1 4 6 9 2 3 8 1 2 3 4 6 9 8

Step 3 Step 7

1 2 6 9 4 3 8 1 2 3 4 6 8 9

Sorted Array

1 2 3 9 4 6 8 1 2 3 4 6 8 9

26
Selection Sort
Alg.: SELECTION-SORT(A)
n ← length[A]
for j ← 1 to n - 1
do smallest ← j
for i ← j + 1 to n
do if A[i] < A[smallest]
then smallest ← i
exchange A[j] ↔ A[smallest]

27
Selection Sort
 Worst case
О(n2)
 Best case
О(n2)
 Average case
О(n2)

28

You might also like