Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 123

Recurrence

Relations
Previously…
For Exam 1, we found f(n) by counting operations:

int factorial( int n ) {


int result = 1; // 1 operation
for( int i = n; n > 0; --n ) { // n iterations
result = result * n; // 2 operations
}
return result; // 1 operation
}

f(n) = 2 + 2n
~ O(n)
However…
What if the function is recursive?

int factorial( int n ) {


if ( n <= 1 ) // 1 operation
return 1; // 1 operation
else
return n * factorial( n – 1 );
// ? operations
}
However…
Finding f(n) is no longer straight-forward.
We need a recurrence relation.
Writing A Recurrence
Relation
Given an algorithm’s description, 5 facts are needed to write its
recurrence relation.
Writing A Recurrence
Relation
• How many sub-problems is the problem split into?
o Refer to this number as a
• By what number is the input-size divided?
o b
• How long does it take to form the sub-problems?
o c( n )
• How long does it take to combine the solutions of the sub-problems?
o d( n )
Writing A Recurrence
Relation
T(n) = a * T(n/b) + c( n ) + d( n )

Lastly, you need a base-case, which is the input-size where


the algorithm stops splitting.
Usually T(0) or T(1).
Example: Merge Sort
65 22 15 40 104 50 19

Split in half
Example: Merge Sort
65 22 15 40 104 50 19

2 sub-problems are formed…


Example: Merge Sort
65 22 15 40 104 50 19

Hence, a == 2
Example: Merge Sort
65 22 15 40 104 50 19

Each sub-problem receives half of the


input (approximately) …
Example: Merge Sort
65 22 15 40 104 50 19

Hence, b == 2
Example: Merge Sort
65 22 15 40 104 50 19

65 22 15 40 104 50 19

Setting up these sub-problems


took linear time
Example: Merge Sort

65 22 15 40 104 50 19

Hence, c(n) is O(n)


Example: Merge Sort

65 22 15 40 104 50 19

65 22 15 40 104 50 19
Example: Merge Sort

65 22 15 40 104 50 19

65 22 40 104 50 19

The algorithm stops splitting


when the input-size is 1
Example: Merge Sort

15

65 22 40 104 50 19

Hence, the base-case is T(1)


Example: Merge Sort

15

65 22 40 104 50 19

Now … merge and sort


Example: Merge Sort

15

65 22 40 104 50 19
Example: Merge Sort

22 65 15

65 22 40 104 50 19
Example: Merge Sort

22 65 15 40 104

40 104 50 19
Example: Merge Sort

22 65 15 40 104 19 50

50 19
Example: Merge Sort

22 65 15 40 104 19 50
Example: Merge Sort

15 22 65

22 65 15 40 104 19 50
Example: Merge Sort

15 22 65 19

40 104 19 50
Example: Merge Sort

15 22 65 19 40

40 104 50
Example: Merge Sort

15 22 65 19 40 50 104

104 50
Example: Merge Sort

15 22 65 19 40 50 104

We are back at the top level


Example: Merge Sort

15 22 65 19 40 50 104

d(n) = The time it takes to re-join the


two sub-problems
Example: Merge Sort
15

15 22 65 19 40 50 104
Example: Merge Sort
15 19

22 65 19 40 50 104
Example: Merge Sort
15 19 22

22 65 40 50 104
Example: Merge Sort
15 19 22 40

65 40 50 104
Example: Merge Sort
15 19 22 40 50

65 50 104
Example: Merge Sort
15 19 22 40 50 65 104

65 104
Example: Merge Sort
15 19 22 40 50 65 104

d(n) is O(n)
Example: Merge Sort
15 19 22 40 50 65 104

In total:

T( n ) = a * T( n/b ) + c( n ) + d( n )

= 2 * T( n/2 ) + O(n) + O(n)

≈ 2 * T( n/2 ) + O(n)
Example: Merge Sort
15 19 22 40 50 65 104

Best case: O(nlogn)

Worst case: O(nlogn)

Average case: O(nlogn)

Merge Sort performs the same number of


operations for all input-types.
Example: Merge Sort
15 19 22 40 50 65 104

Can Merge Sort be done in-place


(without allocating additional memory)?

Yes, by recursively calling Merge Sort on a


sub-set of the input array.
Note…
Not all algorithms divide the input evenly.
Example: Quick Sort
5 10 15 20 25 30 35

Let’s make the first element


the pivot
Example: Quick Sort
5 10 15 20 25 30 35

Normally, Quick Sort creates


two sub-arrays, L and R
Example: Quick Sort
5 10 15 20 25 30 35

where L contains all values


less than the pivot …
Example: Quick Sort
5 10 15 20 25 30 35

and R contains all values greater


than or equal to the pivot.
Example: Quick Sort
5 10 15 20 25 30 35

However, since the pivot is the


smallest value in this case, …
Example: Quick Sort
5 10 15 20 25 30 35

only one sub-array, R,


is formed
Example: Quick Sort
5 10 15 20 25 30 35

As a result, a == 1
Example: Quick Sort
5 10 15 20 25 30 35

Since the input was not


split evenly, …
Example: Quick Sort
5 10 15 20 25 30 35

‘b’ will have no value


Example: Quick Sort
5 10 15 20 25 30 35

As a result,
T(n) = T(n-1) + c(n) + d(n)
so far
Example: Quick Sort
5 10 15 20 25 30 35

10 15 20 25 30 35

c(n) is O(n)
Example: Quick Sort
5

10 15 20 25 30 35

15 20 25 30 35
Example: Quick Sort
5

10

15 20 25 30 35

20 25 30 35
Example: Quick Sort
5

10

15

20 25 30 35

25 30 35
Example: Quick Sort
5

10

15

20

25 30 35

30 35
Example: Quick Sort
5

10

15

20

25

30 35

35
Example: Quick Sort
5

10

15

20

25

30
The base-case is T(1)
35
Example: Quick Sort
5

10

15

20

25

30
Once the base-case is
reached, … 35
Example: Quick Sort
5

10

15

20

25

Quick Sort combines L, R and 30


the pivot to form
L + pivot + R 35
Example: Quick Sort
5

10

15

20

25

Since there was no L for this 30


example, we only combine
pivot + R 35
Example: Quick Sort
5

10

15

20

25

30 35

35
Example: Quick Sort
5

10

15

20

25 30 35

30 35
Example: Quick Sort
5

10

15

20 25 30 35

25 30 35
Example: Quick Sort
5

10

15 20 25 30 35

20 25 30 35
Example: Quick Sort
5

10 15 20 25 30 35

15 20 25 30 35
Example: Quick Sort
5

10 15 20 25 30 35

We are back at the top level


Example: Quick Sort
5

10 15 20 25 30 35

d(n) = The time it takes to re-join the


sub-problem
Example: Quick Sort
5 10 15 20 25 30 35

10 15 20 25 30 35

d(n) is O(n)
Example: Quick Sort
5 10 15 20 25 30 35

In total:

T( n ) = T( n - 1 ) + c( n ) + d( n )
= T( n – 1 ) + O(n) + O(n)
≈ T( n - 1 ) + O(n)

→ O( n2 )
Example: Quick Sort
5 10 15 20 25 30 35

This reflects the worst case for Quick Sort


(sorted, first/last element as the pivot)
since the input-size decreases by only
one, resulting in more recursive calls.
Example: Quick Sort
5 10 15 20 25 30 35

The best case occurs when the input is


sorted and the middle element is the pivot.
Quick Sort
5 10 15 20 25 30 35
Quick Sort
5 10 15 20 25 30 35

Form L and R
Quick Sort
5 10 15 20 25 30 35

5 10 15 25 30 35

Since 2 sub-problems are


formed, …
Quick Sort
20

5 10 15 25 30 35

a == 2
Quick Sort
20

5 10 15 25 30 35

Since each sub-problem will


receive half of the input (approx.)
Quick Sort
20

5 10 15 25 30 35

b == 2
Quick Sort
20

5 10 15 25 30 35

So far,
T(n) = 2*T(n/2) + c(n) + d(n)
Quick Sort
20

5 10 15 25 30 35

Forming the sub-problems


took linear time
Quick Sort
20

5 10 15 25 30 35

Hence, c(n) is O(n)


Quick Sort
20

5 10 15 25 30 35

5 15 25 35

The base case is T(1)


Quick Sort
20

10 30

5 15 25 35

Combine L, R and the pivot to form


L + pivot + R
Quick Sort
20

5 10 15 25 30 35

5 15 25 35
Quick Sort
20

5 10 15 25 30 35
Quick Sort
5 10 15 20 25 30 35

5 10 15 25 30 35

d(n) is O(n)
Quick Sort
5 10 15 20 25 30 35

In total:

T( n ) = a * T( n/b ) + c( n ) + d( n )
= 2 * T( n/2 ) + O(n) + O(n)
≈ 2 * T( n/2 ) + O(n)

→ O(nlogn)
Quick Sort
25 20 35 5 30 10 15

The average case


Quick Sort
25 20 35 5 30 10 15

Let’s make the last element


the pivot
Quick Sort
25 20 35 5 30 10 15

Form L and R
Quick Sort
25 20 35 5 30 10 15

5 10 25 20 35 30
Quick Sort
15

5 10 25 20 35 30

5 25 20 35
Quick Sort
15

10 30

5 25 20 35

25

Join L + pivot + R
Quick Sort
15

10 30

5 20 25 35

25

Join L + pivot + R
Quick Sort
15

5 10 30

5 20 25 35

Join L + pivot + R
Quick Sort
15

5 10 20 25 30 35

20 25 35

Join L + pivot + R
Quick Sort
5 10 15 20 25 30 35

5 10 20 25 30 35

Join L + pivot + R
Quick Sort
5 10 15 20 25 30 35

Join L + pivot + R
The Average Case
On average, the input is not sorted, and the pivot is randomly
chosen.

Hence, one way the average case can be described is:

T( n ) = T( n * a ) + T( n * (1 - a) ) + O(n)
O < a < 1
Running time will be somewhere between the best and worst
case.
Describing Recurrence
Relations
Given an equation of this form:

T( n ) = a * T( n/b ) + f( n )

where f(n) = c( n ) + d( n ), we could say:

“T(n) splits itself into a sub-problems, with each receiving 1/b of the input. The time
it takes to form these sub-problems and combine their solutions is f(n).”
Describing Recurrence
Relations
Example:

T( n ) = 4 * T( n/4 ) + n2

“T(n) splits itself into 4 sub-problems, with each receiving 1/4 of the input. The time
it takes to form these sub-problems and combine their solutions is n2.”
Classifying Recurrence
Relations
Below are four methods for classifying recurrence relations:

• Substitution

• Iterative

• Tree

• Master Theorem
Substitution
T( n ) = 2 * T( n/2 ) + n

Substitution == “Guess and check”


Substitution
T( n ) = 2 * T( n/2 ) + n

My guess:
T(n) is O(n)
Substitution
T( n ) = 2 * T( n/2 ) + n

Let T(n) == c*n


Substitution
T( n ) = 2 * T( n/2 ) + n

= 2 * c( n/2 ) + n

=c*n+n

=(c+1)*n

≤ c * n ???
Substitution
T( n ) = 2 * T( n/2 ) + n

= 2 * c( n/2 ) + n

=c*n+n

=(c+1)*n

≤ c * n ???

FALSE
Substitution
T( n ) = 2 * T( n/2 ) + n

New guess:
T(n) is O(nlog2n)
Substitution
T( n ) = 2 * T( n/2 ) + n

Let T(n) == c*nlog2n


Substitution
T( n ) = 2 * T( n/2 ) + n

= 2 * ( c * n/2 * log2( n/2 ) ) + n

= c * n * log2( n/2 ) + n

= c * n * ( log2n – log22 ) + n

log22 == 1
Substitution
T( n ) = 2 * T( n/2 ) + n

= 2 * ( c * n/2 * log2( n/2 ) ) + n

= c * n * log2( n/2 ) + n

= c * n * ( log2n – 1 ) + n

= c * nlog2n – c * n + n

≤ c * nlog2n ???

TRUE, when c ≥ 1
Substitution
T( n ) = 2 * T( n/2 ) + n

= 2 * ( c * n/2 * log2( n/2 ) ) + n

= c * n * log2( n/2 ) + n

= c * n * ( log2n – 1 ) + n

= c * nlog2n – c * n + n

≤ c * nlog2n ???

Hence, T(n) is O(nlog2n)


Iterative
T( n ) = 2 * T( n/2 ) + n

Repeatedly plug in T(n)


Iterative
T( n ) = 2 * T( n/2 ) + n

= 2 * ( 2 * T( n/4 ) + n/2 ) + n

= 2 * ( 2 * ( 2 * T( n/8 ) + n/4 ) + n/2 ) + n

= 2i * T( n/2i ) + i * n

Max value of i == logn.


Hence, …
Iterative
T( n ) = 2 * T( n/2 ) + n

= 2logn * T( n/2logn ) + nlogn

2 logn
== n
n/2 logn == 1
Iterative
T( n ) = 2 * T( n/2 ) + n

= 2logn * T( n/2logn ) + nlogn

= n * T( 1 ) + nlogn

≈ n*c + nlogn

≈ nlogn

Hence, T(n) is O(nlogn)


Tree Method
T( n ) = 2 * T( n/2 ) + n T( 1 ) = 1

n T(n)

2 * n/2 = nT(n/2) T(n/2)

4 T(n/4)
* n/4 = n T(n/4) T(n/4) T(n/4)

… … … … … … … …

T(1)
n * 1 =T(1)
T(1)
n …

Number of levels = logn + 1


Total time = n + n + … + n = n*(logn + 1)
T(n) is O(nlogn)
Tree Method
T( n ) = T( n-1 ) + n T( 1 ) = 1

n T( n )

n-1 T( n-1 )

n-2 T( n-2 )

… …

1 T( 1 )

Number of levels = n
Total time = n + (n-1) + (n-2) +…+ 1 = n(n+1)/2 ≈ n2
T(n) is O(n2)
Master Theorem
Most straightforward. Given an equation of this form:

T( n ) = a * T( n/b ) + f( n )

where f(n) = c(n) + d(n), the property that returns TRUE gives you your answer.

 𝑻 ( 𝒏 ) =𝚯 ( 𝒏log 𝒃 𝒂 ) 𝒊𝒇 𝒇 ( 𝒏 ) =𝑶 ( 𝒏 log𝒃 𝒂 − 𝝐 ) 𝒇𝒐𝒓 𝒔𝒐𝒎𝒆 𝒄𝒐𝒏𝒔𝒕𝒂𝒏𝒕 𝝐 >𝟎

 𝑻 ( 𝒏 ) =𝚯 ( 𝒏log 𝒃 𝒂
∗ log 𝒏 ) 𝒊𝒇 𝒇 ( 𝒏 ) = 𝚯 ( 𝒏
log𝒃 𝒂
)
 
for some constant c < 1 ?
Master Theorem
Example:

T( n ) = 2 * T( n/ 2 ) + 5

  𝑰𝒔 𝒇 ( 𝒏 ) =𝑶 ( 𝒏 log𝒃 𝒂−𝝐
) 𝒇𝒐𝒓 𝒔𝒐𝒎𝒆 𝒄𝒐𝒏𝒔𝒕𝒂𝒏𝒕 𝝐 > 𝟎 ?
 
Master Theorem
Example:

T( n ) = 2 * T( n/ 2 ) + n

  𝑰𝒔 𝒇 ( 𝒏 ) =𝑶 ( 𝒏 log𝒃 𝒂−𝝐
) 𝒇𝒐𝒓 𝒔𝒐𝒎𝒆 𝒄𝒐𝒏𝒔𝒕𝒂𝒏𝒕 𝝐 > 𝟎 ?
 

  𝑰𝒔 𝒇 ( 𝒏 ) =𝚯 ( 𝒏 log 𝒃 𝒂
) ?
 
Master Theorem
Example:

T( n ) = 4 * T( n/ 2 ) + n3

  𝑰𝒔 𝒇 ( 𝒏 ) =𝑶 ( 𝒏 log
𝒃 𝒂−𝝐
) 𝒇𝒐𝒓 𝒔𝒐𝒎𝒆 𝒄𝒐𝒏𝒔𝒕𝒂𝒏𝒕 𝝐 > 𝟎 ?
 

  𝑰𝒔 𝒇 ( 𝒏 ) =𝚯 ( 𝒏 log 𝒃 𝒂
) ?
 
Master Theorem
Example:

T( n ) = 4 * T( n/ 2 ) + n3

  𝑰𝒔 𝒇 ( 𝒏 ) = 𝜴 ( 𝒏 log 𝒃 𝒂 +𝝐
) 𝒇𝒐𝒓 𝒔𝒐𝒎𝒆 𝒄𝒐𝒏𝒔𝒕𝒂𝒏𝒕 𝝐 > 𝟎 ?
  . (First test passed)

  for some constant c < 1?


  ,
(Both tests passed)
Master Theorem
Example:

T( n ) = 4 * T( n/ 2 ) + n3

  𝑯𝒆𝒏𝒄𝒆 𝑻 ( 𝒏 ) 𝒊𝒔 𝜣 ( 𝒇 ( 𝒏 ) )=𝜣 (𝒏 𝟑 )

You might also like