Chapter 3 Dynamic Programming: Divide-and-Conquer

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 25

Chapter 3 Dynamic Programming

The Binomial Coefficient


Floyd’s Algorithm for Shortest Paths

Dynamic Programming and Optimization Problems


Chained Matrix Multiplication
The Traveling Salesperson Problem

Divide-and-Conquer
 top-down approach
 divide an instance of a problem into
smaller instances
 both smaller instances are unrelated
 e.g., in mergesort, instances are sorted
independently
 not suitable for related smaller instances
 e.g., Fibonacci F(n) = F(n-1)+F(n-2)
 Computing F(4) and F(3) both require F(2)

KPShih@csie.tku.edu.tw 2
Dynamic Programming
 bottom up approach
 divide an instance into smaller instances
 solve small instance first, store it, look it up later
not recomputing
 using array (table) to store solution
 e.g., Fibonacci F[n] = F[n-1]+F[n-2]

 steps:
 1. Establish a recursive property to a problem
 2. Solve an instance of the problem in a bottom-
up fashion; solving smaller instance first.

KPShih@csie.tku.edu.tw 3

3.1 The Binomial Coefficient


3.1 The Binomial Coefficient
 n n! for 0<=k<=n
   
 k  k!n  k !
 15 
   ?
7

  n  1  n  1
 n    , 0k n
    k  1  k 
 k  1, k  0 or k  n

KPShih@csie.tku.edu.tw 5

3.1 The Binomial Coefficient


 Algorithm 3.1 binomial coefficient
 using divide-and-conquer pp. 96
 bin(n, k) = bin(n-1, k-1)+ bin(n-1, k)
 inefficient
 recomputing bin(n-2, k-1) when computing
both bin(n-1,k-1) and bin(n-1, k)

KPShih@csie.tku.edu.tw 6
3.1 The Binomial Coefficient

KPShih@csie.tku.edu.tw 7

3.1 The Binomial Coefficient


 using dynamic programming
 1. Establish a recursive property
 Bi  1 j  1  Bi  1 j  0  j  i
Bi  j   
1 j  0 or j  i

 2. Solve an instance of the problem in a bottom-


up fashion; solving smaller instance first.
 for (i=1, j=1; ....)
{computing B[i][j]; }
 Fig. 3.1 pp.98

KPShih@csie.tku.edu.tw 8
3.1 The Binomial Coefficient

KPShih@csie.tku.edu.tw 9

3.1 The Binomial Coefficient


 Example 3.1 pp. 98
 Algorithm 3.2 pp. 99
 Analysis
 total number of passes (for-j loop) pp. 100

i= 0 1 2 k k+1 n
1 + 2 + 3+....+ k + (k+1) + (k+1) + ....+ (k+1)

 k(k+1)/2 + (n-k+1)(k+1) ∈ Θ(nk)


 Improvements
 using only one dimensional array
 B[i-1][j-1] B[i-1][j]
KPShih@csie.tku.edu.tw B[i][j] 10
3.1 The Binomial Coefficient
n  n 
    
k  n  k 

KPShih@csie.tku.edu.tw 11

3.2 Floyd’s Algorithm for


Shortest Paths
3.2 Floyd’s Algorithm for Shortest Paths
 graph G=(V,E)
 vertices
 edges
 digraph
 weights (weighted graph )
 path (a sequence of vertices) [v1, v4, v3]

KPShih@csie.tku.edu.tw 13

3.2 Floyd’s Algorithm for Shortest Paths


 cycle ([v1, v4, v5, v1])
 cyclic (G contains a cycle)
 Acyclic
 simple path ( never pass through same
vertex twice)
 length
 length [v1, v4, v3] =1+2=3
 shortest path
 length [v1, v2, v3] = 1+3=4
 length [v1, v4, v3] = 1+2=3 → shortest
 length [v1, v2, v4, v3] = 1+2+2=5
KPShih@csie.tku.edu.tw 14
3.2 Floyd’s Algorithm for Shortest Paths
 Shortest Path Problem
 optimization problem (min, max)
 solution may be not unique
 solve: an obvious algorithm
 for each vertex, determining the lengths of all
paths from it to every other vertex, selecting
the minimum
 worst than exponential-time
 (n-2)*(n-3)*......*1= (n-2)!
 2nd, 3rd, …

KPShih@csie.tku.edu.tw 15

3.2 Floyd’s Algorithm for Shortest Paths


 An dynamic programming approach
with cubic-time
 adjacency matrix W[i][j]

KPShih@csie.tku.edu.tw 16
3.2 Floyd’s Algorithm for Shortest Paths

KPShih@csie.tku.edu.tw 17

3.2 Floyd’s Algorithm for Shortest Paths


 shortest path lengths matrix D[i][j]
 e.g., in Fig. 3.2, D[3][5]=7
 find a way to calculate D from W
 D (k) [i][j]
 length of a shortest path from vi to vj
using only the vertices in set {v1, v2, ...,
vk}
 e.g., Example 3.2 pp. 103

KPShih@csie.tku.edu.tw 18
3.2 Floyd’s Algorithm for Shortest Paths
 compute D(k)[i][j] by using dynamic
programming
 D(0) =W
 D(n) =D
 1. Establish a recursive property
 compute D(k) from D(k-1)
 2. Solve in a bottom-up fashion
 D(0), D(1), D(2), ..., D(n)
W D

KPShih@csie.tku.edu.tw 19

3.2 Floyd’s Algorithm for Shortest Paths


 Step 1: two cases:
 (1) one shortest path from vi to vj not passing vk
 D(k)[i][j] = D(k-1)[i][j]
 e.g., D(5)[1][3] = D(4)[1][3]=3
 → [v1,v4, v3] is still shortest even including v5
 (2) all shortest paths from vi to vj passing vk
 Fig. 3.4 pp. 105 (a sub-path of a shortest path is still
shortest)
 D(k)[i][j] = D(k-1)[i][k]+ D(k-1)[k][j]

KPShih@csie.tku.edu.tw 20
3.2 Floyd’s Algorithm for Shortest Paths
 D(k)[i][j] = min(D(k-1)[i][j], Case 1

D(k-1)[i][k]+ D(k-1)[k][j])
Case 2

 Step2:
 create the sequence of arrays for
computing D(n) from D(0)
 e.g., Example 3.3 pp. 106

KPShih@csie.tku.edu.tw 21

3.2 Floyd’s Algorithm for Shortest Paths

KPShih@csie.tku.edu.tw 22
3.2 Floyd’s Algorithm for Shortest Paths
 Every-case time complexity of Algorithm
3.3, pp.107
 Algorithm 3.4 (Floyd‘s algorithm for
shortest paths 2) for output a shortest
path
 e.g., Fig. 3.5 pp.108
 Algorithm 3.5 (Print Shortest Path)
 e.g., in Fig. 3.5 if q=5, r=3
 path [v5, v1, v4, v3]

KPShih@csie.tku.edu.tw 23

3.2 Floyd’s Algorithm for Shortest Paths

KPShih@csie.tku.edu.tw 24
3.2 Floyd’s Algorithm for Shortest Paths

KPShih@csie.tku.edu.tw 25

3.2 Floyd’s Algorithm for Shortest Paths

KPShih@csie.tku.edu.tw 26
3.3 Dynamic Programming
and Optimization Problems

3.3 Dynamic Programming and


Optimization Problems
 1. Establish a recursive property that
gives the optimal solution to an
instance of the problem.

 2. Compute the value of an optimal


solution in bottom-up fashion.

 3. Construct an optimal solution in a


bottom-up fashion.

KPShih@csie.tku.edu.tw 28
3.3 Dynamic Programming and
Optimization Problems
 Principle of Optimality:

 an optimal solution to an instance always


contains optimal solutions to all
subinstances.
 e.g., pp. 110 Example 3.4

KPShih@csie.tku.edu.tw 29

3.4 Chained Matrix


Multiplication
3.4 Chained Matrix Multiplication

 2×3×4 multiplications (standard method)


 (i×j) * (j×k) matrix
  i×j ×k multiplications

KPShih@csie.tku.edu.tw 31

3.4 Chained Matrix Multiplication


A × B × C × D
20×2 2×30 30×12 12×8
- A(B(CD) = 30×12×8+2×30×8+20×2×8 =3680
- (AB)(CD)= ........... =8800
- A((BC)D)= ........... =1232
- ((AB)C)D= ........... =10320
- (A(BC)D)= ........... =3120

KPShih@csie.tku.edu.tw 32
3.4 Chained Matrix Multiplication
 how to determine the optimal order in
A1 × A2 × ... × An ?
 consider all possible order and take the
minimum (exponential-time)
 tn: the number of different order in A1
× A2 × ... × An
 A1×(A2×A3×...×An) → tn-1
 (A1×A2×A3×...)×An → tn-1
 tn ≥ tn-1+tn-1=2tn-1 ; t2 = 1
 tn ≥2n-2

KPShih@csie.tku.edu.tw 33

3.4 Chained Matrix Multiplication


 principle of optimality in A1×A2×...×An
 If A1((((A2A3)A4)A5)A6) is optimal
order,
 (A2A3)A4 must be optimal too
 using dynamic programming to solve

KPShih@csie.tku.edu.tw 34
3.4 Chained Matrix Multiplication
 Let d0 = # of rows in A1
 dk = # of columns in Ak, 1<=k<=n
 e.g., Figure 3.7
A1 × A2 × A3 × A4
d0xd1 d1xd2 d2xd3 d3xd4
 Example 3.5 pp. 109

KPShih@csie.tku.edu.tw 35

3.4 Chained Matrix Multiplication


 M[i][j]= min # of multiplications
needed in Ai×...×Aj
 M[i][i]= 0
 Example 3.5 pp. 114

KPShih@csie.tku.edu.tw 36
3.4 Chained Matrix Multiplication
 consider multiplying six matrices (i.e.,
computing M[1][6])
 1. A1×(A2×A3×A4×A5×A6)
 2. (A1×A2)(A3×A4×A5×A6)
 ..............................
 If (A1×A2)(A3×A4×A5×A6) is optimal,
 both (A1×A2) and (A3×A4×A5×A6) are
optimal
 M[1][6] = M[1][2]+M[3][6] + d0d2d6
 M[1][6] = M[1][k]+M[k+1][6] +
d0dkd6
KPShih@csie.tku.edu.tw 37

3.4 Chained Matrix Multiplication


M i  j   min M i k   M k  1 j   d i 1d k d j , if i  j
i  k  j 1

M i i   0

KPShih@csie.tku.edu.tw 38
3.4 Chained Matrix Multiplication
 how to compute array M[i][j] in steps ?
 M[i][j] is computed all entries on row i
but left it and all entries in column j but
beneath it.
 compute entries in the diagonal 0, 1,...,j
 See Example 3.6 pp. 115
 See Fig. 3.8 pp. 116

KPShih@csie.tku.edu.tw 39

KPShih@csie.tku.edu.tw 40
KPShih@csie.tku.edu.tw 41

3.4 Chained Matrix Multiplication


 Every-Case Time Complexity of
Algorithm 3.6, pp. 118

KPShih@csie.tku.edu.tw 42
3.4 Chained Matrix Multiplication
 See Fig. 3.9, pp. 119

KPShih@csie.tku.edu.tw 43

3.6 The Traveling Salesperson


Problem
3.6 Traveling Salesperson Problem
 finding a shortest simple cycle which
passes all vertices
 tour (Hamiltonian circuit)
 a path from a vertex to itself passing all
vertices exactly once
 optimal tour
 a tour with minimum length
 See Fig. 3.16 pp. 131

KPShih@csie.tku.edu.tw 45

3.6 Traveling Salesperson Problem


 Length of an optimal tour (V1→→→V1)
 
min W 1 j   D v j V  v1 , v j  
2 j  n

where
  
Dvi A  min W i  j   D v j A  v j  , if A  
j:v j  A

Dvi    W i 1

KPShih@csie.tku.edu.tw 46
3.6 Traveling Salesperson Problem
 See Example 3.11, pp. 133
 Algorithm 3.11
 Every-Case Time and Space Complexity
of Algorithm 3.11, pp. 135

KPShih@csie.tku.edu.tw 47

KPShih@csie.tku.edu.tw 48
The End

You might also like