Download as pdf or txt
Download as pdf or txt
You are on page 1of 25

Analysis and Synthesis of Algorithms

Design of Algorithms

Dynamic Programming
Introduction
Principle of Optimality and Recurrences
Graph Shortest Path Example

Copyright 2024, Pedro C. Diniz, all rights reserved.


Students enrolled in the DA class at Faculdade de Engenharia da Universidade do Porto (FEUP)
have explicit permission to make copies of these materials for their personal use.
Motivational Example
• Multi-Stage Graph
– A Labeled and Directed Graph G = (V,E)
– Vertices V can be partitioned into k ≥ 1 disjoint sets V1,…, Vk
– Edge <u,v> Î E Þ u Î Vi and v Î Vi+1, 1 ≤ i ≤ k
– Each edge e Î E has a cost: E ® Int
– Special source and sink nodes, ½V1½ = ½Vk½ = 1

2 4
6
9 2 2 6 9
5 4
7 3 1 4
7 3 2
7 10 12
1 3 5
5
4 11 6
2
11 8 11
8
5

Analysis and Synthesis of Algorithms 2


Motivational Example
• Multi-Stage Graph
– A Labeled and Directed Graph G = (V,E)
– Vertices V can be partitioned into k ≥ 1 disjoint sets V1,…, Vk
– Edge <u,v> Î E Þ u Î Vi and v Î Vi+1, 1 ≤ i ≤ k
– Each edge e Î E has a cost: E ® Int
– Special source and sink nodes, ½V1½ = ½Vk½ = 1

2 4
6
9 2 2 6 9
5 4
7 3 1 4
7 7 3 10 2 12
1 3 5 5
2 4 11 6
11 8 8 11
5

V1 V2 V3 V4 V5
source sink
Analysis and Synthesis of Algorithms 3
Minimum Cost Source to Sink Path

2 4
6
9 2
2 6 9
5 4
7 3 1 4
7 3 2
source 1 7 10 12 sink
3 5
5
2 4 11
6
11 8 11
8

• What Approach to Take?


– Will a Divide-and-Conquer approach work?
– Will a Greedy approach work?

Analysis and Synthesis of Algorithms 4


Divide-and-Conquer Approach?

2 4
6
9 2
2 6 9
5 4
7 3 1 4
7 3 2
source 1 7 10 12 sink
3 5
5
2 4 11
6
11 8 11
8

Analysis and Synthesis of Algorithms 5


Divide-and-Conquer Approach?

2 4
6
9 2
2 6 9
5 4
7 3 1 4
7 3 2
source 1 7 10 12 sink
3 5
5
2 4 11
6
11 8 11
8

• Problems:
– Concatenation of best paths of sub-problems may not yield a path
– Best overall path may be suboptimal at different sub-problem stages
Analysis and Synthesis of Algorithms 6
Greedy Approach?

2 4
6
9 2
2 6 9
5 4
7 3 1 4
7 3 2
source 1 7 10 12 sink
3 5
5
2 4 11
6
11 8 11
8

• Problems
– Guaranteed to always have a feasible path but…
– Decisions based on local information (for a single stage at a time)
– Cannot “look ahead” to anticipated good or bad follow-on paths
Analysis and Synthesis of Algorithms 7
Minimum Cost Source to Sink Path

2 4
6
9 2
2 6 9
5 4
7 3 1 4
7 3 2
source 1 7 10 12 sink
3 5
5
2 4 11
6
11 8 11
8

• What Approach to Take?

Analysis and Synthesis of Algorithms 8


Minimum Cost Source to Sink Path
• Enumerate All Paths?
– Recursively performance a Depth-First Traversal
– Compute the Path’s Costs Keeping a Stack of the Nodes in the Path
– Record the Lowest Cost; Nodes on the Stack as the Nodes in the Path.

Analysis and Synthesis of Algorithms 9


Minimum Cost Source to Sink Path
• Enumerate All Paths?
– Recursively performance a Depth-First Traversal
– Compute the Path’s Costs Keeping a Stack of the Nodes in the Path
– Record the Lowest Cost; Nodes on the Stack as the Nodes in the Path.

S S

m vertices
at each stage

• Problem? s stages
– Worst Case Complexity is O(ms) for all Possible Paths
– Exponential in the Input Problem Size
Analysis and Synthesis of Algorithms 10
Principle of Optimality
Principle: If a solution is optimal, any portion of it should be optimal
with respect to all the other possible choices of that particular portion.

i
S S

1st portion 2nd portion


Source ® i should be optimal i ® sink should be optimal
(wrt to all other paths from source to i) (wrt to all other paths from i to sink)

Corolary: If a partial solution is suboptimal, it need not to be exploited further.


Analysis and Synthesis of Algorithms 11
How Does This Help Us?
• Suppose an optimal path (e1, e2, …, en, en+1) has been found.

This means that, by the optimality principle


e1, e2, e3,…, en, en+1
is optimal with respect to all paths from e3 to sink

• To find an optimal path starting with e’1, e’2


e’1, e’2, e3,…, en, en+1

we can reuse the best path from e3 onwards.

• Thus, we can record partial solutions in a table, thus avoiding


recomputation of partial solutions
Analysis and Synthesis of Algorithms 12
Back to Our Example
• Principle of Optimality
– Shown by Contradiction
– Recurrence Relation uses Previous Stage Results
• Forward Approach
– cost(i,j) is cost of optimal path from node j at stage i to sink
– c(j,k) is the cost associated with the (j,k) edge

!"#$ %, ' = min ! ', 4 + !"#$(% + 1, 4)


, ∈ ./01
2,, ∈ 3

• Computation:
– In reverse from the sink backwards to the source.
– Recurrence is forward though…
Analysis and Synthesis of Algorithms 13
Back to Our Example
• Principle of Optimality
– Shown by Contradiction
– Recurrence Relation uses Previous Stage Results
• Forward Approach
– cost(i,j) is cost of optimal path from node j at stage i to sink
– c(j,k) is the cost associated with the (j,k) edge

!"#$ %, ' = min ! ', 4 + !"#$(% + 1, 4)


, ∈ ./01
2,, ∈ 3

• Computation: j ∈ Vi k ∈ Vi+1 cost of k to sink

– In reverse from the sink backwards to the source.


– Recurrence is forward though…
Analysis and Synthesis of Algorithms 14
Building the Table for the Example
4 6
2 2 6 9
9 2 5 4
7 3 1 4
7 3 2
source 1 3 7 10 12 sink
4 11 5 5
2
11 6
5 8 8 11

Analysis and Synthesis of Algorithms 15


Building the Table for the Example
4 6
2 2 6 9
9 2 5 4
7 3 1 4
7 3 2
source 1 3 7 10 12 sink
4 11 5 5
2
11 6
5 8 8 11

4
min(4)
(9-12)
2
min(2)
(10-12)
5
min(5)
(11-12)

Analysis and Synthesis of Algorithms 16


Building the Table for the Example
4 6
2 2 6 9
9 2 5 4
7 3 1 4
7 3 2
source 1 3 7 10 12 sink
4 11 5 5
2
11 6
5 8 8 11

7 4
min(6+4,5+2) min(4)
(6-12) (9-12)
5 2
min(4+4,3+2) min(2)
(7-12) (10-12)
7 5
min(5+2,6+5) min(5)
(8-12) (11-12)

Analysis and Synthesis of Algorithms 17


Building the Table for the Example
4 6
2 2 6 9
9 2 5 4
7 3 1 4
7 3 2
source 1 3 7 10 12 sink
4 11 5 5
2
11 6
5 8 8 11

7
min(4+7,2+5,1+7) 4
(2-12) 7
min(6+4,5+2) min(4)
9 (6-12) (9-12)
min(2+7,7+5) 2
(3-12) 5
min(4+4,3+2) min(2)
18 (7-12) (10-12)
min(11+7) 5
(4-12) 7
min(5+2,6+5) min(5)
15 (8-12) (11-12)
min(11+5,8+7)
(5-12)
Analysis and Synthesis of Algorithms 18
Building the Table for the Example
4 6
2 2 6 9
9 2 5 4
7 3 1 4
7 3 2
source 1 3 7 10 12 sink
4 11 5 5
2
11 6
5 8 8 11

7
min(4+7,2+5,1+7) 4
(2-12) 7
min(6+4,5+2) min(4)
9 (6-12) (9-12)
16 min(2+7,7+5) 2
(3-12) 5
min(9+7,7+9,3+18,2+15) min(4+4,3+2) min(2)
(1-12) 18 (7-12) (10-12)
min(11+7) 5
(4-12) 7
min(5+2,6+5) min(5)
15 (8-12) (11-12)
min(11+5,8+7)
(5-12)
Analysis and Synthesis of Algorithms 19
Building the Table for the Example
4 6
2 2 6 9
9 2 5 4
7 3 1 4
7 3 2
source 1 3 7 10 12 sink
4 11 5 5
2
11 6
5 8 8 11

7
min(4+7,2+5,1+7) 4
(2-12) 7
min(6+4,5+2) min(4)
9 (6-12) (9-12)
16 min(2+7,7+5) 2
(3-12) 5
min(9+7,7+9,3+18,2+15) min(4+4,3+2) min(2)
(1-12) 18 (7-12) (10-12)
min(11+7) 5
(4-12) 7
min(5+2,6+5) min(5)
15 (8-12) (11-12)
min(11+5,8+7)
(5-12)
Analysis and Synthesis of Algorithms 20
Determining the Paths
• Trace Back Table Values
– Keep a Link to the nodes on the previous stage that derived
the current minimum cost
– Trace Back Links when Reaching Source
– Ties means some branches might have Multiple sub-paths

• Example:
– Minimum Cost is 16
– Two paths:
• 1 ® 2 ® 7 ® 10 ® 12
• 1 ® 3 ® 6 ® 10 ® 12

Analysis and Synthesis of Algorithms 21


Dynamic Programming Characteristics

• A Sequence of Decisions
– Impossible to make optimal decision based only on ”local” information
• Greedy Approaches Fails
– Decisions are Interdependent
• Divide-and-Conquer Approaches Fail
– Enumerate All Possible Alternatives
• But Avoids Recomputing them using table (memorization)

• Principle of Optimality Applies


– Different Enumeration Operations usually Share Sub-problems
– Store partial Solutions thus Avoiding recomputation
– Recursive Formulation
– Computation amounts to filling in a Table

Analysis and Synthesis of Algorithms 22


Dynamic Programming Procedure

• Derive a Recurrence Equation


• Solve the Recurrence Equation by:
– Identifying Primitive Cases for Optimal Solutions
– Build Solutions Iteratively using Optimality Criteria
– Build a Table and Fill it using Recurrence Equation and
Primitive Cases

• Time and Space Complexity


1. How Many Tables?
2. What is the Size of Each Table?
3. What is the Effort to Fill in the Entries of the Table(s)
– Space Complexity: 1+2
– Time Complexity: 1+2+3
Analysis and Synthesis of Algorithms 23
Complexity of the Example

• Time Complexity
– Nodes as Visited Once for a given Stage
– Edges are Visited Once to Compute the Recurrence Equation
!"#$ %, ' = min ! ', 4 + !"#$(% + 1, 4)
, ∈ ./01
2,, ∈ 3

– O(V+E) where V is the Number of Nodes and E the


Number of Edges

• Compare with the O(ms) exponential case!

Analysis and Synthesis of Algorithms 24


Summary
• Introduction to Dynamic Programming
– Why Divide-and-Conquer and Greedy Approaches Won’t Work.
• Principle of Optimality
• Solving a Dynamic Programming Problem
– Recurrence Equation
– Table Building
• Example
– Graph Shortest Path for Stage Graphs

Analysis and Synthesis of Algorithms 25

You might also like