ISYE 6210 Theory of Production Scheduling Constraint Generation Methods

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

ISYE 6210 Theory of Production Scheduling Constraint Generation Methods

Thomas Sharkey

April 23, 2013

Background
We have previously seen the core idea for constraint generation methods:
We have a large number of constraints in the problem. We solve a relaxation of the problem with a reduced number of constraints. We then determine if the optimal solution to the relaxation satises all the constraints in the original problem.
If so, then the current solution is optimal to the original problem. Otherwise, we add a subset of the violated constraints to our relaxation.

Note that it actually may be benecial to formulate the problem with a large number of constraints.
This is a similar idea to the set-partitioning formulation of the GAP. We will see a very similar method as Dantzig-Wolfe Decomposition.

Benders Decomposition: Traditional Motivation


Suppose that we are running a scheduling system where we have a set of customers j = 1, . . . , n and a set of facilities i = 1, . . . , m. However, unlike previous models discussed in class, the processing times of the jobs are uncertain. In certain situations, the assignment of the customers to facilities (i.e., machines) must be done well before the characteristics of the jobs are known with certainty. The actual scheduling of the jobs associated with the customers assigned to a facility, though, can be done after the characteristics of the jobs are known. In this situation, we wish to allocate the customers to the facilities in such a way to minimize our expected costs over all possible scenarios of the characteristics of the jobs.

Benders Decomposition: Stochastic Scheduling


We will let x denote the rst stage decisions, i.e., the assignment of customers to facilities. Note that the cost of the second stage decisions will be a function of: (i) the rst stage decisions x and (ii) characteristics of the job. We will denote yk as the second stage scheduling decisions for scenario k (where k = 1, . . . , K ). We will have rst stage constraints and second stage constraints. Our objective function will be equal to:
K

minimize c x +
k =1

k f yk .

Benders Decomposition Formulation

The formulation of our problem becomes:


K

minimize c x +
k =1

k f yk . (P)

subject to Ax Bk x + Dyk x y = b = dk for k = 1, . . . , K . X Y.

The Idea

We will try to remove the variables yk from the formulation. This is accomplished by introducing constraints on the x variables that bound the objective function (or some portion of the objective function) from the problem. For now, lets assume that the y variables are continuous and that there always exists a feasible second-stage solution given any rst stage solution x . We will see how we can reformulate (P). We will dene the function zk (x ) to be equal to the optimal objective function value in scenario k given rst stage decisions x .

First Reformulation

The problem (P) now becomes:


K

minimize c x +
k =1

k zk (x ) (P1)

subject to Ax x = b X.

The Second-Stage Problem

minimize f yk subject to Dyk yk Its dual problem is: maximize pk (dk Bk x ) subject to pk D f . (D(k )) = dk Bk x 0. (SP(k ))

Getting to the Second Reformulation

By the denition of zk (x ) and linear programming duality, we can express zk (x ) as the optimal solution to (D(k )). We dene EP to be the extreme points of the feasible region of (D(k )) and note that: zk (x ) = max(p i ) (dk Bk x ).
i EP

Alternatively, zk (x ) is the smallest value of zk such that: (p i ) (dk Bk x ) zk for all i EP .

Second Reformulation

The problem (P1) now becomes:


K

minimize c x +
k =1

k zk (P2)

subject to Ax
i

= b X.

(p ) (dk Bk x ) zk for all i EP . x

Constraint Generation Approach/Cutting Plane Algorithm

We will relax (P2), which refer to as (R-P2), by removing the extreme point constraints. We will then solve (R-P2) over a relaxed set of constraints and arrive at a solution (x , z ). We then need to either verify that this solution is optimal or determine which constraints (i.e., cuts) need to be added to the problem.

The Pricing Problem

We have the option of solving either the primal or dual problem associated with zk (x ). The dual problem will immediately give us the constraint to be added to (R-P2), i.e., the one that is violated by (x , z ). . The primal problem will give us an optimal solution yk
It is easy to verify if this solution has an objective function greater than or equal to zk . We then would know that a constraint needs to be added to (R-P2), but it is not immediately clear which one. Recall that there is a complementary dual solution to yk that is optimal to the dual.

The Relaxation at Any Point

minimize c x +
k =1

k zk (R-P2)

subject to Ax = b Constraints on (portions of) the objective function x X.

A Generic Problem for Benders Decomposition

minimize f (x , y ) subject to C (x , y ) are satised x Dx , y Dy . (P)

Generic Relaxation

minimize z subject to z Bx h (x ) for h = 1, . . . , H x Dx , where x h is a previous optimal solution to the relaxation and Bx h (x ) is a cut on the objective function. (R-P2)

The Cuts are Important

The driving factor in the success of a Benders decomposition is the quality of the cuts. Good cuts are readily available when the subproblems for a xed x are continuous problems. But, scheduling problems are often discrete (i.e., integer variables) and/or combinatorial. The question now becomes how can we develop quality cuts for these types of problems.

You might also like