Professional Documents
Culture Documents
Global Optimization Methods For Chemical Process Design: Deterministic and Stochastic Approaches
Global Optimization Methods For Chemical Process Design: Deterministic and Stochastic Approaches
Global Optimization Methods For Chemical Process Design: Deterministic and Stochastic Approaches
School of Chemical Engineering and Technology, Chonbuk National University, Jeonju 561-756, S. Korea
*Chemical Engineeiing Depamnent, University of California Los Angeles, Los Angeles, CA 90095, USA
(Received 24 September 2001 9 accepted 22 November 2001)
Abstract-Process optimization often leads to nonconvex nonlinear programming problems, which may have mnltiple
local optima. There are two major approaches to the identification of the global optimum: detemlinistic approach and
stochastic approach. Algofithins based on the detenNnistic approach guarantee the global optimality of the obtained
solution, but are usually applicable to small problems only. Algorithms based on the stochastic approach, which do
not guarantee the global optimality, are applicable to large problems, but inefficient when nonlinear equality con-
stmints are involved. This paper reviews representative detemmtistic and stochastic global optiffdzation algofithins in
order to evaluate theh" applicability to process design problems, which are generally large, and have many nonlmem"
eqtmlity constraints. Finally, modified stochastic methods are investigated, which use a deterministic local algorithm and
a stochastic global algorithm together to be suitable for such problems.
Key words: Global Optimization, Detefmiuistic, Stochastic Approach, Nonconvex, Nonlinear Prograna
227
228 S.H. Choi and V. Manousiouthalds
~
Feasible point - ~ . ~
le region
Lfiloiic~trio~,e
', %
---~ ]%
-""'4
I
it
L/Z
pl
/
t
y..'
", -.
.......././'
March, 2002
Global Optimization Methods for ChemicaI Process Desigl: Detelmmistic and Stochastic Approaches 229
L.(y; V) =nliIlx v r ~(x~ y)-<0 tion to each subproble~n ks a lower bound in its region. The lowest
for all ve (vlv_~0, ~ y , = I ) one of these is the lower bound for the global minimum of the ori-
ginal problem. If a solution satisfies the oliginal comtrait~ as well,
For subsets of u and x; the sokaon to this problem is a lower bound
the value of tile original objective function at that point ks an upper
for the global minknum.
bound. The lowest upper bound is stored as a candidate for the glo-
This techniqae is nsefi.~Iwhen x and y are separable, because the
bal minimum. Memr~vhile, every subproblem is discarded if it is
solution x and the Lag-ange multipliers u obtained fi-om tile prhnaI
infeasible or its solution is higher than the upper bound (bounding).
can directly be used in tile fu~t set of constraints in tile master, in
Tile algoriti~n stops when the lower bound conve~-gesto tile upper
which the minimization with respect to x is ~mnecessary.
bound.
If the prhnaI is infeasible at a given point y, an infeasibility min-
The efficiency of the branch and bound algonthln mainly de-
imization problem can be solved such as tile folIowing.
pends on tile tightness of tile convex envelopes. The most com-
mh!~ c~ monly used convex envelopes, which are also called underes~nators,
can be classified as follows.
subject to
3-1. Linear Underestimatoi~
g(x, y)_<ca A reverse com~ex term in a separable fi.action ~x)=~j~Sx) can
be replaced by a linear underestimator For example, if (~Sxj)=-x~,
where 1=[1 ... 1]r. The solution x and the Lagrange multipliers v
aj<xj_<bj, the tightest convex envelope is the folIowmg linear func-
obtained from tim problem can be used to conslruct the second set
tion.
of constraints in the master, in which the minhnization with respect
to x is again unnecessary. N ( ~ ) = (g+b,) >}+g~
The algoriti~n iterates between tile primal and tile master until
the upper and lower bounds com*erge together. Like outer approxi- Note that this approach can be applied to separable problems only.
mation, this technique also needs constraint dropping strategies, and However, any problem can be converted to a separable problem
is suitable for low rank nonconvex problems only. because a no,~separable teml x~ xa can be replaced by a separable
3. B r a n c h and B o u n d function w~-w#where ~vI and w 2 are defined by the following linear
This is tile most widely used tectmique for global optilnization equality constraints.
of various problems. Let us consider the following type of problem. w,=(x, +x~)/2
mm f(x) w<(x, - x,)/2
fore, the branch and bound proce&ae that uses interval analysis in solution (Iccal search) and exploring the search space (global search)
s a n e cases requires exnemely many subproblems. Howevei; each [Booker, 1987]. The above algoritl~a is biased towards local sea-ch,
subproblem can be solved very efficiently because it only requires and two representative methods that can be balanced are stmxrna-
simple interval arithmetic. rized as follows.
5. Handling Equalities 1. Simulated Annealing
Many deterministic algorithms are applicable to specific types Let us consider a collection of atoms in equilibrk~ at a given
of problems only. For example, the generalized Benders decompo- tempera~-e T. Displacement of an atom causes a change AE in the
sition algorithm described in this paper allows inequality consn-aints energy of the system. If AE<0, the displacement is accepted. I f A E
only, and the underestlmator branch and bound allows inequality >0, the probability that the displacement is accepted is exp(-AE/
consn-aints aid linear equality consn-aints only. Generally, algoiitl~as kT) where k is the Boltzmann comtalt. Tiffs process can be simu-
can be mc~t~ed to accept Imear equality consa-amts because they lated in optimization as follows.
do not cause nonconvexity. However, problems should be mcx:ll-
For i~tlii~ii~tlizatioi~tof objective function f(X),
fled, except for algorithms such as interval branch and bo~md, if"they
1) Take x~'~ randomly.
have nonlinear equality constraints.
2) IfAf=f(x~'~ - f(x~ accept x~%
The simplest method ks to convert h(x)=0 into h(x)<0 aid h(x)
Otherwise,
_>0. Note that h(x)_<0 and ~(x)>__0 are also eq~avalent to h(x)=0.
a) Take a random number we [0, 1].
Therefore, m equality cons~Lrts can be converted to 2m or m+l
b) If w_<exp(- Af/T), then accept x~'t
inequality constraints. The generalized Benders decomposition can
Othelwise, x*~w=x~
now be applied to equality constrained problems. The linear under-
3) Control T, axl repeat.
estimator branch aid bound can also be applied to a~y problem be-
cause h(x) can be converted to a separabIe d.c. fimction e(x)+r(x) This algorithm is mostly applied to combinatorial opnmizatlon prob-
where c(x) is convex and r(x) is reverse convex. In this case, 2m lems, but suitable for unconslrained ffmction optimization also.
inequality co~s~-ait~ c(x)+~x)_<O and-c(x) -r(x)_<O are preferred 2. Genetic Algorithm
to m+l inequalities, because summation of reverse convex terms The theory of evokaion can also be employed in optLmization
results in a lager gap between the original fraction and its convex as follows.
envelope.
For optimization of fitness function f(x),
The following procedure applies to general tic. fractions c(x)+
1) Select a given size of population {~r where ~r is a ctuomosome
r(x) where c(x) is convex aid ~x) is reverse convex. Let us define
Coinaly vector).
new variables u=e(x) and v=r(x). Then, c(x)+ffx)=0 are equiva-
2) At a given crossover l~obability, crossover x~ axl x~to generate xp'
lent to linear equality constraints u + v = 0, convex inequality
and x~'.
conslraints e(x)-u_<O and -r(x)+v_<0, and reverse co~a~ex inequaI-
3) At a given mutation rate, mutate X.
ity conslramts -c(x)+u<0 and r(x)-v<0. Note that the last 2m re-
4) Repeat.
verse convex constraints can be summed to form a single reverse
convex constraint ~ , [-c,(x)+r,(x)+~-v,]_<0. Therefore, using 2m This algorithm is based on the assumption that the best solutions
extra variables, m nonlinear equality consa-aints can be converted will be found in regions of the search space containing relatively
to m linear equality cons~amts, 2m convex inequality conslxalnts, tfigh proportiom of good solutiom, and that these regions can be
and one reverse convex inequality conslxamt. As a result, all we identified by judicious and robust samplmg of the space ['Booker,
need, theoretically, is an algorittm~ that can solve a convex prob- 1987]. It is being widely applied case by case to special dam struc-
lem with a single reverse convex conslrain~ aid all of the deter- tures in which problem specific knowledge is incoip,~ated Such
ministic algorithms reviewed in this paper can do it. mc~i]ed genetic algorithms are referred to as evolution programs
[Michalewicz, 1996].
STOCHASTIC APPROACH The original genetic algorithm uses binary representation of chro-
mosomes to be suitable for combinatorial optimization. However,
Stochastic algori~as, when run sufficiently long, are virtually this algodtt~a can also be applied to unconstrained fianction opti-
guaranteed to fred the global optimum according to the following mizatiorL In this case, floating point representation is more effi-
convergence theorem [Back et al., 1991]. ciert, in which the followlng operators can be used.
For minimization of objective function f(x), For randomiy selected je {I ..... n},
I) Let x'+l=x'+N(0, ~) 1. 1) Simple crossover: xP'=[x~, ..., x~, X~+l,..., x~] and
2) If f(x~*l)<f(~r accept x'§ x~,=[x,~ ..... x;, x;+, ..... x~]
Otherwise, x~+'=xL 2) Arift~leticai crossover: xP'=w xp+(I- w) x~and
3) Repeat for iIeXtt. x~'=(I - w) xP+w x~
Then, for ~>0 and f"">- % lira, .. p{f(g)=P"~} =1. 3) Unifolm mutatio~ ~e [~, ~]
4) Boundary mutation: ~!=a, O1" b~
This means that for a random search based on a non~aal distribu-
5) Non-m~ifolm mutatien: Fine tune ~.
tion, the probability of global optimality of the obtained solntion
will eventually approach one. For stochastic algonttmls to be effi- 3. Handling Equalities
cient, howevei; balalcing is required between exploiting the best Stochastic algorithms do not suffer from the NP-har&less of the
March, 2002
G~obal Opfitrfizafion Methods for Chemical Process Design: Dctcnrfiifistic mad Stochastic Approaches 231
problem, and thus they we considered to be suitable for large prob- g(x)_<~a
lems. Howex,vr, these algorifluns still have difficulties when applied h(x)_ca
to chemical process design. The stochmtic algorithms my based on - h(x)_< (zl
the t'andom sem'ch technique. Since the evaluation of the objective f(x)_<r-++c~
function is meaningful only at feasible points, they ate suitable for
where 1=[1 ... 1]r, and+ is an optimality tolerance (>0). Note that
unconsWained or inherently inequality consWahled optimization prob-
c~<e means fllatx is a feasible point at which the value of the ob-
lems only. However, most chenlical process design problems have
jective function is lower than the previously found local minimum
many equality cons!taints. Therefore, the problem or the algorililm
1e. Inlplementation of this sW~egy is sdlematically described in
should be modified.
Fig. 5.
3-1. Problem Modification
3-2. Algorithm Modification
3-1-1. Penalty Function Method
3-2-1. Decoding Strategy
Constrained optinmationproblems can be converted to uncon-
Let us convert all ino:luality r g(x~0 in problem (P)
strained problems as follows.
into equality conm~lts g(x)+s=0 where s is a vector ofnomlega,
min~ F(x, r)=f(x)+ 1/(2r) e(x)r r tire slack variables. Then we have the following type of problem.
where r is apenalty parameter (>0), and c(x) is a vector of all active . ~ f(~, y) (E)
cons!taint functiona As r - s 0 , x convelges to a local minknum x*, subject to
but the Hessian matrix H[F(x, r)] becomes ill-conditioned. Further-
more, ,~vhen there are too many equality con~aints as in chemical h(x, y)=O
process design pt~oblems, it is ditficult to keep the teformulated prob- a_<x_<b
lffn numerically stable. c_<y_<d
3-1-2. Feasible Point Strategy where x r~presents n - m design (indetx~dent) variables, y t~pt~sents
In order to avoid dealing with equality constt~nts in a stochastic m state (dependenO vmiables, and h: R"--+R"(n>m). Assuming
algoriflun, a feasPole point strategy can be a6~ted, hi which feasi- that x can be decoded to y by an equation solver, this problem can
ble points can be found by solving an infeasibility minimization prob- be viewed as follows.
l~l such as the following.
. ~ f(~ y(x))
rain c(x)r e(x)
subject to
where c(x) is avector of all violated cons~'aint functions. Another
a_<x_<b
form of hffeasibility minhnization problem is as follows.
This type ofproblenl is suitable for stochastic algorifluns. Further-
min max{ g(x), h(x), - h(x) }
i
Ir ,--
: ii
Stochastic
minimization global h, +i :,i i rl "ill . . . . l-..liial ,.,rl .;, ,I,, :.r
problem (l) optimizer i" .. 'o .. II ;,.l+;c,:.J:r i
Jk "i
%.
:easible Local
+oint x ~ minimum/ T
J b, tv~. h,l',l .
F i g . 5. A s t o c h a s t i c m e t h o d b a s e d o n f e a s i b l e p o i n t s t r a t e g y . F i g . 6. A s t o c h a s t i c m e t h o d b a s e d o n d e c o d i n g s t r a t e g y .
K o r e a n J. C-'h~mt. E n g . ( V o l . 19, N o . 2)
232 S.H. Choi and V Manousiouthakis
more, chemical process design problems generally have small degrees Adjiman, C. S., Androtflalds, I. P., Maranas, C. D. and Floudas, C. A.,
of freedom, and thus the above problem is expected to be small in "A Global C~dr;dzattor~ Method, c~BB, for process D e s i ~ ' Com-
most cases. However, stochastic algorithms are inefficient for free puters& Chem. Eng., 20, Suppl., $419 (1996).
IocaI tumng, and thus a determinishc local algorithm is to be in- Brick, T., Hoffineiste~; F. and Sd~wefeI, H.-P, "A Survey of Evolution
cocporated Strategies:' Proceedings of the Fotu-th IntemationaI Conference on
The decoding strategy can easily be implemented when h is linear Genetic Algorithms, R. K. Bdew and L. B. Booker, eds., Morgan
with respect to y, and is being widely used in algoritt~'ns devoted Kanfmaim, San Mateo, CA, 2 (1991).
to specific problems. For general problems, a robust equation solver Bagaj ewicz, M. and Manousiouthalds, V, ~Z)n the Generalized Bend-
is requared, and a Newton type algorithm can be used ik'a gocd initial ers Decomposition;' Computers & Chem. Eng., 15, 691 (1991).
guess generator is available. A rough solution to an infeasibility min- Booker, L. B., "Improving Search in Genetic Algorithms:' Genetic Al-
imization problem can selene as a good initial guess, and a stochas- gorithms and Simulated Annealing, L. Davis, ect, Pitman, London,
tic global oplamizer can generate it with a large optimality tolerance. 61 (1987).
Note that decoding is valid even if there are multiple solutions for Choi, S. H., Ko, J. W. and Manousiouthakis, V, "A Stochastic Approach
y at a given x, because the result is stochastic, hnplemertation of to Global Optimization of Chemical Processes,"Computers & Chem.
this strategy is schematically described in Fig. 6. Eng., 23, 1351 (1999).
Floudas C. A. and Yiswes~varan, V, "A Global Optimization Algorithm
D I S C U S S I O N AND C O N C L U S I O N (GOP) for Certain Classes of Nonconv~x NLPs-I. Theo~' Com-
puters& Chem. Eng., 14, 1397 (1990).
Outer approximation and generalized Benders decomposition Geoffrion, A. M., "Generalized Benders Decompositiol~' J Opt. The-
are suitable for low raak noncoiwex problems only. Generally, how- oryApplic., 10, 237 (1972).
ever, chemical process optunization problems are high rank non- Goldberg, D. E., "Genetic Algorithms in Search, Op~nization, and Ma-
com~ex problems. Therefore, branch and bound is the most effi- chine Learning:' Addison-Wesley, Reading, MA (1989).
cient deterministic method cua~-entlyavailable, especially when Imear Hm~, J. R., Manousiouthalds, V. and Choi, S. H., '~3Iobal Optimizalaon
underestimators and inten~al analysis are incorporated to tighten of Chemical Processes Using the Interval An~ysis:' Korean J Chem.
the subproblem boxes. However, the ~tarantee of global optamal- Eng., 14, 270 (1997).
ity is still computationally too expensive. Hoist, R. and "lay, H., "Global OplkrJzal~oi~ Dete~rkaistic Approaches~'
Stochastic algorithms inevitably take forever to obtain a sok~on 2nd ed., Sp6xtger-Verlag, Berlin, Germany (1993).
of which the global op~-nality is gum-anteed. Therefore, we have Kirkpatiick, S., Gelatt, Ji: C. D. and Vecchi, M. R, "opth-nization by
to adopt and use the currently best solution at some stage of the pro- Simulated Almealmg:' Science, 220, 671 (1983).
cedure, and if'necessary, keep the prccedure rurming for a long time Konno, H., Thach, t~ T. and Tuy, H., ~Z)ptimization on Low Rank Non-
for a possibility of existence of a better solution. As mentioned be- convex Struc~-es:' Kluwer Academic Publishers, Dordrecht, The
fore, the deterministic algorithms guarantee the global optimality Netherlands (1997).
of the obtained solution, but they don't give any useful informahon Michalewicz, Z., "Genetic Algoiitlu-ns+Data Stractm-es=Evolutionpro-
but the lower bound on the global minimurn ~r~dlthe proce&tre con- grm-ns:'3rd ed., Spiinger-Verlag, New York (1996).
verges and stops. Stochastic algorithms do not g ~ a n t e e the global Ratschek, H. and Rokne, J., '~NewComputer Methods for Global Opti-
op~nality of the obtained solution, but continually improve tentative mizatioi~' Ellis Hol~vood, Chichester, En~mld (1988).
solutions, and thus can give us usefi.tI results in a reasonable time Ryoo, H. S. and Sal~inidis, N. V, "Global Optil~aizalaonof Noncomrex
span. NLPs and MINLPs with Applicatioi~s in Process D e s i ~ ' Comput-
Studies on global optimization indicate that most chemical pro- ers & Chem. Eng., 19, 551 (1995).
cess design problems are still too tough targets. Deterministic al- Soland, R. M., "An Algorithm for Separable Nonconvex Programming
go~itl-ans take too much computation thne even for moderately sized probleans II: Noncomrex Constraints:' 1vIanagement Science, 17,
problems. Stochastic algorithms have difficulties in dealing with 759 (1971).
equality cons~-aints. Therefore, stochashc methcx:Is based on the Vaidyanatharg R. and E1-Halwagi, M., "Global Optimization of Non-
feasible point strategy or the decoding strategy are considered use- convex Noiflinear progim]as via Interval Analysis," Computers &
ful. R~ther research on feasible point finding and equahon solving Chem. Eng., 18, 889 (1994).
is suggested as future work.
REFERENCES
March, 2002