Professional Documents
Culture Documents
Brics Brics: Optimization Theory - A Modern Face of Applied Mathematics
Brics Brics: Optimization Theory - A Modern Face of Applied Mathematics
given set of finite or infinite alternatives. A new impetus to the development of optimization
began with dramatic development of linear programming in the late 1940's. Convexity also
Brics
brought in a complete new feature to optimization"
20
Observe that this function is not differentiable at play a central role in optimization theory. In fact
x = 0 yet it does not have a derivative there. so deep rooted is the notion of convexity that
Thus the notion of sub gradient was developed in usually nonconvex problems are tackled by
order to imitate the notion of the derivative at the devising suitable convex approximations.
points of non-differentiability. For a real-valued 3. Smoothness of Functions
convex function defined over an open convex set Though convexity has such power it was
at any point there exists called the observed that many application problems are
subgradient of f at and satisfies the devoid of convexity yet are non-differentiable.
following inequality This lead Francis Clarke to study locally Lipschitz
nonsmooth function in details in the early
1970's. He developed the notion of generalized
Let us note the subgradient need not be unique ( gradient or the Clarke subdifferential. Clarke
loss of differentiability at a point used his subdifferential to develop KKT type
corresponds to the fact that more that one conditions for mathematical programming
tangent plane can pass through the same point). problem as well as in developing necessary
Thus the collection of all subgradients at a given conditions for optimal control problems. This
point is called the sub differential of f at new approach in tackling nonsmoothness in
and is denoted as Further a point x is a optimization gave rise to the subjects of
global minimum if and only if . This can Nonsmooth Analysis and Nonsmooth
be indeed thought of as the generalization of Optimization. However Clarke's approach
Fermat's rule in the differentiable case. In fact if depended heavily on the tools of convex
we consider the following simple convex analysis. A complete move away from convexity
problem was championed by Mordukhovich by
developing the now famous sequential
Where f and are real-valued convex function nonsmooth analysis. Mordukhovich succeeded
on R n . Let x0 be a minimum of the above in developing a robust subdifferential for lower-
problem and also assume that there exists x semicontinuous functions which is nonconvex in
such that Then one can show that general and is contained in the Clarke
there exists l ³ 0 such that subdifferential. Motivated by optimal control
0 Î ¶f ( x0 ) + l¶g ( x0 ) Mordukhovich introduced the notion of Extremal
and Principle which turned out have far reaching
consequences and which has application to
The above conditions can be thought of as the areas that go beyond traditional optimization.
KKT type conditions for a convex problem in a 4. Numerical Methods for Optimization
nonsmooth setting or a general setting. Thus Meanwhile on the other side during the mid
with the powerful tool like the subdifferential 1960's and 1970's there has been a tremendous
convexity became a very amenable object even growth in the development of algorithms for
in the nonsmooth setting. Infact Tucker coined solving nonlinear programming problems. It first
the term Convex Analysis in order to bring out began by trying to develop efficient and
the importance of convexity and with the implementable algorithms to solve the
publication of the classic monograph titled unconstrained problem
Convex Analysis (Princeton, 1970) by R. T.
Rockafeller, which established convex analysis The most efficient methods developed for
as an important branch of modern mathematics. solving this problem used the principle of line
It is important to note that linear programming search, i.e. one moves from the current iterate
problems are a subclass of the convex to the next one denoted as x+ in the following
programming problems. As Rockafellar has manner
recently pointed out that the great watershed in x+ = x + td ,
optimization is not really between linearity and where d is called the descent direction satisfying
nonlinearity but between convexity and
nonconvexity. Since then convexity continues to and t > 0 is called the steplength.
21
This leads to the fact that and thus applying them to develop necessary optimality
by moving along d one is able to improve the conditions. Relations of subdifferentials to
objective function. Various methods like this generalized convex functions were also one of
have been developed using the line search the areas that I have been interested in. Later on
philosophy. Further finding a minimum amounts during my post-doctoral days I worked on the
to finding the root of the equation Thus notion of a convexity factor which is a very
various approaches like the Newton method or general object and posses many generic
Quasi-Newton method used in equation solving properties of most of the well known sub
can be applied to optimization. The Newton differentials. I also got interested in an emerging
method has the following scheme of iteration area of optimization called Abstract Convexity
during my postdoctoral days and it remains to by
major interest. Abstract Convexity has grown
Major contributions in this area have been made out of generalizing the well known global
by Davidon, Powell, Fletcher, Goldfard, Broyden, property of a convex function namely, every
Goldfarb, and Shanno. In solving constrained convex function is an upper envelope of the
problems the penalty method, the barrier method family of affine functions majorized by it (or point
and the sequential quadratic programming wise smaller that itself). The sub-area of
scheme are important classical approaches. Abstract Convexity where I have worked most is
It is important to note that the simplex method an area called Monotonic Analysis where in one
for linear programming which has been effective studies functions which are increasing and has
in solving many practical problems is not a some additional properties like positive
polynomial time algorithms. Thus there have homogeneity or convex along rays. This area has
been efforts to develop a polynomial time important applications to Global optimization. In
algorithm for linear programming which will be if this area I collaborate with Professor A. Rubinov
not better but at least as efficient as the simplex (Ballarat, Australia) and Professor J. E. Martinez-
method in practice. The major breakthrough Legaz (Barcelona, Spain). Apart from this I am
came in 1984 when Karmarkar announced his interested in studying various theoretical aspects
now famous algorithm. In Karmarkar's method in of vector optimization. Recently my interest also
contrast to the simplex method one moves shifted to bi-level programming and
through the interior of the feasible region and mathematical programming with equilibrium
move towards a vertex. Karmarkar's work gave constraints. I plan to study optimality conditions
rise to the now important interior point methods in this particular area since the most well known
of convex optimization. The interior point constraint qualification fails. Further I plan to
methods remain to be one of the most fertile have a major project in the area of Interior Point
fields of modern optimization research. The Method theory and I intend to study how to
growth of interior point methods has also lead to increase the speed of interior point algorithms for
the growth of the modern areas of semi definite conic programs by using suitable self-concordant
programming and conic programming. Apart barriers. In fact there have been few encouraging
from these the important areas associated with recent results in this direction from some leading
modern optimization are semi finite researchers. Also I plan to develop software with
programming, vector optimization, global a user friendly front end to use the interior point
optimization, bi-level programming, methods in convex programming and have a
mathematical programming under equilibrium library of such programs. Apart from the
constraints, Abstract Convexity, just to name a computational aspects there are also interesting
few. theoretical aspects that can be studied in interior
5. My Interests in IITK point theory, for example the detailed study of
My research initially centered around nonsmooth self scaled cones and developing faster path-
optimization and trying to understand and design following algorithms.
sub differentials for nonconvex functions and
22
References
1. C. R. Bector, S. Chandra and J. Dutta,Principles of Optimization
Theory, Narosa Publishers, New Delhi, 2004.
2. D. P. Bertsekas, Convex Analysis and Optimization, Athena Scientific
Publication, 2003.
3. D. P. Bertsekas, Nonlinear Programming, Athena Scientific
Publication, 1999.
4. J. Borwein and A. S. Lewis, Convex Analysis and Nonlinear
Optimization, Springer, 2000.
5. F. H. Clarke, Optimization and Nonsmooth Analysis, Wiley
Interscience, 1983.
6. B. D. Craven, Control and Optimization, Chapman and Hall, 1995.
7. S. Dempe, Foundations of Bi-level Programming, Kluwer Academic
Publishers, 2002.
8. J. Jahn, Vector Optimization, Springer, 2004.
9. J. Renegar, A Mathematical View of Interior-Point Methods in Convex
Optimization, SIAM, 2001.
10. R. T. Rockafellar, Convex Analysis, Princeton University Press, 1970.
11. R. T. Rockafellar and R. J. B. Wets, Variational Analysis, Springer,
1998.
12. A. Rubinov, Abstract Convexity and Global Optimization, Kluwer
Academic Publishers, 2000.
About the author: Joydeep Dutta recieved his Ph.d in mathematics in 1998
from I.I.T. Kharagpur with specialization in Nonsmooth Optimization.
Consequently he did his post-doctoral studies in the Indian Statistical Institute,
New Delhi and the Autonomous University of Barcelona, Spain. His main
research interests are in Abstract Convexity, Nonsmooth Optimization and
Vector Optimization. He is currently an Assistant Professor in the Department
of Mathematics at the I.I.T. Kanpur
23
Books Written by IITK Faculty
24