Professional Documents
Culture Documents
Lgopt PDF
Lgopt PDF
Lgopt PDF
1.5
f(x)
0.5
−0.5
−1
−1 −0.5 0 0.5 1 1.5 2
x
• Optimization Problems
. Variables
. Objective functions
. Constraints
. Discrete vs Continuous
. Available information
. Optimality
. Problem size
• Local Methods
. Steepest Descent
. Newton
. Quasi-Newton
. Conjugate Gradient
. Simplex
• Exact methods
. Enumeration
. Branch and Bound
. Interval Methods
• Monte-Carlo Methods
. Random points
. Random starting points
. Quasi-Monte Carlo methods
. Sparse Grids
• Simulated Annealing
. Accept larger functions values with certain probability
. Annealing schedule
1.3
1.25
1.2
1.15
1.1
1.05
0.95
0.9
0.85
0.8
6 3
5 2.5
4 2
3 1.5
2 1
1 0.5
0 0
• Choosing a course
. Variables: which courses are available (continuous vs discrete)
. Objective: compulsory, interesting (many, hard to quantify)
. Constraints: one place at a time, pre-requisites
• How much should you invest in the bank, shares, property, ...
. Variables: fraction of money in each asset (many variables)
. Objective: maximize return, minimize risk (several competing objectives)
. Constraints: money available, non-negative amounts, fractions in [0, 1]
. Integrality constraints xi ∈ Z
xi ∈ { 0, 1 } Zero-one variables
xi ∈ { 0, 1, 2, . . . } Nonnegative integer variables.
. c̄i(x) ≥ 0 ⇐⇒ −c̄i(x) ≤ 0
• Atoms/electrons/molecules
. Particles on a surface M: xj ∈ M
. Bonds between particles
. Geometry
• Objective is a function of x
. Final state x(T )
R T 00
. 0 |x (t)| dt
Local minimum: Eigenvalues 1.3820, 3.6180 Local maximum: Eigenvalues −3.6180, −1.3820 Saddle point: Eigenvalues −1.5414, 4.5414
30 0 40
25 −5 30
20 −10 20
15 −15 10
10 −20 0
5 −25 −10
0 −30 −20
2 2 2
1 2 1 2 1 2
1 1 1
0 0 0
0 0 0
−1 −1 −1
−1 −1 −1
−2 −2 −2 −2 −2 −2
x2 x2 x2
x1 x1 x1
• Number of variables n, x ∈ Rn
• Limitations
. Compute time
. Memory
Ans: n ≈ 105
• Example: What is the largest Hessian (n by n symmetric matrix) that can
be stored in IEEE double precision in 32 bit Windows (maximum 2 Gb
addressable block)?
• Steepest descent
. d(k) = −∇f (x(k))
. Global convergence: x(k) → x∗, x∗ stationary point (∇f (x∗) = 0) from
any starting point
. Arbitrarily slow linear rate of convergence |x(k+1) − x∗| ≈ β|x(k) − x∗|,
0 < β < 1.
. Requires gradient ∇f (x), and O(n) storage and work per iteration
• Quasi-Newton methods
. Solve B (k)d(k) = −∇f (x(k)) for search direction d(k)
. Update B (k+1) ≈ ∇2f (x(k))
. Superlinear rate |x(k+1) − x∗| ≈ |x(k) − x∗|τ , 1 < τ < 2 under conditions
. Requires ∇f (x), O(n2) storage, O(n2) work per iteration
• There can be many local minima which are not global minima
• In the context combinatorial problems, global optimization is NP-hard
• Special properties (eg. convexity) of feasible region Ω and objective function
f imply that any local solution is a global solution.
• References: Pinter [20]
10
8
f(x)
0
−3 −2 −1 0 1 2 3
x
0
x2
−1
−2
−3
−4
−5
−6
−6 −5 −4 −3 −2 −1 0 1 2 3 4 5
x1
A salesman must visit every one of n cities exactly once and return
to their starting city. If the cost of going from city i to city j is
Cij , find the route that minimizes the total cost.
• Variables
. Xij = 1 if go from city i to city j; 0 otherwise
. xi = ith city visited; Permutation of { 1, 2, . . . , n }
• Objective
Pn Pn
. f (X) = i=1 j =1 Cij Xij
Pn
. f (x) = i=1 Ci,xi
• Combinatorial optimization problem
. (n − 1)! possible tours
. Enumerating all tours, comparing costs =⇒ n! operations
. Impossible except for small numbers of cities
. NP-hard
. References [14]
1.6
80
1.5
1.4 60
f(r)
f(r)
1.3 40
1.2
20
1.1
1 0
0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1
8 2000
1000
6
f(r)
f(r) 0
4
−1000
2 −2000
0 −3000
0 0.2 0.4 0.6 0.8 1 0 0.01 0.02 0.03 0.04
Distance Distance
• Enumeration
. Only possible for combinatorial problems
. Exponential explosion makes only very small problems possible
• Interval methods
• References
. Hansen [7], [9, 10]
0.9
0.8
0.7
0.6
x2
0.5
0.4
0.3
0.2
0.1
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
x1
• Issues
. Convergence/number of iterations
1
3
0
x2
2.5
−1
2
−2
1.5
−3
−4 1
−5
0.5
−6
−6 −5 −4 −3 −2 −1 0 1 2 3 4 5 0
x1 0 20 40 60 80 100 120 140 160 180 200
Iterations
From: http://wissrech.iam.uni-bonn.de/research/projects/zumbusch/fd.html
References
[1] AD, Automatic differentiation, tech. report, Univeristy of Aachen and Ar-
gonne National Laboratory, 2005. http://www.autodiff.org/.