Professional Documents
Culture Documents
Metaheuristic Methods and Their Applications: Lin-Yu Tseng (曾怜玉)
Metaheuristic Methods and Their Applications: Lin-Yu Tseng (曾怜玉)
Metaheuristic Methods and Their Applications: Lin-Yu Tseng (曾怜玉)
Applications
Lin-Yu Tseng (曾怜玉)
Institute of Networking and Multimedia
Department of Computer Science and Engineering
National Chung Hsing University
OUTLINE
I. Optimization Problems
II. Strategies for Solving NP-hard Optimization
Problems
III. What is a Metaheuristic Method?
IV. Trajectory Methods
V . Population-Based Methods
VI. Multiple Trajectory Search
VII. The Applications of Metaheuristic Methods
VIII. Conclusions
I. Optimization Problems
Computer Science
Traveling Salesman Problem
Maximum Clique Problem
Operational Research
Flow Shop Scheduling Problem
P – Median Problem
Many optimization problems are NP-hard.
Optimization problems
Calculus-based method
Hill climbing
How about this?
An example : TSP
(Traveling Salesman Problem)
23
1
10 3
5
2 18 3
6 20
15
4
5 7
A solution A sequence
12345 Tour length = 31
13452 Tour length = 63
x3 X4
x2
x1 X5
2. Simulated Annealing – Kirkpatrick (Science 1983)
f ( s' ) f ( s)
• Probability p(T , s ' , s ) exp
T
• Temperature T may be defined as Tk 1 Tk , 0 α 1
• Random walk + iterative improvement
3. Tabu Search – Glover 1986
• Simple Tabu Search
• Tabu list
• Tabu tenure: the length of the tabu list
• Aspiration condition
Tabu Search
4. Variable Neighborhood Search – Hansen and Mladenović 1999
• Composed of three phase ① shaking ② local search ③ move
• A set of neighborhood structures, N1 N 2 ... N k max
Initialization:
Select the set of neighborhood structures N k' , k 1,..., k max
'
;
Find an initial solution x;
Choose a stopping condition;
Best
N2N1
Best
x’’
x’ x’’ N
x’1
N1 x’’
x x’’
x’
x’ x
x Best
Best
V. Population-Based Methods
1. Genetic Algorithm – Holland 1975
Coding of a solution --- Chromosome
Fitness
function --- Related to objective
function
Initial population
An example : TSP
(Traveling Salesman Problem)
23
1
10 3
5
2
18 3
6 20
15
4
5 7
A solution A sequence
12345 Tour length = 31
13452 Tour length = 63
Genetic Algorithm
• Reproduction (Selection)
• Crossover
• Mutation
Example 1
Maximize f(x) = x2 where x I and 0 x 31
1. Coding of a solution : A five-bit integer,
e.g. 01101
2. Fitness function : F(x) = f(x) = x2
3. Initial population : (Randomly generated)
01101
11000
01000
10011
Reproduction
Roulette Wheel
Reproduction
Crossover
Mutation
rzfqdhujardbe
niedwrvrjahfj
cyueisosqvcb
fbfvgramtekuvs
kbuqrtjtjensb
fwyqykktzyojh
tbxblsoizggwm
dtriusrgkmbg
jvpbgemtpjalq
2. Ant Colony Optimization – Dorigo 1992
Pheromone
Initialization
Loop
Loop
Each ant applies a state transition rule to
incrementally build a solution and applies a
local updating rule to the pheromone
Until each of all ants has built a complete solution
Until End_Condition
Example : TSP
A simple heuristics – a greedy method: choose
the shortest edge to go out of a node
23
1
10 3
5
2 18
3
6 20
15
4
5 7
Solution : 15432
• Each ant builds a solution using a step by step
constructive decision policy.
• How ant k choose the next city to visit?
arg max uJ r { ru ( ru ) } if q q0
s
S, otherwise
Where 1 , rs be a distance measure associated with edge
(r,s)
rs ( rs )
if s Jr
p rs ru ( ru )
k
uJ r
0 if s Jr
Local update of pheromone
rs (1 ) rs rs , 0 1
where rs 0 , 0 is the initial
pheromone level
Here I am,
now!
pBest
xt gBest
xt+1
v
New
Position !
Particle Swarm Optimization
Modification of velocities and positions
xt+1 = xt + vt+1
VI. Multiple Trajectory Search (MTS)
1.MTS begins with multiple initial
solutions, so multiple trajectories will
be produced in the search process.
2.Initial solutions will be classified as
foreground solutions and background
solutions, and the search will be focused
mainly on foreground solutions.
3.The search neighborhood begins with a
large-scale neighborhood, then it
contracts step by step until it reaches the
pre-defined tiny size. At this time, it is
reset to its original size.
4.Three candidate local search methods
were designed. When we want to search
the neighborhood of a solution, they
will be tested first in order we can
choose the one that best fits the
landscape of the solution.
• The MTS focuses its search mainly on
foreground solutions. It does an iterated local
search using one of three candidate local
search methods. By choosing a local search
method that best fits the landscape of a
solution’s neighborhood, the MTS may find its
way to a local optimum or the global optimum
• Step 1 : Find M initial solutions that are
uniformly distributed over the solution space.
• Step 2 : Iteratively apply local search to all
these M initial solutions.
• Step 3 : Choose K foreground solutions based
on the grades of the solutions.
• Step 4: Iteratively apply local search to K
foreground solutions.
• Step 5 : If termination condition is met then
stop else go to Step 3.
• Three local search methods begin
their search in a very large
“neighborhood”. Then the
neighborhood contracts step by step
until it reaches a pre-defined tiny
size, after then, it is reset to its
original size.
Three Local Search Methods
• Local Search 1
This local search method searches
along each of n dimensions in a
randomly permuted order.
• Local Search 2
This local search method searches
along a direction by considering
about n/4 dimensions.
Three Local Search Methods
• Local Search 3
This local search method also
searches along each of n dimension
in a randomly permuted order, but it
evaluate multiple solutions along
each dimension within a pre-
specified range.
Multiple Trajectory Search for
Large Scale Global Optimization
Winner of 2008 IEEE World Congress on
Computational Intelligence
Competition on Large Scale Global
Optimization
Multiple Trajectory Search for
Unconstrained/Constrained
Multi-Objective Optimization
The Second and the Third Places of 2009 IEEE
Congress on Evolutionary Computation
Competition on Performance Assessment of
Constrained/Bound Constrained Multi-
Objective Optimization Algorithm
VII. The Applications of Metaheuristics
1. Solving NP-hard Optimization Problems
• Traveling Salesman Problem
• Maximum Clique Problem
• Flow Shop Scheduling Problem
• P-Median Problem
2. Search Problems in Many Applications
• Feature Selection in Pattern Recognition
• Automatic Clustering
• Machine Learning (e.g. Neural Network
Training)
VIII. Conclusions
• For NP-hard optimization problems and
complicated search problems, metaheuristic
methods are very good choices for solving these
problems.
• More efficient than branch-and-bound.
• Obtain better quality solutions than heuristic
methods.
• Hybridization of metaheuristics
• How to make the search more systematic ?
• How to make the search more controllable ?
• How to make the performance scalable?
References
1. Osman, I.H., and Laporte,G. “Metaheuristics:A bibliography”. Ann. Oper. Res. 63,
513–623, 1996.
2. Blum, C., and Andrea R. “Metaheuristics in Combinatorial Optimization: Overview
and Conceptual Comparison”. ACM Computing Surveys, 35(3), 268–308, 2003.
3. Kirkpatrick, S., Gelatt. C. D., and Vecchi, M. P. “Optimization by simulated
annealing”, Science, 13 May 1983 220, 4598, 671–680, 1983.
4. Glover, F. “Future paths for integer programming and links to artificial intelligence”,
Comput. Oper. Res. 13, 533–549, 1986.
5. Hansen, P. and Mladenović, N. An introduction to variable neighborhood search. In
Metaheuristics: Advances and trends in local search paradigms for optimization, S.
Voß, S. Martello, I. Osman, and C. Roucairol, Eds. Kluwer Academic Publishers,
Chapter 30, 433–458, 1999.
6. Holland, J. H. Adaption in natural and artificial systems. The University of Michigan
Press,Ann Harbor, MI. 1975.
7. Dorigo, M. Optimization, learning and natural algorithms (in italian). Ph.D. thesis,
DEI, Politecnico di Milano, Italy. pp. 140, 1992.
8. Kennedy, J. and Eberhart, R. “Particle Swarm Optimization”, Proceedings of the 1995
IEEE International Conference on Neural Networks, pp. 1942-1948, IEEE Press, 1995.
References
9. Lin-Yu Tseng* and Chun Chen, (2008) “Multiple Trajectory Search for Large
Scale Global Optimization,” Proceedings of the 2008 IEEE Congress on
Evolutionary Computation, June 2-6, 2008, Hong Kong.
10. Lin-Yu Tseng* and Chun Chen, (2009) “Multiple Trajectory Search for
Unconstrained/Constrained Multi-Objective Optimization,” Proceedings of the
2009 IEEE Congress on Evolutionary Computation, May 18-21, Trondheim,
Norway.
謝謝!
祝各位 身體健康
心情愉快!