Professional Documents
Culture Documents
Simulated Anneling Algorithm
Simulated Anneling Algorithm
Abstract— Simulated annealing (SA) is a solo-search based algorithm might have an advantage over a solo-
algorithm, trying to simulate the cooling process of molten metals algorithm have been represented in [1].
through annealing to find the optimum solution in an
optimization problem. SA selects a feasible starting solution, Simulated annealing (SA) is a popular generic probabilistic
produces a new solution at the vicinity of it, and makes a decision solo-algorithm used for global optimization problems. The
by some rules to move to the new solution or not. However, the name and inspiration stem from annealing in metallurgy, a
results found by SA depend on the selection of the starting point process involving heating and controlled cooling of a material
and the decisions SA makes. In this paper, in order to ameliorate to increase the size of its crystals and reduce its defects. The
the drawbacks of the algorithm, a population-based simulated atoms become unstuck from their initial positions by heating
annealing (PSA) algorithm is proposed. PSA uses the
and wander randomly through states of higher energy. More
population’s ability to seek different parts of the search space,
thus hedging against bad luck in the initial solution or the
chance to find configurations with lower internal energy than
decisions. A set of benchmark functions was used in order to initial one is provided by slow cooling. In a similar way with
evaluate the performance of PSA algorithm. Simulation results this physical process, in SA, each feasible solution is
accentuate the superior capability of PSA in comparison with the analogous to a state of a physical system, and the fitness
other optimization algorithms. function which needs to be minimized is similar to the internal
Keywords - optimization, solo-searcher, population-based energy of the system in that state. The ultimate goal is to bring
searcher, simulated annealing. the system, from an arbitrary (random) initial state, to a state
in which the energy of the system is minimal. In each stage,
I. INTRODUCTION SA replaces the current solution by a random nearby solution
Thanks to their flexibility to solve complex problems, with a probability depending both on the difference between
meta-heuristic optimization algorithms have received the corresponding fitness values and also on a parameter,
significant attention and remarkable growth over the past few named temperature.
decades. The most common way to classify meta-heuristic The ease of implementation makes SA as an extremely
algorithms is based on solo-searchers (simulated annealing popular method for solving large and practical problems such
(SA), tabu search (TS), hill climbing (HC), etc) vs. as travelling salesman [2], job-shop scheduling [3],
population-based searchers, such as genetic algorithm (GA), communication systems [4], continuous optimization [5],
particle swarm optimization (PSO), ant colony optimization among others. However, SA suffers from two main
(ACO), among others. The first methods are those that employ drawbacks, being trapped in local minima and taking long
a single solution during the search process while in the latter a computational time to find a reasonable solution. Due to the
population of solutions is used and evolved during a given fact that SA is a solo-searcher, its success depends strongly on
number of iterations. Population-based algorithms have been the selection of the starting point and the decisions it makes.
found to perform well on many real world problems. This has Hence, any bad luck affects the nature of the results and
led to an effort by researchers to understand and explain this instead of a global minimum a local one may be achieved,
behavior. Five distinct mechanisms by which a population- especially when the problem dimension is high and there are
many local minima. Moreover, seeking search space with a
SMC_2016 004627
2016 IEEE International Conference on Systems, Man, and Cybernetics • SMC 2016 | October 9-12, 2016 • Budapest, Hungary
A. Benchmark Functions
It has been proven that under certain assumptions, no
single search method is the best on average for all problems
[15]. In order to evaluate the optimization power of PSA
algorithm without a biased conclusion toward some selected
problems, a large set of standard benchmark functions has
been employed. Table I demonstrates these seventeen
benchmark functions and their descriptions. A more detailed
description of the functions can be found in [7]. The
SMC_2016 004628
2016 IEEE International Conference on Systems, Man, and Cybernetics • SMC 2016 | October 9-12, 2016 • Budapest, Hungary
yi = 1 +
1
( xi + 1)
PSA algorithm has been tested on a set of unimodal
4 30 [-50, 50]n 0 functions and compared with the other algorithms. Table II
k ( xi − a ) m , xi > a
u ( xi , a, k , m) = 0, − a ≤ xi ≤ a
summarizes the mean and standard deviation of the function
m
k ( − xi − a ) , xi < − a values over 30 runs found by PSA, GSA, and nonu-SA, the
rank of each algorithm in solving unimodal functions f1-f7, as
n −1
f12 ( x) = 0.1{10 sin 2 (3π x1 ) + i =1 ( xi − 1) 2 [1 + 10 sin 2 (3π xi +1 )] well as the best performance of each algorithm among runs.
+ ( xn − 1) [1 + sin (2π xn )]} + i =1u ( xi , 5,100, 4)
2 2 n
30 [-50, 50]n 0 The results of GA, PSO, and GSO algorithms obtained over
1000 runs are also listed in Table III in comparison with the
f13 ( x) = x12 + 2 x22 − 0.3 cos(3πx1 ) − 0.4 cos(4πx2 ) + 0.7 results found by our PSA.
2 [-100, 100]n 0
SMC_2016 004629
2016 IEEE International Conference on Systems, Man, and Cybernetics • SMC 2016 | October 9-12, 2016 • Budapest, Hungary
TABLE II. COMPARISON OF PSA WITH GSA AND NONU-SA ON BENCHMARK PSA in comparison with GSA and nonu-SA on f8-f12 over 30
FUNCTIONS F1-F7. ALL RESULTS HAVE BEEN AVERAGED OVER 30 RUNS.
runs. For four of the benchmark functions (f9-f12), PSA finds
Function Index PSA GSA nonu-SA better results than the other algorithms. Note that PSA
Mean 1.17e-45 15.88 1.95e-8 noticeably outperforms the other algorithms in solving
Std. 2.56e-45 2.28 2.58e-8
f1
rank 1 3 2 functions f9-f12. In solving function f10, PSA reaches the
Best 1.45e-49 10.86 1.73e-9 global minimum in all runs. In optimizing f8 PSA outperforms
Mean 1.44e-18 7.51e-5 3.32e-4 nonu-SA and is outperformed by GSA. As a result, the search
Std. 7.73e-18 4.07e-6 9.63e-3 power of the algorithms can be ordered as PSA > nonu-SA >
f2
rank 1 2 3
Best 5.41e-26 14.69 1.78e-2 GSA.
Mean 5.19 84.31 10.65 TABLE IV. COMPARISON OF PSA WITH GSA AND NONU-SA ON BENCHMARK
Std. 3.35 20.56 5.89 FUNCTIONS F8-F12. ALL RESULTS HAVE BEEN AVERAGED OVER 30 RUNS.
f3
rank 1 3 2 Function Index PSA GSA nonu-SA
Best 1.09 57.07 2.24 Mean 1.69 2.78e-2 44.58
Mean 5.46e-8 5.70 0.38 Std. 1.09 2.43e-1 16.66
Std. 1.08e-7 4.29 0.20 f8
f4 rank 2 1 3
rank 1 3 2 Best 2.46e-8 2.42e-2 15.92
Best 1.59e-9 1.98 0.073 Mean 4.44e-15 19.84 0.72
Mean 26.04 4.75e3 53.38 Std. 0 0.13 0.89
Std. 0.13 1.24e3 56.86 f9
f5 rank 1 3 2
rank 1 3 2 Best 4.44e-15 19.56 3.76e-6
Best 25.71 2.82e3 19.53 Mean 0 0.60 1.39e-2
Mean 0 17.67 1.37 Std. 0 7.80e-2 1.30e-2
Std. 0 2.17 1.27 f10
f6 rank 1 3 2
rank 1 3 2 Best 0 0.30 1.98e-7
Best 0 12 0 Mean 1.57e-32 13.28 0.44
Mean 4.82e-5 27.87 4.89e-2 Std. 5.47e-48 4.96 0.69
Std. 2.40e-5 6.19 2.49e-2 f11
f7 rank 1 3 2
rank 1 3 2 Best 1.57e-32 4.48 1.24e-10
Best 1.82e-5 14.28 2.01e-2 Mean 1.35e-31 2.29 4.03e-3
Average rank 1.00 2.86 2.14 Std. 6.57e-47 0.28 5.39e-3
Final rank 1 3 2 f12
rank 1 3 2
Best 1.35e-31 1.49 0.69e-9
TABLE III. COMPARISON OF PSA WITH GA, PSO, AND GSO ON BENCHMARK Average rank 1.20 2.60 2.20
FUNCTIONS F1-F7.
Final rank 1 3 2
Function Index PSA GA PSO GSO
Mean 1.17e-45 3.1711 3.6927e-37 1.9481e-8 A comparison between the obtained results by PSA versus
f1 Std. 2.56e-45 1.6621 2.4598e-36 1.1629e-8 GA, PSO, and GSO has been indicated in Table V. The results
rank 1 4 2 3
Mean 1.44e-18 0.5771 2.9168e-24 3.7039e-5 of GA, PSO, and GSO have been averaged over 1000 runs.
f2 Std. 7.73e-18 0.1306 1.1362e-23 8.6185e-5 PSA outperforms markedly GA, PSO, and GSO on f9-f12. On
rank 1 4 2 3 f8, PSA produces better results than PSO and is outperformed
Mean 5.19 9749.9145 1.1979e-3 5.7829 by GA and GSO. In this case the overall search performance is
f3 Std. 3.35 2594.9593 2.1109e-3 3.6813
rank 2 4 1 3 PSA > GSO > PSO > GA.
Mean 5.46e-8 7.9610 0.4123 0.1078
f4 Std. 1.08e-7 1.5063 0.2500 3.9981e-2
rank 1 4 3 2
Mean 26.04 338.5616 37.3582 49.8359 TABLE V. COMPARISON OF PSA WITH GA, PSO, AND GSO ON BENCHMARK
f5 Std. 0.13 361.497 32.1436 30.1771 FUNCTIONS F8-F12.
rank 1 4 2 3 Function Index PSA GA PSO GSO
Mean 0 3.6970 0.146 1.6000e-2 Mean 1.69 0.6509 20.7863 1.0179
f6 Std. 0 1.9517 0.4182 0.1333 f8 Std. 1.09 0.3594 5.9400 0.9509
rank 1 4 3 2 rank 3 1 4 2
Mean 4.82e-5 0.1045 9.9024e-3 7.3773e-2 Mean 4.44e-15 0.8678 1.3404e-3 2.6548e-5
f7 Std. 2.40e-5 3.6217e-2 3.5380e-2 9.2557e-2 f9 Std. 0 0.2805 4.2388e-2 3.0820e-5
rank 1 4 2 3 rank 1 4 3 2
Average rank 1.14 4.00 2.14 2.71 Mean 0 1.0038 0.2323 3.0792e-2
Final rank 1 4 2 3 f10 Std. 0 6.7545e-2 0.4434 3.0867e-2
rank 1 4 3 2
D. Multimodal Functions Mean 1.57e-32 4.3572e-2 3.9503e-2 2.7648e-11
f11 Std. 5.47e-48 5.0579e-2 9.1424e-2 9.1674e-11
D.1. Multimodal Functions with Many Local Minima rank 1 4 3 2
Mean 1.35e-31 0.1681 5.0519e-2 4.6948e-5
In multimodal functions (f8-f12), number of local minima f12 Std. 6.57e-47 7.0681e-2 0.5691 7.001e-4
increases exponentially with the problem dimension. rank 1 4 3 2
Therefore, multimodal functions are frequently considered as Average rank 1.40 3.40 3.20 2.00
Final rank 1 4 3 2
being difficult to optimize. Table IV presents the results of
SMC_2016 004630
2016 IEEE International Conference on Systems, Man, and Cybernetics • SMC 2016 | October 9-12, 2016 • Budapest, Hungary
D.2. Multimodal Functions with a Few Local Minima In order to investigate the convergence rate of PSA
algorithm seventeen benchmark test functions were chosen
In Table VI are described the results of PSA in comparison
and the average results over 30 runs were made. By the
with GSA and nonu-SA for functions f13-f17 over 30 runs. It is
simplicity only the performance of PSA for high-dimensional
obvious that PSA performs better than the other algorithms on functions over the first 2000 iterations and for low-
all the benchmark functions. In solving f13, PSA discovers the dimensional functions over the first 200 iterations, are
global minimum in all runs. The performance of PSA and presented in Figs. 2 to 10. The rapid convergence of PSA can
nonu-SA is similar on f13 and f16. The final rank of the be seen in Figs. 2 to 10, for four benchmark functions f1, f3, f14
algorithms in solving multimodal functions with a few local and f17, respectively.
minima can be concluded from Table VI as PSA > nonu-SA >
GSA. 6
x 10
4
Mean value
rank 1 3 1
Best 0 7.29e-4 0 3
6
Average rank 1.00 3.00 1.60
Final rank 1 3 2 5
4
From Table VII, we can observe in comparison with GA,
3
PSO and GSO, PSA achieves better results on benchmark
functions f14, f16, and f17. On f15, PSA outperforms GA and is 2
algorithm. The results of GA, PSO, and GSO have been Fig. 3. Average results over 30 runs obtained by PSA on f3.
averaged over 50 runs. Due to the fact that function f13 has not
been solved in [14], the comparison has made without 100
considering this function. From Table VII we can see that the 90
order of the search performance of the four algorithms is GSO 80
> PSA > GA > PSO.
70
TABLE VII. COMPARISON OF PSA WITH GA, PSO, AND GSO ON BENCHMARK
FUNCTIONS F14-F17.
Mean value
60
SMC_2016 004631
2016 IEEE International Conference on Systems, Man, and Cybernetics • SMC 2016 | October 9-12, 2016 • Budapest, Hungary
4
x 10
6 600
5 500
4 400
Mean value
Mean value
3 300
2 200
1 100
0
0 0 200 400 600 800 1000 1200 1400 1600 1800 2000
0 200 400 600 800 1000 1200 1400 1600 1800 2000
Iteration Iteration
Fig. 5. Average results over 30 runs obtained by PSA on f6. Fig. 8. Average results over 30 runs obtained by PSA on f10.
6
400
350 5
300
4
Mean value
250
Mean value
3
200
150 2
100
1
50
0
0 20 40 60 80 100 120 140 160 180 200
0
0 200 400 600 800 1000 1200 1400 1600 1800 2000 Iteration
Iteration
Fig. 9. Average results over 30 runs obtained by PSA on f14
Fig. 6. Average results over 30 runs obtained by PSA on f8.
-2
25
20
-2.5
Mean value
Mean value
15
-3
10
-3.5
0 20 40 60 80 100 120 140 160 180 200
0
Iteration
0 200 400 600 800 1000 1200 1400 1600 1800 2000
Fig. 10. Average results over 30 runs obtained by PSA on f17
Iteration
Fig. 7. Average results over 30 runs obtained by PSA on f9. Promising results obtained by PSA in comparison with
solo-searchers and population-based searchers accentuate high
optimization capability of the proposed population-based SA
algorithm. It is clear that using a population can greatly
improve the SA performance and overcome considerably its
deficiencies. The success of PSA is not only because of
providing a population to seek the search space but also is
because of its memory which provides a social behavior
among solutions. Providing a memory makes this opportunity
SMC_2016 004632
2016 IEEE International Conference on Systems, Man, and Cybernetics • SMC 2016 | October 9-12, 2016 • Budapest, Hungary
for the solutions to know the search space better and decide [13] L. Wang, and L. Zhang, “Stochastic optimization using simulated
annealing with hypothesis test,” Applied Mathematics and Computation,
about their new solutions. vol. 174, pp. 1329-1342, 2006.
[14] S. He, Q. H. Wu, and J. R. Saunders, “Group search optimizer: an
IV. CONCLUSION AND FUTURE RESEARCH optimization algorithm inspired by animal searching behavior,” IEEE
Transactions on Evolutionary Computation, vol. 13, pp. 973-990, 2009.
[15] D. H. Wolpert, and W. G. Macready, “No free lunch theorems for
Solo-search nature of SA algorithm makes this algorithm optimization,” IEEE Transactions on Evolutionary Computation, vol. 1,
to be trapped in local minima more easily. In order to hedge pp. 67-82, 1997.
against being unlucky in the starting point or the decisions that [16] H. P. Schwefel, Evolution and Optimum Seeking, New York; Wiely,
SA makes, a population-based simulated annealing (PSA) 1995.
[17] S. Mahdavi, M. E. Shiri and S. Rahnamayan, “Metaheuristics in large-
algorithm has been proposed in this paper. PSA utilize the scale global continues optimization: A survey,” Information Sciences,
ability of a population to efficiently seek the different parts of vol. 295, pp. 407-428, 2015.
the search space. It has a memory by which solutions share [18] M. N. Omidvar, X. Li, and K. Tang, “Designing benchmark problems
their information and determine their new probable positions. for large-scale continuous optimization,” Information Sciences, vol. 316,
pp. 419-436, 2015.
In order to evaluate the optimization power of the proposed
algorithm, experiments have been conducted on testing a set
of unimodal and multimodal benchmark functions. In
comparison with solo-search algorithms, PSA produces
promising results. PSA has also a competitive performance
when it is compared with other population-based algorithms.
It can be drawn that PSA is a powerful optimization algorithm
with which we may obtain better results than other algorithms.
Further research work related to PSA is being carried out to
efficiently optimize other large scale cases (see benchmarks in
[17],[18]).
REFERENCES
[1] A. Prügel-Bennett, “Benefits of a population: five mechanisms that
advantage population-based algorithms,” IEEE Transactions on
Evolutionary Computation, vol. 14, pp. 500-517, 2010.
[2] V. Černý, “Thermodynamical approach to the traveling salesman
problem: an efficient simulation algorithm,” Journal of Optimization
Theory and Applications, vol. 45, pp. 41-51, 1985.
[3] C. Low, “Simulated annealing heuristic for flow shop scheduling
problems with unrelated parallel machines,” Computers & Operations
Research, vol. 32, pp. 2013-2025, 2005.
[4] C Paik, and S. Soni, “A simulated annealing based solution approach for
the two-layered location registration and paging areas partitioning
problem in cellular mobile networks,” European Journal of Operational
Research, vol. 178, pp. 579-594, 2007.
[5] M. Locatelli, “Convergence of a simulated annealing algorithm for
continuous global optimization,” Journal of Global Optimization, vol.
18, pp. 219-233, 2000.
[6] X. Yao, “A new simulated annealing algorithm,” International Journal
of Computer Mathematics, vol. 56, pp. 161-168, 1995.
[7] Z. Xinchao, “Simulated annealing algorithm with adaptive
neighborhood,” Applied Soft Computing, vol. 11, pp. 1827-1836, 2010.
[8] L. Ingber, “Adaptive simulated annealing (ASA): Lessons learned,”
1996.
[9] A. Bevilacqua, “A methodological approach to parallel simulated
annealing on an SMP system,” Journal of Parallel and Distributed
Computing, vol. 62, pp. 1548-1570, 2002.
[10] O. Cordon, F. Moya, and C. Zarco, “A new evolutionary algorithm
combining simulated annealing and genetic programming for relevance
feedback in fuzzy information retrieval systems,” Soft Computing, vol.
6, pp. 308-319, 2002.
[11] G. Pajares, and J. M. De La Cruz, “On combining support vector
machines and simulated annealing in stereovision matching,” IEEE
Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics,
vol. 34, pp. 1646-1657, 2004.
[12] S. Salcedo-Sanz, R. Santiago-Mozos, and C. Bousono-Calzon, “A
hybrid Hopfield network-simulated annealing approach for frequency
assignment in satellite communications systems,” IEEE Transactions on
Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 34, pp. 1108-
1116, 2004.
SMC_2016 004633