Research On CS Adaptive.

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

20162015

IEEE 2nd
IEEE International
2nd International Conference onRecent
Conference on Recent Trends
Trends in Information
in Information Systems
Systems (ReTIS)(ReTIS)

A New Adaptive Cuckoo Search Algorithm

Manoj Naik1, Maheshwari


Kushalin Rashmita
2
Kashyap1 Nath , Aneesh Rutuparna Panda
3 4
Aneesh Wunnava2,
Wunnava Siddharth
, Siddharth Sahany3
Sahany Department of Electronics and Telecommunication
Indianof
School Institute of Technology,
Electronics, ITER Engineering,
Delhi Main Rd, IIT Campus, Veer Surendra Sai University of Technology,
Siksha ‘O’ Anusandhan University,
Hauz Khas, New Delhi, Delhi 110016 (India)
Bhubaneswar
e-mail: – 751030 (India)
kushalinkashyap1@soauniversity.ac.in1, Burla-768018 (India),
manojnaik@soauniversity.ac.in1,
,aneeshwunnava@soauniversity.ac.in2
e-mail: Phone: 91-663-2431857, Fax: 91-663-2430204,
mahive2009@gmail.com2,
,siddharthsahany@soauniversity.ac.in3 e-mail: r_ppanda@yahoo.co.in
aneeshwunnava@soauniversity.ac.in3,
siddharthsahany@soauniversity.ac.in4

Abstract—This paper presents a new adaptive Cuckoo search However, Ong in 2014 proposed an adaptive Cuckoo search
(ACS) algorithm based on the Cuckoo search (CS) for algorithm (ACSA) [25] with the help of adaptive step size
optimization. The main thrust is to decide the step size adaptively adjustment strategy. It has been shown that the ACSA
from its fitness value without using the Levy distribution. The performed better than the CS. The major drawback of ACSA
other idea is to enhance the performance from the point of time
is the requirement of fixed predefined constant parameters to
and global minima. The performance of ACS against standard
benchmark function show that the proposed algorithm converges decide the adaptive step size. This may be time consuming and
to best solution with less time than Cuckoo search. affects the convergence. It has motivated us to form a step size
that is automatically decided within the iterative process of the
Keywords—Cuckoo Search, Parameter free algorithm algorithm.
Here, we proposed a new adaptive optimization
I. INTRODUCTION algorithm. When one thinks to get a solution in the least
amount of time in optimization, the Cuckoo search
In the recent past, the evolutionary algorithms (EAs) are
optimization is one good choice because it requires less
gaining attention in the field of research, which are basically
parameter in the evolution process. One constraint is the
inspired by the nature. The main idea behind the EAs are to
parameter selection beforehand, and it affects the performance
find the globally optimal solution or near global optimum
of the optimization algorithm. The performance may be
solution in the minimal amount of time [1]. The EAs are
related to speed of convergence, reaching to the global
successfully applied in the domain of image segmentation [2],
minima, or fall in the local minima. Therefore, we made an
image fusion [3], pattern recognition [4-6], and filter design
attempts to minimize the parameters and decide the search
[7, 8], and optimization of an objective function [9-12].
path adaptively with less time to reach an optimal value.
Sometime it happens that the modified EAs provide better
solution than the standard EAs. Here, an attempt has been II. CUCKOO SEARCH (CS) ALGORITHM
made to propose a new adaptive algorithm to solve the
The Cuckoo search (CS) algorithm was developed by the
function optimization.
Yang & Deb [17, 18] inspired by social thinking of the
There are several evolutionary algorithms proposed by
Cuckoo birds. The Cuckoo bird lays its eggs in the nest of the
researchers. The genetic algorithm (GA) [13] was inspired by
other birds. The host bird discovers the egg laid by Cuckoo
crossover and the mutation principle of genetics. The bacterial
with a probability pa [0, 1]. The egg was discovered by the
foraging optimization (BFO) algorithm [14] proposed by
host birds either through the eggs or while the abandonment of
considering the evolution principle behind E-coli bacterium
nests to build a new one [26]. For simplicity of mathematical
present in the human intestine. The search for nutrition of ants
modeling, the number of host nests are assumed to be fixed.
in a colony has led to development of ant colony optimization
The mathematical model of Cuckoo search (CS) algorithm
(ACO) [15]. The particle swarm optimization (PSO) [16] have
was described as:
been developed by taking the inspiration swarm behavior of a
Assume in the environment, there is N Cuckoos present,
congregation of birds. The social thinking of Cuckoo birds
where each Cuckoo represents a nest. The nest relates to the
was modeled as Cuckoo search (CS) optimization [17-21].
solution of an optimization problem. Let us initialize the
Similarly, nature inspired algorithm are BATS [22], Honey
search space of n dimension for ith Cuckoo as
Bee [23], and Coral reef [24] can be used for various
optimization problem. However, for a particular problem, the ( )
X i = x i1 ,⋅ ⋅ ⋅, x id ,⋅ ⋅ ⋅, x in for i = 1,2,⋅ ⋅ ⋅, N . Then at the time t,
particular algorithm cannot be suitable to get the optimal the new search space X i (t + 1) for ith Cuckoo will be
solution. Sometimes, we cannot get the global solution for a calculated as
particular problem from a standard evolutionary algorithm.

16 IEEE
2016
978-1-4799-8349-0/15/$31.00 ©2015 1
X i (t + 1) = X i (t ) + α × Levy (γ ) (1) omit the α parameter. Then the adaptive Cuckoo search
algorithm step can be modeled as
Where Į is constant parameter related to dimension of search
§ 1 · bestfit (t )− worstfit (t )
bestfit (t )− fiti (t )

stepi (t + 1) = ¨
space helps in deciding the step size, and Levy (γ ) is the
© t ¹̧
(5)
random walk through a Levy flight [27]. In most cases Į is
taken as a constant value equal to 1.
In general, a random walk of next location only depends Where
on the current location as derived from Markov chain and the t = Generation of Cuckoo search.
Levy step. In long run, the Levy step gives larger step while fiti (t) = Fitness value of ith nest in tth generation.
exploring search space. Generally, the Levy step is taken from bestfit (t) = Best fitness value in tth generation.
the Levy distribution. Most commonly, the Levy distribution is worstfit (t) = Worst fitness value in tth generation.
obtained from the Mantegna algorithm. So the Levy step size The step size initially high, but when the generation
can be obtained from the Mantegna algorithm as increases the step size decreases. That indicates when the
algorithm reaches to the global optimal solution step size is
u small. From the Eq. (5), it clearly indicates that the step size
Levy (γ ) = 1
(2) adaptively decides from the fitness value. Then the adaptive
z (γ −1) Cuckoo search algorithm (ACS) is modeled:

where u and z are obtained from a normal distribution, Ȗ is X i (t + 1) = X i (t ) + randn × stepi (t + 1) . (6)
considered in the range [1,3], and the standard deviation
related to normal distribution is The Eq. (6) gives leads to new search space for adaptive
Cuckoo search (ACS) algorithm form the current solution.

ª § πγ · º
1 Another advantage of the ACS is, it does not require any

« Γ(1 + γ ) sin ¨ »
γ initial parameter to be defined. As it requires less parameter, it

« © 2 ¹̧ »
seems to be faster than the Cuckoo search algorithm.

« § γ −1 · »
§ ·
σ u (γ ) = , and σ z (γ ) = 1 . (3)
« Γ¨ γ 2 © 2 ¹̧ »
¨
Adaptive Cuckoo search algorithm
1 + γ
«¬ © 2 ¹̧ »¼ 1. Initialization
Randomly initialize the N number of host nests
Finally, Levy(Ȗ) multiplied with a factor ȕ gives step size. The (
X i = x i1 ,⋅ ⋅ ⋅, x id ,⋅ ⋅ ⋅, x in for )
i = 1,2,⋅ ⋅ ⋅, N for n
value ȕ is generally chosen such that the Levy step should not dimensional problem and define the fitness function
be aggressive. Conceptually, Levy distribution for large steps fit (X). Initially take t =1 and evaluate the fitness
applied a power law, thus has an infinite variance depicted as function of the host nests fit (Xi) for i=1,2,..,N for the
first time.
2. Iterative algorithm
Levy ~ u = t − γ . (4)
A. Find the bestfit and worstfit of the current
generation among the host nests.
III. ADAPTIVE CUCKOO SEARCH (ACS) ALGORITHM B. Calculate the step size using the Eq. (5).
The CS algorithm is a heuristic search algorithm which C. Then calculate the new position of Cuckoo
generally explores the search space using the Levy step. The nests using the Eq. (6).
Levy step is taken from the Levy distribution given by either
Mantegna algorithm or McCulloch’s algorithm. In [21], the D. Evaluate the objective function of the host
author suggested that Levy distribution using McCulloch’s nests fit (Xi) for i=1,2,..,N.
algorithm is potent than the Mantegna’s algorithm. Anyway, E. Then choose randomly a nest, j, among N
in the Cuckoo search algorithm follow the Levy distribution. If (fiti > fitj)
Here we made an attempt to make the Cuckoo search, adaptive
without using the Levy distribution. Update jth nest by the new solution.
The standard Cuckoo search algorithm does not have any End
control over the step size in the iteration process to reach F. The worst nests are abandoned with a
global minima or maxima. Here we try to incorporate the step
probability (pa) and new one are built.
size is proportional to the fitness of the individual nest in the
search space and the current generation. On the other hand, in G. t = t + 1.
some literature α has been taken as a fixed parameter, here we

2
H. Verify (t <= tmax) or (End creation not satisfied), ACS 53.8043 96.0495 18.7348 5.4994
F9
if yes then go to A; otherwise end. CS 30.3207 51.8353 9.8126 6.7492
ACS 2.0147e-005 0.0021 0.0108 5.4060
Then report the best solution by ranking them. F10
CS 1.9633e-005 0.4419 0.5616 6.6746
IV. RESULTS AND DISCUSSION ACS 1.3959e-009 5.8510e-005 2.3119e-004 6.0132
F11
To test our proposed adaptive Cuckoo search (ACS) algorithm CS 3.3070e-010 4.2978e-004 0.0015 7.2643
we carried out the performance study with well- known 23 ACS 6.5561e-009 0.3271 0.5553 9.4660
benchmark function [28]. The benchmark function used are F12
CS 8.0016e-007 0.5407 0.5789 10.7105
categorized in three parts: unimodal, multimodal with variable
dimension, and multimodal with fixed dimension. The ACS 1.8026e-010 2.1975e-004 0.0016 9.2508
F13
unimodal test function are Sphere model (F1), Schwefel’s CS 6.6602e-010 0.1801 0.7391 10.4896
problem 2.22 (F2), Schwefel’s problem 1.2 (F3), Schwefel’s ACS
problem 2.21 (F4), generalized Rosenbrock’s function (F5), 0.9980 0.9980 0 16.9906
F14
Step function (F6) and quartic function i.e. noise (F7). The CS 0.9980 0.9980 0 17.9274
multimodal with variable dimension are generalized ACS 3.0749e-004 3.0749e-004 3.2713e-018 5.2583
Schwefel’s problem 2.26 (F8), generalized Rastrigin’s F15
CS 3.0749e-004 3.0749e-004 5.3632e-019 6.0415
function (F9), Ackley’s function (F10), generalized Griewank
function (F11), generalized penalized function 1 (F12) and ACS -1.0316 -1.0316 2.2430e-016 4.3653
F16
generalized penalized function 2 (F13). The unimodal (F1 - F7) CS -1.0316 -1.0316 2.2430e-016 5.1281
& multimodal with variable dimension (F8 – F13) are having ACS
dimension, d=30 for performance evaluation. The multimodal 0.3979 0.3979 3.3645e-016 4.3166
F17
function with fixed dimension are Shekel’s foxholes function CS 0.3979 0.3979 3.3645e-016 5.0432
(F14, d = 2), Kowalik’s function (F15, d = 4), Six-hump Camel- ACS 3.0000 3.0000 3.5820e-015 5.0302
back function (F16, d = 2), Branin function (F17, d = 2), F18
CS 3.0000 3.0000 3.4252e-015 5.0583
Goldstein-Price function (F18, d = 2), Hartman’s family (F19, d
= 3; F20, d = 6) and Shekel’s family (F21 , d = 4, m = 5; F22, d ACS -3.8628 -3.8628 3.1402e-015 6.7429
F19
= 4, m = 7; F23, d = 4, m = 10). In the above d represents CS -3.8628 -3.8628 3.1402e-015 7.7172
dimension and m represents number of local minima. Here, the ACS -3.3220 -3.3220 0 6.9482
main focus is to enhance the Cuckoo search algorithm for F20
finding the global minima within less time. CS -3.3220 -3.3220 6.3441e-017 7.9412
ACS -10.1532 -10.1532 8.7097e-015 7.4554
F21
CS -10.1532 -10.1532 8.9396e-015 8.1611
TABLE I. Performance Evaluation.
ATime ACS -10.4029 -10.4029 7.7221e-015 8.1677
Best Mean Std. F22
(in Sec.)
ACS CS -10.4029 -10.4029 8.2659e-015 9.3168
3.0916e-011 4.4757e-010 3.9038e-010 4.8842
F1
CS ACS -10.5364 -10.5364 9.2442e-015 9.4424
3.9298e-011 5.0806e-010 3.7159e-010 6.0579 F23
ACS CS -10.5364 -10.5364 9.0113e-015 10.2875
7.3860e-005 3.2220e-004 1.6471e-004 5.2306
F2
CS 1.0205e-005 8.0336e-005 7.5415e-005 6.8499
ACS
The parameter for the CS and ACS are N = 25, pa = 0.25,
F3
1.9564 9.2915 3.9668 12.7819 Į = 1, Ȗ = 1.5, ȕ = 10, tmax =2000. Each algorithm was run for
CS 2.9970 14.6907 7.1782 14.4177 100 iterations. In each run the algorithm evaluates 100000
ACS functions. For comparison, we use best minima/maxima
0.0114 0.0470 0.0196 7.1576
F4 (‘Best’), mean (‘Mean’), standard deviation (‘Std’), and
CS 2.2423 5.6273 2.0234 11.7317 average time (‘ATime’) to get the best result in an evaluation
ACS 19.1351 23.0209 1.5189 5.9156 of 100000 functions, which are shown in TABLE I. To the
F5 every benchmark test function, the performance is compared
CS 5.7358 27.8085 17.5555 8.6053 between ACS and CS. The best solution is shown in bold
ACS 0 0 0 6.1039 faces. The convergence curves of some benchmark function are
F6
CS 0 0 0 6.3473
shown in Fig. 1-3.
ACS 0.0093 0.0184 0.0050 6.1452 The result shows that there is a significant improvement of
F7
CS average time to reach the near global solution in all the
0.0109 0.0379 0.0152 7.4226
benchmark functions when we compare ACS with CS. When
ACS -9.2415e+003 474.6379
F8
-1.0418e+004 5.6441 we consider the unimodal function (F1-F7), ACS outperform
CS -9.8816e+003 -9.3129e+003 283.1838 6.8740 the CS. For the multimodal function (F8-F13), except F9 our
proposed algorithm ACS outperform CS. The result of

3
multimodal test functions with fixed dimension (F14-F23) are 10
F13
10
not varying so much, but still ACS outperform CS when we ACS
consider the average time for evaluation of 100000 functions. CS

5
10
5
F1
10
ACS

Fitness
CS 0
10

0
10
-5
Fitness

10

-5
10
-10
10
0 500 1000 1500 2000
Iteration
Fig. 2(b). Performance Comparison of multimodal benchmark function F13
10
-10 with varied dimension.
0 500 1000 1500 2000
Iteration

Fig. 1(a). Performance Comparison of unimodal benchmark function F1. F20


0.1
-10 ACS
3
F7 CS
10
ACS 0.2
CS
-10
2
10
Fitness

0.3
1
-10
10
Fitness

0.4
10
0 -10

-1 0.5
10 -10
0 20 40 60 80 100
-2
10
0 500 1000 1500 2000
Iteration
Iteration Fig. 3(a). Performance Comparison of multimodal benchmark function F20 with
fixed dimension.
Fig. 1(b). Performance Comparison of unimodal benchmark function F7.
0
F23
-10
F10 ACS
2
10 CS
ACS
1 CS
10
Fitness

0
10
Fitness

-1
10

-2
10

-3 1
10 -10
0 50 100 150
-4 Iteration
10
0 500 1000 1500 2000 Fig. 3(b). Performance Comparison of multimodal benchmark function F23
Iteration with fixed dimension.
Fig. 2(a). Performance Comparison of multimodal benchmark function F10 with
varied dimension.
The convergences of six benchmark functions are presented in
Fig. 1-3. From the Fig. 1-2, we can say our proposed algorithm

4
ACS has better convergence than CS. But for the multimodal [11] R. Panda and M. K. Naik, "A novel adaptive crossover bacterial
function with fixed dimension both the algorithm has the same foraging optimization algorithm for linear discriminant analysis based
face recognition," Applied Soft Computing, vol. 30, pp. 722-736, 5//
way of convergence, as shown in Fig. 3. 2015.
[12] M. A. Munoz, S. K. Halgamuge, W. Alfonso, and E. F. Caicedo,
V. CONCLUSION “Simplifying the Bacteria Foraging Optimization Algorithm,” Proc.
2010 IEEE Congress on Evolutionary Computation, pp.1,7, 18-23 July
Different modified Cuckoo search algorithm has been 2010.
proposed for improvising the search pattern and rate of [13] M. Mitchell, An introduction to genetic algorithms, MIT Press,
convergence. However, this paper proposes a new adaptive Cambridge MA, USA, 1998.
Cuckoo search algorithm thinking beyond the Levy flight. [14] V. Gazi and K. M. Passino, “Stability analysis of social foraging
Another advantage of the ACS is a parameter free algorithm. swarms,” IEEE Transactions on Systems, Man, and Cybernetics – Part
B, vol. 34, no. 1, pp. 539-557, 2004.
Twenty three benchmark test functions are considered for
[15] M. Dorigo, V. Maniezzo, and A. Colorni, “The ant system: optimization
performance evaluation of the proposed ACS algorithm. From by a colony of cooperating agents,” IEEE Transactions on Systems,
the Fig. 1-3 and TABLE-I, it can be easily recognized the Man, and Cybernetics – Part B, vol. 26, no. 1, pp. 29-41, 1996.
proposed ACS outperforms CS. Further, this algorithm can be [16] J. Kennedy and R. C. Eberhart, “Particle swarm optimization,” Proc.
used for multi-objective optimization application with various IEEE International Conference on Neural Networks, vol. 4, pp. 1942-
1948, 1995.
input constraints. Finally, we can conclude our proposed
[17] X. S. Yang and S. Deb, “Cuckoo search via Lévy flights,” Proc. World
algorithm ACS has better convergence than the CS in all Congress on Nature & Biologically Inspired Computing (NaBIC 2009),
respects. pp. 210-214, 2009.
[18] X.S.Yang, and S. Deb, “Cuckoo search: recent advances and
REFERENCES applications,” Neural Computing and Applications, vol.24, no.1, pp.
[1] K. Steer, A. Wirth, and S. Halgamuge, “The rationale behind seeking 169-174, 2013.
inspiration form nature,” in Nature-Inspired Algorithms for [19] Cuckoo Search and Firefly Algorithm.
Optimisation, ser. Studies in Computational Intelligence, R. Chiong, Ed. http://link.springer.com/book/10.1007%2F978-3-319-02141-6.
Springer, 2009, vol. 193, pp. 51-76. [20] J. F. Chen, and Q. H. Do, “Training neural networks to predict student
[2] R. Panda, S. Agrawal, and S. Bhuyan, “Edge magnitude based academic performance: A comparison of Cuckoo search and
multilevel thresholding using Cuckoo search technique,” Expert gravitational search algorithms,” International Journal of Computational
Systems with Applications, vol. 40, no.18, pp. 7617-7628, Dec. 2013. Intelligence and Applications, vol. 13, no. 1, 2014.
[3] R. Panda, and M. K. Naik, “Fusion of Infrared and Visual Images Using [21] H. Soneji, and R. C. Sanghvi , "Towards the improvement of Cuckoo
Bacterial Foraging Strategy,” WSEAS Trans. on Signal Processing, vol. search algorithm," Information and Communication Technologies
8, no. 4, pp. 145-156, 2012. (WICT), 2012 World Congress on , pp.878,883.
[4] R. Panda, M. K. Naik and B. K. Panigrahi, “Face recognition using [22] X. S. Yang, , “A New Metaheuristic Bat-Inspired Algorithm”, Nature
bacterial foraging strategy”, Swarm and Evolutionary Computation, vol. Inspired Cooperative Strategies for Optimization (NICSO 2010), Studies
1, no. 3, pp. 138-146, Sept. 2011. in Computational Intelligence Volume 284, 2010, pp 65-74.
[5] C. Liu and H. Wechsler, “Evolutionary pursuit and its application to face [23] D. T. Pham, M. Castellani, and H. A. L. Thi, “Nature-Inspired
recognition,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 22, no. 6, pp. Intelligent Optimisation Using the Bees Algorithm, Transactions on
570-582, 2000. Computational Intelligence XIII, Lecture Notes in Computer
[6] W. S. Zheng, J. H. Lai, and P. C. Yuen, “GA-Fisher: a new LDA-based Science Volume 8342, 2014, pp 38-69.
face recognition algorithm with selection of principal components,” [24] S. Salcedo-Sanz, J. Del Ser, I. Landa-Torres, S. Gil-López, and J. A.
IEEE Transactions on Systems, Man, and Cybernetics – Part B, vol. 35, Portilla-Figueras, “The Coral Reefs Optimization Algorithm: A Novel
no. 5, pp. 1065-1078, 2005. Metaheuristic for Efficiently Solving Optimization Problems,” The
[7] N. E. Mastorakis, I. F. Gonos, and M. N. S. Swamy, “Design of two- Scientific World Journal, vol. 2014, Article ID 739768, 15 pages, 2014.
dimensional recursive filters using Genetic algorithm,” IEEE Trans.On [25] P. Ong, “Adaptive Cuckoo search algorithm for unconstrained
Circuits and Systems-I: Fundamentals Theory and Applications, vol. 50, optimization,” The Scientific World Journal, Hindawi Publication, vol.
pp. 634-639, May 2003. 2014, pp.1-8, 2014.
[8] R. Panda and M.K. Naik, “Design of two-dimensional recursive filters [26] S. Chakraverty, and A. Kumar, “Design optimization for reliable
using bacterial foraging optimization,” Proc. 2013 IEEE Symposium on embedded system using Cuckoo search,” Proc. International Conference
Swarm Intelligence (SIS), pp. 188-193, April 2013. on Electronics, Computer Technology, vol. 1, pp. 164-268, 2011.
[9] W. Du and B. Li, “Multi-strategy ensemble particle swarm optimization [27] P. Barthelemy, J. Bertolotti, and D. S. Wiersma, “A Lévy flight for
for dynamic optimization,” Information Sciences, vol. 178, pp. 3096- light,” Nature, vol. 453, pp. 495-498, 2008.
3109, 2008. [28] X. Yao, Y. Liu, and G. Lin, “Evolutionary programming made faster,”
[10] R. Panda and M. K. Naik, “A crossover bacterial foraging optimization IEEE Transactions on Evolutionary Computation, vol. 3, pp. 82-102,
algorithm”, Applied Computational Intelligence and Soft Computing, 1999.
Hindawi Publication, vol. 2012, pp.1-7, 2012.

You might also like