Professional Documents
Culture Documents
Modified Bat Algorithm With Cauchy Mutation and Elite Opposition-Based Learning
Modified Bat Algorithm With Cauchy Mutation and Elite Opposition-Based Learning
Modified Bat Algorithm With Cauchy Mutation and Elite Opposition-Based Learning
Abstract—Metaheuristics can be used to solve optimization In recent years, various metaheuristic algorithms based on
complex problems because they offer approximate and acceptable swarm intelligence have been proposed. For instance, Bat
solutions. In recent years, nature has been a source of inspiration Algorithm (BA) [8] is inspired by the echolocation of bats
for many computer scientists when proposing new metaheuristics
such as the algorithms inspired by swarm intelligence. They are using a frequency adjustment technique to balance exploration
based on the behavior of animals that live in groups such as and exploitation. BA has been used to resolve many real-world
birds, fishes and bats. In this context, Bat Algorithm (BA) is problems such as image processing [9], data classification [10]
a recent metaheuristic inspired by echolocation of bats during and distribution systems [11].
their flights. However, a problem that this algorithm faces is the
loss of the ability to generate diversity and, consequently, the A problem commonly faced by metaheuristic algorithms
chances of finding the global solution are reduced. This paper is known as premature convergence and it occurs when the
proposes a modification to the original BA using two methods algorithm stays “trapped” in local optimum. BA is powerful
known as Cauchy mutation operator and Elite Opposition- in local searches, but at times it be may trap into some local
Based Learning. The new variant aims generate diversity of the
algorithm and increases its convergence velocity. It was compared
optima, so that it cannot perform global search well.
to the original BA and another variant found in the literature. Various papers have been proposed to improve the original
For this comparison, the proposed variant used four benchmark BA performance. Hybrid Bat Algorithm (HBA) [12] was
functions, during 30 independent runs. After the experiments, obtained by hybridizing the original BA with Differential Evo-
the superiority of the new variant is highlighted when the results
are compared to the original BA. lution [13] strategies. In [14], a variant called Bat Algorithm
Index Terms—Metaheuristic, Swarm Intelligence, Bat Algo- with Self-Adaptive Mutation (BA–SAM) was proposed. Dif-
rithm, Elite Opposition-Based Learning, Cauchy Mutation ferently from the original BA which generates new solutions
from random walks only, BA–SAM produces the mutation
I. I NTRODUCTION step size using both Gaussian and Cauchy distributions. In
[15], an Improved BA (IBA) is presented proposing three
Optimization problems are present in many real-world do-
modifications: Inertia Weigh Factor Modification, Adaptive
mains such as localization problem [1], parallel processing
Frequency Modification and Scout Bee Modification.
[2], multiprocessor task scheduling [3], air traffic management
[4] and many others. Optimization is a process that aims to This paper proposes a BA variant that uses two approaches
find the best solution for a given problem. Therefore, there called Cauchy mutation and elite opposition-based learning
is a constant need to develop efficient and robust algorithms to increase the diversity of the population. They can be used
capable of solving real problems. Metaheuristic optimization to improve the convergence velocity of the original BA. The
algorithms have been widely used to solve these types of novel variant and original BA are compared by means of four
problems. benchmark functions. The results show that the new variant
Metaheuristics can be used to guide a subordinate heuristic has achieved better performance compared to the original BA.
by combining different concepts for exploring and exploiting The paper is organized as follows. Section II presents a brief
the search space. Some metaheuristics are inspired by nature, background about BA and elite opposition-based learning.
for example, swarm intelligence. Therefore, they are known Section III describes the Cauchy mutation and presents a novel
as algorithms based on swarm intelligence. Some examples of variant BA to improve performance of the original algorithmr.
such algorithms are Particle Swarm Optmization (PSO) [5], In Section IV, the experiments are presented and the results
Artificial Bee Colony (ABC) [6], Firefly Algorithm (FA) [7] obtained are analyzed. Finally, Section V presents conclusions
and others. and future work.
978-1-5386-3734-0/17/$31.00
2017
c IEEE
II. BACKGROUND ri and the loudness Ai . Lines 4–17 represent the evolution of
This section presents some needed information to under- bats over time. In this evolution, new solutions are established
stand the algorithm proposed in Section III, such as Bat Al- and the frequencies, velocities and positions are updated (line
gorithm and the concept of Elite Opposition-Based Learning. 6). In the next line, the pulse emission rate is compared to
a random real number drawn from a uniform distribution
A. Bat Algorithm ∈ [0, 1]. When the random number is greather than ri , a new
Bio-inspired metaheuristic algorithms have been receiving solution is generated around the best solution (line 9). In line
much attention. Yang [8] proposed one of these algorithms 11, it is evaluated if this solution will be accepted. If positive,
which is inspired by echolocation and it is known as Bat ri is increased and Ai is decreased (line 13). Finally, the bats
Algorithm (BA). Echolocation is a sophisticated biological are ranked and the best one is selected (line 16).
capability that the bats use to detect their preys and to avoid
obstacles. Algorithm 1 BA Pseudocode
Bats make calls as they fly and listen to the returning echoes 1: Initialize the bats population Xi i = (1, 2, ..., N ) and Vi
to build up a sonic map of their surroundings. Thus, a bat can 2: Define pulse frequency fi at Xi
compute how far they are from an object. Furthermore, bats 3: Initialize pulse rates ri and the loudness Ai
can distinguish the difference between an obstacle and a prey 4: while (t ≤ num_max_iter) do
even in complete darkness [16]. 5: for i = 1 : N do
In BA, all bats use echolocation to perceive and calculate 6: Generate new solutions by adjusting frequency, and
distances. The ith bat has a position xi (solution), velocity updating velocities and locations/solutions
vi and frequency fi . For each iteration t, the ith bat moves 7: if (rand > ri ) then
toward the current global best solution. In order to implement 8: Select the best solution
global search, the position, the velocity, and the frequency of 9: Generate a local solution around the best solution
the ith bat are updated, according to Equation (1): 10: end if
fi = fmin + (fmax − fmin ) · β 11: if (rand < Ai & f it(xi ) < f it(xt−1
g )) then
12: Accept the new solution
vit = vit−1 + (xt−1 − xt−1 13: Increase ri and reduce Ai
i g ) · fi (1)
14: end if
15: end for
xti = xt−1
i + vit ,
16: Rank the bats and find the current best
where fmin and fmax are minimum and maximum frequen- 17: end while
cies, respectively. β is a random value generated from a
uniform distribution ∈ [0, 1] and xt−1
g represents the current
global best solution.
In order to implement local search, a new candidate solution B. Elite Opposition-Based Learning
is generated apart from or close to the current best solution as
In general, random solutions are used to initialize meta-
follows:
heuristic algorithms. Over time, the algorithm evolves towards
xnew = xt−1
g + Āt , (2)
global optimum and it is common that the solutions are
where is a random value ∈ [-1, 1] and Āt is the average distant from the one that is considered optimal. The worst
value of the loudness of all bats. case occurs when the solutions are in the opposite position
According to the proximity of the target, the bats decrease to the optimal one. An alternative is to search simultaneously
the loudness Ai ∈ [A0 , Amin ] and increase the rate of pulse in all directions or to search in the opposite direction. Elite
emission ri ∈ [0, 1] as follows: Opposition-Based Learning (EOBL) [17] can be applied to
deal with this situation.
At+1
i = αAti
Before introducing EOBL, the concept of Opposition-Based
(3)
Learning (OBL) [18] will be explained. In the computa-
rit+1 = ri0 [1 − exp(−γt)],
tional intelligence field, OBL have been used to increase the
where α and γ are constants. For any 0 < α < 1 and γ > 0, algorithms efficiency. For a given problem, OBL evaluates
then simultaneously a solution x and its respective opposite solution
Ati → 0 x̃, which allows a candidate solution to be found close to the
(4) global optimum.
rit → ri0 , as t → ∞.
OBL can be apllied not only to generate random initial
The pseudocode of BA can be summarized as shown in solutions, but also at each iteration to improve the diversity
Algorithm 1. It is initialized with a random generation of during the exploration process [19]. Next, the definitions for
the population (line 1). In line 2, the initial frequency fi is Opposite Number and Opposite Point are presented.
determined for the position Xi . Line 3 initializes the pulse rate
Definition 1: Opposite Number – Let x ∈ <, x ∈ [a, b], where T > 0 is a scalar parameter. The corresponding
the opposite number x̃ is defined by Equation (5): distribution function is given by:
x̃ = a + b − x. (5) 1 1 x
FT (x) = + arctan( ). (11)
2 π T
Similarly, the same definition can be extended for Therefore, it can be observed that the variation of the
multidimensional problems. Cauchy distribution is infinite. In thesis, this allows a global
search with more efficiency [20] and, as of a consequence,
Definition 2: Opposite Point – Let P = (x1 , x2 , ..., xn ) be better results.
a point in the n-dimensional space, where x1 , x2 , ..., xn ∈ < The proposed variant is presented in Algorithm 2. It is
and xi ∈ [ai , bi ], ∀i ∈ {1, 2, ..., n}. Therefore, the opposite verified that the main differences related to the original BA
point P̃ = (x̃1 , x̃2 , ..., x̃n ) is defined by its components: are in lines 15–21. In line 15, a bat is randomly drawn and
it is called batr . Next, EOBL is applied on batr and it is
x̃i = ai + bi − xi . (6) verified if its fitness value is better than the original value. If
this condition is true, the changes on batr will be commited
Considering an optimization problem, P is given as a point
(line 17).
in the n-dimensional space (i.e., a candidate solution) and f (·)
In line 19, Cauchy mutation is applied on batr and after
is a function used to measure the fitness value of the candidate
mutation, it is verified if its new fitness value is better than
solution. According to Definition 2, P̃ = (x̃1 , x̃2 , ..., x̃n ) is
the current fitness value. If this condition is true, batr will
the opposite point of P = (x1 , x2 , ..., xn ). Therefore, in an
receive the changes perfomed through Cauchy mutation (line
optimization problem, if f (P̃ ) ≤ f (P ) to minimize f (·), then
20). The rest of the algorithm is similar to the original BA.
P will be replaced by P̃ ; otherwise, solution P remains.
Once the OBL definition has been presented, an exam- Algorithm 2 Modified BA Pseudocode
ple will be used to explain EOBL. In this paper, the bat
1: Initialize the bat population Xi i = (1, 2, ..., N ) and Vi
with the best fitness value is viewed as the elite individ-
2: Define pulse frequency fi at Xi
ual. Supposing that the elite bat is Xe = (xe1 , xe2 , ...xen ),
3: Initialize pulse rates ri and the loudness Ai
the elite opposition-based solution for the individual Xi =
4: while (t ≤ num_max_iter) do
(xi1 , xi2 , ..., xin ) can be defined as X̂i = (x̂i1 , x̂i2 , ...x̂in ). It
5: for i = 1 : N do
can be obtained by Equation (7):
6: Generate new solutions by adjusting frequency, and
x̂ij = δ · (daj + dbj ) − xej , i = 1, 2, ..., N ; j = 1, 2, ..., n (7) updating velocities and locations/solutions
7: if (rand > ri ) then
where N is the population size, n is the dimension of X, 8: Select the best solution
δ ∈ (0, 1) is a generalized coefficient that is used to control the 9: Generate a local solution around the best solution
opposition magnitude. xej ∈ [ai , bi ] and (daj , dbj ) represents 10: end if
the dynamic bound obtained by the following equation: 11: if (rand < Ai & f it(xi ) < f it(xt−1
g )) then
12: Accept the new solution
daj = min(xij ), dbj = max(xij ) (8) 13: Increase ri and reduce Ai
14: end if
It is possible that a transformed solution x̂ij will jump out 15: batr ← random_draw()
of [ai , bi ]. If it happens, Equation (9) will be used to reset x̂ij : 16: if f it(EOBL(batr )) ≤ f it(batr ) then
x̂ij = rand(daj , dbj ), x̂ij < daj || x̂ij > dbj . (9) 17: batr ← EOBL(batr )
18: end if
III. M ODIFIED BA WITH C AUCHY M UTATION AND EOBL 19: if f it(cauchy_mutation(batr )) < f it(batr ) then
20: batr ← cauchy_mutation(batr )
This paper proposes a novel BA variant that uses two 21: end if
techniques known as Cauchy mutation and EOBL. They can 22: end for
be used to increase the exploration of the search space and 23: Rank the bats and find the current best
consequently diversify the population. The proposed variant is 24: end while
an alternative for the original BA to achieve better performance
and higher velocity of convergence.
Before presenting the new variant, it is important to high- IV. C OMPUTATIONAL E XPERIMENTS
light the Cauchy mutation. It is characterized by having a This section presents benchmark functions used to validade
density function of one-dimensional probability centered on the proposed algorithm. It also presents the configuration of
the origin that is defined by: the experiments as well as the values defined to the algorithm
1 T parameters. Finally, several simulations are presented and the
fT (x) = − ∞ < x < ∞, (10) results analisys is performed.
π (T 2 + x2 )
A. Benchmark Functions and Experimental Setup
It is common to use benchmark functions with the as-
sumption that their difficulty corresponds to the difficulties
encountered in real-world applications. Therefore, they are
used to validate and compare the optimization algorithms, as
well as to validate any global optimization approach [21].
Four benchmark functions were chosen to perform the
experiments. They are often applied in minimization problems
and have been used in various BA studies [12][14][22]. For
each one of them, the formulation, the search space, the
optimal solution and the modality are presented in Table I.
Sphere and Rosenbrock functions are unimodal. Often they
are used to test the ability of the algorithm in local searches.
On the other hand, Griewank and Rastrigin have several local
minima and they are used to verify the ability of the algorithm
in global searches. Figure 1. Convergence mean behavior for f1 : 50 bats, 30D, 900 iterations.
All routines were implemented in the MATLAB program-
ming language. The experiments were run on a computer that
uses Intel Core i7 processor with 2.4 GHz frequency, 8 GB
of RAM and Windows 10 Home Single Language (64 bits).
Multiprocessing techniques were not used.
The experiments use various configurations to compare
the original BA to the proposed variant. Dimensions of the
problem and the number of iterations are varied. For each
run, the number of iterations is set 300, 600 and 900 for 10
dimensions, 20 dimensions and 30 dimensions respectively.
The variant is also compared to one of the literature using 30
dimensions and 900 iterations. The algorithms are tested with
30 independent runs and the number of bats is fixed to 50.
The values defined for the parameters used in the experi-
ments were the same used by [23]: loudness Ai and pulse rate
ri are fixed to 0.5, initial loudness A0 equals 2 and minimum
loudness Amin equals 0. The values defined for the constants
γ and α are 0.05 and 0.95, respectively. Figure 2. Convergence mean behavior for f2 : 50 bats, 30D, 900 iterations.
Table II
C OMPARISON BETWEEN ORIGINAL BA AND MODIFIED BA WITH C AUCHY MUTATION AND EOBL ( BEST RESULTS IN BOLD ).