Professional Documents
Culture Documents
Improved Harris Hawks Optimization Based On Adaptive Cooperative Foraging and Dispersed Foraging Strategies
Improved Harris Hawks Optimization Based On Adaptive Cooperative Foraging and Dispersed Foraging Strategies
Improved Harris Hawks Optimization Based On Adaptive Cooperative Foraging and Dispersed Foraging Strategies
ABSTRACT Harris hawks optimization (HHO) is a new swarm intelligence optimization technique. Because
of its simple structure and easy to implement, HHO has attracted research interest from scholars in different
fields. However, the low population diversity and the single search method in the exploration phase weakened
the global search capability of the HHO algorithm. In response to these defects, this paper proposes an
improved HHO algorithm based on adaptive cooperative foraging and dispersed foraging strategies. First,
the adaptive cooperative foraging strategy uses three random individuals to guide the position update,
which achieves cooperation between individuals. Then the cooperation behavior is embedded in the one-
dimensional update operation framework, and the one-dimensional or total-dimensional update operation
is adaptively selected. This way allows the algorithm to perform position update operations for a specific
dimension of individual vectors with a certain probability, which improves the population diversity. Second,
the dispersed foraging strategy is introduced into the HHO, forcing a part of Harris hawks to leave their
current position to find more prey to obtain a better candidate solution. This way effectively avoids the
algorithm falling into local optimum. Finally, a randomly shrinking exponential function is used to simulate
the energy change of the prey, so that the algorithm maintains the exploration ability in the later exploitation
process, effectively balancing the exploration and exploitation ability of the algorithm. The performance
of the proposed ADHHO algorithm is evaluated using Wilcoxon’s test on unimodal, multimodal and CEC
2014 benchmark functions. Numerical results and statistical experiments show that ADHHO provides better
solution quality, convergence accuracy and stability compared with other state-of-the-art algorithms.
INDEX TERMS Harris hawks optimization, adaptive cooperative foraging, dispersed foraging, Wilcoxon’s
test, CEC 2014 benchmark functions.
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/
VOLUME 8, 2020 160297
X. Zhang et al.: Improved HHO Based on Adaptive Cooperative Foraging and Dispersed Foraging Strategies
behavior bird or fish swarm. It has the advantages of simple successfully applied to text feature extraction and has
structure and few parameter settings. However, the PSO algo- achieved good results [32].
rithm is easy to fall into local optimum [12]. Artificial Bee As a new bionic optimization algorithm, the swarm intelli-
Colony algorithm (ABC) [13] is a simulation of a particular gence algorithm has developed rapidly in recent years. How-
behavior of honeybees known as foraging behavior. The Ant ever, according to the no free lunch (NFL) theorem, one
Colony Optimization (ACO) algorithm is inspired by the algorithm cannot be regarded as a general optimizer to solve
behavior of finding the optimal path in the ant colony foraging all optimization problems [33]. NFL Theorem encourages
process [14]. The pheromone matrix of the ant colony evolves scholars to propose new optimization algorithms or improve
continuously during the iteration of the candidate solution classical optimization algorithms to achieve better optimiza-
and eventually obtains the optimal solution. It is very effec- tion performance. Therefore, Heidari et al. proposed a new
tive in solving path planning problems [15]. Cuckoo Search swarm intelligence algorithm in 2019, Harris hawks Opti-
algorithm (CS) [16] is another swarm intelligence algorithm, mization (HHO) algorithm [34], by simulating the cooper-
which is based on the obligate brood parasitic behavior of ative behavior of Harris hawks in the process of hunting
certain cuckoo species [17]. However, such algorithms are prey. The simulation experiments of 29 benchmark func-
not very popular due to their high time complexity. Fruit fly tions and several engineering optimization problems proved
optimization algorithm (FOA) [18] was proposed in 2011, the effectiveness of HHO in optimization problems. At the
inspired by the cooperative foraging behavior of fruit flies. same time, the HHO algorithm has the advantages of simple
Besides, many new SI algorithms appear every year, such operation, few adjustment parameters and easy to implement,
as grey wolf optimization algorithm (GWO) [19], firefly so it has been applied to solve actual optimization problems
algorithm (FF) [20], dragonfly algorithm (DA) [21], squirrel in many disciplines. For example, image segmentation [35],
search algorithm (SSA) [22], [23]. It is worth mentioning structure optimization [36], image denoising [37], parameter
that some recently proposed meta-heuristic algorithms, such identification [38], layout optimization [39], and power load
as Henry gas solubility (HGS) [24], slime mould algorithm distribution [40].
(SMA) [25], equilibrium optimizer (EO) [26] and quasi- Although HHO has been successfully applied to various
affine transformation evolutionary (QUATRE) [27] show practical optimization problems, the algorithm itself still
excellent performance in solving optimization problems. The has some shortcomings. Since the HHO algorithm intro-
HGS algorithm simulates Henry’s law behavior and was duced four different attack strategies, it has a significant
recently developed by Hashim et al. In HGS, population advantage in exploitation capability. However, the choice
agents were divided into clusters with the same number of of these four attack strategies is based on random parame-
gas types. The gas equilibrium state was updated by updating ters, resulting in an imbalance between the exploration and
each cluster of gases’ solubility at different temperatures and exploitation capabilities of the algorithm. When dealing with
pressures. The resulting gas in the highest equilibrium state multimodal or modern highly complex optimization tasks,
is the optimal solution. The advantage of HGS is that it the convergence accuracy is low, and it is easy to fall into
has the search mechanism of the worst gas neighborhood, local optimization. Besides, the HHO algorithm ignored the
thereby improving the global search capability. SMA mainly later global search capabilities. More specifically, the escape
simulates the behavior and morphological changes of slime energy E in the later stage of the iteration is always less
mould during foraging. When the food source’s quality is than 1, and the hawks are always in the stage of attacking
different, the slime mould can establish a moving route in prey. In this way, there is no guarantee that the population has
the direction of a higher concentration, thereby ensuring the gathered around the optimum at the end of the exploration
maximum concentration of nutritional requirements. SMA phase, resulting in premature convergence. In response to
has an excellent balance between exploration and exploita- these problems, some scholars have proposed improvement
tion, so it could effectively avoid local optimization. The strategies from different perspectives. For example, litera-
inspiration of the EO algorithm comes from the dynamic ture [41] introduced long-term memory to the HHO algo-
mass balance on the control volume. In EO, the mass balance rithm, allowing individuals to exercise based on experience,
equation is used to measure the number of mass items gen- increasing the population’s diversity. However, it ignored
erated in the volume over time and seek to find the state that the running time of the algorithm and was less effective in
achieves the equilibrium of the system. EO is easy to imple- high-dimensional problems. Reference [35] used dynamic
ment and has a high population diversity. Therefore, it has control parameters to reduce the probability of the HHO
been applied to many real-world optimization problems such algorithm falling into a local optimum and used mutation
as multi-thresholding image segmentation problems [28], operators to improve the global search capability further.
optimal structural design problems [29]. The QUATRE Literature [42] added interference terms to the escape energy
is a co-evolution framework based on quasi-affine trans- to control the location of the disturbance peaks, which
formation. It has been proved that excellent optimization increased the global search capability in the later stage.
performance could be obtained on large-scale optimiza- Besides, some scholars combined with the exploration abil-
tion problems [27], [30], [31]. Due to its simple struc- ity of other algorithms to improve HHO, such as com-
ture and high flexibility, the QUATRE algorithm has been bining sine and cosine algorithm [43], simulated annealing
algorithm [44] and dragonfly algorithm [45]. However, The rest of the paper is organized as follows. Section II
the above improvements are generally aimed at improving briefly introduced the HHO algorithm. Section III detailed the
exploration capabilities, and the lack of a balanced method proposed ADHHO algorithm based on adaptive foraging and
between search capabilities makes the robustness and search dispersed foraging strategies. Section IV gave the experimen-
results under multimodal or modern highly complex opti- tal results and analysis. The proposed method was applied to
mization tasks generally weak. In addition to the above a practical problem in Section V. Finally, Section VI gave a
improvement measures, TABLE 1 summarizes the advan- conclusion.
tages, disadvantages and applications of other HHO variants.
Given the above discussion, this paper proposes a Harris II. AN OVERVIEW OF HARRIS HAWKS OPTIMIZATION
hawks optimization algorithm based on adaptive cooperative The HHO algorithm was inspired by Harris hawk’s foraging
foraging and dispersed foraging strategies, and proposes three behavior and attack strategy. The search process consists
improvements to the above deficiencies. The main contribu- of three parts: the exploration phase, the transformation of
tions of this article are summarized as follows: exploration and exploitation, and the exploitation phase. The
1) Introduced an adaptive cooperative foraging strategy first stage involves waiting, finding and discovering prey. The
that randomly selected some Harris hawk individuals to for- Xprey is regarded as the prey position, and then, the hawk’s
age cooperatively. Then this foraging strategy was embed- position is defined as follows:
ded in the one-dimensional position update framework, and
( t
t
Xrand − r1 Xrand − 2r2 Xit q ≥ 0.5
t+1
the one-dimensional update operation and the traditional Xi = (1)
total-dimensional update operation were adaptively selected
t
Xprey − Xmt − r3 (lb + r4 (ub − lb)) q < 0.5
according to the conversion factor. This strategy effectively where Xrandt represents a random hawk in the current itera-
improved the population diversity of the HHO algorithm. tion. q is a random value in the range [0, 1]. r1 , r2 , r3 and r4 are
2) Proposed a dispersed foraging strategy that randomly random numbers in between [0, 1]. Xmt denotes the average
distributed some Harris hawk individuals to other areas position of all hawks and can be calculated as follows.
for foraging, to avoid the algorithm falling into local 1 XN
optimum. Xmt = Xt (2)
N i=1 i
3) The randomly shrinking exponential function was used
where Xit is the position of the ith Harris hawk in iteration t
to simulate the energy consumption process of the prey, and
and N represents the size of population.
effectively solved the problem of imbalance between explo-
At the second stage, the transition from exploration and
ration and exploitation in the later stage of the algorithm.
exploitation depends on the escape energy E of prey. The
4) The proposed algorithm was compared with other
mathematical model based on prey escape energy behavior
famous metaheuristic algorithms based on the convergence
is as follows:
curves and statistical measures (e.g., mean, best, worst, stan- t
dard deviation). E = 2E0 × (1 − ) (3)
T
VOLUME 8, 2020 160299
X. Zhang et al.: Improved HHO Based on Adaptive Cooperative Foraging and Dispersed Foraging Strategies
III. THE PROPOSED ADHHO ALGORITHM space and avoid the algorithm falling into local optimum.
In this section, a new ADHHO algorithm was proposed to The third is to introduce a randomly shrinking exponential
improve the performance of the original HHO algorithm. function to simulate the escape energy of the prey. And part
There are three main innovations. First, an adaptive coopera- of the exploration capability is retained in the exploitation
tive foraging strategy was introduced into the original HHO stage of the late search, thereby achieving a balance between
algorithm and used the average distance of some randomly exploration and exploitation.
selected individuals to guide the search agent to update their
position to realize the cooperative mechanism. Then, accord- A. ADAPTIVE COOPERATIVE FORAGING STRATEGY
ing to the conversion factor CF, one-dimensional and total- In the original HHO, a single global optimal position
dimensional update operations were adaptively selected to (Xprey ) was used to guide the hawk’s swarm to update the
improve the population diversity. Second, proposed a dis- position. However, it ignores that there may be an optimal
persed foraging strategy, distributing some Harris hawk indi- solution in the vicinity of individuals with poor fitness, lead-
viduals to other regions to realize the exploration of search ing to premature convergence of the algorithm [47], [49].
TABLE 2. The effect of attenuation factor (δ) on the performance of ADHHO algorithm on four benchmark functions.
FIGURE 5. Comparison between original HHO and proposed ADHHO on escape energy, (a) The original escape energy at HHO, (b) The
randomly shrinking exponential escape energy at ADHHO (δ = 1.5).
the exploitation stage. Therefore, δ could help the algorithm E. COMPUTATIONAL COMPLEXITY
to achieve a balance between exploration and exploitation and To analyze the computational complexity of the proposed
avoid falling into local optima. ADHHO algorithm and the original HHO algorithm, we grad-
ually calculated the complexity of the two algorithms accord-
D. ALGORITHM FLOW
ing to the worst-case complexity. In the initialization stage,
The primary process of the ADHHO algorithm is shown in after initializing the positions of N individuals, ADHHO
FIGURE 6. First, the exploration and exploitation phases and HHO had the same computational complexity O (N ).
were selected according to the escape energy of the prey Then for the main loop of the HHO algorithm, the fitness of
in each iteration. The adaptive cooperation strategy was each search agent was first evaluated, so the computational
introduced into the exploration process, and the solution complexity in this stage is O (T · N ). Then in the population
search equation is the modified location update equation update stage, the position vector of each search agent was
(formula (8)). Then, one-dimensional or all-dimensional updated, so the computational complexity in this stage is O
update operations were performed according to the conver- (T · N · D). After comprehensive analysis, the computational
sion factor CF, which improved the diversity of Harris hawks’ complexity of HHO is calculated as follows:
population and helps to find a better solution. Then, the fit-
ness of the population were evaluated, and better individuals O(HHO) = O(N ) + O(T · N ) + O(T · N · D) = O(T · N · D)
were selected to enter the dispersed foraging stage. The dis- (14)
persed foraging strategy forces some individuals to leave the
current optimal position to explore the target space further. In the main loop process of ADHHO, the pro-
This method prevented the algorithm from falling into local posed adaptive cooperative foraging strategy adaptively
optima. In addition, as an auxiliary method, randomly shrink- changed the one-dimensional update operation and the
ing exponential escape energy can not only ensure that the total-dimensional update operation according to the CF.
algorithm has strong exploration ability in the early iteration, In the one-dimensional update operation, only one dimen-
but also retain the exploration capability in the later exploita- sion of the solution vector was changed. Therefore, the
tion stage, which effectively balanced the exploration and computational complexity is O (T · N ). In the total-
exploitation of the algorithm. The pseudo-code of ADHHO dimensional update operation, all dimensions of the solu-
algorithm is shown in Algorithm 4. tion vector were updated, so the computational complexity
is O (T · N · D). According to the worst-case complexity rule, According to the above analysis, the ADHHO and HHO
the computational complexity at this stage is O (T · N · D). algorithms have the same computational complexity.
In the dispersed foraging strategy, the Harris hawk individuals
that meet the decentralized conditions were allocated to other IV. EXPERIMENTAL RESULTS AND ANALYSIS
areas in the search space. The total number of individuals A. BENCHMARK FUNCTION
with position update does not change. Therefore, the compu- In this study, 20 benchmark functions, including
tational complexity is O (T · N · D). Comprehensive analysis, unimodal [55], multimodal [56] and CEC 2014 benchmark
the computational complexity of ADHHO is calculated as functions [57] were selected to evaluate the performance
follows: of ADHHO algorithm, and three experiments were carried
out respectively. The expressions, dimensions, search ranges,
O(HHO) = O(N )+O(T ·N ·D)+O(T · N · D) = O(T · N · D) and theoretical optimum of the benchmark functions are
(15) shown in TABLE 3. Among them, the unimodal benchmark
Xit+1 = Xit + µ × ∇it × Pti % update Xit+1 using dispersed foraging strategy(Eq. (9))
26: Sort fitness and then update the locations of prey: Xprey
27: The location of prey Xprey is the final optimal solution
28: Iter = Iter + 1
29: end for
30: End
functions (F1 -F6 ) were selected to test the convergence rate optimization ability of the algorithm. F13 -F20 are modern
and local exploitation ability of the algorithm. F7 -F12 are numerical optimization problems proposed in IEEE CEC
multimodal benchmark functions and they contain multiple 2014 special session and competition on single-objective
locally optimal solutions, and the number increases expo- real parameter numerical optimization [57]. These bench-
nentially with the increase of dimension. Therefore, they mark functions have shifted, rotated, expanded and combined
were used to evaluate the exploration ability and global the most complicated mathematical optimization problems
TABLE 3. Description of the unimodal, multimodal and CEC 2014 benchmark functions.
presented in the literature. Therefore, they are usually used to hawks optimization with mutation mechanism (DHHO/M)
evaluate the comprehensive abilities of an algorithm. [35]. TABLE 4 lists the parameter settings of all algorithms.
For the fairness of the experiment, the maximum iterations
B. PARAMETER SETTING (T ) and population size (N ) were set to 1000 and 50, respec-
In each experimental test, the performance of the improved tively, in the first two experiments (benchmark problems).
ADHHO algorithm was compared with classical SI algorithm In the third experiment (CEC 2014 problems), the maximum
and other improved versions of HHO, including the original iterations (T ) was set to 6000 to obtain 300000 number of
HHO [34], PSO [11], FF [20], Long-term memory Harris functional evaluations (NFEs). For each benchmark function,
hawks optimization (LMHHO) [41] and Dynamic Harris all algorithms were run 30 times independently. In addition,
TABLE 5. Statistical results obtained by PSO, FF, HHO, DHHO/M, LMHHO and ADHHO through 30 independent runs on functions F1 - F6 with unimodal
benchmark functions.
FIGURE 7. Comparison of the convergence curves of ADHHO and the other algorithms on functions F1 -F6 .
all the algorithms were implemented in MATLAB 2014a. is the same as that of LMHHO and HHO, but much better
All computations were run on a CPU: Intel Core i5-4200 M, than other algorithms. FIGURE 7 shows the comparison of
2.5 GHz, 8G RAM, and Windows 7 (64-bit) operating system. the convergence rates of the six algorithms. From FIGURE 7,
the ADHHO algorithm has a faster convergence rate than
C. EXPERIMENTAL RESULTS other algorithms for most unimodal functions. Although the
1) EXPERIMENTAL TEST 1: UNIMODAL BENCHMARK search steps of LMHHO and DHHO/M have been improved
FUNCTIONS compared with HHO, they still can not exceed the proposed
Unimodal functions have only one global optimum, and they ADHHO. According to the characteristics of unimodal func-
are often used to evaluate the convergence rate and exploita- tion, it can be proved that ADHHO has good exploitation
tion ability of algorithms. It can be seen from the TABLE 5 ability.
that the proposed ADHHO algorithm can achieve better
results than other algorithms on most unimodal functions. For 2) EXPERIMENTAL TEST 2: MULTIMODAL BENCHMARK
example, on F1 , F2 , F3 , F4 and F6 , the average and standard FUNCTIONS
deviation of ADHHO are better than LMHHO, DHHO/M, The multimodal benchmark function contains multiple local
PSO and FF. For F5 , the performance of ADHHO algorithm minimum points and is often used to evaluate the exploratory
TABLE 6. Statistical results obtained by PSO, FF, HHO, DHHO/M, LMHHO and ADHHO through 30 independent runs on functions F7 - F12 with multimodal
benchmark functions.
FIGURE 8. Comparison of the convergence curves of ADHHO and the other algorithms on functions F7 -F12 .
ability of the algorithm. It can be observed from TABLE 6 that provided by the cooperative relationship of Harris hawks and
the mean and standard deviation of ADHHO on F7 , F8 , dispersed foraging behavior.
F10 , and F12 are obviously better than other algorithms. It is
worth noting that the ADHHO algorithm can obtain the global
optimum on F7 and F8 , while other algorithms cannot get 3) EXPERIMENTAL TEST 3: CEC 2014 BENCHMARK
the global optimum. On F11 , the ADHHO algorithm ranks FUNCTIONS
second among all algorithms. FIGURE 8 shows the conver- In this experiment, the CEC 2014 benchmark functions
gence curves of all algorithms for solving high-dimensional are selected to evaluate the comprehensive performance
multimodal functions. From this figure, compared with the of ADHHO. From TABLE 7, it can be seen that among all
other six algorithms, ADHHO has better convergence rate the 8 CEC 2014 test functions, the performance of ADHHO
and sufficient accuracy. The LMHHO algorithm only defeats on seven functions is better than that of other algorithms.
ADHHO on F9 . The experiment shows that the ADHHO The average value of ADHHO is second only to LMHHO
algorithm still has an excellent performance in exploration algorithm on F14 . Also, as can be seen from FIGURE 9,
ability, which is attributed to the high population diversity ADHHO has the best convergence performance among all
FIGURE 9. Comparison of the convergence curves of ADHHO and the other algorithms on functions F13 -F20 .
TABLE 7. Statistical results obtained by PSO, FF, HHO, DHHO/M, LMHHO and ADHHO through 30 independent runs on functions F13 - F20 with the CEC
2014 benchmark functions.
TABLE 8. Results of Wilcoxon’s test for ADHHO against the other five algorithms for 20 benchmark functions.
the impact of three strategies on algorithm performance. strategies. Therefore, we no longer compare the third strategy
Since the third strategy is the auxiliary strategy of the first two separately. Based on this, the improved algorithm (ADHHO),
strategies, in short, the third strategy is a part of the first two the HHO that only introduces adaptive cooperative foraging
TABLE 9. Average ranking of the six algorithms using MAE for 24 TABLE 10. Simulation comparison of improved strategies.
benchmark functions.
FIGURE 11. Comparison of the convergence curves of improved strategies on some benchmark functions.
VI. CONCLUSION
This paper proposed three strategies to improve the global
search performance of the HHO algorithm. First, an adaptive
cooperative foraging strategy was introduced, and some Har-
ris hawk individuals were randomly selected for cooperative
g3 (Ex ) = −πx32 x4 − 4/3πx33 + 1296000 ≤ 0, foraging. Simultaneously, the method was embedded in the
g4 (Ex ) = x4 − 240 ≤ 0, one-dimensional update operation, and the one-dimensional
Variable range: 0 ≤ x1 ≤ 99; 0 ≤ x2 ≤ 99; cooperative foraging and traditional total-dimensional coop-
10 ≤ x3 ≤ 200; 10 ≤ x4 ≤ 200. erative foraging was adaptively selected. Second, a dispersed
foraging strategy was introduced, forcing some Harris hawks
The algorithms used to test this type of problem are the to scatter into different search spaces to find potential prey.
same as the algorithms in Section IV. The parameter set- Finally, a randomly shrinking exponential function was used
tings of the algorithms are shown in TABLE 4. The opti- to simulate the energy decay process of the prey. Twenty
mal solutions and corresponding variable results are shown benchmark functions, including unimodal, multimodal, and
in TABLE 11. From this table, the optimization result of CEC 2014 benchmark functions, were used to test the
ADHHO is the best among all algorithms. Besides, to test proposed ADHHO algorithm’s robustness. The algorithm’s
the stability of the algorithm, the experimental results of performance was compared with other advanced swarm
intelligence algorithms and other improved variants of the [15] H. Wang, F. Guo, H. Yao, S. He, and X. Xu, ‘‘Collision avoidance planning
HHO algorithm. Wilcoxon’s rank-sum test and MAE were method of USV based on improved ant colony optimization algorithm,’’
IEEE Access, vol. 7, pp. 52964–52975, Mar. 2019.
used to analyze the experimental results. Statistical results [16] X. S. Yang and S. Deb, ‘‘Cuckoo search via Lévy flights,’’ in Proc. World
show that ADHHO outperforms HHO and other advanced Congr. Nature Biol. Inspired Comput., Dec. 2009, pp. 210–214.
algorithms in solution quality, stability, and local optimum [17] L. Wang, Y. Zhong, and Y. Yin, ‘‘Nearest neighbour cuckoo search
algorithm with probabilistic mutation,’’ Appl. Soft Comput., vol. 49,
avoidance. Besides, this paper also applied the ADHHO algo- pp. 498–509, Dec. 2016.
rithm to the design of pressure vessels. Experimental results [18] W.-T. Pan, ‘‘A new fruit fly optimization algorithm: Taking the financial
have shown that ADHHO has a good performance in solving distress model as an example,’’ Knowl.-Based Syst., vol. 26, pp. 69–74,
Feb. 2012.
real-world optimization problems of unknown search spaces. [19] S. Mirjalili, S. M. Mirjalili, and A. Lewis, ‘‘Grey wolf optimizer,’’ Adv.
However, on some convergence curves, it can be seen Eng. Softw., vol. 69, pp. 46–61, Mar. 2014.
that the convergence speed of ADHHO is slow. There- [20] X. S. Yang, ‘‘Firefly algorithm, stochastic test functions and design optimi-
sation,’’ Int. J. Bio-Inspired Comput., vol. 2, no. 2, pp. 78–84, Mar. 2010.
fore, in future research, it is necessary to study the con- [Online]. Available: http://arxiv.org/abs/1003.1409
vergence rate of the algorithm further. Also, this work only [21] S. Mirjalili, ‘‘Dragonfly algorithm: A new meta-heuristic optimization
provides the basic framework of ADHHO for low dimen- technique for solving single-objective, discrete, and multi-objective prob-
lems,’’ Neural Comput. Appl., vol. 27, no. 4, pp. 1053–1073, May 2016.
sion optimization problems, and the performance in high- [22] M. Jain, V. Singh, and A. Rani, ‘‘A novel nature-inspired algorithm for
dimensional optimization problems is not yet clear. It can be optimization: Squirrel search algorithm,’’ Swarm Evol. Comput., vol. 44,
further extended to high-dimensional optimization problems pp. 148–175, Feb. 2019.
[23] X. Zhang, K. Zhao, L. Wang, Y. Wang, and Y. Niu, ‘‘An improved squir-
in future research. Besides, we also intend to use the ADHHO rel search algorithm with reproductive behavior,’’ IEEE Access, vol. 8,
algorithm to solve multi-objective optimization, constrained pp. 101118–101132, May 2020.
optimization, and NP-hard problems in future work. [24] F. A. Hashim, E. H. Houssein, M. S. Mabrouk, W. Al-Atabany,
and S. Mirjalili, ‘‘Henry gas solubility optimization: A novel physics-
based algorithm,’’ Future Gener. Comput. Syst., vol. 101, pp. 646–667,
REFERENCES Dec. 2019.
[1] M. N. Omidvar, X. Li, Y. Mei, and X. Yao, ‘‘Cooperative co-evolution [25] S. Li, H. Chen, M. Wang, A. A. Heidari, and S. Mirjalili, ‘‘Slime mould
with differential grouping for large scale optimization,’’ IEEE Trans. Evol. algorithm: A new method for stochastic optimization,’’ Future Gener.
Comput., vol. 18, no. 3, pp. 378–393, Jun. 2014. Comput. Syst., vol. 111, pp. 300–323, Oct. 2020.
[26] A. Faramarzi, M. Heidarinejad, B. Stephens, and S. Mirjalili, ‘‘Equilibrium
[2] S. Bubeck, ‘‘Convex optimization: Algorithms and complexity,’’ Found.
optimizer: A novel optimization algorithm,’’ Knowl.-Based Syst., vol. 191,
Trends Mach. Learn., vol. 8, no. 4, pp. 231–357, May 2014. [Online].
Mar. 2020, Art. no. 105190.
Available: https://arxiv.org/abs/1405.4980
[27] Z. Meng, J.-S. Pan, and H. Xu, ‘‘QUasi-affine transformation evolu-
[3] Y. Zhong, L. Wang, M. Lin, and H. Zhang, ‘‘Discrete pigeon-inspired
tionary (QUATRE) algorithm: A cooperative swarm based algorithm
optimization algorithm with metropolis acceptance criterion for large-scale
for global optimization,’’ Knowl.-Based Syst., vol. 109, pp. 104–121,
traveling salesman problem,’’ Swarm Evol. Comput., vol. 48, pp. 134–144,
Oct. 2016.
Aug. 2019.
[28] M. Abdel-Basset, V. Chang, and R. Mohamed, ‘‘A novel equilibrium
[4] J. Liang, W. Xu, C. Yue, K. Yu, H. Song, O. D. Crisalle, and B. Qu, ‘‘Mul-
optimization algorithm for multi-thresholding image segmentation prob-
timodal multiobjective optimization with differential evolution,’’ Swarm
lems,’’ Neural Computing and Applications. Mar. 2020, pp. 1–34, doi:
Evol. Comput., vol. 48, pp. 1028–1059, Feb. 2019.
10.1007/s00521-020-04820-y.
[5] A. Slowik and H. Kwasnicka, ‘‘Nature inspired methods and their industry
[29] H. Özkaya, M. Yıldız, A. R. Yıldız, S. Bureerat, B. S. Yıldız, and
applications—Swarm intelligence algorithms,’’ IEEE Trans. Ind. Infor-
S. M. Sait, ‘‘The equilibrium optimization algorithm and the response
mat., vol. 14, no. 3, pp. 1004–1015, Mar. 2018.
surface-based metamodel for optimal structural design of vehicle compo-
[6] M. Mavrovouniotis, C. Li, and S. Yang, ‘‘A survey of swarm intelligence nents,’’ Mater. Test., vol. 62, no. 5, pp. 492–496, May 2020.
for dynamic optimization: Algorithms and applications,’’ Swarm Evol. [30] J.-S. Pan, Z. Meng, H. Xu, and X. Li, ‘‘Quasi-affine transformation evo-
Comput., vol. 33, pp. 1–17, Apr. 2017. lution (QUATRE) algorithm: A new simple and accurate structure for
[7] H. Li, F. He, Y. Liang, and Q. Quan, ‘‘A dividing-based many-objective global optimization,’’ in Proc. Int. Conf. Ind., Eng. Appl. Appl. Intell. Syst.,
evolutionary algorithm for large-scale feature selection,’’ Soft Comput., Morioka, Japan, Aug. 2016, pp. 657–667.
vol. 24, no. 9, pp. 6851–6870, May 2020. [31] Z. Meng and J.-S. Pan, ‘‘QUasi-affine TRansformation evolution with
[8] N. Hou, F. He, Y. Zhou, and Y. Chen, ‘‘An efficient GPU-based parallel tabu external ARchive (QUATRE-EAR): An enhanced structure for differential
search algorithm for hardware/software co-design,’’ Frontiers Comput. evolution,’’ Knowl.-Based Syst., vol. 155, pp. 35–53, Sep. 2018.
Sci., vol. 14, no. 5, Mar. 2020, Art. no. 145316. [32] K. Matsumoto, F. Ren, M. Matsuoka, M. Yoshida, and K. Kita, ‘‘Slang
[9] M. Le Quiniou, P. Mandel, and L. Monier, ‘‘Optimization of drinking water feature extraction by analysing topic change on social media,’’ CAAI Trans.
and sewer hydraulic management: Coupling of a genetic algorithm and two Intell. Technol., vol. 4, no. 1, pp. 64–71, Mar. 2019.
network hydraulic tools,’’ Procedia Eng., vol. 89, pp. 710–718, Jan. 2014. [33] D. H. Wolpert and W. G. Macready, ‘‘No free lunch theorems for
[10] Z. Sai, Y. Fan, T. Yuliang, X. Lei, and Z. Yifong, ‘‘Optimized algorithm of optimization,’’ IEEE Trans. Evol. Comput., vol. 1, no. 1, pp. 67–82,
sensor node deployment for intelligent agricultural monitoring,’’ Comput. Apr. 1997.
Electron. Agricult., vol. 127, pp. 76–86, Sep. 2016. [34] A. A. Heidari, S. Mirjalili, H. Faris, I. Aljarah, M. Mafarja, and H. Chen,
[11] J. Kennedy and R. Eberhart, ‘‘Particle swarm optimization,’’ in Proc. ‘‘Harris hawks optimization: Algorithm and applications,’’ Future Gener.
Int. Conf. Neural Netw. (ICNN), Perth, WA, Australia, Nov./Dec. 1995, Comput. Syst., vol. 97, pp. 849–872, Aug. 2019.
pp. 1942–1948. [35] H. Jia, C. Lang, D. Oliva, W. Song, and X. Peng, ‘‘Dynamic Harris hawks
[12] B. Yan, Z. Zhao, Y. Zhou, W. Yuan, J. Li, J. Wu, and D. Cheng, ‘‘A par- optimization with mutation mechanism for satellite image segmentation,’’
ticle swarm optimization algorithm with random learning mechanism and Remote Sens., vol. 11, no. 12, p. 1421, Jun. 2019.
levy flight for optimization of atomic clusters,’’ Comput. Phys. Commun., [36] A. R. Yıldız, B. S. Yıldız, S. M. Sait, and X. Li, ‘‘The Harris hawks,
vol. 219, pp. 79–86, Oct. 2017. grasshopper and multi-verse optimization algorithms for the selection of
[13] K. Z. Gao, P. N. Suganthan, T. J. Chua, C. S. Chong, T. X. Cai, and optimal machining parameters in manufacturing operations,’’ Mater. Test.,
Q. K. Pan, ‘‘A two-stage artificial bee colony algorithm scheduling flexible vol. 61, no. 8, pp. 725–733, Aug. 2019.
job-shop scheduling problem with new job insertion,’’ Expert Syst. Appl., [37] N. Amiri Golilarz, H. Gao, and H. Demirel, ‘‘Satellite image de-noising
vol. 42, no. 21, pp. 7652–7663, Nov. 2015. with Harris hawks meta heuristic optimization algorithm and improved
[14] M. Dorigo, M. Birattari, and T. Stutzle, ‘‘Ant colony optimization,’’ IEEE adaptive generalized Gaussian distribution threshold function,’’ IEEE
Comput. Intell. Mag., vol. 1, no. 4, pp. 28–39, Nov. 2006. Access, vol. 7, pp. 57459–57468, Apr. 2019.
[38] H. Chen, S. Jiao, M. Wang, A. A. Heidari, and X. Zhao, ‘‘Parameters iden- [49] F. Zhong, H. Li, and S. Zhong, ‘‘A modified ABC algorithm based
tification of photovoltaic cells and modules using diversification-enriched on improved-global-best-guided approach and adaptive-limit strategy
Harris hawks optimization with chaotic drifts,’’ J. Cleaner Prod., vol. 244, for global optimization,’’ Appl. Soft Comput., vol. 46, pp. 469–486,
Jan. 2020, Art. no. 118778. Sep. 2016.
[39] E. H. Houssein, M. R. Saad, K. Hussain, W. Zhu, H. Shaban, and [50] M. S. Kiran, H. Hakli, M. Gunduz, and H. Uguz, ‘‘Artificial bee colony
M. Hassaballah, ‘‘Optimal sink node placement in large scale wireless algorithm with variable search strategy for continuous optimization,’’ Inf.
sensor networks based on Harris’ hawk optimization algorithm,’’ IEEE Sci., vol. 300, pp. 140–157, Apr. 2015.
Access, vol. 8, pp. 19381–19397, Jan. 2020. [51] D. Karaboga and B. Akay, ‘‘A comparative study of artificial bee
[40] M. S. Mehta, M. B. Singh, and M. Gagandeep, ‘‘Harris Hawks optimization colony algorithm,’’ Appl. Math. Comput., vol. 214, no. 1, pp. 108–132,
for solving optimum load dispatch problem in power system,’’ Int. J. Eng. Aug. 2009.
Res. Technol., vol. 8, no. 6, pp. 962–968, 2019. [52] J. C. Bednarz, ‘‘Cooperative hunting Harris’ hawks (parabuteo unicinc-
[41] K. Hussain, W. Zhu, and M. N. M. Salleh, ‘‘Long-term memory Harris’ tus),’’ Science, vol. 239, no. 4847, pp. 1525–1527, Mar. 1988.
hawk optimization for high dimensional and optimal power flow prob- [53] X. Wang and X. Zou, ‘‘Modeling the fear effect in predator–prey inter-
lems,’’ IEEE Access, vol. 7, pp. 147596–147616, Oct. 2019. actions with adaptive avoidance of predators,’’ Bull. Math. Biol., vol. 79,
[42] Q. Fan, Z. Chen, and Z. Xia, ‘‘A novel quasi-reflected Harris hawks no. 6, pp. 1325–1359, Jun. 2017.
optimization algorithm for global optimization problems,’’ Soft Comput., [54] M. Moustafa, M. H. Mohd, A. I. Ismail, and F. A. Abdullah, ‘‘Dynam-
vol. 10, no. 2, pp. 1–19, Mar. 2020. ical analysis of a fractional-order Rosenzweig–MacArthur model incor-
[43] V. K. Kamboj, A. Nandi, A. Bhadoria, and S. Sehgal, ‘‘An intensify Harris porating a prey refuge,’’ Chaos, Solitons Fractals, vol. 109, pp. 1–13,
hawks optimizer for numerical and engineering optimization problems,’’ Apr. 2018.
Appl. Soft Comput., vol. 89, Apr. 2020, Art. no. 106018. [55] M. Jamil and X. S. Yang, ‘‘A literature survey of benchmark func-
[44] E. Kurtuluş, A. R. Yıldız, S. M. Sait, and S. Bureerat, ‘‘A novel hybrid tions for global optimization problems,’’ J. Math. Model. Numer. Optim.,
Harris hawks-simulated annealing algorithm and RBF-based metamodel vol. 4, no. 2, pp. 150–194, Aug. 2013. [Online]. Available: http://arxiv.org/
for design optimization of highway guardrails,’’ Mater. Test., vol. 62, no. 3, abs/1308.4008
pp. 251–260, Mar. 2020. [56] F. Wang, H. Zhang, K. Li, Z. Lin, J. Yang, and X.-L. Shen, ‘‘A hybrid
[45] H. Moayedi, M. M. Abdullahi, H. Nguyen, and A. S. A. Rashid, ‘‘Com- particle swarm optimization algorithm using adaptive learning strategy,’’
parison of dragonfly algorithm and Harris hawks optimization evolutionary Inf. Sci., vols. 436–437, pp. 162–177, Apr. 2018.
data mining techniques for the assessment of bearing capacity of footings [57] J. J. Liang, B. Y. Qu, and P. N. Suganthan, ‘‘Problem definitions and eval-
over two-layer foundation soils,’’ Eng. Comput., pp. 1–11, Aug. 2019, doi: uation criteria for the CEC 2014 special session and competition on single
10.1007/s00366-019-00834-w. objective real-parameter numerical optimization,’’ Comput. Intell. Lab.,
[46] A. S. Menesy, H. M. Sultan, A. Selim, M. G. Ashmawy, and S. Kamel, Zhengzhou Univ., Zhengzhou, China, Nanyang Technol. Univ., Singapore,
‘‘Developing and applying chaotic Harris hawks optimization technique Tech. Rep. 201311, 2013.
for extracting parameters of several proton exchange membrane fuel cell [58] S. García, D. Molina, M. Lozano, and F. Herrera, ‘‘A study on the use of
stacks,’’ IEEE Access, vol. 8, pp. 1146–1159, Dec. 2020. non-parametric tests for analyzing the evolutionary algorithms’ behavior:
[47] D. Yousri, D. Allam, and M. B. Eteiba, ‘‘Optimal photovoltaic array A case study on the CEC’2005 special session on real parameter optimiza-
reconfiguration for alleviating the partial shading influence based on a tion,’’ J. Heuristics, vol. 15, no. 6, p. 617, May 2008.
modified Harris hawks optimizer,’’ Energy Convers. Manage., vol. 206, [59] A. Kaveh and V. R. Mahdavi, ‘‘Colliding bodies optimization: A novel
Feb. 2020, Art. no. 112470. meta-heuristic method,’’ Comput. Struct., vol. 139, pp. 18–27, Jul. 2014.
[48] H. M. Ridha, A. A. Heidari, M. Wang, and H. Chen, ‘‘Boosted mutation- [60] A. Kaveh, T. Bakhshpoori, and E. Afshari, ‘‘An efficient hybrid particle
based Harris hawks optimizer for parameters identification of single- swarm and swallow swarm optimization algorithm,’’ Comput. Struct.,
diode solar cell models,’’ Energy Convers. Manage., vol. 209, Apr. 2020, vol. 143, pp. 40–59, Sep. 2014.
Art. no. 112660.