Professional Documents
Culture Documents
Drones 07 00427 v3
Drones 07 00427 v3
Review
Nature-Inspired Algorithms from Oceans to Space: A
Comprehensive Review of Heuristic and Meta-Heuristic
Optimization Algorithms and Their Potential Applications
in Drones
Shahin Darvishpoor 1 , Amirsalar Darvishpour 2 , Mario Escarcega 3 and Mostafa Hassanalian 3, *
1 Department of Aerospace Engineering, K.N. Toosi University of Technology, Tehran 16569-83911, Iran;
darvishpoor@email.kntu.ac.ir
2 Department of Computer Engineering, University of Tehran, Tehran 14179-35840, Iran; salar.darvish@ut.ac.ir
3 Department of Mechanical Engineering, New Mexico Tech, Socorro, NM 87801, USA;
mario.escarcega@student.nmt.edu
* Correspondence: mostafa.hassanalian@nmt.edu
Abstract: This paper reviews a majority of the nature-inspired algorithms, including heuristic and
meta-heuristic bio-inspired and non-bio-inspired algorithms, focusing on their source of inspiration
and studying their potential applications in drones. About 350 algorithms have been studied, and
a comprehensive classification is introduced based on the sources of inspiration, including bio-
based, ecosystem-based, social-based, physics-based, chemistry-based, mathematics-based, music-
based, sport-based, and hybrid algorithms. The performance of 21 selected algorithms considering
calculation time, max iterations, error, and the cost function is compared by solving 10 different
benchmark functions from different types. A review of the applications of nature-inspired algorithms
in aerospace engineering is provided, which illustrates a general view of optimization problems in
drones that are currently used and potential algorithms to solve them.
Citation: Darvishpoor, S.;
Darvishpour, A.; Escarcega, M.;
Keywords: bio-inspired; drones; heuristics; meta-heuristics; nature-inspired; optimization
Hassanalian, M. Nature-Inspired
Algorithms from Oceans to Space: A
Comprehensive Review of Heuristic
and Meta-Heuristic Optimization
1. Introduction
Algorithms and Their Potential
Applications in Drones. Drones 2023, Optimization is a practical and essential part of engineering and science with an
7, 427. https://doi.org/10.3390/ increasing amount of applications [1]. Many researchers around the world are working on
drones7070427 the development of optimization methods. Among these different optimization algorithms,
nature-inspired or bio-inspired algorithms are prevalent due to their excellent performance
Academic Editor: Diego González-
and simplicity [2]. Since their inception, nature-inspired algorithms have experienced
Aguilera
exponential growth. Hundreds of animals, insects, and natural phenomena have been
Received: 15 May 2023 used as a source of inspiration for developing optimization algorithms [3]. Researchers
Revised: 17 June 2023 have developed algorithms based on underwater, terrestrial, and flying animals. This
Accepted: 25 June 2023 list includes natural phenomena such as rain, the water cycle, hurricanes, and even stars
Published: 27 June 2023 and galaxies. In addition, many behaviors of humans and animals and other topics such
as music and sports have also been used to develop optimization algorithms. There are
also hybrid algorithms that are a combination of other nature-inspired algorithms. About
100 different species of animals, insects, plants, and micro- and nano-organisms have been
Copyright: © 2023 by the authors.
Licensee MDPI, Basel, Switzerland.
used so far to develop optimization algorithms. Based on the current paper, there are at
This article is an open access article
least 350 different nature-inspired algorithms in various categories. Figure 1 illustrates a
distributed under the terms and
small portion of the sources of inspiration in nature-inspired algorithms from oceans to
conditions of the Creative Commons space symbolically.
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
Figure1.1.The
Figure Thesources
sourcesof
of inspiration
inspiration in
in nature-inspired
nature-inspired algorithms
algorithms vary
vary from
from underwater
underwaterto
tospace.
space.
In thissimilar
Some paper, research
we try to can
studybethe latest
found indevelopments
the literature in onnature-inspired
the latest progressoptimization
of nature-
and challenges
inspired in thisalgorithms.
optimization field. First ofMolina
all, weetclassified
al. have all of thesecomprehensive
proposed algorithms based on their
taxonomies
source
of of inspiration
nature-inspired and provided
optimization a comprehensive
algorithms. They have classification.
focused on theConsidering
source of other clas-
inspiration
sifications in similar papers, we have tried to provide a more comprehensive
in these algorithms and have shown the similarity of a big group of algorithms with classic classification
with detailed
approaches sub-categories.
regarding In each
their core category, process.
computation the most However,
popular algorithms are studied
their research lacks a
in detail to provide a good view of their challenges and benefits.
performance analysis for these algorithms [4]. Discussions on the novelty and importanceNext, in Section 3, the
performance
of nature-inspiredof a group of selected
optimization algorithms
algorithms are is evaluated
still going on, in some
solving 10 different
researchers prob-
consider
lems.
no Critical
value parameters
for these algorithmssuch as mean
while othersiterations,
believe the average computation
production of new time,
methodsand should
mean
error
be are calculated
stopped by solving
and the majority of each
effortsproblem
should500 times. A database
be dedicated to moreofpromising
sample codes for
research
nature-inspired
directions algorithms is also
in the meta-heuristic provided.
literature. While In they
addition to this,
confirm the all of the
power of project’s codes
nature-inspired
are provided they
optimization, in the GitHub
believe repositories.
only a few algorithmsMost algorithms
can really be in used
this paper have problems
for solving been ex-
tracted from a recent work by Tzanetos et al., in addition to classical
with high accuracy in a short time [5]. Evaluating the performance of these algorithms algorithms and newlyin
developed ones [6]. In the last section, we have studied the different
solving different problems seems to be necessary research that is not studied too much. applications of na-
The
ture-inspired
next section willalgorithms in drones
review similar and aerospace
review papers onsystems. After providing
nature-inspired a classification
optimization and their
efforts to manage and classify these algorithms for better understanding and study.
In this paper, we try to study the latest developments in nature-inspired optimization
and challenges in this field. First of all, we classified all of these algorithms based on
their source of inspiration and provided a comprehensive classification. Considering other
classifications in similar papers, we have tried to provide a more comprehensive classi-
fication with detailed sub-categories. In each category, the most popular algorithms are
studied in detail to provide a good view of their challenges and benefits. Next, in Section 3,
the performance of a group of selected algorithms is evaluated in solving 10 different
problems. Critical parameters such as mean iterations, average computation time, and
mean error are calculated by solving each problem 500 times. A database of sample codes
for nature-inspired algorithms is also provided. In addition to this, all of the project’s
codes are provided in the GitHub repositories. Most algorithms in this paper have been
extracted from a recent work by Tzanetos et al., in addition to classical algorithms and
newly developed ones [6]. In the last section, we have studied the different applications of
nature-inspired algorithms in drones and aerospace systems. After providing a classifica-
tion of different applications, we have studied the papers published in this area to find out
which algorithms are being used most. This study reveals: (1) the areas in drones in which
Drones 2023, 7, x FOR PEER REVIEW 3 of 140
Figure
Figure 3.
3. Classification
Classification of
of optimization
optimization methods
methods based on Muller
based on Muller [8].
[8].
Nature-inspired algorithms have many parts in common with the well-known field of
Nature-inspired algorithms have many parts in common with the well-known field
meta-heuristic algorithms. Although some researchers consider nature-inspired algorithms
of meta-heuristic algorithms. Although some researchers consider nature-inspired algo-
to be a sub-category of meta-heuristic algorithms, they have a few different algorithms, such
rithms to be a sub-category of meta-heuristic algorithms, they have a few different algo-
as math-inspired algorithms, which are considered non-nature-inspired algorithms. Much
rithms, such as math-inspired algorithms, which are considered non-nature-inspired al-
research is conducted by different researchers on meta-heuristic algorithms, including
gorithms.
Osman [9],Much research
Gendreau et al.is[10],
conducted
Fister [2],by anddifferent researchers
others [11]. on meta-heuristic
Abdel-basset et al. have studiedalgo-
rithms, including Osman [9], Gendreau et al. [10], Fister [2], and others
different reviews on meta-heuristic algorithms. Based on their work, there are some popular [11]. Abdel-basset
et al. have studied
classifications different reviews
of meta-heuristic on meta-heuristic
algorithms. One of them is algorithms. Basedand
trajectory-based on their work,
population-
there
based;arein asome popular classifications
trajectory-based algorithm, aofsolution
meta-heuristic algorithms.
is considered at first,One
andof inthem is trajec-
each iteration,
tory-based and population-based; in a trajectory-based algorithm, a
the best solution is replaced by a new, better solution, while a population-based algorithm solution is considered
at first,with
starts and ainrandom
each iteration,
populationthe best solution isand
of solutions, replaced by a new,
this solution be er solution,
is refined throughwhile each
asearch
population-based algorithm starts with a random population
iteration. Some researchers have also classified meta-heuristic algorithms of solutions, andbased
this so-
on
lution is refined
the usage through each search iteration. Some researchers have also classified meta-
of memory.
heuristic
Anotheralgorithms
popular based on the usage
classification of memory.
is based on being nature-inspired or not. Nature-
Another popular classification is based
inspired algorithms are divided into swarm-intelligence-based, on being nature-inspired
non-swarm, or not.and Nature-in-
physics/
spired algorithms are divided into swarm-intelligence-based,
chemistry-based algorithms in these classifications. Ruiz-Vanoye et al. have also classi-non-swarm, and phys-
ics/chemistry-based algorithms in these classifications. Ruiz-Vanoye
fied meta-heuristic algorithms based on animal groups. Based on them, meta-heuristic et al. have also clas-
sified meta-heuristic algorithms based on animal groups. Based
algorithms can be divided into swarm, school, flock, and herd algorithms [11]. Abdel- on them, meta-heuristic
algorithms
basset et al.can bealso
have divided into swarm,
introduced a newschool, flock, and
classification herd algorithmsalgorithms
of meta-heuristic [11]. Abdel-bas-
based
set et al. inspiration
on their have also introduced a new classification
type into metaphor-based andof meta-heuristic algorithms
non-metaphor-based basedUn-
algorithms. on
their inspiration type intoalgorithms,
like non-metaphor-based metaphor-based and non-metaphor-based
metaphor-based algorithms simulate algorithms.
a natural Unlike
phe-
non-metaphor-based
nomenon, human or algorithms,animal behavior,metaphor-based algorithms simulate
and even mathematics. a natural
Abdel-basset etphenom-
al. have
enon,
divided human or animal behavior,
metaphor-based algorithms andinto
even mathematics.(evolutionary,
biology-based Abdel-bassetswarm et al. have divided
intelligence,
metaphor-based
and artificial immune algorithms into physics-based,
systems), biology-based (evolutionary,
swarm-based,swarm intelligence,
social-based, and ar-
music-based,
tificial immune systems),
chemistry-based, physics-based,
sport-based, and math-based.swarm-based,
They did social-based,
not introduce music-based, chem-
any sub-category
istry-based, sport-based, and category,
for the non-metaphor-based math-based. but They did not
it includes introducesuch
algorithms any assub-category
Iterated Local for
the non-metaphor-based
Search category, butSearch
(ILS), Variable Neighborhood it includes
(VNS), algorithms such as Iterated
Greedy Randomized Local Search
Adaptive Search
(ILS), Variable
Procedure Neighborhood
(GRASP), and Partial Search (VNS), Greedy
Optimization Randomized
Meta-Heuristic Adaptive
Under SpecialSearch Proce-
Intensifica-
dure (GRASP), and Partial Optimization Meta-Heuristic Under Special Intensification
tion Condition (POPMUSIC). A similar classification is used by Espinosa. He classified
Condition (POPMUSIC).
nature-inspired methodsAassimilar classification
biological, chemical,isor used by Espinosa.
physical He classified
[12]. Figure nature-
4 illustrates the
inspired methods
classification as biological,algorithms
of meta-heuristic chemical, or physical
studied [12]. Figure 4 et
by Abdel-basset illustrates
al. the classifi-
cation of meta-heuristic
This paper introduces algorithms studiedbased
a classification by Abdel-basset et al. of algorithms to study
on the inspiration
nature-based algorithms. This classification is an extended version of the classification
of Abdel-basset et al. There are nine main categories: bio-based, ecosystem-based, social-
based, physics-based, chemistry-based, music-based, sport-based, hybrid, and math-based.
Math-based algorithms are considered nature-inspired algorithms, although they are
not necessarily nature-based. The biology-based category is divided into 10 categories:
evolution-based, organ-based, behavior-based, microorganism-based, insect-based, avian-
Drones 2023, 7, 427 5 of 134
The most popular category is bio-based, which contains 66% of the total algorithms.
Second place belongs to the physics-based algorithm at 15.6% percent. Figure 6 illustrates
the share of each category from the total number (about 360 algorithms) of algorithms
studied
Figure 5.inInspiration-based
this paper. classification of nature-inspired algorithms.
Figure 5. Inspiration-based classification of nature-inspired algorithms.
The most popular category is bio-based, which contains 66% of the total algorithms.
Second place belongs to the physics-based algorithm at 15.6% percent. Figure 6 illustrates
the share of each category from the total number (about 360 algorithms) of algorithms
studied in this paper.
Figure 6. The share of each category from the total number of nature-inspired algorithms.
Figure 6. The share of each category from the total number of nature-inspired algorithms.
Figure 7 illustrates the divisions of each category from the total number of nature-
inspired algorithms, including subcategories of the bio-based category.
Figure 6. The share of each category from the total number of nature-inspired algorithms.
Figure 7 illustrates the divisions of each category from the total number of nature-
inspired algorithms, including subcategories of the bio-based category.
Figure 7.
Figure 7. The
The share
share of
of each
each category
category from
from the
the total
total number
number of
of nature-inspired
nature-inspiredalgorithms.
algorithms.
Another factor
Another factor that
that shows
shows the
the popularity
popularity of
of an
an algorithm
algorithm is is the
the count
count of
of citations
citations of
of
the papers
the papersin
ineach
eachcategory.
category.Although
Although thethe number
number of citations
of citations is not
is not an accurate
an accurate factor,
factor, and
and more citations do not necessarily mean more applications, it shows the greater num-
ber of developments and research based on each algorithm or category. This factor ap-
proximately shows the applicability of the algorithms. For each category, we have meas-
ured the number of citations for the first published publication to introduce each algo-
Figure 7. The share of each category from the total number of nature-inspired algorithms.
Drones 2023, 7, 427 Another factor that shows the popularity of an algorithm is the count of citations 7 of of
134
the papers in each category. Although the number of citations is not an accurate factor,
and more citations do not necessarily mean more applications, it shows the greater num-
ber of developments
more citations do notand researchmean
necessarily basedmore
on each algorithmit or
applications, category.
shows This factor
the greater number ap-of
proximately shows the applicability of the algorithms. For each category, we
developments and research based on each algorithm or category. This factor approximately have meas-
ured
showsthethe
number of citations
applicability of the for the first For
algorithms. published publication
each category, we haveto introduce
measured each algo-
the number
rithm based for
of citations on Google
the first Scholar
published metrics. Figureto8 introduce
publication illustrateseach
a comparison
algorithm of the total
based cita-
on Google
tions for each
Scholar category.
metrics. Figure 8 illustrates a comparison of the total citations for each category.
Figure
Figure8.8.Number
Numberofofcitations
citationsfor
foreach
eachcategory.
category.
Based
Basedon onFigure
Figure8,8,bio-based
bio-basedandandphysics-based
physics-basedalgorithms
algorithmsarearealso
alsothe
themost
mostpopular
popular
and applicable algorithms considering the total citations, but music-based algorithms are
and applicable algorithms considering the total citations, but music-based algorithms
more popular
are more than than
popular math-based or ecosystem-based
math-based algorithms,
or ecosystem-based whilewhile
algorithms, there there
are fewer mu-
are fewer
sic-based algorithms.
music-based algorithms.
The
Thefollowing
followingsections
sectionsstudy
studydifferent
differentnature-inspired
nature-inspiredalgorithms
algorithmsbased
basedononthe
theabove
above
classification. There are about 360 different algorithms classified into the above-mentioned
classification. There are about 360 different algorithms classified into the above-mentioned
categories.InIn
categories. each
each category,
category, a number
a number of the
of the most most popular
popular algorithms
algorithms are studied
are studied in de-in
detail, and the rest are just briefly mentioned. The Git repository of this research
tail, and the rest are just briefly mentioned. The Git repository of this research contains contains
thesample
the samplecodes
codesofofa agroup
groupofofalgorithms
algorithms inin MATLAB,
MATLAB, Python,
Python, or or C/C#/C++.
C/C#/C++. Inab-
In the the
absence of sample codes, the pseudocode or flowchart of the algorithms
sence of sample codes, the pseudocode or flowchart of the algorithms are included are included
(h(https://github.com/shahind/Nature-Inspired-Algorithms,
ps://github.com/shahind/Nature-Inspired-Algorithms, accessed accessed
on 20onJune
20 June 2023).
2023).
2.1. Bio-Based
Bio-based algorithms are generally inspired by living species such as animals, humans,
insects, etc. While the majority of these algorithms are inspired by animals and insects, some
of them are developed based on processes, organs, or behavior of animals in general. About
80% of the bio-based algorithms which are studied in this research (about 230 algorithms)
are based on living species. Among the living species-based algorithms, the most popular
sub-category is terrestrial animal-based algorithms with about 19.7% of all algorithms; the
second place belongs to insect-based algorithms with about 18% of bio-based algorithms;
and the next most popular subcategories are the aquatic-based, avian-animal-based, and
plant-based algorithms. Figure 9 illustrates the distribution of each sub-category from the
total number of bio-based algorithms.
A comparison of the citations of each bio-based sub-category reveals other results.
Based on Figure 10, evolutionary algorithms are 3 times more popular than insect-based
algorithms and 13 times more popular than terrestrial-animal-based algorithms, while they
have about 9 times fewer numbers. Of course, the citation count is not an accurate factor
in calculating the popularity of an algorithm because some have been introduced recently.
Nevertheless, almost all sub-categories have algorithms from the 1900s to recent times.
Almost 100 different species of animals, insects, plants, and micro- and nano-organisms
can be found among the bio-based algorithms. Figure 11 illustrates different species, as
well as organisms in which the studied algorithms in this paper are inspired by.
mans, insects, etc. While the majority of these algorithms are inspired by animals and in-
sects, some of them are developed based on processes, organs, or behavior of animals in
general. About 80% of the bio-based algorithms which are studied in this research (about
230 algorithms) are based on living species. Among the living species-based algorithms,
the most popular sub-category is terrestrial animal-based algorithms with about 19.7% of
all algorithms; the second place belongs to insect-based algorithms with about 18% of bio-
Drones 2023, 7, 427 based algorithms; and the next most popular subcategories are the aquatic-based, avian-
8 of 134
animal-based, and plant-based algorithms. Figure 9 illustrates the distribution of each
sub-category from the total number of bio-based algorithms.
Almost 100 different species of animals, insects, plants, and micro- and nano-organ-
isms can be found among the bio-based algorithms. Figure 11 illustrates different species,
as well as
Figure Figurein10.
10.organisms
Comparison of Comparison
which
total the of total
studied
citations ofcitations of bio-based
algorithms
bio-based sub-categories.
in this paper are inspired by.
sub-categories.
Almost 100 different species of animals, insects, plants, and micro- and nano-organ-
isms can be found among the bio-based algorithms. Figure 11 illustrates different species,
as well as organisms in which the studied algorithms in this paper are inspired by.
Figure
Figure 11. Different 11. Different
species speciesalgorithms
in bio-based in bio-basedand
algorithms and their
their share share
of the of the
total total number
number of algo-
of algo-
Figure 11. Different species in bio-based algorithms and their share of the total number of algorithms.
rithms.
rithms.
2.1.1. Evolution-Based
2.1.1. Evolution-Based
There are less than 10 main evolutionary-based algorithms among the bio-based al-
There are less than 10but
gorithms, main evolutionary-based
as mentioned algorithms
before, they are among and
the most popular the the
bio-based
most usedal-algo-
gorithms, but as rithms
mentioned
comparedbefore, they
to other are the Among
categories. most popular and thealgorithms,
the evolutionary most usedthe algo-
genetic al-
Drones 2023, 7, 427 9 of 134
s 2023, 7, x FOR PEER REVIEW
2.1.1. Evolution-Based
There are less than 10 main evolutionary-based algorithms among the bio-based algo-
Figure
Figure 13.13. Flowchart
Flowchart of of
thethe
GAGA [15].
[15].
Multiple types of crossovers, mutation, and even selection operators have been used
Multiple types of crossovers, mutation, and even selection operators have been used
by researchers so far. The most used crossover functions are single point, two point (or
by researchers so far. The most used crossover functions are single point, two point (or k-
k-point), and uniform crossovers; moreover, among mutation functions, we can mention
point), and uniform crossovers; moreover, among mutation functions, we can mention
uniform, gaussian, bit, and flip bit. It is also possible to use various selection functions in
uniform, gaussian, bit, and flip bit. It is also possible to use various selection functions in
GA; common functions are tournament, uniform, and rank selections [16–18]. Figure 14
GA; common functions are tournament, uniform, and rank selections [16–18]. Figure 14
illustrates some common genetic operators.
illustrates some common genetic operators.
The genetic algorithm is one of the most popular nature-inspired algorithms. About
70% of the total citations of the evolution-based algorithms belong to genetic algorithms.
Numerous improved versions of the genetic algorithm have been introduced by researchers.
Figure
Genetic13. algorithms
Flowchart ofhave
the GA [15].
been used in a wide range of applications, including path planning,
image processing, and optimal control. They are effective algorithms for finding the
Multiple
global optimum typessolution
of crossovers,
for manymutation, and even[14].
other problems selection
GA hasoperators haveinbeen
been used used
numerous
by researchers including
applications, so far. Thedifferent
most used crossover
fields functionsartificial
of engineering, are single point, twoand
intelligence point (or k-
computer
point),
science,and uniform
finance, crossovers;
social moreover, among
sciences, multimedia, mutation
and network functions,
[18]. Numerous weacademics
can mention
from
uniform,
a varietygaussian, bit, and
of disciplines areflip bit. It is alsoonpossible
concentrating to use various
the development selection
of feasible functions
strategies in
based
GA; common
on GA. Variousfunctions
businessesare are
tournament,
also tryinguniform, andcommercial
to develop rank selections [16–18].
products usingFigure
the aid14of
illustrates
GA [19]. some common genetic operators.
The genetic algorithm is one of the most popular nature-inspired algorithms. About
70% of the total citations of the evolution-based algorithms belong to genetic algorithms.
Numerous improved versions of the genetic algorithm have been introduced by research-
ers. Genetic algorithms have been used in a wide range of applications, including path
planning, image processing, and optimal control. They are effective algorithms for finding
the global optimum solution for many other problems [14]. GA has been used in numer-
ous applications, including different fields of engineering, artificial intelligence and com-
puter science, finance, social sciences, multimedia, and network [18]. Numerous academ-
Figure 14. Different genetic operators [16–18].
ics from14.
Figure a variety
Differentofgenetic
disciplines are concentrating
operators [16–18]. on the development of feasible strategies
based on GA. Various businesses are also trying to develop commercial products using
the aid of genetic
The GA [19].algorithm is one of the most popular nature-inspired algorithms. About
70% of the total citations of the evolution-based algorithms belong to genetic algorithms.
Numerous improved versions of the genetic algorithm have been introduced by research-
ers. Genetic algorithms have been used in a wide range of applications, including path
planning, image processing, and optimal control. They are effective algorithms for finding
the global optimum solution for many other problems [14]. GA has been used in numer-
Drones 2023, 7, 427 11 of 134
Function Formula
DE/rand/1 vi = xi + F x j − x k
DE/best/1 vi = xbest + F xi − x j
DE/rand/2 vi = xi + F x j − x k + F ( x l − x m )
DE/best/2 vi = xbest + F xi − x j + F ( xk − xl )
DE/current-to-best/1 vi = xi + F ( xbest − xi ) + F x j − xk
DE/current-to-rand/1 vi = xi + rand x j − xi + F ( xk − xl )
Population size determines the ability of the algorithm to explore the search space. In
problems with a large number of dimensions, the parameter n should be large to provide
the capability of searching the multi-dimensional design space. Small values of F lead
to small mutation step sizes and result in longer convergence time, while large values
of F decrease the exploration time but can lead to the overshooting of good optima. The
crossover probability CR controls the number of changing elements. Larger values of CR
result in more variation in the new population [21].
Notably, although DE is an evolutionary algorithm, it lacks a real natural paradigm
and is not an exact replica of natural evolution, unlike other evolutionary algorithms. DE
has demonstrated outstanding performance in a wide range of optimization problems
from diverse scientific domains, including constrained and multi-objective optimization
problems [23]. It belongs to the stochastic population-based evolutionary group and, like
other evolutionary algorithms, uses a population of candidate solutions and stochastic
mutation, crossover, and selection operators to move the population toward superior
solutions in the design space. The key advantage of standard DE is that it requires the
adjustment of only one control parameter, although it has two other control parameters.
The performance of DE in a certain optimization problem is highly dependent on both
the trial vector generation scheme and the chosen control parameters [21]. As is clear in
Table 1—although the main version of DE uses three candidate solutions—firstly, different
mutation functions may use more population, and secondly, the main process can be done
Drones 2023, 7, 427 12 of 134
x𝑥
0 =x 𝑥+ +
i =p σi ·𝜎z . 𝑧
i (1)
(1)
σ𝜎
i == β𝛽i Fℱ(𝑥)
( x ) ++γiγ
where
where fitness
fitness value
value F ℱ(𝑥) is the
( x ) is the objective
objective function
function which
which is is scaled
scaled to
to positive
positive values
values using
using
function 𝒢.
function G .
Selecting
Selecting appropriate
appropriate valuesvalues for
for the
the parameters
parameters 𝛽 β i ,, γγi could
could bebe challenging
challengingin in high-
high-
dimensional
dimensional objective
objective functions.
functions. SomeSome research
research was
was done
done on onsolving
solvingthis
thisproblem,
problem,includ-
includ-
ing
ing meta-EP
meta-EP that,
that, like
like evolution
evolution strategies,
strategies, self-adapts to the
self-adapts to the required
required variables
variables [29].
[29].
Unlike
Unlike GA, EP does not use crossover; however, it may combine candidate solutions
GA, EP does not use crossover; however, it may combine candidate solutions
with
with other
other methods.
methods. It It is
is also
also based
based on
on aa continuous
continuous representation
representation of of candidates
candidates instead
instead
of
of GA’s
GA’s binary
binary representation.
representation. FigureFigure 15
15 compares
compares the
the main
main features
features of
of standard
standard forms
forms of
of
GA
GA and
and DE.
DE.
Figure
Figure 15.
15. Comparison
Comparison between main features of standard forms of GA and EP [30].
Other Algorithms
There are other evolution-based algorithms, such as evolutionary strategies (ES), which
is a popular algorithm developed by Rechenberg in 1973 that uses mutation, recombination,
and selection applied to a population of individuals [31]. Gene expression programming
Other Algorithms
There are other evolution-based algorithms, such as evolutionary strategies (ES),
Drones 2023, 7, 427 13 of 134
which is a popular algorithm developed by Rechenberg in 1973 that uses mutation, re-
combination, and selection applied to a population of individuals [31]. Gene expression
programming (GEP) is another well-known genotype/phenotype genetic algorithm intro-
(GEP)by
duced is another
Ferreira well-known
in 2001 which genotype/phenotype geneticchromosomes
employs character linear algorithm introduced by Fer-
made of genes
reira in 2001 which employs character linear chromosomes made of genes structurally
structurally organized in a head and tail [32]. The Memetic algorithm (MA), which is con-
organized
sidered in a head of
an extension and tailwas
GA, [32].first
Theintroduced
Memetic algorithm
by Moscato (MA), which
in 1989 is considered
[33]. Grammatical an
extension of GA, was first introduced by Moscato in 1989 [33]. Grammatical evolution
evolution (GE) is also a genetic algorithm, introduced by Ryan et al., which uses a variable-
(GE) islinear
length also agenome
genetic algorithm,
to determine introduced by Ryan et al.,form
how a Backus–Naur which uses a variable-length
grammar definition is
linear genome to determine how a Backus–Naur form grammar
mapped to an expression or program of arbitrary complexity [34]. definition is mapped to an
expression or program of arbitrary complexity [34].
2.1.2. Organ-Based
2.1.2. Organ-Based
Some nature-inspired algorithms are based on the internal organs of humans or the
Some nature-inspired algorithms are based on the internal organs of humans or the
bodies of other animals. There are different algorithms inspired by the immune system,
bodies of other animals. There are different algorithms inspired by the immune system,
kidney,
kidney,heart,
heart,neural
neuralsystem,
system, coronary
coronarycirculation system,
circulation system,andandso on.
so Artificial neuralneural
on. Artificial net-
works (ANN) are probably the most popular organ-based algorithm.
networks (ANN) are probably the most popular organ-based algorithm. ANNs are a classANNs are a class of
machine-learning
of machine-learning algorithms
algorithmsthat that
are trained to learn
are trained the relation
to learn between
the relation somesome
between inputinput
and
output data. This class of learning algorithms was so popular in the last decades
and output data. This class of learning algorithms was so popular in the last decades that that many
versions of them
many versions of have been been
them have developed, including
developed, conventional
including conventional neural networks,
neural networks,deep
deep
neural networks, and Bayesian neural networks. The study of ANNs and
neural networks, and Bayesian neural networks. The study of ANNs and machine-learning machine-learn-
ing algorithms
algorithms is the
is the subject
subject of another
of another study.
study. Inpaper,
In this this paper,
learninglearning algorithms
algorithms are ig-In
are ignored.
nored.
FigureIn16,
Figure
some16, some
of the of the organ-based
popular popular organ-based
algorithmsalgorithms are presented.
are presented.
Figure 17. An antibody and a B-lymphocyte with antibodies on its surface [35].
Figure 17. An antibody and a B-lymphocyte with antibodies on its surface [35].
B-lymphocytes are the cells responsible for the production of antibodies. On the
surfaceB-lymphocytes
of each cell is aroundare 10the cells responsible for the production of antibodies. On
5 antibodies with identical paratopes which serve as sensors
face
to of each
detect cell is around
the presence 105 antibodies
of an epitope to which thiswith identical
antibody type canparatopes whichtheserve as
respond. When
correct epitope is identified, the lymphocyte is driven to create more lymphocytes (clone)
to detect the presence of an epitope to which this antibody type can respond. W
and to secrete free antibodies.
correct epitope
Clonal selectionis is
identified,
the processthe lymphocyte
of amplifying onlyis driven
those cells to
thatcreate more
produce lymphocytes
a desirable
and to secrete
antibody type. The free antibodies.
diversity of the immune system is maintained by the daily replacement
Clonal selection is the process ofwith
of around five percent of the B-lymphocytes amplifying
newly producedonly lymphocytes
those cells that
in theproduce
bone ad
marrow. As cells are produced in the bone marrow, they will produce various antibodies.
antibody type. The diversity of the immune system is maintained by the daily repla
In addition to the generation of new cells in the bone marrow, the reproduction of B-
of around five
lymphocytes percent
stimulated of the
by the B-lymphocytes
recognition of an epitope with newlyadditional
generates produced lymphocyte
diversity.
bone marrow.
During this process,Asitcells are produced
is believed in therate
that the mutation bone for marrow,
antibody genestheyiswill produce vario
substantially
bodies.than
greater In addition to the generation
that of non-antibody genes [35].of new cells in the bone marrow, the reprodu
B-lymphocytes stimulated by the
AIS implements genetic operators recognition
(such as inversion,ofpoint
an epitope
mutation, generates additional
and crossover) to d
the epitope and paratope strings to mimic the reproduction of real lymphocytes. Inversion
During this process, it is believed that the mutation rate for antibody genes is subst
is simulated by inverting a segment of the string randomly. Point mutation is simulated by
greater than
changing a bit inthat of non-antibody
a string randomly. Crossover genes [35].
is simulated by interchanging two randomly
AIS implements genetic operators (such as inversion,
selected pieces of two antibody types to create two entirely point mutation,
new antibodies [35]. In this and cro
to the epitope
terminology, and paratope
an antibody strings the
cell represents to mimic
candidatethesolutions
reproduction
and theof real lymphocyte
ability of the
cell to recognize the input pattern, or alternatively, the affinity function represents the
sion is simulated by inverting a segment of the string randomly. Point mutation
cost function.
lated byhas
AIS changing
a varietyaofbit in a stringinrandomly.
applications Crossover
different fields is simulated
of engineering by interchang
and science. Re-
randomly
search shows selected
that it haspieces
beenof twoinantibody
used computer types
security,to antivirus
create two entirely
software, new antibod
anomaly
In this terminology,
detection, fault diagnosis, anpattern
antibody cell represents
recognition, the candidate
and data analysis. It is onesolutions
of the widelyand the a
the cell to recognize the input pa ern, or alternatively, the affinity function repres
used algorithms in optimization [36]. It can also be parallelized for faster computation [37].
cost function.
Clonal Selection Algorithm
AISimmune
The has a system
varietyfindsof applications in different
the response features fieldsstimulus
to an antigen of engineering
event via theand scie
searchselection
clonal shows algorithm
that it has beenOnly
(CSA). usedcells
in computer
that recognizesecurity, antivirus
the antigens software,
are allowed to anom
tection, fault diagnosis, pa ern recognition, and data analysis. It is one of the wide
multiply. The cells are matured based on their affinity for selective antigens. Complicated
learning algorithms, such as multimodal optimizations and pattern recognition, were
algorithms in optimization [36]. It can also be parallelized for faster computation
solved using a clonal-selection-based computational system developed by Castro et al. [38].
Antibodies (Ab) are produced in the B lymphocytes in the bone marrow when an
ClonalisSelection
animal exposed toAlgorithm
antigens. The cell creates a semi-specific antibody to a certain antigen.
TerminalThe immune system
plasma cells are secretedfinds
as athe response
result features to
of the proliferation andan antigen of
maturation stimulus
B cells even
after an antigen binds to the antibodies. Clones of the cells are generated through mitosis.
clonal selection algorithm (CSA). Only cells that recognize the antigens are allo
Both plasma cells and large B lymphocytes secrete Ab, but plasma does so at a higher rate.
Tmultiply. The cells
cells are critical are matured
in immune responsesbased on theirBaffinity
and regulate for selective
cells. Lymphocytes canantigens.
develop Com
learning algorithms, such as multimodal optimizations and pa ern recognitio
into older B memory cells in addition to differentiating into plasma cells. In the event of
asolved
second using
antigenaexposure,
clonal-selection-based
memory cells, which computational system
circulate through developed
the body, transformby Cast
into large lymphocytes. These newly differentiated lymphocytes produce pre-selected and
[38].
high-affinity antibodies for the initial antigen responsible for the primary response [38].
FigureAntibodies
18 depicts the(Ab)
clonalare produced
selection in the B lymphocytes in the bone marrow w
principle.
animal is exposed to antigens. The cell creates a semi-specific antibody to a certain
Terminal plasma cells are secreted as a result of the proliferation and maturation o
after an antigen binds to the antibodies. Clones of the cells are generated through
Both plasma cells and large B lymphocytes secrete Ab, but plasma does so at a hig
T cells are critical in immune responses and regulate B cells. Lymphocytes can
Drones 2023, 7, 427 15 of 134
There are three main features of clonal selection theory that CSA is based upon. The
first feature is a diverse antibody pattern formed via accelerated somatic mutations as
a result of randomly generated genetic changes. The second trait is the retention and
restriction of a pattern to a cloned cell. The third trait is the multiplication and maturity
of cells when in contact with antigens [38]. The aspects of immunity modeled in CSA
are: cloning of the productive cells from a stimulated viewpoint; decommissioning of
non-stimulated cells; upkeep of the memory cells efficiency that have left the repertoire;
selection; generation; genetic diversity; hypermutation based on the affinity of a cell; and
the maturity affinity and selection of high-affinity cloned cells.
CSA has a fixed number of generations—the max number of generations determined
by the user. In each generation, a set of candidate solutions pool (P) is generated, which
is the sum of the remaining existing population (Pr ) and of a group of memory cells (M)
(P = Pr + M). From population (Pn ), the highest-affinity n individuals are selected. The
population is then reproduced and creates a temporary group of clones (C). The size of
the clone is a function of the antigen affinity. Next, the clones are hypermutated, which is
proportional to antibody–antigen affinity from which a new and mature antibody group
Drones 2023, 7, x FOR PEER REVIEW (C*) emerges. A new memory set M is then comprised of cells from C*. Improved cells 1
in C* can also replace cells of P. Lastly, novel antibodies replace d antibodies as a form of
diversity introduction. The low-affinity cells are more likely to be replaced [38]. Figure 19
illustrates the abstract flowchart of CSA.
into older
CSA isBcapable
memory cells inmultimodal
of solving addition to anddifferentiating into plasmaascells.
combinatorial optimization, well In
as the eve
second antigen
maintaining exposure,
effective memorymemory cells,While
and learning. whichGAcirculate
oftentimesthrough the
concierges to body,
the besttransfor
candidate solution, CSA obtains a diverse set of optimal solutions. CSA and GA differ in
large lymphocytes. These newly differentiated lymphocytes produce pre-selecte
the sequence, vocabulary, and inspiration of their evolutionary search process. GSA and
high-affinity antibodies
GA still have similar for and
evaluation the coding
initial processes
antigen and
responsible
exhibit finefor the primary
tractability relatedrespon
Figure 18 depicts
to computational the
cost clonal selection principle.
[38].
Figure 18.Clonal
Figure 18. Clonal selection
selection principle
principle [38]. [38].
There are three main features of clonal selection theory that CSA is based upo
first feature is a diverse antibody pa ern formed via accelerated somatic mutatio
result of randomly generated genetic changes. The second trait is the retention a
striction of a pa ern to a cloned cell. The third trait is the multiplication and matu
cells when in contact with antigens [38]. The aspects of immunity modeled in CS
cloning of the productive cells from a stimulated viewpoint; decommissioning o
stimulated cells; upkeep of the memory cells efficiency that have left the repertoire
Drones 2023,
Drones 2023, 7,
7, 427
x FOR PEER REVIEW 17
16of 140
of 134
Figure 19.
Figure 19. Flowchart of CSA.
Flowchart of CSA.
tions were selected as a mathematical model basis when considering the typ
ment ofexamination
of the sperm flow. of NCThe
usingovum
varioussecretes
benchmark a chemoa ractant
functions show that, inthat behaves
comparison to aas a me
guide theofsperm
majority methods, and
the ensure the sperm
average number approaches
of iterations the ovum.
for 50 independent runs A search metho
of functions
has been decreased by using NC [43].
optimization algorithm was achieved by Raouf et al. by mimicking the fertil
Another method for global optimization is proposed by Raouf et al., based on the
cess. The SMA
fertilization has in
process been tested
humans, using
called sperm several
motilitystandard benchmark
algorithm (SMA). functions
The search for an
ing problems, and the results validate and verify the efficiency of SMA [44].
the ovum is initiated by the random diffusion of sperm inside the female vagina. Stokes
equations were selected as a mathematical model basis when considering the typical move-
Enciso et al. have also developed an algorithm based on allostasis. Allosta
ment of sperm flow. The ovum secretes a chemoattractant that behaves as a mechanism to
theguide
process thatand
the sperm internal
ensure theorgans follow tothereach
sperm approaches ovum.aAsteady state towhen
search method find anpresent
unbalanced
optimization condition
algorithm wasbased
achievedonby the
Raoufinternal state ofthe
et al. by mimicking the organs.process.
fertilization Each indiv
hanced using the biological foundation of the allostasis mechanism. Numerica
The SMA has been tested using several standard benchmark functions and engineering
problems, and the results validate and verify the efficiency of SMA [44].
in allostatic
Enciso etoptimization (AO) mimic
al. have also developed the based
an algorithm IS ofonother organs.
allostasis. Theexplains
Allostasis results indic
isfactory
the processperformance of AOfollow
that internal organs in regard
to reachto the search
a steady for presented
state when an optimum with an when c
other well-documented optimization algorithms [45].
unbalanced condition based on the internal state of the organs. Each individual is enhanced
using the biological foundation of the allostasis mechanism. Numerical operations in
allostatic optimization (AO) mimic the IS of other organs. The results indicate the satisfac-
2.1.3.
tory Behavior-Based
performance of AO in regard to the search for an optimum when compared to other
well-documented optimization algorithms [45].
We have classified algorithms whose main function is related to the beha
mals orBehavior-Based
2.1.3. insects in the behavior-based category. Behavior-based algorithms fo
mals’ andhave
We insects’ tactics
classified to survive
algorithms andfunction
whose main communicate
is related to with each of
the behavior other
ani- or ot
mals or insects in the behavior-based category. Behavior-based algorithms focus on animals’
These algorithms
and insects’ tactics tomay be inspired
survive by migration,
and communicate hunting,
with each other competitions,
or other species. These and so
iors in animals.
algorithms may be The mostbypopular
inspired migration,algorithms in this category
hunting, competitions, and social are biogeograph
behaviors in
timization (BBO), symbiotic organisms search (SOS), and group search optim
animals. The most popular algorithms in this category are biogeography-based optimiza-
tion (BBO), symbiotic organisms search (SOS), and group search optimizer (GSO). In the
In coming
the coming
sections,sections, wetwo
we will study will study
popular two popular
behavior-based behavior-based
algorithms, BBO and SOS, algorithm
in
SOS, in Figure
detail. detail.20Figure 20the
illustrates illustrates
most popularthe most popular
behavior-based behavior-based algorithm
algorithms.
Figure
Figure20.
20. Most popular
Most popular behavior-based
behavior-based algorithms.
algorithms.
Biogeography-Based Optimization
The study of the geographical spread of biological organisms is called biogeography. In
the 1960s, the governing mathematical equations for organism distribution were developed.
The idea of using biogeography in optimization was first introduced by Simon. BBO has
has some features in common with other bio-based optimization methods,
and PSO [46].
The rise of new species, species extinction, and the migration of specie
island to another are described with biogeography mathematical models. Th
land” denotes a geographically isolated habitat from other habitats. The amou
Drones 2023, 7, 427 18 of 134
cies in a habitat directly influences the emigration rate, µ, and the immigration
emigration
some features rate grows with
in common as theotherhabitat
bio-based isoptimization
filled andmethods,
speciessuchexpand.
as GA and As such,
likely to seek another suitable habitat. As the number of species reaches the lim
PSO [46].
can sustain, the maximum emigration rate is achieved. It becomes more difficu
The rise of new species, species extinction, and the migration of species from one
island to another are described with biogeography mathematical models. The term “island”
cies to survive
denotes as the habitat
a geographically isolatedbecomes
habitat frommoreothercrowded and
habitats. The the immigration
amount of a species rate
in aThe biogeography of a species can be modeled in a simple way. The pr
habitat directly influences the emigration rate, µ, and the immigration rate, λ. The
the habitat to contain exactly S species, Ps, changes in each time step. The pro
emigration rate grows as the habitat is filled and species expand. As such, species are likely
to seek another suitable habitat. As the number of species reaches the limit a habitat can
ingsustain,
the optimal
the maximum solution starts
emigration ratewith generating
is achieved. a set
It becomes moreofdifficult
habitats,
for a with
specieseach ha
sponding
to survive to a potential
as the solution.
habitat becomes Then, and
more crowded thethefitness function,
immigration called
rate slows. [46].the habita
index (HSI), is calculated for each solution. A high value for the HIS represen
The biogeography of a species can be modeled in a simple way. The probability of the
habitat to contain exactly S species, Ps , changes in each time step. The process of finding the
with more
optimal species.
solution startsAfterward,
with generatingthe a sethabitats
of habitats,should behabitat
with each modified based toon the i
corresponding
rate and emigration
a potential rate.
solution. Then, theAfitness
habitat’s HSI
function, canthe
called change
habitatsuddenly due(HSI),
suitability index to apparen
events modeled as suitability index variables (SIVs) mutation. Population div
is calculated for each solution. A high value for the HIS represents a habitat with more
species. Afterward, the habitats should be modified based on the immigration rate and
to emigration
increase rate.
dueAtohabitat’s
the mutation scheme,
HSI can change as the
suddenly duerate of mutation
to apparently random isevents
inversely p
to modeled
the probability of index
as suitability a solution.
variablesSolutions with high
(SIVs) mutation. probability
Population diversity will
tendsdominate
to
without population diversity mediated by mutations. Solutions with a low H
increase due to the mutation scheme, as the rate of mutation is inversely proportional to
the probability of a solution. Solutions with high probability will dominate in a model
more likely
without to mutate,
population and
diversity thus, by
mediated have a greater
mutations. probability
Solutions with a lowofHSIimproving.
index are Des
a high potential
more likely to improve
to mutate, already,
and thus, have high
a greater HSI solutions
probability canDespite
of improving. also improve,
having giv
mutation rate. Figure 21 illustrates the flowchart of the biogeography-based a
a high potential to improve already, high HSI solutions can also improve, given a higher
mutation rate. Figure 21 illustrates the flowchart of the biogeography-based algorithm.
cies.oneSymbiotic
species directly relationships
benefits from another,can be classified as either
Para- obli
most common types found in nature. Commensalism describes a symbiotic relationship
where but the other is neutrally affected.
symbiotic
sitism describes arelations describe
symbiotic relationship wherethe necessary
one party benefits andrelationship
another is actively betw
and negatively affected. A mutually beneficial symbiotic relationship where two species
whilebenefit
mutually facultative relationships
is called mutualism. describe
Figure 22 illustrates non-necessary
various symbiotic relations in an but m
tion. The symbiotic relationships—called commensalism, par
ecosystem [47].
The SOS aims to find an optimal global solution via iterative processes involving popu-
the most
lations common
of candidate solutions intypes
promisingfound in nature.
search spaces Commensalism
similar to other well-documented de
population-based algorithms. In SOS, the ecosystem is designated as the initial population.
ship
To where
populate one
the search species
space, directly
random organisms are benefits
generated from from another,
the initial ecosystem.but th
Parasitism describes a symbiotic relationship where one par
For every organism, there exists a corresponding problem and associated fitness value
that indicates how adaptive the organism is to the desired objective. In SOS, biological
tively and
interaction negatively
is mimicked affected.
to generate new Aopposed
solutions, as mutually beneficial
to all other meta-heuristicsymbi
Xi + X j
Mutual_vector = (4)
2
Some mutualism relationships may benefit one organism more than another. Such an
unequal beneficial relationship is modeled using benefit factors (BF1 and BF2 ) randomly
determined as 1 or 2, which represent how beneficial a mutualistic relationship is to each
organism. The Mutual_vector is a representation of the relationship between the organisms
Xi and Xj . The highest value of adaptation is represented by the value Xbest . If an organism’s
fitness is greater after the interaction, the organisms are updated. [47].
Remora fish and sharks are an example of commensalism in nature. While the shark
receives very little benefit from the remora, a remora fish can consume leftover scraps
produced by the shark. The commensal symbiosis model is used to mathematically cal-
culate the candidate solution of Xi between organism Xi and Xj , the equation for which is
shown below:
Xi,new = Xi + rand(−1, 1) × Xbest − X j (5)
The value (Xbest − Xj ) describes the increased survivability rating of Xi as the advan-
tage provided by Xj . The highest degree of survivability in an organism is Xbest [47].
The plasmodium parasite, which mosquitoes pass between human hosts, is an example
of parasitism in nature. The parasites use human hosts to survive, but the human may
develop malaria and die. In the SOS, an organism Xi is duplicated, modified randomly, and
tagged the Parasite_vector in the search space. An organism Xj is randomly chosen and
designated as a host for the parasite. The Parasite_vector will replace organism Xj in the
ecosystem if the parasite has a better fitness value. The Parasite_vector will no longer exist
in the ecosystem if Xj has a higher fitness value [47]. Figure 23 illustrates the flowchart of
the SOS.
SOS is capable of generating higher-quality solutions than existing meta-heuristic algo-
rithms based on the performance of sample problems. SOS outperformed GA, DE, BA, PSO,
and PBA by identifying 22 out of 26 mathematical function solutions in the benchmarking
phase. SOS outperformed other algorithms when benchmarked against structural design
problems by achieving optimal results with less iteration. SOS does not require a tuning
phase for stable performance, unlike other algorithms, and the three biological models for
symbiotic relationships are modeled in simple mathematical expressions [47]. Despite its
complex algorithm, it is relatively easy to implement; however, the number of parameters
in SOS which directly determine its performance is high compared to other algorithms.
ones 2023, 7, x FOR PEER REVIEW 22 of 140
Drones 2023, 7, 427 21 of 134
schools of fish. Some benefits that animals derive from collective behaviors are avoiding
predators, increased aerodynamics, more efficient migratory routes, and increased har-
vesting efficiency. In the collective animal behavior (CAB) algorithm, the searcher agents
emulate a group of animals that interact with another based on the biological laws of
collective motion. Benchmarking tests show that the CAB algorithm performs well in
global optimum searches [56].
The chromosome amount or combination of gametes is unnecessary in asexual re-
production. A uni- or multicellular parent organism endows its offspring with a copy of
its genetic makeup through asexual reproduction. Single-celled bacteria, archaea, plants,
animals, and fungi primarily reproduce asexually. Based on this biological phenomenon,
Farasat et al. have developed an optimization algorithm called asexual reproduction opti-
mization (ARO). In ARO, offspring are produced by an individual through reproduction.
The fitter individual is determined via a performance index based on the objective function.
The ARO performance is tested with several benchmark functions frequently used in the
area of optimization and is compared with that of PSO. Results of the simulation illustrate
that ARO remarkably outperforms PSO [57]. Another similar algorithm is developed by
Kaveh et al. called the cyclical parthenogenesis algorithm (CPA). CPA mimics the social be-
havior and propagation of species that can reproduce sexually or asexually, such as aphids.
Displacement and reproduction mechanisms are used to iteratively improve the solution
quality of a solution, which is an organism of a species. The benchmarking results indicate
that the performance of CPA is comparable to similar meta-heuristic algorithms [58].
The hierarchical system that is responsible for the creation of complex, problem-solving
intelligence was used by Chen et al. to develop the hierarchical swarm model (HSM) [59].
Parpinelli et al. have also developed the ecology-inspired optimization algorithm (ECO),
which is an optimization algorithm based on the ecological concepts of habitats, ecological
relationships, and ecological successions. ECO performs significantly better than ABC,
especially as the dimensionality of the functions increases [60].
Other algorithms focus on the methods that animals use to survive. Competition
over resources (COR) is an optimization algorithm developed by Mohseni et al., which
mimics the competitive behavior found within animal groups. In the COR algorithm, less
searching-efficient individuals will be eliminated from the animal group while the best
searching agent in a group spreads its children within the animal group. Convergence to
an optimization algorithm is quickly reached since highly competitive search agent popu-
lations compete against each other. Based on benchmarking tests, COR converges faster
and more accurately than other optimization algorithms, such as PSO [61]. Nguyen et al.
have also developed a similar algorithm based on the foraging behavior of zombies. In
zombie survival optimization (ZSO), the fitness of the exploration agents, zombies, is based
on their ability to exploit the search space to find an airborne antidote that transforms the
zombie back into a human. ZSO is productive as a search tool to find an image, for example.
Benchmarking using the CAVIAR dataset indicates that ZSO is more efficient than BFO
and PSO [62].
2.1.4. Disease-Based
Some research is based on the behavior and models of diseases and treatments. In-
fectious and viral disease transmission is usually modeled using established ordinary
differential equations (ODE) or partial differential equations (PDE) based on epidemio-
logical models such as SIR, MSIR, MSEIRS, etc. Some researchers have focused on using
the spreading models of diseases to solve optimization problems. Some work is based on
tumor growth and chemotherapy. Figure 24 illustrates the most popular disease-based
algorithms based on their citations.
Swine influenza models-based optimization (SIMBO) is a disease-based optimiza-
tion algorithm that leverages the well-known ODE susceptible–infectious–recovered (SIR)
models of swine flu developed by Pattnaik et al. The development of SIMBO follows a
probability-based treatment phase (SIMBO-T), vaccination phase (SIMBO-V), and quaran-
rones 2023, 7, x
Drones FOR
2023, PEER REVIEW
7, 427 24 of 134
tine phase (SIMBO-Q). The SIMBO variants can be used to optimize complex multimodal
2.1.4. Disease-Based
functions with improved convergence and accuracy. First, a test based on the dynamic
threshold identifies a confirmed case of swine flu. Susceptible parties are advised to in-
oculateSome research
themselves is based
from swine ona confirmed
flu after the behaviorcase in and models of
the community. Thediseases
swine and
fectious
flu-infectedand viral disease
individuals transmission
were quarantined from the is usuallyThe
population. modeled
suspectedusing establish
cases are
ferential equations (ODE) or partial differential equations (PDE) based on
treated with antiviral medication. The number of antiviral drugs given to individuals is
dependent on patient health and susceptibility or existing complications. In SIMBO-V and
models such asand
SIMBO-Q, treatment SIR, MSIR, MSEIRS,
quarantine/vaccination etc.areSome
status used to researchers have focus
update the individual’s
spreading models
state. An individual’s of diseases
health to solve
cannot be queried everyoptimization problems.
day due to the restrictions Some
created by wor
the nonlinear momentum factors. SIMBO variants can easily be implemented on parallel
mor growth
computer and without
architecture chemotherapy.
overburdening Figure 24 illustrates
or modifications. SIMBO-T,the most popular
SIMBO-V, and d
gorithms
SIMBO-Q have based
been on their
tested withcitations.
13 standard benchmark functions, and the results have
been compared with other optimization techniques.
information exchange achieved by infectious disease. The operators for SEIQRA are such
state transitions. Currently, there are 13 state transitions, and as such, there are 13 operators.
The SEIQRA model controls the state transitions through the use of transmission as a
synergistic logic control scheme, which allows for the implementation of many opera-
tions (reflection, crossover, differential, etc.). The algorithm is stabilized by allowing the
13 operators an equal opportunity to occur. The use of the part variables iteration strategy
(PVI) gives the algorithm the capability to solve high-dimensional optimization problems.
Benchmarking results reveal that SEIQRA has a high convergence speed when searching
for global optima [64].
Invasive tumor growth optimization (ITGO) is another disease-based algorithm de-
veloped by Tang et al. Tumor growth is mediated by the drive of tumor cells to grow and
proliferate by targeting nutrients in their immediate environment. The three categories of
tumor cells In the ITGO algorithm are quiescent cells, proliferative cells, and dying cells.
Tumor cells rely on intercellular interaction and random motion to travel. The interaction
between all three cell types is simulated. Quiescent and proliferative cells are mimicked
in their invasive behavior by levy flight. The results of testing ITGO on using problems
such as CEC2005, CEC2008, and CEC2010 reveal that ITGO is better at solving global
optimization problems in comparison to other meta-heuristic algorithms such as ABC, DE,
and PSO [65].
Salmani et al. have proposed a chemotherapy-based meta-heuristic population al-
gorithm for searching purposes. The chemotherapy science algorithm (ChSA) eliminates
unwanted cells (solutions) at the risk of destroying normal cells (possible solutions) [66].
2.1.6. Insect-Based
Insects are a popular species in developing optimization algorithms. More than
40 algorithms have been developed based on the behavior and life of insects, ignoring
different variants of similar algorithms. Some algorithms in this group, such as the artificial
bee colony and ant colony optimization, are highly used by engineers and researchers.
Many species, such as ants, bees, flies, spiders, and cockroaches, have been considered in
developing optimization algorithms. Figure 26 illustrates the most popular insect-based
algorithms and a simple categorization of them.
algorithms have been developed based on the behavior and life of insects, ignoring differ-
ent variants of similar algorithms. Some algorithms in this group, such as the artificial bee
colony and ant colony optimization, are highly used by engineers and researchers. Many
species, such as ants, bees, flies, spiders, and cockroaches, have been considered in devel-
Drones 2023, 7, 427 oping optimization algorithms. Figure 26 illustrates the most popular insect-based algo-
27 of 134
rithms and a simple categorization of them.
Figure
Figure26.
26.Most
Mostpopular
popular insect-based algorithms.
insect-based algorithms.
Figure
Figure 27. Flowchart
27. Flowchart of ACO
of the the ACO
[81].[81].
ACO has been used to solve a range of famous optimization problems, including
ACO has been used to solve a range of famous optimization problems, including
traveling salesman, vehicle routing, sequence ordering, and so forth. It has been shown
traveling salesman, vehicle routing, sequence ordering, and so forth. It has been shown
that ACO can be used to solve stochastic, dynamic, multi-objective, and continuous prob-
that ACO can be used to solve stochastic, dynamic, multi-objective, and continuous prob-
lems [81]. Recent studies show that rather than academic applications, some companies are
lems [81]. Recent studies show that rather than academic applications, some companies
applying ACO for real-world industrial problems in which multiple objectives, stochastic
are applying ACO for real-world industrial problems in which multiple objectives, sto-
information, and time-varying data are readily available [82]. ACO is reported to be fast in
chastic information, and time-varying data are readily available [82]. ACO is reported to
solving complex problems, but it is sensitive to parameter adjustment like the pheromone
be fast in solving complex problems, but it is sensitive to parameter adjustment like the
evaporation rate during the pheromone update process.
pheromone evaporation rate during the pheromone update process.
Artificial Bee Colony (ABC)
Artificial Bee Colony (ABC)
ABC works based on the behavior of honeybees. In ABC, three groups of artificial
ABC
ants works based
(employed, on the behavior
onlookers, and scouts)of honeybees.
comprise the Inentire
ABC, colony.
three groups of artificial
The colony is evenly
antsdivided
(employed, onlookers,
between employedandand
scouts) comprise
onlooker bees.the
Theentire colony.
number The colony
of employed beesis is
evenly
equal to
divided between
the number ofemployed andaround
food sources onlooker
the bees. The
colony. number
Scouts of employed
are bees whose foodbeessource
is equalhastobeen
the depleted.
number of food
The sources
three around
steps in a cyclethe
are:colony.
onlookerScouts are bees whose
and employed bees arefood source
directed has
to relocate
been depleted. The three steps in a cycle are: onlooker and employed bees are directed
the food source, the nectar amount is calculated, and scout bees are designated to locate to
more food sources. Solutions are thus possible locations of food, and the nectar amounts
assign the quality of the food source. A probability-based process is used to select which
onlookers will retrieve the food source. The onlookers show preferential attention to food
sources that have a relatively high amount of nectar. Scouts are characterized as having low
food source quality and low search costs since scouts are concerned with finding any food
source without prior knowledge of known food source locations. Scouts can sometimes
discover abundant food sources that were previously undiscovered. Employed bees are
selected to be scouts via the “limit” control parameter. Employed bees turn into scout bees
if a food source is not improved after various solution trials. The parameter “limit” sets
the number of trials before a food source is abandoned [83,84]. Figure 28 illustrates the
flowchart of ABC.
having low food source quality and low search costs since scouts are concerned with find-
ing any food source without prior knowledge of known food source locations. Scouts can
sometimes discover abundant food sources that were previously undiscovered. Employed
bees are selected to be scouts via the “limit” control parameter. Employed bees turn into
scout bees if a food source is not improved after various solution trials. The parameter
Drones 2023, 7, 427 “limit” sets the number of trials before a food source is abandoned [83,84]. Figure 2829illus-
of 134
Figure
Figure28.
28.Flowchart
Flowchartof
ofABC
ABC[84].
[84].
Theperformance
The performanceof ofABC
ABCisisbebetter thansimilar
er than similaralgorithms
algorithmssuch suchas
asPSO,
PSO,GA,
GA,andandDEDE
when solving nonlinear unimodal and multimodal benchmark functions
when solving nonlinear unimodal and multimodal benchmark functions [84,85]. ABC [84,85]. ABC uses
a simple operation that quantifies the differences between the parent and a random solution
uses a simple operation that quantifies the differences between the parent and a random
from the bee population to find new candidate solutions, as opposed to DE and GA, which
solution from the bee population to find new candidate solutions, as opposed to DE and
use crossover operations to generate new solutions. Local minima convergence speed is
GA, which use crossover operations to generate new solutions. Local minima convergence
thus increased. GA and DE find the optimal solution in the population, which can be used
speed is thus increased. GA and DE find the optimal solution in the population, which
to generate new solutions or new velocities in PSO. In ABC, the best solution is chosen from
can be used to generate new solutions or new velocities in PSO. In ABC, the best solution
the pool of existing solutions and those solutions found by scout bees, which may not create
is chosen from the pool of existing solutions and those solutions found by scout bees,
new trial solutions. A greedy selection process between parent and candidate solution is
which may not create new trial solutions. A greedy selection process between parent and
used in DE and ABC. In ABC, like in DE, test solutions are produced for all population
candidate solution is used in DE and ABC. In ABC, like in DE, test solutions are produced
solutions disregarding the quality of the solution. Trial solutions are produced by onlooker
for all population solutions disregarding the quality of the solution. Trial solutions are
bees favoring the new solutions with higher fitness levels, so favorable solutions are
produced by onlooker bees favoring the new solutions with higher fitness levels, so favor-
searched more thoroughly and in less time. The mentioned attributes are similar to the
able solutions
selection areemployed
criteria searchedinmore
GA, thoroughly
namely seeded andand
in natural
less time. The mentioned
selection. The ABC aalgorithm
ributes
are similar to the selection criteria employed in GA, namely seeded and natural
only has one control parameter (limit) apart from colony size and maximum cycle number. selection.
The
The ABC
valuealgorithm onlywill
of the “limit” hasbeone control parameter
determined (limit) size
based on colony apart from
and colony sizeofand
the dimension the
maximum cycle number. The value of the ‘‘limit” will be determined
problem. Therefore, ABC has only two common control parameters [86]. based on colony size
Figure
Figure 29.29. Transverse
Transverse orientation
orientation in mothsin
(a)moths (a) Usingasmoonlight
Using moonlight as encirclement
a reference (b) a referencetraps
(b) encircle
intraps in presence
presence of artificialof artificial
lights [87]. lights [87].
In the MFO algorithm, moths serve as the candidate solutions, and the moth positions
Moth-flame optimization (MFO) algorithm is inspired by the use of the tr
serve as the problem’s variables. Therefore, the moths may traverse one-, two-, three-, or
orientation of moths
hyper-dimensional in the existence
space without of artificial
the positional light, The
vector changing. as shown in Figure
MFO algorithm is 29b. T
eventually converges
population-based, toward
so the moths the light,
and flames which inisaused
are presented matrixas theThe
form. mathematical
flames and base
MFO algorithm [87].
moths are solutions, but the way they are updated is different. The flames, which can be
In the
considered MFOtagged
as flags algorithm, moths
for search, serve
are the as the
optimal candidate
solutions solutions,
that have been foundand
by the
the moth p
moths, which are the search space agents. Moths update the flags after searching for fire
serve
and as the
update problem’s
the fitness of the variables.
solution, so Therefore,
the moth always the tracks
mothsthemaybest traverse one-, two-, t
solution. Moths’
hyper-dimensional space without the positional vector changing.
positions are updated based on a mathematical model of transverse orientation with the The MFO algo
population-based,
following equation: so the moths and flames are presented in a matrix form. The fla
moths are solutions, but the way Mi =theyS Mare
i , Fj updated is different. The flames, (6) whic
considered
where as flags
Mi represents tagged
the ith for search,
agent/moth, are the
Fj represents theoptimal solutions
jth flag/flame, and Sthat have been f
represents
the moths, which are the search space agents. Moths update the flags after
the spiral function. In this case, a logarithmic spiral is used but whatever spiral can be searc
utilized for these purposes. The logarithmic spiral is shown below:
fire and update the fitness of the solution, so the moth always tracks the best
Moths’ positions are updated S Mi , Fj based
= Di ·ebton
· cosa(2πt
mathematical model of transverse ori
) + Fj (7)
with the following equation:
where D is the distance between moth i and flame j, b is a value that defines the spiral
shape, and t is a value in the range [−1, 1] chosen 𝑀 randomly
= 𝑆(𝑀 [87].,𝐹)
The MFO has been used to solve 19 different unimodal, multimodal, and composite
where Mi and
benchmarks represents the ithhas
its performance agent/moth,
been compared Fj represents
to PSO, GSA,the BA,jth
GA,flag/flame,
FA, SMS, and and S re
the spiral function. In this case, a logarithmic spiral is used but whatever spira
FPA. The results have shown that MFO provides the best results in four of the test functions,
thus the superior efficiency of MFO is statistically significant. MFO competes with GSA in
utilized for these purposes. The logarithmic spiral is shown below:
the results of some problems. The selection of flames for the positioning of the moths is
the reason why MFO does not provide 𝑆 𝑀 better
, 𝐹 results
= 𝐷in. 𝑒three of the multimodal
. cos(2𝜋𝑡) + 𝐹 tests. The
local solutions are avoided since the moths mainly explore the search space. MFO cannot
where D isthe
approximate the distance
global optimum between moth
very well sincei unimodal
and flame testj,functions
b is a value
do notthat defines t
feature
local solutions, which is not a major concern since MFO achieved superior results in two
shape, and t is a value in the range [–1, 1] chosen randomly [87].
unimodal tests. Accuracy and convergence speed are maximized by the MFO algorithm
due to The MFO has
the updating been vector
position used ofto the
solve 19 different
moths, unimodal,
but the avoidance multimodal,
of local solutions is and co
benchmarks
one of the largestand its performance
drawbacks [87]. has been compared to PSO, GSA, BA, GA, FA, S
FPA. The results have shown that MFO provides the best results in four of the t
Other Algorithms
tions, thus the superior efficiency of MFO is statistically significant. MFO compe
Some other algorithms have been developed based on bees and their behaviors, such
as the bee colony optimization (BCO) and fuzzy bee systems (FBS) algorithms developed
by Teodorovic et al. BCO and FBS can solve deterministic combinatorial problems and
uncertain combinatorial problems [88]. Honeybees mating optimization (HBMO) is another
swarm-based algorithm that was developed by Haddad et al. based on the process of honey-
Drones 2023, 7, 427 31 of 134
bee mating. The performance of HBMO is comparable to the results of the well-developed
genetic algorithm in problems such as highly non-linear constrained and unconstrained
real-valued mathematical models [89]. Another similar algorithm is marriage in honeybees
(MHB), which mimics the development of honeybees that begins with a solitary colony, cat-
egorized as a solitary queen with no family, to a eusocial colony, characterized as multiple
queens with families. MHB displays good performance in solving satisfiability problems
(SAT) [90]. Queen-bee evolution (QBE) is another example of a bee-inspired algorithm,
which enhances the capability of GA, where the queen bee has a direct influence over the re-
production process. QBE enhances genetic algorithm performance in converging to a global
optimum by improving the exploration and exploitation of the search environment [91].
Bee swarm optimization (BSO) is based on the foraging behavior of honeybees by using
population-based optimization, where bees adjust their flight trajectories. Experimental
results have shown that BSO is comparable to ABC and bee and foraging algorithm (BFA)
in solving nonlinear unimodal and multimodal multivariable benchmark functions [92].
Bee collecting pollen algorithm (BCPA) is also a global convergence searching algo-
rithm that mimics honeybees’ swarm intelligence and pollen collection behaviors. BCPA
has comparable performance with ACO in benchmarking function performance [93]. An-
other bee-based algorithm is OptBees, which is inspired by the processes of collective
decision-making by bee colonies. It has been shown that OptBees exploits the multimodal-
ity of problems. Additionally, it creates and maintains diversity and achieves desired results
in global optimization [94]. Fitness dependent optimizer (FDO) is another bee swarm algo-
rithm inspired by the bee-swarming reproductive process and collective decision-making.
FDO is a PSO-based algorithm that uses velocity to update the position of the search
agents. Velocity is calculated via the fitness function that processes weights, which provide
guidance to the search agents in the exploration and exploitation phase. Experimental
results have shown that FDO performs better than PSO, GA, DA, WOA, and SpSO in
some nonlinear unimodal and multimodal benchmark functions, including CEC-C06 [95].
Bumblebees (BB) is a multiagent optimization algorithm inspired by the collective behavior
of social insects. Experimental results have shown that BB is faster and outperforms other
algorithms, such as GA, in solving the k-coloring of a graph problem for a range of random
graphs with different orders and densities [96]. Another algorithm in this group is bumble
bees mating optimization (BBMO), which simulates the mating behavior of the bumble
bees for solving global unconstrained optimization problems. It has been shown that the
BBMO has high performance in solving some nonlinear multivariable benchmark functions
in comparison to algorithms such as GA, DE, PSO, and HBMO [97].
Some algorithms in this category are inspired by antlion and dragonfly, which both
are members of the insect order Odonata. Antlion optimizer (AlO) mimics the hunting
mechanism of antlions in nature. It has been shown that AlO competes with algorithms
such as PSO, GA, CS, and FPA in benchmarking functions in terms of local optima avoid-
ance, exploration, exploitation, and convergence [98]. Dragonfly algorithm (DA) is also
another swarm intelligence optimization technique that originates from the static and
dynamic swarming behaviors of dragonflies in nature. In DA, dragonfly behaviors such as
foraging, navigation, and predator avoidance were used to develop two distinct phases of
exploration/exploitation and optimization. DA can effectively solve highly constrained
CFD problems [99].
Some other algorithms are inspired based on flies; for example, fruit fly optimization
(FFO) is based on the food-seeking behavior of fruit flies. Fruit flies can detect food sources
from over 40 km away using their olfactory organs. Fruit flies also employ their sensitive
vision to detect food and swarm flocking locations. The stability of the search route for the
fruit flies is related to the population of the flies. FFO can normally find correct solutions to
optimization problems [100]. The dispersive flies optimization (DFO) algorithm is based
on the swarming behavior of flies over food sources in nature [101].
Another group of insect-based algorithms is based on Orthoptera, an order of insects
including grasshoppers, locusts, and crickets. Grasshopper optimization algorithm (GOA)
Drones 2023, 7, 427 32 of 134
mimics the behavior of grasshopper swarms in nature for solving optimization problems.
The simulation results have shown that GOA is able to provide superior results compared
to well-known algorithms such as GA, PSO, CS, SM, and FPA and verify the merits of
GOA in solving real problems with unknown search spaces [102]. Locust swarms (LS) is a
multi-optima search technique explicitly designed for non-globally convex search spaces,
which uses PSO as part of its coarse-mesh search to find starting points for a paired greedy
search technique [103]. Another algorithm is the cricket algorithm (CrA) which is based on
the behavior of crickets in nature. The CrA is a population-based algorithm similar to PSO,
in which the candidate that converges to the optimum result tries to provide solutions. CrA
applies some aspects of BA and FA [104].
There are some algorithms based on fireflies, such as the firefly algorithm (FA). The
FA is a warm-intelligent algorithm based on the flashing pattern of tropical fireflies. It has
been confirmed that FA can provide a good balance of exploitation and exploration and
requires far fewer function evaluations [105]. Another algorithm is the glowworm swarm-
based optimization algorithm (GSOA), which is a swarm intelligence-based algorithm for
optimizing multimodal functions. The GSOA aims to encapsulate all of a function’s local
maxima. The glowworms, or agents, of GSOA move in a direction based on the signal
strength, picked up from their neighbors after using a decision domain to select them. The
GSOA is based on glowworm behavior used to attract mates and pray, namely the luciferin
glow intrinsic to the worms [106]. Jumper firefly algorithm (JFA) is also another algorithm
based on FA. The JFA finds efficient solutions by evaluating the efficiency of the agents.
Inefficient agents are then relocated to maximize the likelihood of finding the best solutions
by using the jump option [107].
Another class of algorithms is inspired by spiders. Social spider optimization (SSO) is
a swarm algorithm based on the simulation of the cooperative behavior of social spiders.
SSO mimics colony cooperation by featuring computational mechanisms and considering
two genders to provide a better exploration and exploitation behavior and avoid premature
convergence, which are issues that plague PSO and ABC algorithms [108]. Another spider-
based algorithm is black widow optimization (BWO) which draws inspiration from black
widow mating behaviors, such as cannibalism. The cannibalism stage allows for early
convergence due to the rejection of inefficient search agents [109]. Another example is
the water strider optimization algorithm (WSOA), a population-based optimizer that
mimics the water strider bug life cycles. The WSOA includes water strider behaviors
such as mating style, feeding mechanisms, succession, territorial behavior, and intelligent
ripple communication. WSOA has been applied to classical constrained, unconstrained,
continuous, and discrete engineering design problems, two structural optimizations of
double-layer barrel vaults, and a challenging bridge damage detection problem [110].
In the group of lepidopteras, there are some algorithms inspired by butterflies. The
monarch butterfly optimization method (MBOA) was developed based on the migration
of monarch butterflies. In MBOA, the location of the butterflies, which are located in
two different environments, is updated by two methods. The migration operator gener-
ates offspring. Then, the butterfly adjusting operator updates the position of the agents.
The amount of agents remains unchanged by these two methods so as to avoid fitness
evaluations [111]. Another example is the butterfly optimization algorithm (BOA), which
uses the concept of emitting fragrances from flowers and the ability of butterflies to sense
those fragrances from long distances. Butterflies are agents which are used to search for
optimal solutions in the search space. It has been shown that BOA has efficient performance
compared to PSO, GA, ABC, and FA [112].
Another popular algorithm is the artificial butterfly optimization algorithm (ABOA)
which is based on the mate-finding strategy of some butterfly species. In the ABOA
algorithm, all virtual butterflies are divided into two groups, one called the sunspot
butterfly group, and the other called the canopy butterfly group. The fitness of sunspot
butterflies is better than those of canopy butterflies. In an optimization process, exploration
Drones 2023, 7, 427 33 of 134
Figure
Figure 31.
31. An An example
example of Lévy flightof Lévy
in 2D spaceflight
[128]. in 2D space [128].
The CS algorithm is governed by three main principles. First, a cuckoo bird can only
lay one egg at a time, which is randomly deposited in an existing nest. Second, a higher-
quality nest will propagate. Third, host nest amounts are fixed, and the probability of the
parasitic egg being discovered is pa ∈ [0, 1]. If discovered, the host species will abandon
the nest or evict the cuckoo egg, which is approximated by pa of n replaced nests. Fitness
is proportional to the objective function in a maximization problem. Fitness can also be
defined as it is in existing genetic algorithms. Essentially, a new cuckoo egg generates
a potentially better solution that can replace a less optimal solution [127]. Levy flight i
performed for a new solution x(t + 1) for a cuckoo, i, as follows:
where α > 0 controls the step size that is dependent on the scale of the problem (in most
cases α = 1), and product ⊕ represents entry-wise multiplication. Random walk is thus
generated by the above stochastic equation. The Levy distribution gives the random length
to the random walk as follows:
which features an infinite variance and mean. A step-length power-law distribution with
a thick tail random walk process is thus formed. The local search can be sped up by
generating Levy walk with proximity to the best solutions. To ensure that the system does
not converge on a local optimum, far-field randomization should be employed to scatter
the new solutions away from the best solution. CS is most similar to hill-climbing due to
the randomization process, yet distinct in many ways. CS finds the best solution similar
to harmony search while maintaining a population-based structure such as GA and PSO.
Additionally, due to the thick-tailed distribution of the step size, randomization is more
efficient. Lastly, CS is likely easier to implement than GA and PSO due to the diminished
number of parameters. CS can be used as a meta-population algorithm if each nest is
treated as a set of solutions [127].
The performance of CS in optimizing multimodal objective functions has been shown
to be superior to other algorithms in part due to the diminished number of parameters (n
and pa ). Parameter pa does not affect the convergence rate, so the parameter does not need
to be tuned on a case-by-case basis. CS is more robust than other nature-inspired algorithms
as a result. CS can be extended to study multi-objective optimization applications with
various constraints, including NP-hard problems [127].
Figure 32. Preying habit with bait in the green heron bird [129].
the Heron
increases catches
(illustrated the prey.
in Figure 33a). Optimization
In the catch mode, and
baitcomplex problem-solving
is deposited, and the bird ar
catches prey. Thus, a more competitive solution is added to the system and a suboptimal
using isthe
GHOA
solution meta-heuristic
features
removed, three
and the number foundations
operations that create
of solutions of aHeron’s
remainspath andbaiting
constant, behavior
the solutions
which [129].
areinbaiting,
is illustrated
suited
tracting prey
Figure for
33a. Angraph-based
swarms, and change
event where problems
a bird catches with
of position.
prey discrete
Thebait
without representations.
baiting
is namedstage mimics
the false The
catch the
mode, algorit
behavior
enhanced
herons, which drop
where an convergence
bait with the
inappropriate solutionspeed
ishopesand
removed, of aefficient
and the solution-finding
racting prey.
number of solutions in the behavior gen
The bait is a randomly
system wh
decreases. If a solution is removed in a false catch, a new mode must be introduced, or
ated solution that the bird
good solution-set drops upon finding
combinations [129]. an optimal
Figure position through
32 illustrates a local bait
the Heron’s sear
certain constraints of the problem may be compromised (illustrated in Figure 33a) [129].
Thus, both bait and prey serve as solutions to the algorithm. There are three modes t
the bait-prey class will take the form of to improve or generate a new solution heuri
cally. The three modes are the missed catch, catch, and false catch. In the missing mo
bait is placed in a suitable location, but the bird fails to catch prey, so the solution num
likely increases (illustrated in Figure 33a). In the catch mode, bait is deposited, and
bird catches prey. Thus, a more competitive solution is added to the system and a sub
timal solution is removed, and the number of solutions remains constant, which is ill
trated in Figure 33a. An event where a bird catches prey without bait is named the fa
catch mode, where an inappropriate solution is removed, and the number of solutions
the system decreases. If a solution is removed in a false catch, a new mode must be int
duced, or certain constraints of the problem may be compromised (illustrated in Fig
33a) [129].
Figure
Figure 32. 32. Preying
Preying habit
habit with baitwith
in thebait
greenin thebird
heron green heron bird [129].
[129].
GHOA features three operations that create a path and the solutions
tracting prey swarms, and change of position. The baiting stage mimics t
herons, which drop bait with the hopes of a racting prey. The bait is a ra
ated solution that the bird drops upon finding an optimal position through
Thus, both bait and prey serve as solutions to the algorithm. There are thr
the bait-prey class will take the form of to improve or generate a new sol
cally. The three modes are the missed catch, catch, and false catch. In the
Figure
bait
Figure is
33. placed
33. Main Main in a suitable
operations
operations of GHOA:
of GHOA:
location,
(a)(a) baiting
baiting but the(b)bird
operations,
operations, failsracting
toswarm,
(b) apray
attracting catch (c)prey,
pray
so
swarm,
change of the so
(c) cha
likely[129].
position
of position increases (illustrated in Figure 33a). In the catch mode, bait is depo
[129].
birdThe catches
attractingprey.
prey Thus,
swarms’aphase moreis competitive
essentially a localsolution
search thatis ensures
addedthe toalgo-
the system
The
timal
rithm aconverges
ractingquickly
solution prey swarms’
is removed,
when solving phase
and is essentially
the
precedence number aconstrained
criteria inof local search
solutions thatproblems,
remains
discrete ensures the al
constant,
rithmtrated
converges
which is differentquickly
in Figure from when
33a.
the
An solving
change
event where
of precedence
positions
a birdcriteria
phase. In the
catches in prey
attraction constrained
phase,
without
the discrete
bait ispr
position
n
of the bird remains static while swarms of osprey are attracted to the bait, which is an
lems,catch
whichmode, is different
where from the change of positions phase. In the a raction phase,
algorithm behavior usefulan inappropriate
in solving solution is
VRP, TSP, scheduling removed,
problems and
or other the number
problems
position of
the system
where the bird
constrained remains
decreases. static
numbers ofIfunknown, while
a solution swarms
and is of
removed
selectively osprey are a
in ina path
useful racted
falseplanning, toa the
newbait,
catch, routing, modewh
is anetc.,
algorithm
duced,
problems orbehavior
certain
(see useful
Figureconstraints
33b). The inchange
solving
of oftheVRP, TSP,
problem
position phasescheduling
may problems
be compromised
represents the or other pr
bird’s behavior (illustr
lems where constrained numbers of unknown, and selectively useful in path
in choosing a location where it can easily attract prey or catch prey on the chance that prey planni
33a) [129].
routing, etc., problems (see Figure 33b). The change of position phase represents the bir
moves near its feet, which ensures that time is not being wasted in a local solution space
that does not contain solutions (see Figure 33c) [129]. Figure 34 illustrates the flowchart of
behavior in choosing a location where it can easily a ract prey or catch prey on the cha
the GHOA.
GHOA is seemingly robust and scalable and displays adequate convergence criteria.
The algorithm has produced promising solutions for high dimensional and combinatorial
problems, as well as graph-based and discrete optimization problems [129].
Figure 33. Main operations of GHOA: (a) baiting operations, (b) a racting pray sw
that prey moves near its feet, which ensures that time is not being wast
tion space that does not contain solutions (see Figure 33c) [129]. Figure
Drones 2023, 7, 427
flowchart of the GHOA. 37 of 134
and xi (t) are generated using the best global solution, x*. A random walk is used to generate
new solutions for each bat after the global best is found using the averaged loudness values
at the given time across all bats. Velocity and position are updated similarly to how values
are updated in PSO since the range and speed of the particles are controlled by fi . BA
is an equal combination of local search mediated by loudness and pulse control and the
standard PSO. Additionally, loudness is diminished once the prey has been detected and
pulse emission is increased. Yang et al. showed that BA outperforms existing algorithms
when applied to seven constrained and nonlinear design task benchmark problems. BA is
Drones 2023, 7, x FOR PEER REVIEW
potentially more powerful than GA, PSO, and HS. PSO and HS are a simplified version
of BA. BA outperforms these existing algorithms since it incorporates the strengths of the
other algorithms but includes a robust local optimal search finder [130].
Figure
Figure 35.35.
BatBat echolocation
echolocation [131]. [131].
Other Algorithms
The bat algorithm (BA) is inspired by the above-mentioned behaviors of bats
Crow search algorithm (CrSA) is another bird-inspired algorithm developed by
Askarzadeh, which isto
use echolocation sense distance, technique
a population-based type of food/prey,
that mimicsand crowbackground
food storage and barriers in
The bats
retrieval. randomly
Simulation fly reveal
results with loudness
that CSOA may A0, position xi, and velocity
produce promising vi. The to
results compared bats also
a changing wavelength λ and fixed frequency fmin. The proximity of prey adjusts t
the other algorithms [132].
Other algorithms are developed based on hawks, eagles and other birds of prey.
emission (between the range [0, 1]) and the frequency of the pulses. Loudnes
Heidary et al. have developed Harris hawk optimization (HHO), based on surprise bounce,
within
which is athe maximum
cooperative A0 and
chasing the minimum
behavior A0. New
intrinsic to Harris hawks solutions for thehawks
when multiple velocities a
tions
dive at v i(t)same
the andtime
xi(t)toare generated
surprise using
a target. Harris the besthave
hawks global solution,
developed x*. A
several random wal
methods
oftosurprise
generate new due
pouncing solutions for each bat
to unpredictability after
in the the global
environment andbest is found
prey [133]. using the a
The coop-
loudness values at the given time across all bats. Velocity and position are
erative behavior and chasing style of Harris hawks in nature is called surprise pounce. In updat
this intelligent strategy, several hawks cooperatively pounce prey from different directions
larly to how values are updated in PSO since the range and speed of the part
in an attempt to surprise it. Harris hawks can reveal a variety of chasing patterns based on
controlled
the dynamic natureby fi.ofBA is anand
scenarios equal combination
escaping of local
patterns of the search
prey [133]. Eaglemediated
strategy (ES) by loudn
pulse control
iteratively combinesandthethe standard
firefly algorithm PSO.
and Additionally,
the Levy walk methodloudness is diminished
for a random search, once
has been detected and pulse emission is increased. Yang et al. showed that BA
and studies suggest that ES is an effective stochastic optimizer [134].
Segundo et al. have also developed Falcon’s hunt optimization algorithm (FHOA)
forms existing algorithms when applied to seven constrained and nonlinear des
based on the hunting behavior of falcons which is a robust and powerful stochastic
benchmark problems.
population-based algorithmBA thatisneeds
potentially more of
the adjustment powerful than GA,
a few parameters forPSO, and HS. P
its three-
HS are
stage a simplified
movement decision version of BA.
[135]. Khan et BA outperforms
al. have also developed thesetheexisting algorithms sin
eagle perching
corporates
optimizer the strengths
algorithm (EPOA), whichof themimics
other thealgorithms but includes
perching nature of eagles aandrobust
is basedlocal
on optim
finder [130].
exploration and exploitation [136]. Bald eagle search (BES) is another eagle-based algo-
rithm that is based on the hunting strategy or intelligent social behavior of bald eagles as
they search for fish. Simulation results confirm that the BES algorithm has comparable
Other Algorithms
performance to conventional methods and advanced meta-heuristic algorithms [137].
Crow group
Another search algorithmwere
of algorithms (CrSA) is another
developed based onbird-inspired
penguins. Penguinsalgorithm
search devel
optimization algorithm (PeSOA) is one such algorithm and is based on the collaborative
Askarzadeh, which is a population-based technique that mimics crow food stor
hunting strategy of penguins [138]. Emperor penguin colony (EPeC) is another algorithm
retrieval.by
developed Simulation
Harifi et al.results reveal
based on body that
heat CSOA
radiationmay
and produce promising
the spiral-like movementresults co
to the other algorithms [132].
Other algorithms are developed based on hawks, eagles and other birds of p
dary et al. have developed Harris hawk optimization (HHO), based on surprise
which is a cooperative chasing behavior intrinsic to Harris hawks when multipl
Drones 2023, 7, 427 39 of 134
of emperor penguins in their colony [139]. Dhiman et al. have also developed emperor
penguin optimizer (EPeO), which mimics the huddling behavior of emperor penguins [140].
Other algorithms are developed based on swarms or the social behavior of birds,
such as migration. Among bird-based algorithms is chicken swarm optimization (CSO),
which mimics the intelligent hierarchical swarm behaviors in hens, roosters, and chicks
to optimize problems. Studies show that CSO achieves good robustness and accuracy in
optimization problems compared to popular meta-heuristic algorithms [141]. The bird
swarm algorithm (BSA) is another algorithm based on interactions and behaviors of birds
swarm intelligence. Three predominant social behaviors in birds are foraging, vigilance,
and flight behavior. Birds increase their survival rate through social interactions by foraging
and escaping predators, for example [142]. Duman et al. have also developed migrating
birds optimization (MBO) based on the energy-saving V flight formation common in
bird migration [143]. Swallow swarm optimization algorithm (SWOA) is another similar
algorithm that models swallow swarm movement. SWOA has been proven to be highly
efficient in flat areas, local extrema stagnation avoidance, good convergence speed, and
intelligent particle participation [144]. Bird mating optimizer (BMO) is a similar algorithm
inspired by the mating strategies of birds. BMO creates optimum searching techniques
by breeding agents with better genes. BMO competes with other EAs in performance on
unimodal and multimodal benchmark functions [145].
Some algorithms are based on other bird behaviors, such as feeding and egg-laying.
The laying chicken algorithm (LCA) was developed by Hosseini based on the behavior
of chickens laying eggs [146]. Lamy has also developed artificial feeding birds (AFB)
inspired by food-searching behaviors in birds, which can be applied to many optimization
problems [147].
Numerous algorithms are based on different kinds of birds, such as pigeon-inspired
optimization (PIO), which is based on pigeon swarm behaviors [148]. Dhiman et al. have
also developed the seagull optimization algorithm (SOA), which is inspired by the mi-
gration and attacking behaviors of seagulls [149]. Satin bowerbird optimizer (SBO) was
developed by Moosavi et al. based on the mating behavior of seagulls [150]. Jain et al. have
also developed the owl search algorithm (OSA), which is a population-based algorithm
inspired by the hunting mechanism of owls in the dark [151]. Another example is the
Egyptian vulture optimization algorithm (EVOA), which was developed based on the food
acquisition behaviors of Egyptian vultures [152]. Kestrel optimization algorithm (KOA)
has been developed from the feeding behavior of a Kestrel, which is a type of duck [153].
Dhiman et al. have also developed the sooty tern optimization algorithm (STOA) based
on the migration and attacking behaviors of the sooty tern, which is a seabird [154]. The
raven roosting optimization algorithm (RROA), which was developed by Barbazon et al.
based on the social roosting and foraging behavior of the common raven, is another
example [155]. Andean Condor algorithm (ACA) is another example that is inspired by the
movement pattern of the Andean Condor when it searches for food [156]. Omidvar et al.
have also developed a PSO-like algorithm termed the see-see partridge chicks optimization
(SSPCO) by modeling the behavior of the see-see partridge chicks [157]. Hoopoe heuristic
optimization (HHO) is another bird-inspired algorithm developed by El-Dosuky et al. [158].
Urban pigeon-inspired optimizer (UPIO) was developed based on the foraging behavior of
groups of urban pigeons [159]. Another algorithm based on bat’s echolocation capabilities
is called the bat sonar optimization algorithm (BSOA) [160].
Figure 37.Bubble-net
Figure 37. Bubble-net feeding
feeding method
method in humpback
in humpback whales [161,162].
whales [161,162].
Figure 37. Bubble-net feeding method in humpback whales [161,162].
WOA is based on the bubble-net feeding, the encirclement of prey, and prey search
behaviors in humpback whales. The whales encircle prey after the prey has been located.
WOA takes the location of the prey as the best candidate solution since the optimal design
location is not intrinsically known. Search agents will update their positions using the
Drones 2023, 7, 427 41 of 134
best solution positions so far. The shrinking encircling mechanism and the spiral updating
position approaches are designed to model bubble-net foraging. In essence, shrinking
encircling is achieved by creating a spiral between the whale located at (X, Y) and the prey
located at (X*, Y*). The distance is first calculated, and the value is decreased. Humpbacks
simultaneously swim in a helical path and encircle the prey in a shrinking circle such that
the model updates the whale’s position as follows:
X ∗ (t) − A· D
i f p < 0.5
X ( t + 1) = (10)
D 0 ·ebl · cos(2πl ) + X ∗ (t) i f p ≥ 0.5
where p is a number randomly chosen in the span [0, 1], and D 0 = | X ∗ (t) − X (t)|, which
indicates the distance between the ith whale to the best prey/solution, the constant b
defines the logarithmic spiral shape, and l is a number randomly generated from the span
[−1, 1]. WOA begins with a random solution set where the position of agents is updated
based on the position of a random agent or the best solution that has been found. It can
select between a circular or spiral movement based on the variable p. The optimization
process finishes when the termination criteria are met.
WOA competes with popular meta-heuristic methods based on its performance on sev-
eral benchmarking functions to test the local optima avoidance, convergence, exploration,
and exploitation behaviors [161].
Krill Herd
Antarctic krill are a vastly studied marine animal. These herds exist on the 10s to 100s
of meter space scale and hour-to-day time scales. The herds have no parallel orientation
and are mass collections of krill, and the swarming behavior is a main characteristic of
krill. Krill herd density is diminished after predatory attacks, where individual krill are
removed from the swarm. Many parameters dictate the formation of a new krill herd after
an attack. The two main goals of herding krill are to reach food and increase the swarm
density. Krill herd (KH) is a meta-heuristic algorithm based on the herding of krill to solve
global optimization problems. KH emulates the herding behaviors of krill swarms during
specific environmental and biological events. The parameters for the KH algorithm have
been determined from a review of the literature of real-world studies. The distance from
krill to food and the highest density in the swarm serves as the fitness functions for the KH
algorithm. The position of the krill, which is time-dependent, is determined via the motion
caused by other krill, foraging, and random diffusion [163]. Krill herds are close to global
minima due to the objectives of increasing herd density and finding food such that the krill
approach is the best solution when looking to fulfill the two goals. Therefore, the objective
function is minimized when krill are close to food or at high densities of other krill [163].
The position of a krill in the KH algorithm which represents a candidate solution is
updated based on the maximum induction speed, the inertia weight, the target direction
from the best krill agent, and the attractive and repulsive effects between krill neighbors.
Krill neighbors are determined using a sensing distance parameter as illustrated in Figure 38.
The maximum induction speed determines the maximum distance that a krill can move
in each iteration. The inertia weight determines the degree to which the krill agent will
maintain its current direction. The target direction from the best krill agent guides the krill
toward the global optimum. The attractive and repulsive effects between krill neighbors
are determined by their fitness and position relative to each other. Krill agents with higher
fitness are attractive to others, while those with lower fitness are repulsive. These effects
influence the direction and speed of movement of each krill agent. From this aspect, it
shares the same concept with PSO [163]. Figure 39 shows the flowchart for KH.
The related parameters should be tuned for meta-heuristic algorithms. KH uses
real-world coefficients to simulate krill behavior, so only the time interval, Ct , has to be
tuned. Thus, one of the greatest advantages that KH has compared to other algorithms,
is that it only has one parameter that must be tuned. Other advantages of using KH are
as follows: (1) the fitness of agents dictates the motion of other agents, (2) a local search is
Drones 2023, 7, 427 42 of 134
Figure 38.Schematic
Figure 38. Schematic of the
of the sensing
sensing ambitambit
aroundaround a krill individual
a krill individual [163]. [163].
Figure 38. Schematic of the sensing ambit around a krill individual [163].
The
Therelated
relatedparameters
parametersshould
shouldbe betuned
tunedfor
formeta-heuristic
meta-heuristicalgorithms.
algorithms.KH KHuseus
world coefficients to simulate krill behavior, so only the time interval, C t, has to be
world coefficients to simulate krill behavior, so only the time interval, Ct, has to be
Thus,
Thus,one
oneof
ofthe
thegreatest
greatestadvantages
advantagesthat
thatKH
KHhas
hascompared
comparedto toother
otheralgorithms,
algorithms,isi
only
onlyhas
hasone
oneparameter
parameterthat
thatmust
mustbe betuned.
tuned.Other
Otheradvantages
advantagesof ofusing
usingKHKHareareas
asfo
f
(1) the fitness of agents dictates the motion of other agents, (2) a local search is con
ated randomly [165].
Fish realizes their external perception throug
Drones 2023, 7, 427
denotes the current state of a fish, visual43describ of 134
are
with a the position speedof andthe fish, tuning.the step length, the dis
complex high-dimensional nonlinear problems independent from gradient information
faster convergence less parameter AFSO does not incorporate
Some
fish, denoted ofAF,the
mimic advantages
the behaviors of real-lifeare fish andglobal search
include AF_Swarm,
AF_Prey, AF_Move, AF-Evaluate, and AF_Leap [165]. AFSA has lower convergence speed,
ability, r
AF_Follow,
and parameter
higher time complexity, no agentsememory,ingandtolerance [166].
difficulty differentiating A global
between
and local searches despite being one of the best swarm intelligence algorithms. Some of
compre
rithm
the advantages was done
are global searchby Neshat
ability, robustness, et al., including
insensitivity their c
to initial values, and
parameter setting tolerance [166]. A comprehensive review of the fish swarm algorithm
was done by Neshat et al., including their challenges and applications [165].
developed by Yong et al. based on dolphin swarms called dolphin swarm optimization
algorithm (DSOA) [180]. Serani et al. have also developed the dolphin pod optimization
algorithm (DPOA) based on a simplified social model of a dolphin pod in search of food
for unconstrained single-objective minimization [181].
Other major groups of aquatic-animal-based algorithms are inspired by aquatic preda-
tors such as sharks and whales. Shark smell optimization method (SSOM) mimics the
olfactory-based prey-location behavior employed by sharks. The capabilities of SSOM
have been validated by finding the solution to load frequency control problems in electrical
power systems [182]. Ebrahimi et al. have also developed the sperm whale algorithm (SWA),
a population-based optimization technique that mimics the lifestyle of sperm whales. SWA
uses the worst and best answers to reach the optimum [183]. Marine predators algorithm
(MPA) is another similar algorithm that is inspired by the Levy and Brownian foraging
movements and the maximum encounter rate of prey for marine predators [184]. Biyanto
et al. have developed a killer Whale algorithm (KWA) based on the life of the killer whale.
Studies show that KWA outperformed algorithms such as GA, imperialist competitive
algorithm (ICA), and SA in black box optimization benchmarking [185]. Whale swarm
algorithm (WSA) was developed by Zeng et al. based on the whales’ ultrasonic communi-
cation and hunting behavior. WSA has been compared with several popular meta-heuristic
algorithms on comprehensive performance metrics, and the results show that WSA has
a competitive performance compared to other algorithms [186]. Masadeh et al. have also
developed the vocalization of humpback whale optimization algorithm (VHWOA), which
mimics the humpback whale behavior of vocalization and is used in cloud computing
environments to ameliorate task scheduling. A multi-objective model is a basis for the
VWOA scheduler, which maximizes resource usage and reduces span, energy, and cost
consumption [187].
There are numerous other algorithms inspired by aquatic animal behavior, such as
the artificial algae algorithm (AAA), which was developed by modeling the living behav-
iors of the photosynthetic species microalgae [188]. Coral reefs optimization algorithm
(CROA) was developed based on the growth, reproduction, and fighting behaviors of
coral reefs [189]. Eesa et al. have also developed the cuttlefish algorithm (CA) based on
color-changing in cuttlefishes [190]. Another aquatic-based algorithm is mussels wandering
optimization (MWO), developed by An et al. MWO simulates mussels’ leisurely locomotion
behavior in forming bed patterns in their habitat [191]. Tunicate swarm algorithm (TSA)
was developed based on jet propulsion and swarm behaviors of tunicates during their
navigation and foraging process by Kaur et al. [192]. Masadeh et al. have also developed
the sea lion optimization algorithm (SLOA) based on the hunting behavior of sea lions and
their whiskers which are used to detect prey [193]. Another example of a marine-based
algorithm is the barnacles mating optimizer algorithm (BMOA) which was developed
by Sulaiman et al. based on the behavior of the barnacle [194]. Another fish-inspired
algorithm is the anglerfish algorithm (AA) which was developed based on the mating
behavior of anglerfish [195]. Circular structures of puffer fish (CSPF) was developed by
Catalbas et al. and is inspired by the courting behaviors of male puffer fish on females [196].
Pontogammarus maeoticus swarm optimization (PMSO) is also another example inspired
by the foraging behavior of Pontogammarus [197]. The last example in this group is the
water-tank fish algorithm (WTFA) developed by Sukoon et al. [198].
Terrestrial Animals-Based
Terrestrial animals are one of the most popular species in developing nature-inspired
algorithms. They form around 20 percent of nature-inspired algorithms. The most popular
algorithms in this category based on citations are the grey wolf optimizer (GWO), cats
swarm optimization (CSO), and lion optimization algorithm (LOA). Figure 41 illustrates
the most cited terrestrial animal-based algorithms.
7, x FOR PEER REVIEW 47 of 140
Drones 2023, 7, 427 46 of 134
Figure41.
Figure 41. Most popular
Figure 41. Mostpopular
popular
terrestrial
Most terrestrialanimal-based
animal-based
terrestrial animal-based algorithms.
algorithms. algorithms.
GreyWolf
Grey WolfOptimizer
Optimizer
Grey Wolf Optimizer
Grey wolves (Canis
Grey wolves (Canislupus)
lupus)are aretop-of-the-food-chain
top-of-the-food-chainapex apexpredators.
predators. Grey
Grey wolves
wolves
Grey wolves (Canis lupus)packs
are top-of-the-food-chain apex predators. Grey wolves
live in hierarchical packs of about 5–12 wolves led by male and female alphas.The
live in hierarchical of about 5–12 wolves led by male and female alphas. Thealphas
alphas
live in hierarchical packs
dictate
dictate of about
where
where the pack
the pack5–12 wolves
hunts,
hunts, sleep,
sleep,led
etc.by
The
etc. male
Thebeta and female
wolves
beta wolvesareare alphas.
subordinateThetoalphas
subordinate the alphas
to the and
alphas
dictate where theaidpack
and in hunts,
aidpack
in pack sleep, etc. TheThe
decision-making.
decision-making. betaThe wolves
lowest are
hierarchy
lowest subordinate
wolves
hierarchy are to
wolves the
the
are alphas
omegas,
the andwho
who
omegas, are often
are
aid in pack decision-making.
the scapegoat,
often Thelast,
eat
the scapegoat, lowest
eat last, hierarchy
andandmustmustsubmit wolves
submit to to
theare
the the omegas,
alphas
alphas and
andbetas. who
betas. areor
Delta
Delta oroften
subordinate
subordinate
the scapegoat, wolves
eat
wolveslast, and must
represent
represent the
the submit
rest
rest of to
of the the the
pack.pack.alphas
The The and
deltadelta betas.
wolveswolves
submitDelta
submit oralpha
to the tosubordinate
theand
alpha
betaand beta
wolves
wolves representbut the rest
not
wolves the of the
omegas
but not theomegas
pack. The
[199]. [199].delta
Figure 42Figurewolves
illustrates thesubmit thetohierarchy
hierarchy
42 illustrates the alpha
of grey andwolves.
wolves.
of grey beta
wolves but not the omegas [199]. Figure 42 illustrates the hierarchy of grey wolves.
Figure42.
Figure 42.Hierarchy
Hierarchyof
ofgray
graywolves
wolves[200].
[200].
Grey wolves exhibit other social behaviors, such as group hunting. Wolves hunt in
Figure 42. Hierarchy of gray wolves
the following [200].
phases: track-chase-approach, pursue-encircle-harass, and a ack, which are
illustrated in Figure 43 [199].
Grey wolves exhibit other social behaviors, such as group hunting. Wolves hunt in
the following phases: track-chase-approach, pursue-encircle-harass, and a ack, which are
Drones 2023, 7, 427 47 of 134
Grey wolves exhibit other social behaviors, such as group hunting. Wolves hunt in
Drones 2023, 7, x FOR PEER REVIEW 48 of 140
the following phases: track-chase-approach, pursue-encircle-harass, and attack, which are
illustrated in Figure 43 [199].
TheThe
mathematical
mathematicalmodel
model describing theencircling
describing the encircling behavior
behavior of wolves
of grey grey wolves
is intro-is intro-
duced as follows:
duced as follows:
D = C · X p (t) − X (t) (11)
𝐷 = 𝐶. 𝑋 (𝑡) − 𝑋(𝑡) (11)
X ( t + 1) = X p ( t ) − A · D (12)
where t describes the iteration, A𝑋(𝑡
and + = 𝑋 (𝑡)
1) vectors
C are of − 𝐴. 𝐷 Xp describes the prey
coefficient, (12)
position vector, and X describes the wolf position vector. The A and C vectors are calculated
where t describes the iteration, A and C are vectors of coefficient, Xp describes the prey
as follows:
position vector, and X describes the wolf A = 2aposition
·r1 − a vector. The A and C vectors(13) are calcu-
lated as follows:
𝐴C = 2𝑎.
= 2r 2 𝑟 −𝑎 (14) (13)
where a is decreased linearly from 2 to 0 as the iterations carry out, and r1 , r2 are vectors
assigned random numbers in the span [0, 1].𝐶Grey = 2𝑟wolves can find and encircle prey. The (14)
alpha normally leads the hunt, whereas the beta sometimes participates. However, the
where a is decreased
location linearly
of the optimum from(prey)
solution 2 to 0is as
notthe iterations
known carry
in a search out, Therefore,
space. and r1, r2 are
it is vectors
assigned random numbers in the span [0, 1]. Grey wolves can find and encircle
assumed that the alphas, betas, and deltas know of a better solution than the omegas. The prey. The
alpha normally
positions of allleads
of the the hunt,
wolves are whereas the beta
updated based on thesometimes
solutions of participates. However,
the alphas, betas, and the
location of the optimum solution (prey) is not known in a search space. Therefore, it is
deltas. This behavior is mathematically simulated as follows:
assumed that the alphas, betas, and deltas know of a be er solution than the omegas. The
Dα = |C1 · Xα − X | , Dβ = C2 · X β − X , Dδ = |C3 · Xδ − X | (15)
positions of all of the wolves are updated based on the solutions of the alphas, betas, and
deltas. This behavior is mathematically simulated as follows:
X1 = X α − A 1 · D α , X2 = X β − A 2 · D β , X3 = X δ − A 3 · D δ (16)
𝐷 = |𝐶 . 𝑋 − 𝑋| , 𝐷 = 𝐶 . 𝑋 − 𝑋 , 𝐷 = |𝐶 . 𝑋 − 𝑋| (15)
𝑋 = 𝑋 − 𝐴 .𝐷 ,𝑋 = 𝑋 − 𝐴 .𝐷 ,𝑋 = 𝑋 − 𝐴 .𝐷 (16)
Drones 2023, 7, 427 48 of 134
X1 + X2 + X3
X ( t + 1) = (17)
3
Grey wolf hunts end when the pack attacks the prey and is mathematically simulated
by decreasing the values of a and A. A is a value randomly chosen from the span [2a, 2a]
where a decrement from 2 to 0. The search agent’s next position can be anywhere between
itself and the prey when A is [1, 1]. Thus, GWO compels the search agents to advance
toward the prey using the alpha, beta, and omega solutions as a guide at the cost of
local solution stagnation. The stagnation can be remedied by including more operators to
mediate exploration [199].
GWO has been tested with 25 benchmark functions, and the results showed that it
competed well with popular heuristics like GSA, DE, PSO, ES, and EP. GWO exhibited su-
perior exploitation during the unimodal function test results. GWO displayed exploratory
ability during the multimodal test functions. Local optima avoidance capabilities were
shown during the unimodal functions. Additionally, the GWOs capability to converge was
displayed. GWO algorithm has displayed good performance in unknown and challenging
search spaces [199].
Figure
Figure44.
44.Flowchart
Flowchartofofthe
theSFLA.
SFLA.
SFLAand
SFLA andGA GAperformance
performancewere werecompared
comparedtotoone
oneanother
anotherininaaseries
seriesofoftests.
tests.SFLA
SFLA
outperformed or performed equally compared to GA in 2 applications and
outperformed or performed equally compared to GA in 2 applications and 11 theoretical 11 theoretical
testfunctions
test functionsononnearly
nearlyall
alltests.
tests.SFLA
SFLAdemonstrated
demonstratedhigher
higherrobustness
robustnessininlocating
locatingthethe
global optima. The results of four engineering problems were compared with
global optima. The results of four engineering problems were compared with results from results from
theliterature
the literaturefor
forother
otheroptimization
optimization algorithms.
algorithms. SFLA
SFLA was
was proven
proven toto be
be robust
robust andand have
havea
fast convergence speed. SFLA shows encouraging results as a robust meta-heuristic process.
a fast convergence speed. SFLA shows encouraging results as a robust meta-heuristic pro-
SFLA seems to perform well in solving mixed-integer problems despite being developed for
cess. SFLA seems to perform well in solving mixed-integer problems despite being devel-
use in combinatorial problems. Similar to other GAs, SFLA may be a promising candidate
oped for use in combinatorial problems. Similar to other GAs, SFLA may be a promising
for parallelization [201]. Due to the SFLA promising performance, improved versions of
candidate for parallelization [201]. Due to the SFLA promising performance, improved
this algorithm have been developed [202–204].
versions of this algorithm have been developed [202–204].
Cat Swarm Optimization
Cat Swarm Optimization
Cats maintain a high level of alertness even when they are at rest. In Cat Swarm
Cats maintain
Optimization a high
(CSO), thelevel of alertness
seeking even when
and tracking they are
behaviors at rest.
of cats areInmimicked.
Cat SwarmInOp- the
timization
algorithms, the agent cats are described by their dimensional positions, M, In
(CSO), the seeking and tracking behaviors of cats are mimicked. the algo-
where every
rithms, the dimension
positional agent cats arehas described
a velocity, by their of
a value dimensional
fitness, andpositions, M, whereflag.
a seeking/tracking every posi-
The best
tional dimension has a velocity, a value of fitness, and a seeking/tracking
solution that is found by a cat is kept until all of the iterations are completed. Four modes flag. The best
solution thatfor
are defined is found by a cat
the seeking is kept until
behavior: seekingall aofrange
the iterations are completed.
of dimensions Four memory
(SRD), seeking modes
are
pooldefined
(SMP),for the seekingconsideration
self-position behavior: seeking(SPC), aand range
countof of
dimensions
the changing(SRD), seeking
dimensions
memory pool (SMP), self-position consideration (SPC), and count of the changing dimen-
(CDC) [205].
sions (CDC) [205]. size (seeking points) for the seeking behavior of the cats is defined by
The memory
The memory
SMP. From the memory size (seeking
pool, a points)
cat picksfor the seeking
a point behavior
to explore. of the cats
The mutative is defined
ratio for memoryby
SMP. From the memory pool, a cat picks a point to explore. The mutative
is defined via the SRD. SRD also dictates that when a dimension mutates, the new value ratio for memory
iswill
defined
not beviaoutside
the SRD. of SRD also dictates that
a predetermined range. when Thea number
dimension of mutates,
dimensions the that
new vary
valueis
will not beby
dictated outside
CDC. A of point
a predetermined
occupied byrange.
a cat canThebe number of dimensions
determined that vary
as a candidate is dic-
solution by
tated by CDC.variable
the Boolean A pointdefined
occupied byby a cat
SPC. SMPcanisbenot determined
influencedasby a candidate
SPC [205].solution by the
Boolean variable
Targets are defined
traced by bythe
SPC. SMP
casts in is
thenot influenced
tracing mode.by SPCmove
Cats [205].based on the velocity
Targets
of their are traced
positions whenbytracing.
the casts
Catsin use
the tracing
most ofmode. Catstracing
their time move based
when on the velocity
resting or when
ofawake.
their positions when tracing.
When resting, Cats use
the cat traces when mostin of
thetheir
sametime tracingorwhen
position, they resting or when
move cautiously
awake. When resting, the cat traces when in the same position, or they move cautiously
Drones 2023, 7, x FOR PEER REVIEW 51 of 140
Drones 2023, 7, 427 50 of 134
and deliberately. Cats then chase their target based on real-world cats’ running behaviors.
and deliberately. Cats then chase their target based on real-world cats’ running behaviors.
MR is thus set to a small value to ensure cats mostly seek [205]. Figure 45 illustrates the
MR is thus set to a small value to ensure cats mostly seek [205]. Figure 45 illustrates the
flowchart of the CSO.
flowchart of the CSO.
algorithm was compared to other meta-heuristic strategies such as BBO, GSA, PSO, and
ABC. The BM algorithm displayed a competitive performance against the other algorithms.
BM also shows promising results in optimization problems with unidentified or restricted
search spaces [209].
Lions have inspired various optimization algorithms, such as the lion optimization
algorithm (LOA) developed by Yazdani et al. LOA is based on the lifestyle and cooperation
characteristics of lions. Benchmark simulation results show that LOA outperforms similar
algorithms [210]. The lion’s algorithm (TLA) mimics the social behaviors of lions. Optimal
solutions are found in the search space through the interpretation of lion social orders. TLA
uses binary and integer structured agents to solve single and multivariable cost functions.
TLA can perform depending on different sizes of the search space [211]. Wang et al. have
developed the lion pride optimizer (LPO) based on lion group theory and pride evolution.
The state of each lion contributes to the overall pride’s health which exists simultaneously
with competition in and between the male lions of a pride. LPO incorporates the dominant
behavior of lion breeding to solve optimization problems. The alpha lion has access to
most of the breeding resources. When an alpha lion is replaced, the new group of lions
eliminates the cubs bred by the last alpha, which aids in finding the optimum solution.
Studies show that LPO is not sensitive to parameter tuning, displaying LPOs robustness
as an optimization algorithm [212]. Kaveh et al. have also developed a similar algorithm
called the lion pride optimization algorithm (LPOA) [213].
Hunting predator animals and their behaviors are another source of inspiration for
developing nature-inspired algorithms. One such popular hunter is the wolf, which was
studied in the GWO section. Wolf search (WS) [214], wolf pack algorithm (WPA) [215], and
dominion optimization algorithm (DOA) [216] are three examples of wolf-based algorithms
which have various improved versions [217]. Another terrestrial-hunter-based algorithm
is the spotted hyena optimizer (SHO) which is based on the social relationship between
spotted hyenas as well as their collaborative behavior [218]. Another example is coyote
optimization (CO) which is inspired by the Canis latrans species [219]. Polap et al. have
developed the polar bear optimization algorithm (PBOA) based on the hunting techniques
of polar bears in harsh arctic conditions [220]. Another hunting-animal-based algorithm
is the cheetah-based optimization algorithm (CbOA) which was developed based on the
social behaviors of cheetahs [221]. Another similar algorithm is the cheetah chase algorithm
(CCA) [222]. Jaguar algorithm (JA) developed by Chen et al. can also be mentioned in this
section. JA mimics the hunting behaviors of jaguars in the exploitation and exploration
phases of the optimization process [223]. African wild dog algorithm (AWDA) is another
example inspired by the communal hunting behavior of the dogs [224]. Finally, military dog
optimizer (MDO) can be mentioned in this group. MDO models the searching capability of
trained military dogs to solve optimization problems [225].
Humans also fall into this category of algorithms. Zhang et al. have developed the
human-inspired algorithm (HIA), which mimics the use of binoculars and cell phones by
mountain climbers to find the highest mountain in the range [226].
There are some algorithms based on elephants, such as elephant herding optimization
(EHO), which was developed based on the herding behavior of elephant groups [227]. The
elephant search algorithm (ESA) is based on similar concepts [228].
Some algorithms are inspired by squirrels. One example is the squirrel search algo-
rithm (SSA) developed by Jain et al. based on the dynamic foraging behavior of flying
squirrels and their gliding motion [229]. Another example is flying squirrel optimizer (FSO)
which uses a similar approach and considers social connections between squirrels [230].
Meerkats have been considered in designing optimization problems. The meerkat-
inspired algorithm (MIA) was developed based on the behaviors of meerkats [231]. Meerkat
clan algorithm (MCA) uses a similar approach as the MIA since it models the personal and
social behaviors of meerkats to solve optimization problems [232].
There are many algorithms in this group based on different species. The sheep flocks
heredity model algorithm (SFHMA) models the heredity of sheep flocks in a prairie [233].
Drones 2023, 7, 427 52 of 134
Another similar algorithm is the shuffled shepherd optimization algorithm (SSOA) which
imitates the behavior of a shepherd [234]. Another example is the camel optimization algo-
rithm (COA), which is inspired by camels’ behaviors while traveling through a harsh desert
environment [235]. Motevali et al. have developed wildebeests herd optimization (WHO)
based on the path-planning behavior of African wildebeests during migration [236]. The
side-blotched lizard algorithm (SBLA) is also another example developed by Maciel et al.
based on mating strategies and population balance in side-blotched lizards [237]. Another
terrestrial-animal-based algorithm is the raccoon optimization algorithm (ROA) which is
developed based on the rummaging behaviors of real raccoons for food [238]. Tian et al.
have developed an algorithm based on the habitual characteristics of the rhinoceros called
the rhinoceros search algorithm (RSA) [239]. Xerus optimization algorithm (XOA) was
developed based on cape ground squirrels’ lifestyle in groups [240]. Even earthworms
have been used by Wang et al. to solve optimization problems. Earthworm optimization
algorithm (EOA) models the reproduction of earthworms to generate a population during
the optimization process [241].
The red deer algorithm (RDA) is another example developed by Fathollahi et al. based
on the mating behavior of the Scottish red deer [242]. Taherdangkoo et al. have also
developed the blind naked mole-rats algorithm (BNMRA) based on the social behavior
of the blind naked mole-rats colony in finding food and protecting themselves against
enemies [243]. Another example is rhino herd (RH) which is a swarm-based meta-heuristic
algorithm based on the herding behavior of rhinos [244]. The donkey and smuggler
optimization algorithm (DSOA) is also another example based on the searching behavior of
donkeys [245]. The African buffalo optimization (ABO) algorithm can also be mentioned,
which was developed by Odili et al. based on the African buffalo’s strategy in searching
for pastures [246]. The last algorithm in this list is the jumping frogs optimization (JFO)
which is a PSO-based algorithm with random frog’s jump-like movements in agents, which
makes the algorithm faster in finding an optimal solution [247].
2.1.9. Plant-Based
In this category, plant-based algorithms based on agriculture are reviewed. A majority
of these plant-based algorithms are based on trees. There are also algorithms based on
Drones 2023, 7, x FOR PEER REVIEWflowers, fruits, and other kinds of plants. The most popular plant-based algorithm 54based
of 140
on citations is the flower pollination algorithm. Figure 46 illustrates the most popular
plant-based algorithms based on their citations.
Figure
Figure TheThe
47. 47. pollinators
pollinators andand pollination
pollination types
types [249].
[249].
In the flower pollination algorithm (FPA), it is assumed that each plant only has
In the flower pollination algorithm (FPA), it is assumed that each plant only has one
one flower that only produces one pollen gamete for simplicity. Therefore, a solution xi
flower that only produces one pollen gamete for simplicity. Therefore, a solution xi is
is equivalent to a flower and/or a pollen gamete. Global and local pollination are the
equivalent to a flower and/or a pollen gamete. Global and local pollination are the two
two main steps in the algorithm. In the global pollination step, pollinators carry pollen
main steps in the algorithm. In the global pollination step, pollinators carry pollen over a
over a long distance, which ensures the fertilization and reproduction of the best solution,
long distance, which ensures the fertilization and reproduction of the best solution, which
which is represented as g∗. The first rule and flower constancy can be mathematically
is represented as follows:
represented as g∗. The first rule and flower constancy can be mathematically repre-
sented as follows: Xi ( t + 1 ) = Xi ( t ) + L ( Xi ( t ) − g ∗ ) (18)
where X (t) represents the𝑋ith
(𝑡pollen
+ 1) =or𝑋X(𝑡), the 𝐿(𝑋 (𝑡) −
+ solution (18) g∗
𝑔∗ ) at the t-th iteration, and
vector
i i
where Xi(t) represents
represents the ith pollen
the best solution foundorsoXfar
i, the solution
amount thevector at population.
current the t-th iteration, and g∗
The parameter
represents the best solution found so far amount the current population. The parameter
L represents the pollination strength. Levy flight is simulated to mimic the efficiency Lof a
represents the long-distance
pollinator’s pollination strength. Levy the
flight. From flight
Levy is simulated to Lmimic
distribution, > 0 is the efficiency of a
drawn:
pollinator’s long-distance flight. From the Levydistribution, L > 0 is drawn:
λΓ(λ) sin πλ 1
L ∼ 𝜋𝜆 2
𝜆Γ(𝜆)sinπ( 2 ) 1s1+λ , (s s0 > 0) (19)
(19)
𝐿~ , (𝑠 ≫ 𝑠 > 0)
𝜋 𝑠
where Γ(λ) represents the gamma function distribution which is valid for large steps s>0.
Flower constancy and local pollination is represented as follows:
𝑋 (𝑡 + 1) = 𝑋 (𝑡) + 𝜖(𝑋 (𝑡) − 𝑋 (𝑡)) (20)
where Xj(t) and Xk(t) represent pollen particles of the same plant from different flowers,
Drones 2023, 7, 427 54 of 134
where Γ(λ) represents the gamma function distribution which is valid for large steps s > 0.
Flower constancy and local pollination is represented as follows:
Xi ( t + 1 ) = Xi ( t ) + e X j ( t ) − X k ( t ) (20)
where Xj (t) and Xk (t) represent pollen particles of the same plant from different flowers,
which emulates floral constancy in a limited population. The pollen particles Xj (t) and Xk (t)
essentially perform a local random walk when e is between [0, 1] since they come from the
same population. Pollination occurs both globally and locally. Local pollen is more likely to
pollinate neighboring flowers than those that are far away. To emulate the proximity based
population, p is used to switch between intense local pollination to global pollination [248].
The simulation results have shown that the FPA algorithm can outperform GA and
PSO and it features an exponential convergence rate. FPA is efficient because of flower con-
sistency and distance pollination. Pollinators can leave the local search space to essentially
explore the overall search space. FPA converges quickly because the same flowers/solutions
are frequently chosen due to flower consistency. The efficiency of the algorithm is ensured
through the relationship between the components and g∗ (the best solution) [245]. The
Drones 2023, 7, x FOR PEER REVIEW
FPA has only a few parameters that must be tuned. Therefore, FPA is applicable in56many of 140
optimization areas. FPA needs some improvement to eliminate the time cost and premature
convergence [249].
InvasiveWeed
Invasive WeedColonization
Colonization
AAplant
plantisisclassified
classifiedasas a weed
a weed if mainly
if it it mainly grows
grows in ainhuman-occupied
a human-occupied territory
territory with-
without
out being cultivated directly. Weeds cause troubles in the agricultural sector
being cultivated directly. Weeds cause troubles in the agricultural sector since they adapt since they
adapt to their environment and change in order to increase their own fitness
to their environment and change in order to increase their own fitness [250]. Figure 48 [250]. Figure
48 illustrates
illustrates examples
examples of weeds.
of weeds.
Figure48.
Figure 48. Some
Somekinds
kindsofof
species that
species are are
that usually considered
usually as weeds:
considered (a) dandelion
as weeds: [251], (b)
(a) dandelion bur-
[251],
dock [252], (c) amaranth or pigweed [253].
(b) burdock [252], (c) amaranth or pigweed [253].
Theinvasive
The invasiveweedweed colonization
colonization (IWC)(IWC) algorithm
algorithm has has four
four main
main stages:
stages: initializing
initializinga
apopulation,
population,reproduction,
reproduction,spatial
spatialdispersal,
dispersal,and andcompetitive
competitive exclusion
exclusion (where
(where thethe
best in-
best
dividuals are
individuals arechosen).
chosen).Initial solutions
Initial solutionsarearedisseminated
disseminated in random
in random positions over over
positions a prob-
a
lem space
problem with
space withd dimensions.
d dimensions. Seed
Seedproduction
productionisisbased basedon onthethe fitness
fitness ofof aa plant
plant and
andaa
colony.For
colony. Forexample,
example, thethe least
least fit plant
fit plant in a in a colony
colony producesproduces the minimum
the minimum numbernumber of
of seeds,
seeds,the
while while theplant
fittest fi estproduces
plant produces
the most theseeds.
most seeds.
A unique A unique
featurefeature
of IWC ofisIWC
the is thethat
fact fact
that while
while the plants
the fittest fi est are
plants are allowed
allowed to reproduce,
to reproduce, less fit plantsless are
fit plants are still
still allowed allowed to
to reproduce
inreproduce in the
the case that case
their that their
offspring is a offspring is a useful
useful solution. solution.
The seeds The seeds disseminated
are randomly are randomly
disseminated
over over of
the dimensions thethe
dimensions
search space of via
the random
search space via random
and normally and normally
distributed numbers distrib-
with
uted numbers with changing variance and a mean of zero. In every generation, the stand-
changing variance and a mean of zero. In every generation, the standard deviation (SD),
σ,ard deviation
will be reduced(SD),from
σ, will be reduced
σinitial from
to σfinal in σinitial
every stepto σ(generation).
final in every step (generation).
A nonlinear A non-
alteration
linear alteration
provided adequate provided adequate
performance performance
and is shown as follows:and is shown as follows:
((𝑁
N −−i )𝑖)
n
(21)
σ𝜎i == 𝜎 − σ−f inal
σinitial 𝜎 + σ+ 𝜎
f inal (21)
N𝑁
whereNNdescribes
where describes the
the total
total iteration
iteration count,
count, σi represents
σi represents thethe current
current SD, SD,
and and n represents
n represents the
the nonlinear
nonlinear modulation
modulation indexindex
wherewhere thevalue
the best best value for3 nbased
for n is is 3 based on simulation
on simulation results.
results. The
The above
above changes
changes ensureensure that
that the the probability
probability of seedsof dropping
seeds dropping fardecreases
far away away decreases
linearlyline-
to
cluster the fitter plants and phase out the less fit plants [250]. The population of plants mustof
arly to cluster the fi er plants and phase out the less fit plants [250]. The population
plants must be regulated so that one species does not invade the entire search space. Thus,
competition between plants is necessary. When pmax, the maximum plant limit in a colony,
is reached, less fit plants start to be eliminated. When a colony reaches the maximum num-
ber of plants, the current population disseminates its seeds according to the aforemen-
Drones 2023, 7, 427 55 of 134
be regulated so that one species does not invade the entire search space. Thus, competition
between plants is necessary. When pmax , the maximum plant limit in a colony, is reached,
less fit plants start to be eliminated. When a colony reaches the maximum number of plants,
the current population disseminates its seeds according to the aforementioned mechanisms.
When the seeds find their position, all weeds (parents and children alike) are ranked, and
Drones 2023, 7, x FOR PEER REVIEW fewer fit plants are eliminated until the maximum number of plants is achieved. 57 ofThus,
140
even low-fitness plants have the opportunity to reproduce [250]. Figure 49 illustrates the
flowchart of the IWC algorithm.
Figure49.
Figure Flowchartofofthe
49.Flowchart theIWC
IWCalgorithm
algorithm[250].
[250].
The feasibility and efficiency of IWO for the optimization of two examples have been
The feasibility and efficiency of IWO for the optimization of two examples have been
compared to four recent evolutionary algorithms: GA, MA, PSO, and SFLA. It was shown
compared to four recent evolutionary algorithms: GA, MA, PSO, and SFLA. It was shown
that IWC locates minima rapidly compared to other methods. IWC also escapes from local
that IWC locates minima rapidly compared to other methods. IWC also escapes from local
optima and can solve non-differentiable complex objective functions. IWO performed at a
optima and can solve non-differentiable complex objective functions. IWO performed at
satisfactory level in the test functions and competed with other evolutionary algorithms.
a satisfactory level in the test functions and competed with other evolutionary algorithms.
When increasing the plants in a set, the mean of the solution increases, but the percentage
When increasing the plants in a set, the mean of the solution increases, but the percentage
of success stays the same. The behavior of IWC seems to be optimal when the minimum
of success stays the same. The behavior of IWC seems to be optimal when the minimum
and maximum seed numbers are set to zero and 2, respectively [250].
and maximum seed numbers are set to zero and 2, respectively [250].
Other Algorithms
Other Algorithms
A large group of plant-based algorithms is based on trees and forests. Tree seed
A large group
optimization of plant-based
algorithm (TSOA) is aalgorithms is based based
popular optimizer on trees onand
tree forests.
and seed Tree seed op-
relationships.
timization
The locationalgorithm (TSOA)
of feasible is a popular
solutions optimizer based
in the n-dimensional on tree
search andisseed
space relationships.
represented by the
The location
position of feasible
of seeds solutions
and trees. Each in tree
the n-dimensional
creates one or search space and
more seeds, is represented
seeds withbybetter
the
position of seeds
fitness ratings and trees.
replace trees Each
with tree
low creates one or more
fitness values. A treeseeds, andorseeds
location withsolution
the best be er
fitness ratings replace
is considered for a new trees with
tree evenlowthough
fitness new
values. A tree are
locations location or thefor
produced bestseeds,
solution is
which
considered for a new tree even though new locations are produced for
is controlled using the search tendency (ST) parameter for a certain iteration count. The seeds, which is
controlled
exploitationusing
andthe search tendency
exploration abilities(ST) parameter
of the TSOA are forbalanced.
a certain When
iteration count.
tested on The ex-
numeric
ploitation and exploration
function optimization, TSOA abilities of the TSOA
outperformed similararemeta-heuristic
balanced. When tested on
algorithms numeric
such as ABC,
function
PSO, HS,optimization,
FA, and BA and TSOA can outperformed similar meta-heuristic
be utilized for multilevel thresholding algorithms such as
functions [254].
ABC, PSO,
GhaemiHS,etFA,
al. and
haveBA and can be
developed theutilized for multilevel
forest optimization thresholding
algorithm (FOA), functions [254].
which mimics
treesGhaemi
that liveetfor
al. have developed
decades the forest
in the forest. FOAoptimization
was developed algorithm
to solve(FOA), whichnonlinear
continuous mimics
trees that live for
optimization decades in
functions. the simulates
FOA forest. FOAthe was developed
spread of seedsto solve continuous
by trees, whether nonlinear
they are
optimization functions.
deposited directly FOA simulates
underneath a canopy, the spread
spread of seeds
across by trees,
a search space,whether
or eaten they are de-
by animals.
posited directly underneath a canopy, spread across a search space, or eaten by animals.
The results of the experiments have shown the acceptable performance of FOA compared
to GA and PSO [255]. The tree growth algorithm (TGA) is another algorithm that is in-
spired by trees competing for light and food. Convergence analysis and significance tests
Drones 2023, 7, 427 56 of 134
The results of the experiments have shown the acceptable performance of FOA compared to
GA and PSO [255]. The tree growth algorithm (TGA) is another algorithm that is inspired
by trees competing for light and food. Convergence analysis and significance tests via
nonparametric techniques have confirmed the efficiency and robustness of TGA. According
to the experimental tests, TGA can be considered a successful meta-heuristic method and is
suitable for optimization problems [256]. Li et al. have also developed the artificial tree
algorithm (ATA), which is inspired by the growth law of trees. In ATA, the design variable
is the branch position. The branch thickness is a solution indicator, and the branch itself
is the solution. Updating the branches and organic material transport models is the main
computing process of ATA. Based on simulation results, ATA is effective in dealing with
various problems [257]. Natural forest regeneration (NFR) was developed by Moez et al.
based on the natural behavior of the forests against the rapidly changing environment. This
phenomenon is combined with the natural regeneration behaviors of forests [258].
Other algorithms are inspired by fruit such as strawberries. Plant propagation opti-
mization algorithm (PPOA) is an example of a fruit-inspired algorithm that mimics the
way strawberries and other plants propagate [259]. Merrikh-Bayat developed the straw-
berry algorithm (SA). Plants—such as strawberry plants—grow roots and runners to find
minerals and water and for propagation purposes. Runners and roots can be used as
tools for global and local searches. SA displays three main differences between other
nature-inspired optimization methods: information exchange isolation between agents,
duplication-elimination of agents during all iterations, and forcing all agents to move in
small or large magnitudes. SA has the advantage of only having one parameter that needs
to be fine-tuned. Simulations have shown that SA can very effectively solve complicated
optimization problems [260]. Another similar algorithm is the mushroom reproduction
optimization (MRO) algorithm, which was created by Bidar et al. to mimic the growth
and reproduction behaviors of mushrooms in nature. Spores explore the search space to
find rich areas to develop a colony. The experimental results have confirmed the ability
of MRO to deal with complex optimization problems by discovering solutions with better
quality [261].
Some algorithms are developed based on agriculture, such as farmlands and related
materials. For example, the farmland fertility algorithm (FFA) can be mentioned here. FFA
uses external and internal memory as well as partitioning farmland into different zones
to find an optimal solution. Simulations have shown that the FFA often performs better
than other meta-heuristic algorithms such as ABC, FA, HS, PSO, DE, BA, and the improved
PSO [262]. The paddy field algorithm (PFA) [263], fertile field optimization algorithm
(FFOA) [264], and targeted showering optimization (TSO) [265] are other examples of
agriculture-inspired algorithms.
Other plant-based algorithms are based on root growth, such as the runner-root algo-
rithm (RRA) [266], root tree optimization algorithm (RTOA) [267], root growth algorithm
(RGA) [268], and root mass optimization (RMO) algorithm [269]. These algorithms work
based on a model of the root’s growth in plants and trees, which strives to find rich soil
and mineral sources. RMO is inspired by the Roger Newson growth model. RGA uses the
L-system model, and other algorithms employ different approaches [268,269].
Other algorithms are developed based on the growth and reproduction of the dif-
ferent plants. Plant growth optimization (PGO) uses a model for plant’s growth which
includes leaf growth, branches, spatial occupation, and phototropism [270]. Physarum
optimization (PO) is a high-parallelism algorithm developed for minimum path finding
and uses a computation model based on the slime mold Physarum polycephalum [271].
An algorithm similar to PO is the Physarum-energy optimization algorithm (PEOA) which
uses Physarum’s energy and biological model [272]. Another example is the saplings
growing up algorithm (SGA) which is developed based on the sowing and development of
saplings [273]. Sulaiman et al. have developed the seed-based plant propagation algorithm
(SBPA) based on seed dispersion caused by birds and animals [274]. Another plant-based
algorithm is the artificial plant optimization algorithm (APOA) which was developed
Drones 2023, 7, 427 57 of 134
for solving constrained optimization problems [275]. The waterweeds algorithm (WA) is
Drones 2023, 7, x FOR PEER REVIEW another algorithm that is based on the reproduction principle of waterweeds searching 59
for water sources [276]. Gowri et al. have developed the bladderworts suction algorithm
(BSA), which is a plant-intelligence-inspired algorithm based on the foraging and suction
mechanism of bladderworts [277]. Photosynthetic algorithm (PA) was also developed by
Okayama et al. based on photosynthesis in plants [278].
2.2. Ecosystem-Based
In this section, ecosystem and environment-inspired algorithms will be explore
2.2. Ecosystem-Based
algorithms are inspired
In this section, byand
ecosystem natural phenomena such
environment-inspired as the
algorithms water
will cycle, The
be explored. sun, wind
and general mechanics found in the ecosystem. Figure 50 illustrates the most popul
algorithms are inspired by natural phenomena such as the water cycle, sun, wind, rain,
and general mechanics found in the ecosystem. Figure 50 illustrates the most popular
system-based
ecosystem-based algorithms basedonon
algorithms based their
their citations.
citations.
WCA can handle many constraints based on the efficiency of the algorithm compared
to other popular optimization methods. WCA has a lower computational cost and gener-
ally obtains be er solutions than popular optimizing methods such as DE, PSO, and GA.
The complexity of the problem determines the quality of the solution and the computa-
Drones 2023, 7, 427 60 of 134
Figure
Figure53.
53.Most
Mostpopular
popularsocial-based
social-basedalgorithms.
algorithms.
𝑣 v(𝑡i (t++1)
1) =
= 𝑣(𝑡) + c𝑐1 ( p𝑝i −−x𝑥i (t(𝑡)
v(t) + )) R1𝑅+ +
c2 (𝑐g (𝑔 i ( t𝑥))(𝑡))𝑅
− x− R2 (26)
(26)
where pi represents the best position, the particle has found, and g is the best position found
globally by all of the particles. The magnitude of the steps taken by the particles in search
of local or global optima are controlled by c1 and c2 , called the acceleration constant, and
are in the range 0 ≤ c1 , c2 ≤ 4. The acceleration coefficients are also called the cognitive
and social coefficients. The velocity update rule is influenced by the constant acceleration
considered semi-random [296]. Figure 54 illustrates the flowchart of the PSO.
PSO has gained a ention from researchers because of the relative simplicity of
algorithm and the fact that PSO does not assume that an optimization function is con
Drones 2023, 7, 427
uous or differential equations. Initial suggestions to improve PSO were62 to use diffe
of 134
topologies, but the problem has idiosyncratic topologies. PSO struggled with conver
after researchers began widely using the algorithm. PSO was combined with other a
rithms, and other
stochastically via parameters were
R1 and R2, which areadded
diagonaltomatrices
ameliorate the convergence
composed issues. The m
of random numbers
relevant applications
from the solved
span [0, 1]. Since using PSO
the trajectories arewere multimodal
influenced and constrained
by the stochastic weighting ofmulti-objec
the
optimization problems [295].
social and cognitive terms as well as the attraction to the local and global optima, they are
considered semi-random [296]. Figure 54 illustrates the flowchart of the PSO.
Figure 54.54.
Figure Flowchart
Flowchart of thePSO
of the PSO[296].
[296].
PSO has gained attention from researchers because of the relative simplicity of the
2.3.2. Teaching–Learning-Based
algorithm Optimization
and the fact that PSO does not assume that an optimization function is contin-
Teaching-Learning
uous BasedInitial
or differential equations. Optimization
suggestions(TLBO) mimics
to improve the influence
PSO were of a teache
to use different
topologies, but the problem has idiosyncratic topologies. PSO struggled with converging
learners in a class.began
after researchers TLBO considers
widely grades
using the as an
algorithm. PSOoutput. Learners
was combined withusually absorb in
other algo-
mation from
rithms, and teachers, who were
other parameters are vastly
added toknowledgeable. The outcomes
ameliorate the convergence of the
issues. The mostlearners
pend on the quality of the teacher. A be er teacher produces learners that receive b
relevant applications solved using PSO were multimodal and constrained multi-objective
grades and perform
optimization problemswell [297].
[295].
In TLBO,
2.3.2. two teachers, T1Optimization
Teaching–Learning-Based and T2 have different classes where the learners have
same merit, and they each
Teaching-Learning teach
Based the same material.
Optimization Figure
(TLBO) mimics the55 shows the
influence of a grade
teacherdistribu
of the
on learners
learners inevaluated by the
a class. TLBO teachers
considers of the
grades different
as an output. classes.
LearnersCurve-1 and Curve-2
usually absorb
the distributions of the learners’ grades in the classes taught by T1 and T2. The grade
information from teachers, who are vastly knowledgeable. The outcomes of the learners
assumed
dependto onhave a normal
the quality of the distribution, but
teacher. A better in real
teacher life, the
produces results
learners thatmay bebetter
receive skewed [29
grades and perform well [297].
In TLBO, two teachers, T1 and T2 have different classes where the learners have the
same merit, and they each teach the same material. Figure 55 shows the grade distribution
of the learners evaluated by the teachers of the different classes. Curve-1 and Curve-2 are
the distributions of the learners’ grades in the classes taught by T1 and T2 . The grades are
assumed to have a normal distribution, but in real life, the results may be skewed [297].
Drones 2023, 7, x FOR PEER REVIEW 65 of 140
Drones 2023, 7, 427 63 of 134
Figure
Figure55.
55.Distribution
Distributionofofmarks
marksobtained
obtainedby
bylearners
learnerstaught
taughtby
bytwo
twodifferent
differentteachers
teachers[297].
[297].
Sincecurve-2
Since curve-2shows showsbe better results,ititisisconcluded
er results, concludedthat teacherTT2 2isisofofhigher
thatteacher higherquality
quality
than T1 . From the results, it can be concluded that a better teacher produces a better
than T1. From the results, it can be concluded that a be er teacher produces a be er mean-
meaning. The learners’ results are also improved by interactions between themselves [297].
ing. The learners’ results are also improved by interactions between themselves [297].
The best learner emulates the teachers since they are seen as knowledgeable people.
The best learner emulates the teachers since they are seen as knowledgeable people.
Teachers distribute new knowledge to the class of learners, which increases the level of
Teachers distribute new knowledge to the class of learners, which increases the level of
knowledge in the class. Therefore, a teacher aims to move the mean knowledge level of
knowledge in the class. Therefore, a teacher aims to move the mean knowledge level of a
a class closer to their own. Although a teacher dedicates their entire will to teaching a
class closer to their own. Although a teacher dedicates their entire will to teaching a class,
class, students retain information based on the quality of the instruction of the teacher and
students retain information based on the quality of the instruction of the teacher and the
the quality of the learner. The population determines the quality of the students. When a
quality of the learner. The population determines the quality of the students. When a
teacher increases the mean of a class close to the mean level of the teacher, the class requires
teacher increases the mean of a class close to the mean level of the teacher, the class re-
a new instructor with higher mean knowledge to continue improving [297].
quiresTLBOa newisinstructor with higher
a population-based mean knowledge
algorithm that finds atoglobal continue improving
solution using its[297].
population,
TLBO is a population-based algorithm that finds
which is a class of learners. Design variables are analogous to materials taught a global solution using its to
popula-
pupils
tion, which is a class of learners. Design variables are analogous
in TLBO, and the learners’ performance is the associated fitness. The best solution for the to materials taught to
pupils in TLBO, and the
population is the teacher [297]. learners’ performance is the associated fitness. The best solution
for theTLBO
population
consists is of
thetwo
teacher [297].The first section is termed the ‘Teacher Phase’, and the
sections.
TLBO consists of two
second section is termed the ‘Learnersections. ThePhase’.
first section
Learners is termed
learn fromthe ‘Teacher
the teacherPhase’, and the
during the
second section is termed the ‘Learner Phase’. Learners learn
‘Teacher Phase’, and the students learn from each other during the ‘Learners Phase’. A good from the teacher during the
‘Teacher
teacherPhase’,
brings and the students
the mean knowledge learnlevel
fromofeach theirother
classduring
to theirthe ‘Learners
own. However,Phase’.
theAteacher
good
teacher
can only brings
move thethemean
mean knowledge
of the class level
to theof their
extent class
thattodepends
their own. onHowever, the teacher
the capabilities of the
can only move the mean of the class to the extent that depends
students. The two sections are random processes. Let Mi represent the mean knowledge on the capabilities of the
students. The two sections are random processes. Let M i represent
level, and Ti represents a teacher at an iteration, i. Ti aims to translate the mean Mi closer the mean knowledge
level,
to theandmeanTi represents
of Ti suchathat teacher
Mnewatisanset iteration,
to be Tii.. The Ti aims to translate
difference between the mean
the two Mi closer
means
toupdates
the mean the of Ti such The
solution. thatlearners
Mnew is set gaintoknowledge
be Ti. The throughdifference the between
teacher’s thelecturing
two means and
updates
throughthe solution. between
interactions The learners gain Learners
learners. knowledge learnthrough
from one theanother
teacher’s lecturing
through and
random
through interactions
interactions between
if one learner learners.
knows more Learners learn from
than another. Figure one56another through
illustrates random
the flowchart
interactions
of TLBO where if oneTlearner
F knows
describes the more than
teaching another.
factor, whichFigure 56
determinesillustratesthe the flowchart
magnitude of
of the
TLBO where T describes the teaching factor, which determines
mean to be changed, and r is a number in the span [0, 1] chosen randomly. TF can either be
F the magnitude of the
mean
1 or 2to be is
and changed,
decidedand r is a number
randomly with equal in the span [0, 1]
probability chosen
[297]. TLBO randomly. TF can either
is a population-based
be 1 or 2 and is
optimization decided which
technique randomly with equal
incorporates probability
solutions [297].for
to search TLBO is a population-
an optimum solution.
based optimizationoftechnique
The performance an algorithm which incorporates
is affected by thesolutions
parameters to search
required for for
an optimum
the algorithm.so-
lution.
TLBO doesThe performance of an algorithm
not require parameter tuning, andis affected
thus, it bydoes thenot parameters
lose performance required for the
compared
algorithm. TLBO algorithms.
to other popular does not require parameter tuning,
The convergence rate in TLBOand thus, it does by
is increased nottaking
lose perfor-
the best
mance compared to other popular algorithms. The convergence rate in TLBO is increased
solution found in the iteration and applying it to the population. TLBO does not partition
bythetaking the bestbut
population, solution found
greediness is in the iterationtoand
implemented applying
ensure a good it solution.
to the population.
TLBO hasTLBO better
does not partition
performance the population,
compared but greediness is
to other nature-inspired implemented
algorithms suchtoasensureDE, PSO,a goodandsolu-
ABD
tion.
withTLBO
respecthastobemean er performance
solution, success comparedrate, to other nature-inspired
convergence rate, and average algorithms such as
evaluations
DE, PSO, and
required fromABD with respectfunctions
the benchmark to mean tested.
solution, success rate,
Additionally, convergence
TLBO performed rate, and with
better av-
erage evaluations required from the benchmark functions tested. Additionally, TLBO per-
high dimensional problems with less computational cost. TLBO can be used to optimize
formed be erdesign
engineering with high dimensional
problems [297]. problems with less computational cost. TLBO can be
used to optimize engineering design problems [297].
Drones
Drones2023,
2023,7,7,x427
FOR PEER REVIEW 66 of 64
140of 134
Figure
Figure56.56.
Flowchart of of
Flowchart thethe
TLBO [297].
TLBO [297].
called the cohort intelligence algorithm (CIA) [312]. Borji has developed the parliamentary
optimization algorithm (POA), which simulates the intra- and inter-group competitions in
parliamentary government [313]. Artificial tribe algorithm (ATA) is another social-based
algorithm that models existing skills, propagation behaviors, and migration behaviors of
natural tribes [314]. Kashan et al. have also developed the find-fix-finish-exploit-analyze
meta-heuristic (FFFEAM) algorithm based on the targeting process for selecting objects
or installations to be destroyed in warfare [315]. Another example is the open-source
development model algorithm (OSDMA) which was introduced by Khormouji et al. based
on the open-source software development mechanism and community behaviors [316].
The last algorithm in this group is the jigsaw inspired meta-heuristic (JIM) which works
based on a jigsaw cooperative learning strategy developed by Chifu et al. [317].
2.4. Physics-Based
Physics-based algorithms are one of the most popular types of nature-inspired algo-
rithms. There are more than 50 physics-based algorithms which currently amount to 15.6%
of all nature-inspired algorithms. The most popular physics-based algorithms based on ci-
tations are the simulated annealing algorithm (SAA), gravitational search algorithm (GSA),
and the big bang–big crunch (BBBC) algorithm. Physics-based algorithms are inspired
by different physical laws and phenoms, including gravity, space, stars, galaxies, atoms,
nuclear reactions, electromagnetism, gas dynamics, combustion, explosions, water,
Drones 2023, 7, x FOR PEER REVIEW
sounds,
69 of 140
motion, vibrations, optics, and energy, to name some. Figure 57 illustrates the classification
of the most popular physics-based algorithms.
Figure 57.Most
Figure57. Mostpopular
popular physics-based algorithms.
physics-based algorithms.
Figure
Figure 58. Flowchart
58. of
Flowchart of the
the SAA [320]. SAA [320].
SAA can handle multiple constraints, noisy data, and nonlinear models, making it a
general and robust technique [320]. SSA is ideal for finding optimal solutions to combina-
torial problems with various local minima [321]. SSA is more flexible and converges well
toward global optima compared to other local search methods. Since SSA does not rely on
Drones 2023, 7, 427 68 of 134
SAA can handle multiple constraints, noisy data, and nonlinear models, making
it a general and robust technique [320]. SSA is ideal for finding optimal solutions to
combinatorial problems with various local minima [321]. SSA is more flexible and converges
well toward global optima compared to other local search methods. Since SSA does not rely
on restrictive problem properties, it is a versatile method. SAA can be tuned easily, which
is an important feature since turning an algorithm into a specific problem is complicated
and time-consuming [320].
The quality of the solution produced by SAA is dependent on the precision of the
implemented variable [320]. A drawback to using SAA is that the initial annealing schedule
and temperature may require extensive tuning [321].
mi ( t )
Mi ( t ) = N
(28)
∑ j =1 m j ( t )
where fiti (t) is the ith agent’s fitness at time t and best(t) and worst(t) are best and worst
fitness values. Figure 59 illustrates the flowchart of the GSA.
GSA has been tested on different nonlinear multivariable single objective bench-
mark functions (unimodal and multimodal), and the results have been compared to meta-
heuristic algorithms. GSA tended to produce superior or comparable results to CFO, RGA,
and PSO [324].
Drones
Drones2023,
2023,7,
7,x427
FOR PEER REVIEW 7269ofof140
134
Figure
Figure59.
59.Flowchart
Flowchartof
ofGSA
GSA[324].
[324].
2.4.3.GSA
Big has
Bang—Big Crunch
been tested on different nonlinear multivariable single objective benchmark
functions (unimodal and multimodal),
During the Big Bang and the
phase, randomness andresults haveare
disorder been compared
produced to meta-heu-
by energy dissipa-
ristic
tion. algorithms.
During the GSA tended particles
Big Crunch, to produce superior
that or comparable
were distributed results
randomly todrawn
are CFO, RGA,
into a
and PSO [324].
cohesive mass. The Big Bang–Big Crunch (BB–BC) method is thus inspired by this theory.
In the Big Bang phase, random points are generated, and in the Big Crunch phase, the
2.4.3. Big
points are Bang—Big
shrunkenCrunch
into a minimal cost approach or center of mass to form a singular
representative point. This optimization contains two parts: the Big Bang, which generates
During the Big Bang phase, randomness and disorder are produced by energy dissi-
an initial population of points, and the Big Crunch, which calculates the center of mass (xc )
pation. During the Big Crunch, particles that were distributed randomly are drawn into a
according to the following formula:
cohesive mass. The Big Bang–Big Crunch (BB–BC) method is thus inspired by this theory.
In the Big Bang phase, random points are generated, and in the Big Crunch phase, the
∑iN=1 1f xi
points are shrunken into a minimal cost X =c approach or center of mass to form a singular(29)
N 1
representative point. This optimization contains ∑ i =two
1 f parts: the Big Bang, which generates
an initial population of points, and the Big Crunch, which calculates the center of mass
(x
where xi represents
c) according to the following exists in a search space of n dimensions, fi represents the
formula:
a point that
fitness of xi , and N is the number of points that were created during the Big Bang phase.
1 𝑖
The Big Bang and Big Crunch iteratively repeat, ∑𝑁𝑖=1where
𝑥 new agents are created to be used
𝑓
again [325]. The BB-BC algorithm is shown 𝑋 =in Figure 60. (29)
𝑁 1
According to Eksin et al., BB-BC is more ∑ effective
𝑖=1 𝑓 than C-GA where an optimization
problem has many local optimum points. The performance of the BB–BC method demon-
where
stratesxsuperiority
i represents a point that exists in a search space of n dimensions, fi represents the
over an improved and enhanced version of GSA and outperforms the
fitness of genetic
classical xi, and N is the number
algorithm (GA) inofmany
pointsbenchmark
that were test
created during[326].
functions the Big Bang phase.
The Big Bang and Big Crunch iteratively repeat, where new agents are created to be used
again [325]. The BB-BC algorithm is shown in Figure 60.
Drones
Drones 2023,
2023, 7,
7, x427
FOR PEER REVIEW 7370ofof 140
134
Some algorithms were developed based on chemical reactions. For example, chemical-
reaction-inspired optimization (CRIO) emulates molecular interaction in reaction to reach a
stable low-energy state. The performance of CRIO was tested using three combinatorial
nondeterministic polynomial-time hard problems. One was a real-world problem, and the
other two were traditional benchmark problems [379]. Another example is the artificial
chemical reaction optimization algorithm (ACROA) which is more robust and has fewer
parameters than similar algorithms [380]. Chemical reaction algorithm (CRA) is another
example that uses chemical reaction principles to create a meta-heuristic population-based
cal-reaction-inspired optimization (CRIO) emulates molecular interaction in reaction to
reach a stable low-energy state. The performance of CRIO was tested using three combi-
natorial nondeterministic polynomial-time hard problems. One was a real-world prob-
lem, and the other two were traditional benchmark problems [379]. Another example is
the artificial chemical reaction optimization algorithm (ACROA) which is more robust and
Drones 2023, 7, 427 has fewer parameters than similar algorithms [380]. Chemical reaction algorithm (CRA) is
73 of 134
another example that uses chemical reaction principles to create a meta-heuristic popula-
tion-based optimization algorithm. In chemical reactions, reactants are transformed into
molecules through
optimization a seriesInofchemical
algorithm. reactionsreactions,
[381]. reactants are transformed into molecules
through a series of reactions [381].
2.6. Math-Based
2.6. There
Math-Based
are algorithms that were developed based on mathematical concepts such as
basic There
arithmetic operators, that
are algorithms sinusoidal functions, based
were developed fractals,
onand so on. Figure
mathematical 62 illustrates
concepts such as
the most cited math-based algorithms.
basic arithmetic operators, sinusoidal functions, fractals, and so on. Figure 62 illustrates
the most cited math-based algorithms.
Figure
Figure62.
62.Most
Mostcited
citedmath-based
math-basedalgorithms.
algorithms.
The most popular math-based algorithm is the population-based sine cosine algo-
The most popular math-based algorithm is the population-based sine cosine algo-
rithm (SCA) which uses a sinusoidal-based mathematical model to create and disseminate
rithm (SCA) which uses a sinusoidal-based mathematical model to create and disseminate
candidate solutions that move toward the best solution. The sine cosine model is as follows:
candidate solutions that move toward the best solution. The sine cosine model is as fol-
lows:
t
Xi + r1 sin(r2 ) r3 Pit − Xit , r4 < 0.5
Xit+1 = (30)
Xit + r1 cos(r2 ) r3 Pit − Xit , r4 ≥ 0.5
)
𝑋 + 𝑟 sin(𝑟 𝑟 𝑃 − 𝑋 , 𝑟 < 0.5
𝑋 = (30)
where Pi is the position of the𝑋destination
+ 𝑟 cos(𝑟point ) 𝑟 in
𝑃 ith
− iteration
𝑋 , 𝑟 ≥ri0.5
and is a random number.
SCA has been implemented on multiple single objective nonlinear multivariable unimodal
where Pi is the position
and multimodal of thefunctions,
benchmark destination andpoint in ith iteration
the results and ri is the
have confirmed a random number.
fast convergence
SCA has been implemented on
and local optimal avoidance of SCA [382].multiple single objective nonlinear multivariable uni-
modalStochastic
and multimodal benchmark functions, and the results have confirmed
fractal search (SFS) is another math-based algorithm that was developed the fast con-
vergence
based onand the local optimal
fractal concept avoidance
in geometry.of SCA [382]. can be simply defined as a whole that
A fractal
Stochastic
is formed fromfractal search
particles (SFS)toisitself.
similar another math-based
In FSF, algorithm
each particle thatand
diffuses wascauses
developed
some
based
other on the fractal
random concept
particles to beincreated
geometry. A fractal
in order can bea simply
to shape fractal. defined
FSF hasasbeena whole
shown that
to
isperform
formedwellfrom particles similar to itself. In FSF, each particle diffuses
in some constrained and unconstrained unimodal or multimodal benchmark and causes some
other random
functions particles to be created in order to shape a fractal. FSF has been shown to
[383].
perform well in some constrained
Another example and unconstrained
is the simulated Kalman Filter unimodal
algorithm or multimodal
(SKFA), in whichbenchmark
all agents
functions
behave as[383].
Kalman Filters in a state estimation problem which is considered to be the opti-
mization problem. The agents use a best-so-far reference solution and measuring process
as a Kalman filter to find the optimum in a given problem. The SKFA has been applied
to 30 benchmark functions of CEC 2014 for real-parameter single-objective optimization
problems, and results shown that SKFA is a promising approach and is able to outperform
some well-known meta-heuristic algorithms such as GA, PSO, BHA, and GWO [384].
The base optimization algorithm (BOA) is inspired by basic arithmetic operators,
which use a combination of operators to force the solutions toward the optimum point. It
has been shown that BOA displays an acceptable performance in solving some nonlinear
multivariable single objective unimodal and multimodal benchmark functions [385].
Golden sine algorithm (GSA) is another math-inspired population-based algorithm
that uses the sine function to solve optimization problems. Individuals are created randomly
whose dimensions are distributed uniformly. The current solution is moved closer to the
Drones 2023, 7, 427 74 of 134
target goal with each iteration. The solutions are narrowed to the golden section to scan
the best solutions instead of all of the solutions. GSA has fewer algorithm-dependent
parameters and operators than other meta-heuristic methods, and it has been shown that it
converges faster compared to similar algorithms [386].
Spherical search optimizer (SSO) uses a spherical search style that is inspired by the
hypercube search style and basic reduced hypercube search style. Implementation of SSO
on the CEC2013, CEC2014, CEC2015, and CEC2017 benchmark functions has proved its
simplicity and efficiency [387].
2.7. Music-Based
Some algorithms are based on music theories and concepts, such as the harmony
search (HS) algorithm. HS works by imitating the activity of musicians while improvising
a musical piece. Musical performances strive toward the optimal solution (harmony),
which is judged by aesthetic estimation or the search for an optimum solution. The
combined sounds produced by the instruments are measured by aesthetic estimation,
such as evaluation carried out by the objective function. The sounds of a symphony are
improved through practice, such as the solutions of a problem being improved through
iterations. HS makes a new candidate solution by considering the entire set of candidates
instead of just parent solutions such as the genetic algorithm. HS is flexible and finds better
solutions since it does not require initial variables. It has been shown that HS can solve
continuous variable problems as well as combinatorial problems [388]. HS has been an
active topic of research such that an improved version of HS, called the melody search
(MS), has been developed and implemented on different problems [389]. However, some
researchers consider HS redundant and similar to DE [390].
The method of musical composition (MMC) algorithm is another example of a music-
based algorithm. MMC is a population-based meta-heuristic method that is inspired by an
artificial society that employs a creative dynamic system to produce music. MMC has been
Drones 2023, 7, x FOR PEER REVIEW shown to have a higher exploration capability of the solution space throughout78 the
of entire
140
iteration due to the utilization of interaction among agents and is an attractive option to
solve a set of rotated multimodal problems [391]. Figure 63 illustrates the most popular
music-based algorithms.
any optimization algorithm. It should be noted that some methods have been introduced
based on the idea of separating the cost function and constraints. A classical classification
suggests four categories for CHTs, penalty functions, a search for feasible solutions, pre-
Drones 2023, 7, x FOR PEER REVIEWserving feasibility of solutions and separation of objective function from constraints [408].
81 of 140
Mezura-Montes and Coello also suggest five other new categories besides novel attempts
at classic categories [409]. Figure 65 illustrates the classification of CHTs.
Some approaches
The penalty methods are based
are theon separating
most common theapproaches
cost function
forand constraints
constraint [410], and
handling. The
some approaches try to limit the solution space to just feasible solutions
penalty term is determined by the amount of constraint violation of the solution vector such that the
problem can be solved as an unconstrained problem. These problems
and sums it up with the cost function. There are different penalty methods, including aare considered some
of the penalty,
static most competitive constraint-handling
dynamic penalty, death penalty, techniques [409]. In
co-evolutionary some methods,
penalty, annealingapenalty,
special
operator
adaptive is createdand
penalty, to preserve
more [410].the feasibility of the solution
Penalty functions can be or to movebased
classified withinonpart of the
different
region
aspects,ofincluding
interest. variable/constant
GENCOP is an example penalty,ofproblem-dependent/problem
such an operator which is developedindependent, for
linearparameter/without
with constraints [409] parameter, and so forth [411]. Classic penalty methods and their
pros Research
and cons havehas been
beenconducted
studied beforeto develop
by Yeniaytechniques
[408]. based on feasibility rules. Me-
zura-Montes and Coello have studied some of these techniques,
Some approaches are based on separating the cost function such as stochastic
and constraints ranking
[410], and
(SR).
some SR was created
approaches try to
to compensate for under
limit the solution space or to
over-penalization awarded
just feasible solutions by that
such penalty
the
functions.can
problem SRbeuses a parameter
solved to compare the
as an unconstrained feasibility
problem. Theseofproblems
a solutionareinstead of defining
considered some
the function
of the [409].
most competitive constraint-handling techniques [409]. In some methods, a special
The is
operator ε-constrained methodthe
created to preserve is afeasibility
recently reported constraint
of the solution or to handling
move within technique
part ofthat
the
converts a constrained numerical problem to an unconstrained numerical problem. There
region of interest. GENCOP is an example of such an operator which is developed for
are also some highly competitive constraint-handling techniques based on multi-objective
linear constraints [409]
concepts. In these
Research methods,
has been a constraint
conducted violation
to develop measure
techniques is added
based as another
on feasibility rules.objective.
Mezura-
Some
Montes techniques
and Coello have alsostudied
have been suggested
some of based
these on a hybrid use
techniques, suchof as
thestochastic
above-mentioned
ranking
methods
(SR). [409,412].
SR was created to compensate for under or over-penalization awarded by penalty
functions. SR uses a parameter to compare the feasibility of a solution instead of defining
3.
theComparison
function [409]. between Algorithms
The ε-constrained method is a recently reported constraint handling technique that
In order to have a general view of nature-inspired algorithms, it would be a good
converts a constrained numerical problem to an unconstrained numerical problem. There
idea to provide a comparison between different algorithms in solving different problems.
are also some highly competitive constraint-handling techniques based on multi-objective
There are several benchmark functions to evaluate the performance of the optimization
concepts. In these methods, a constraint violation measure is added as another objective.
algorithms. A benchmark function is a single- or multi-objective function that has a deter-
Some techniques have also been suggested based on a hybrid use of the above-mentioned
ministic optimum. By using benchmark functions, it would be possible to evaluate the
methods [409,412].
performance of the optimization algorithms in terms of accuracy and speed. In the follow-
ing sections, 10 different problems are solved by 27 nature-inspired algorithms. Each
problem is solved 500 times, and the mean results are calculated. In addition to this, the
average solving time is also calculated to obtain the speed of the algorithms. In order to
provide an equal situation, the maximum iteration and the maximum number of agents
(or any equivalent concept based on the algorithm) in all algorithms are considered to be
the same. Computational calculations are done in Python using a modified version of the
Drones 2023, 7, 427 78 of 134
The equations of the benchmark functions and their plots are shown in Table 3 Note
that only a two-dimensional schematic of multimodal functions is plotted to show the
general view of the function.
Qing Continuous Differentiable Separable Scalable Multimodal
Scahffer Continuous Differentiable Non-separable Non-scalable Unimodal
The equations of the benchmark functions and their plots are shown in Table 3 Note
Drones 2023, 7, 427 that only a two-dimensional schematic of multimodal functions is plo ed to79show
of 134 the
general view of the function.
TableTable 3. Equations
3. Equations and plots
and plots of theofselected
the selected benchmark
benchmark functions
functions [417,418].
[417,418].
Function Equation Plot
Function Equation Plot
Chung
Chung 𝑓(𝑥) = ( 𝑥 ) , 𝑑=4
Reynolds 𝑓(𝑥) = ( 𝑥 ) , 𝑑=4
Chung
Reynolds 2
Chung Reynolds 𝑥 () d, x2 ) 𝑑
𝑓(𝑥) = f(( x ) = =4
Chung
Reynolds ∑ , d=4
𝑓(𝑥) = ( 𝑥 )i=,1 i 𝑑 = 4
Reynolds
Cosine Mix-
Cosine 𝑓(𝑥) = −0.1 cos(5𝜋𝑥 ) + 𝑥 + 0.1𝑑, 𝑑=4
ture Mix- 𝑓(𝑥) = −0.1 d
) d
Cosine
Cosine Mixture
tureMix- f ( x ) = −0.1 ∑ cos(5πxi ) + ∑ xi + 0.1d, 𝑑d =
cos(5𝜋𝑥 + 𝑥 + 0.1𝑑,
2 = 44
𝑓(𝑥) = −0.1 cos(5𝜋𝑥 )+ 𝑥 i=+1 0.1𝑑, 𝑑=4
Cosine
tureMix-
i =1
𝑓(𝑥) = −0.1 cos(5𝜋𝑥 ) + 𝑥 + 0.1𝑑, 𝑑=4
ture
Dixon and d
Dixon
Dixon and
and Price 𝑓(𝑥) = (𝑥f ( x−) =
1)( x+− 1)𝑖(2𝑥
2
+ ∑− 𝑥 2− ) x, 2 ,𝑑 d==4 4
Price 𝑓(𝑥) = (𝑥 − 1) +
1
𝑖(2𝑥
i 2x i i −1
i =2 − 𝑥 ) , 𝑑=4
Dixon
Priceand
𝑓(𝑥) = (𝑥 − 1) + 𝑖(2𝑥 − 𝑥 ) , 𝑑=4
Dixon and
Price
𝑓(𝑥) = (𝑥 − 1) + 𝑖(2𝑥 − 𝑥 ) , 𝑑=4
Price
𝑥 𝑥
Griewank 𝑓(𝑥) = 𝑥 − cos 𝑥 + 1, 𝑑=4
Dixon and
Drones 2023,Price
7, 427
𝑓(𝑥) = (𝑥 − 1) + 𝑖(2𝑥 − 𝑥 ) , 𝑑=4 80 of 134
Table 3. Cont.
𝑥 𝑥
Griewank 𝑓(𝑥) = f ( x ) = ∑− xi −
cos d
+
xi 1 ,
2
𝑑=4
Griewank
Drones 2023, 7, x FOR PEER REVIEW 4000 4000 ∏ cos √
√𝑖 i + 1, d = 4 84 of 140
i =1
Drones 2023, 7, x FOR PEER REVIEW 84 of 140
Drones 2023, 7, x FOR PEER REVIEW 84 of 140
Pint´er
𝑓(𝑥) = 𝑖𝑥 +𝑑 = 20𝑖𝑠𝑖𝑛
4 (𝐴) + 𝑖𝑙𝑜𝑔 (1 + 𝑖𝐵 ) ,
𝑓(𝑥) = 𝑓(𝑥)𝑖𝑥
= + 𝑖𝑥 20𝑖𝑠𝑖𝑛
+ )(𝐴) +
20𝑖𝑠𝑖𝑛 (𝐴) + 𝑖𝑙𝑜𝑔
d 𝐴 = 𝑥 d sin(𝑥 + sin (𝑥
(1 +
) 𝑖𝑙𝑜𝑔 (1 𝑖𝐵
+ 𝑖𝐵) , ) ,
Pint´er ==
f (x) 𝐵 ∑ 𝑥ixi2 +− ∑ 20isin
d
2 ( A) +
∑ ilog)10+ 11+ iB2 , d = 4
2𝑥 + 3𝑥 − cos(𝑥
Pint´er Pint´er
Pint´er i =1 𝑑 = 4
i =1 i =1
𝑑=4
𝐴 =𝑑𝑥A=2=4sin(𝑥
xi−1 sin
)+ ) +(𝑥
( xisin sin( xi+1 )
)
B = x𝐴i−=
1 −
𝑥 2x +
sin(𝑥
i 3x )
i ++1 − (𝑥( xi )) + 1
sincos
𝐵 =𝐴 + 3𝑥) + sin
𝑥 = 𝑥− 2𝑥sin(𝑥 − cos(𝑥(𝑥 )) + 1
𝐵 = 𝑥 − 2𝑥 + 3𝑥 − cos(𝑥 ) + 1
𝐵 = 𝑥 − 2𝑥 + 3𝑥 − cos(𝑥 ) + 1
/
𝑓(𝑥) = 𝑓(𝑥)
/ (𝑥 = d/4 ++(𝑥
(𝑥 10𝑥+ 10𝑥−)2𝑥+ )5(𝑥
)+ 5(𝑥
+ 10(𝑥
− 𝑥− )𝑥 )− 𝑥 ) ,
2
Powell Powell
𝑓(𝑥) = f (
(𝑥x ) = 𝑑∑ (
+=10𝑥x + 10x
44i−3 ) +4i5(𝑥
−2 ) + 5 ( x4i𝑥−1)− x4i )2 +
−
Powell i =1
Powell 2x4i+−(𝑥
( x4i−2+−(𝑥 )4 2𝑥
1−
− (2𝑥
+ 10 )10(𝑥
3−
x)4i−+ +
x4i10(𝑥 = 𝑥−
)4 , d − 4 𝑥) ,) ,
+ (𝑥 𝑑 =−4 2𝑥
𝑑=4 ) + 10(𝑥 −𝑥 ) ,
𝑑=4
Qing 𝑓(𝑥) = (𝑥 − 𝑖) , 𝑑 = 4
Qing Qing f𝑓(𝑥) = d (𝑥
(x) = − 𝑖)
2 , 𝑑 = 4
∑ xi2 − i , d = 4
Qing 𝑓(𝑥) = (𝑥i=1− 𝑖) , 𝑑 = 4
Qing 𝑓(𝑥) = (𝑥 − 𝑖) , 𝑑 = 4
3.2. Results
3.2. Results
Due to the stochastic terms in algorithms, each problem is solved 500 times, and the
Due to the
average stochastic
of results terms
is used in algorithms,
for comparison. Tableeach problem the
4 summarizes is solved 500
results of times, and the
computational
average of results
calculations.
3.2. Results is
The used
bestfor comparison.
result in each Table
problem 4 summarizes
regarding cost, the results
iteration, andof computational
time is deter-
calculations.
mined Theabest
with greenresult in each
highlight, and problem
the worst regarding
result is cost, iteration,
determined using a and
red time is deter-
highlight. In
3.2. Results
Due to the stochastic terms in algorithms, each problem is solved 500 times, and the
minedsome
withproblems,
a green highlight,
more than and the worst
1 algorithm result is determined
is highlighted in red colorusing
whichaindicates
red highlight.
the al- In
Drones 2023, 7, 427 81 of 134
3.2. Results
Due to the stochastic terms in algorithms, each problem is solved 500 times, and the
average of results is used for comparison. Table 4 summarizes the results of computational
calculations. The best result in each problem regarding cost, iteration, and time is deter-
mined with a green highlight, and the worst result is determined using a red highlight.
In some problems, more than 1 algorithm is highlighted in red color which indicates the
algorithms were unable to find a good solution in less than 1000 iterations, which was 1 of
the termination criteria. In such a case, the best solution in 1000 iterations was considered
the best solution of the algorithm. A good solution was set to be a solution with a cost
function less than 10−4 . The term error stands for the mean square error of the solution
and the optimum solution (which is at the origin of all problems). This value determines
how far was the calculated solution from the origin. This is very important for problems
such as Qing, Powell, or Chung-Reynolds in which a wide area around the origin has a
low-cost value, and there are not many local minimums. In these problems, a low final cost
does not necessarily represent a good solution. It is important in these problems to check
whether the final solution is near the origin or not. Based on the results, in most cases, the
mean error is low enough, which means most algorithms went near enough to the origin
(optimum solution).
Algorithm Problem Mean Cost Mean Iterations Mean Time (s) Mean Error
Sine Cosine Algorithm 5.34 × 10−5 339.482 5.50 × 10−1 1.07 × 10−7
Harris Hawks Optimization 6.14 × 105 × 10−5 35.086 1.02 × 10−1 1.23 × 10−7
Fireworks Algorithm 6.27 × 10−5 88.314 5.48 × 10−1 1.25 × 10−7
Artificial Bee Colony Algorithm 7.28 × 10−5 91.194 1.52 × 10−1 1.46 × 10−7
Bees Algorithm 7.75 × 10−5 140.83 7.29 × 10−1 1.55 × 10−7
Gravitational Search Algorithm 7.92 × 10−5 422.2 7.17 × 10−1 1.58 × 10−7
Firefly Algorithm 7.93 × 10−5 341.84 5.71 1.59 × 10−7
Differential Evolution 7.96 × 10−5 217.234 9.90 × 10−1 1.59 × 10−7
Bat Algorithm 7.99 × 10−5 294.708 2.70 × 10−1 1.60 × 10−7
Grey Wolf Optimizer 8.00 × 10−5 28.902 8.20 × 10−2 1.60 × 10−7
Flower Pollination Algorithm Ackley 8.13 × 10−5 841.306 1.32 1.63 × 10−7
Cuckoo Search 8.19 × 10−5 312.4 4.94 × 10−1 1.64 × 10−7
Particle Swarm Algorithm 8.27 × 10−5 227.056 5.03 × 10−1 1.65 × 10−7
Cat Swarm Optimization 9.30 × 10−5 48.064 1.91 × 10−1 1.86 × 10−7
Clonal Selection Algorithm 6.12 × 10−4 1000 1.67 1.22 × 10−6
Fish School Search 3.78 × 10−3 1000 2.46 7.56 × 10−6
Moth Flame Optimizer 4.06 × 10−3 531.462 5.47 × 10−1 8.12 × 10−6
Forest Optimization Algorithm 3.54 × 10−2 1000 1.00 7.09 × 10−5
Bacterial Foraging Optimization 6.83 × 10−2 1000 5.61 1.37 × 10−4
Genetic Algorithm 2.46 × 10−1 1000 4.77 × 10−1 2.46 × 10−2
Harmony Search 3.49 × 10−1 1000 1.36 × 10−1 6.98 × 10−4
Drones 2023, 7, 427 82 of 134
Table 4. Cont.
Algorithm Problem Mean Cost Mean Iterations Mean Time (s) Mean Error
Fireworks Algorithm 6.64 × 10−5 133.906 5.94 × 10−1 1.33 × 10−7
Harris Hawks Optimization 7.00 × 10−5 27.042 6.11 × 10−2 1.40 × 10−7
Bees Algorithm 7.13 × 10−5 135.848 5.09 × 10−1 1.43 × 10−7
Artificial Bee Colony Algorithm 7.50 × 10−5 84.648 1.25 × 10−1 1.50 × 10−7
Gravitational Search Algorithm 8.00 × 10−5 310.83 7.94 × 10−1 1.60 × 10−7
Particle Swarm Algorithm 8.06 × 10−5 250.622 3.37 × 10−1 1.61 × 10−7
Firefly Algorithm 8.06 × 10−5 260.928 3.98 1.61 × 10−7
Grey Wolf Optimizer 8.47 × 10−5 145.878 3.23 × 10−1 1.69 × 10−7
Cat Swarm Optimization 9.32 × 10−5 91.69 2.67 × 10−1 1.86 × 10−7
Differential Evolution 1.68 × 10−4 315.536 9.93 × 10−1 3.37 × 10−7
Alpine
Cuckoo Search 4.82 × 10−4 932.264 7.28 × 10−1 9.64 × 10−7
Fish School Search 1.13 × 10−3 999.994 2.21 2.26 × 10−6
Moth Flame Optimizer 2.32 × 10−3 548.076 6.47 × 10−1 4.64 × 10−6
Sine Cosine Algorithm 2.56 × 10−3 340.892 2.62 × 10−1 5.12 × 10−6
Bat Algorithm 4.91 × 10−3 768.372 7.83 × 10−1 9.81 × 10−6
Harmony Search 1.37 × 10−2 1000 1.17 × 10−1 2.74 × 10−5
Flower Pollination Algorithm 1.62 × 10−2 1000 1.31 3.24 × 10−5
Forest Optimization Algorithm 4.01 × 10−2 1000 6.14 × 10−1 8.02 × 10−5
Bacterial Foraging Optimization 4.16 × 10−2 1000 4.07 8.33 × 10−5
Genetic Algorithm 5.24 × 10−2 1000 2.21 × 10−1 5.24 × 10−3
Clonal Selection Algorithm 7.96 × 10−1 1000 1.05 1.59 × 10−3
Sine Cosine Algorithm 2.39 × 10−5 146.724 1.20 × 10−1 4.78 × 10−8
Fireworks Algorithm 3.05 × 10−5 43.376 1.95 × 10−1 6.09 × 10−8
Artificial Bee Colony Algorithm 4.16 × 10−5 14.254 2.88 × 10−2 8.33 × 10−8
Forest Optimization Algorithm 4.65 × 10−5 171.568 6.75 × 10−2 9.30 × 10−8
Grey Wolf Optimizer 4.68 × 10−5 9.834 1.59 × 10−2 9.36 × 10−8
Gravitational Search Algorithm 4.76 × 10−5 114.352 4.45 × 10−1 9.51 × 10−8
Bees Algorithm 4.82 × 10−5 16.916 8.23 × 10−2 9.65 × 10−8
Bat Algorithm 4.83 × 10−5 45.57 5.10 × 10−2 9.67 × 10−8
Differential Evolution 4.88 × 10−5 71.866 2.31 × 10−1 9.77 × 10−8
Firefly Algorithm 4.94 × 10−5 91.87 1.11 9.87 × 10−8
Particle Swarm Algorithm Chung- 5.07 × 10−5 47.766 5.38 × 10−2 1.01 × 10−7
Reynolds
Clonal Selection Algorithm 5.08 × 10−5 44.366 7.50 × 10−2 1.02 × 10−7
Fish School Search 5.14 × 10−5 314.832 5.99 × 10−1 1.03 × 10−7
Flower Pollination Algorithm 5.15 × 10−5 203.686 2.58 × 10−1 1.03 × 10−7
Bacterial Foraging Optimization 5.16 × 10−5 53.93 2.54 × 10−1 1.03 × 10−7
Cuckoo Search 5.21 × 10−5 67.72 4.55 × 10−2 1.04 × 10−7
Moth Flame Optimizer 7.05 × 10−5 228.232 2.14 × 10−1 1.41 × 10−7
Cat Swarm Optimization 7.50 × 10−5 15.922 4.91 × 10−2 1.50 × 10−7
Genetic Algorithm 1.82 × 10−4 843 2.57 × 10−1 1.82 × 10−5
Harris Hawks Optimization 2.96 × 10−4 6.14 9.07 × 10−3 5.92 × 10−7
Harmony Search 4.63 × 10−4 789.02 4.77 × 10−2 9.26 × 10−7
Drones 2023, 7, 427 83 of 134
Table 4. Cont.
Algorithm Problem Mean Cost Mean Iterations Mean Time (s) Mean Error
Sine Cosine Algorithm 3.73 × 10−5 201.704 1.89 × 10−1 7.47 × 10−8
Fireworks Algorithm 4.87 × 10−5 45.982 2.30 × 10−1 9.75 × 10−8
Clonal Selection Algorithm 5.69 × 10−5 81.124 1.17 × 10−1 1.14 × 10−7
Artificial Bee Colony Algorithm 5.84 × 10−5 28.978 3.28 × 10−2 1.17 × 10−7
Grey Wolf Optimizer 6.37 × 10−5 13.644 3.38 × 10−2 1.27 × 10−7
Firefly Algorithm 6.62 × 10−5 135.314 2.02 1.32 × 10−7
Bees Algorithm 6.64 × 10−5 62.548 2.68 × 10−1 1.33 × 10−7
Differential Evolution 6.70 × 10−5 104.8 3.73 × 10−1 1.34 × 10−7
Particle Swarm Algorithm 6.80 × 10−5 77.204 1.10 × 10−1 1.36 × 10−7
Flower Pollination Algorithm Cosine 6.83 × 10−5 464.014 5.65 × 10−1 1.37 × 10−7
Cuckoo Search Mixture 6.83 × 10−5 130.344 1.24 × 10−1 1.37 × 10−7
Fish School Search 7.71 × 10−5 993.258 2.66 1.54 × 10−7
Forest Optimization Algorithm 8.33 × 10−5 622.266 2.97 × 10−1 1.67 × 10−7
Cat Swarm Optimization 8.68 × 10−5 41.174 1.22 × 10−1 1.74 × 10−7
Harris Hawks Optimization 1.09 × 10−4 8.736 2.20 × 10−2 2.19 × 10−7
Gravitational Search Algorithm 2.56 × 10−4 240.778 5.91 × 10−1 5.13 × 10−7
Moth Flame Optimizer 3.85 × 10−4 331.822 2.82 × 10−1 7.70 × 10−7
Genetic Algorithm 1.49 × 10−3 1000 2.65 × 10−1 1.49 × 10−4
Bat Algorithm 3.91 × 10−3 185.366 1.48 × 10−1 7.82 × 10−6
Harmony Search 9.96 × 10−3 1000 8.87 × 10−2 1.99 × 10−5
Bacterial Foraging Optimization 1.61 × 10−2 1000 4.04 3.23 × 10−5
Firefly Algorithm 6.48 × 10−5 205.242 3.17 1.30 × 10−7
Bat Algorithm 6.60 × 10−5 158.698 1.25 × 10−1 1.32 × 10−7
Gravitational Search Algorithm 6.66 × 10−5 229.648 9.72 × 10−1 1.33 × 10−7
Cuckoo Search 6.88 × 10−5 224.77 2.32 × 10−1 1.38 × 10−7
Particle Swarm Algorithm 7.12 × 10−5 171.6 2.15 × 10−1 1.42 × 10−7
Flower Pollination Algorithm 7.19 × 10−5 743.12 9.05 × 10−1 1.44 × 10−7
Artificial Bee Colony Algorithm 8.33 × 10−5 332.548 5.04 × 10−1 1.67 × 10−7
Fish School Search 8.53 × 10−5 992.308 3.04 1.71 × 10−7
Harris Hawks Optimization 1.06 × 10−4 208.448 5.15 × 10−1 2.12 × 10−7
Dixon-
Bees Algorithm Price 9.22 × 10−4 120.12 6.29 × 10−1 1.84 × 10−6
Differential Evolution 2.12 × 10−3 204.858 5.65 × 10−1 4.24 × 10−6
Cat Swarm Optimization 1.08 × 10−2 1000 4.26 2.15 × 10−5
Bacterial Foraging Optimization 1.54 × 10−2 1000 4.06 3.07 × 10−5
Forest Optimization Algorithm 9.88 × 10−2 1000 8.36 × 10−1 1.98 × 10−4
Grey Wolf Optimizer 1.25 × 10−1 1000 1.40 2.49 × 10−4
Sine Cosine Algorithm 2.49 × 10−1 1000 9.62 × 10−1 4.98 × 10−4
Fireworks Algorithm 4.95 × 10−1 1000 4.64 9.90 × 10−4
Harmony Search 8.09 × 10−1 1000 1.22 × 10−1 1.62 × 10−3
Genetic Algorithm 3.03 1000 2.80 × 10−1 3.03 × 10−1
Moth Flame Optimizer 4.36 706.77 7.07 × 10−1 8.72 × 10−3
Clonal Selection Algorithm 10.3 1000 2.25 2.05 × 10−2
Drones 2023, 7, 427 84 of 134
Table 4. Cont.
Algorithm Problem Mean Cost Mean Iterations Mean Time (s) Mean Error
Cat Swarm Optimization 9.04 × 10−5 130.836 5.16 × 10−1 1.81 × 10−7
Artificial Bee Colony Algorithm 1.45 × 10−4 367.618 5.78 × 10−1 2.90 × 10−7
Fireworks Algorithm 1.49 × 10−4 209.674 1.05 2.97 × 10−7
Harris Hawks Optimization 1.50 × 10−4 27.364 5.30 × 10−2 3.00 × 10−7
Grey Wolf Optimizer 2.21 × 10−4 262.478 3.97 × 10−1 4.43 × 10−7
Cuckoo Search 3.86 × 10−4 509.374 5.12 × 10−1 7.72 × 10−7
Flower Pollination Algorithm 6.49 × 10−4 894.716 1.54 1.30 × 10−6
Sine Cosine Algorithm 1.78 × 10−3 510.884 4.51 × 10−1 3.56 × 10−6
Firefly Algorithm 2.33 × 10−3 945.782 1.39 × 101 4.67 × 10−6
Particle Swarm Algorithm 3.46 × 10−3 542.182 8.18 × 10−1 6.92 × 10−6
Bacterial Foraging Optimization Expanded 4.09 × 10−3 974.946 5.85 8.19 × 10−6
Schaffer
Bees Algorithm 4.29 × 10−3 775.36 3.53 8.58 × 10−6
Fish School Search 5.77 × 10−3 697.276 1.59 1.15 × 10−5
Differential Evolution 6.79 × 10−3 792.188 2.45 1.36 × 10−5
Clonal Selection Algorithm 6.91 × 10−3 789.402 1.21 1.38 × 10−5
Moth Flame Optimizer 8.75 × 10−3 982.44 6.91 × 10−1 1.75 × 10−5
Genetic Algorithm 8.78 × 10−3 1000 3.88 × 10−1 8.78 × 10−4
Gravitational Search Algorithm 8.89 × 10−3 998.088 1.37 1.78 × 10−5
Forest Optimization Algorithm 9.06 × 10−3 960.018 6.72 × 10−1 1.81 × 10−5
Harmony Search 9.58 × 10−3 978.92 8.91 × 10−2 1.92 × 10−5
Bat Algorithm 2.40 × 10−2 988.604 8.10 × 10−1 4.79 × 10−5
Artificial Bee Colony Algorithm 4.42 × 10−5 127.17 9.13 × 10−1 8.84 × 10−8
Fireworks Algorithm 1.63 × 10−4 18.25 4.96 × 10−2 3.27 × 10−7
Harris Hawks Optimization 3.23 × 10−4 773.19 9.98 × 10−1 6.45 × 10−7
Cuckoo Search 6.61 × 10−4 384.45 6.14 × 10−1 1.32 × 10−6
Fish School Search 1.22 × 10−3 887.88 1.95 2.45 × 10−6
Cat Swarm Optimization 1.95 × 10−3 180.84 7.96 × 10−1 3.90 × 10−6
Grey Wolf Optimizer 3.40 × 10−3 439.69 6.40 × 10−1 6.80 × 10−6
Differential Evolution 8.61 × 10−3 502.86 1.36 1.72 × 10−5
Particle Swarm Algorithm 9.41 × 10−3 866.57 1.36 1.88 × 10−5
Flower Pollination Algorithm 1.09 × 10−2 1000.00 1.89 2.18 × 10−5
Griewank
Sine Cosine Algorithm 1.22 × 10−2 417.41 5.25 × 10−1 2.45 × 10−5
Clonal Selection Algorithm 1.38 × 10−2 951.05 1.35 2.76 × 10−5
Bees Algorithm 1.46 × 10−2 946.30 4.75 2.93 × 10−5
Harmony Search 1.69 × 10−2 1000.00 1.35 × 10−1 3.38 × 10−5
Genetic Algorithm 2.18 × 10−2 1000.00 3.24 × 10−1 2.18 × 10−3
Firefly Algorithm 2.29 × 10−2 971.10 1.67 × 101 4.58 × 10−5
Bacterial Foraging Optimization 2.95 × 10−2 998.50 5.87 5.90 × 10−5
Gravitational Search Algorithm 3.15 × 10−2 972.95 1.52 6.31 × 10−5
Forest Optimization Algorithm 3.90 × 10−2 1000.00 5.16 × 10−1 7.79 × 10−5
Bat Algorithm 8.00 × 10−2 994.59 9.75 × 10−1 1.60 × 10−4
Moth Flame Optimizer 1.38 × 10−1 998.66 9.12 × 10−1 2.76 × 10−4
Drones 2023, 7, 427 85 of 134
Table 4. Cont.
Algorithm Problem Mean Cost Mean Iterations Mean Time (s) Mean Error
Sine Cosine Algorithm 3.69 × 10−5 269.614 5.62 × 10−1 7.38 × 10−8
Fireworks Algorithm 4.79 × 10−5 80.12 1.21 9.58 × 10−8
Differential Evolution 6.54 × 10−5 205.1 1.13 1.31 × 10−7
Firefly Algorithm 6.58 × 10−5 240.378 8.05 1.32 × 10−7
Cuckoo Search 6.72 × 10−5 366.782 1.35 1.34 × 10−7
Cat Swarm Optimization 8.66 × 10−5 42.148 2.49 × 10−1 1.73 × 10−7
Harris Hawks Optimization 9.47 × 10−5 17.048 9.11 × 10−2 1.89 × 10−7
Flower Pollination Algorithm 1.61 × 10−4 889.718 2.46 3.22 × 10−7
Artificial Bee Colony Algorithm 1.92 × 10−4 353.71 1.04 3.85 × 10−7
Grey Wolf Optimizer 3.74 × 10−2 39.032 1.83 × 10−1 7.48 × 10−5
Pinter
Gravitational Search Algorithm 1.01 357.948 1.28 2.01 × 10−3
Moth Flame Optimizer 2.37 632.918 1.57 4.75 × 10−3
Particle Swarm Algorithm 2.59 320.354 7.60 × 10−1 5.19 × 10−3
Forest Optimization Algorithm 2.65 1000 1.33 5.30 × 10−3
Fish School Search 3.96 999.66 6.25 7.93 × 10−3
Harmony Search 9.54 1000 1.25 × 10−1 1.91 × 10−2
Bacterial Foraging Optimization 10.4 1000 8.44 2.09 × 10−2
Bees Algorithm 13.2 597.326 7.65 2.64 × 10−2
Genetic Algorithm 17.8 1000 1.03 1.78
Bat Algorithm 32.2 761.046 1.84 6.45 × 10−2
Clonal Selection Algorithm 56.0 985.222 3.64 1.12 × 10−1
Sine Cosine Algorithm 3.79 × 10−5 270.806 3.28 × 10−1 7.58 × 10−8
Fireworks Algorithm 4.36 × 10−5 112.154 6.03 × 10−1 8.72 × 10−8
Grey Wolf Optimizer 6.16 × 10−5 31.632 1.03 × 10−1 1.23 × 10−7
Flower Pollination Algorithm 6.25 × 10−5 339.416 7.87 × 10−1 1.25 × 10−7
Firefly Algorithm 6.38 × 10−5 178.004 4.77 1.28 × 10−7
Cuckoo Search 6.39 × 10−5 138.046 3.20 × 10−1 1.28 × 10−7
Gravitational Search Algorithm 6.54 × 10−5 194.274 6.73 × 10−1 1.31 × 10−7
Bat Algorithm 6.68 × 10−5 141.476 1.89 × 10−1 1.34 × 10−7
Fish School Search 7.18 × 10−5 952.01 2.81 1.44 × 10−7
Particle Swarm Algorithm Powell 7.35 × 10−5 171.714 3.70 × 10−1 1.47 × 10−7
Cat Swarm Optimization 8.20 × 10−5 29.168 1.32 × 10−1 1.64 × 10−7
Harris Hawks Optimization 1.72 × 10−4 14.4 5.09 × 10−2 3.44 × 10−7
Clonal Selection Algorithm 1.80 × 10−4 308.52 8.33 × 10−1 3.59 × 10−7
Artificial Bee Colony Algorithm 2.90 × 10−4 958.778 1.95 5.80 × 10−7
Bees Algorithm 5.67 × 10−4 808.19 6.47 1.13 × 10−6
Forest Optimization Algorithm 2.51 × 10−2 997.9 8.70 × 10−1 5.03 × 10−5
Bacterial Foraging Optimization 3.43 × 10−2 997.456 5.08 6.87 × 10−5
Differential Evolution 4.52 176.416 5.83 × 10−1 9.05 × 10−3
Genetic Algorithm 7.90 1000 5.65 × 10−1 7.90 × 10−1
Harmony Search 11.7 1000 1.01 × 10−1 2.33 × 10−2
Moth Flame Optimizer 98.2 892.974 1.11 1.96 × 10−1
Drones 2023, 7, 427 86 of 134
Table 4. Cont.
Algorithm Problem Mean Cost Mean Iterations Mean Time (s) Mean Error
Artificial Bee Colony Algorithm 5.73 × 10−5 57.068 1.23 × 10−1 1.15 × 10−7
Differential Evolution 6.46 × 10−5 227.682 7.35 × 10−1 1.29 × 10−7
Bees Algorithm 6.54 × 10−5 60.376 4.47 × 10−1 1.31 × 10−7
Gravitational Search Algorithm 6.55 × 10−5 226.204 7.34 × 10−1 1.31 × 10−7
Bat Algorithm 6.58 × 10−5 157.976 1.48 × 10−1 1.32 × 10−7
Firefly Algorithm 6.63 × 10−5 204.654 3.69 1.33 × 10−7
Moth Flame Optimizer 6.92 × 10−5 431.196 6.93 × 10−1 1.38 × 10−7
Particle Swarm Algorithm 6.93 × 10−5 130.818 2.41 × 10−1 1.39 × 10−7
Cuckoo Search 6.94 × 10−5 172.876 2.09 × 10−1 1.39 × 10−7
Fish School Search Qing 7.02 × 10−5 989.846 2.07 1.40 × 10−7
Harris Hawks Optimization 1.17 × 10−4 737.712 1.53 2.34 × 10−7
Flower Pollination Algorithm 3.91 × 10−4 978.582 1.29 7.81 × 10−7
Forest Optimization Algorithm 3.16 × 10−3 996.474 4.68 × 10−1 6.32 × 10−6
Bacterial Foraging Optimization 7.55 × 10−3 1000 3.94 1.51 × 10−5
Fireworks Algorithm 1.32 × 10−2 1000 3.85 2.64 × 10−5
Harmony Search 1.77 × 10−2 1000 6.30 × 10−2 3.54 × 10−5
Clonal Selection Algorithm 2.53 × 10−2 911.428 9.80 × 10−1 5.07 × 10−5
Cat Swarm Optimization 3.69 × 10−2 1000 3.08 7.38 × 10−5
Grey Wolf Optimizer 7.70 × 10−2 1000 1.43 1.54 × 10−4
Genetic Algorithm 1.85 × 10−1 1000 3.63 × 10−1 1.85 × 10−2
Sine Cosine Algorithm 2.81 1000 6.38 × 10−1 5.62 × 10−3
It is necessary to remember that each algorithm may find better or significantly better
solutions by setting suitable parameters. Of course, it is not easy to find appropriate
parameters for each set of algorithms and problems. Even in that case, regardless of human
error, the results would not be suitable for comparison. Instead, we tried to keep the
situation the same enough for each algorithm. Each algorithm’s parameters are adjusted
to provide the best possible results for the Ackley benchmark function. The maximum
population for each algorithm is also considered to be 25. Thus, each algorithm solved
the problems with the same number of populations or agents and fixed parameters in a
maximum of 1000 iterations. The Python codes and parameters of both problems and
algorithms are available in the project’s GitHub repository [419].
Considering the results, most of the algorithms provide an acceptable result for the
majority of problems. In each problem, the best and worst results in each section (cost,
iterations, time, and error) are highlighted in green and red colors. Considering the number
of problems in which an algorithm had the best cost value, the sine cosine algorithm was
the best of the five problems. In contrast, clonal selection was the worst algorithm by
providing the least accurate results in three problems. It should be noted that, at the same
time, sine cosine algorithm had the least accurate results in the Qing problem. Some other
algorithms, such as artificial bee colony, cat swarm optimization, firefly algorithm, and
fireworks algorithm, also had the best results in some problems.
A significant result is the low performance of the genetic algorithm compared to other
algorithms, which is well aligned with the results of other research [420]. It also had the
highest error and iteration numbers in the majority of problems. BFO shows mostly low
performance in most algorithms, but it had good results in solving Chung-Reynolds and
Dixon price; other research also confirms the result and indicates its acceptable performance
in solving other benchmark functions such as quadratic (sphere), Rosenbrock, and Rastrigin
functions. Some adaptive forms of BFO have shown significantly better performance [421].
Figure 66 compares the algorithms with the best and worst results in solving the problems.
and Dixon price; other research also confirms the result and indicates its acceptable per-
formance in solving other benchmark functions such as quadratic (sphere), Rosenbrock,
Drones 2023, 7, 427
and Rastrigin functions. Some adaptive forms of BFO have shown significantly be 87 erofper-
134
formance [421]. Figure 66 compares the algorithms with the best and worst results in solv-
ing the problems.
Figure 66.
Figure
Figure 66. Comparison
66. Comparison of
Comparison of algorithms,
of algorithms, number
algorithms, number of
number of problems
of problems with
problems with best
with best and
best and worst
and worst cost
worst cost functions.
cost functions.
functions.
One interesting
One interesting result
result is
is that,
that, in
in six
six problems,
problems, thethe Harris
Harris hawk
hawk algorithm
algorithm hadhad the
the
least number of iterations. It was also the fastest algorithm in most algorithms.
least number of iterations. It was also the fastest algorithm in most algorithms. In most In most of
those problems, it was able to find the optimal solution in less than 30 iterations
of those problems, it was able to find the optimal solution in less than 30 iterations and and the
−2
average
the averagesolve timetime
solve was was
about an order
about of 10of
an order −2 s.
10In the group of algorithms with few num-
−2 s. In the group of algorithms with few
bers of iterations,
numbers artificial
of iterations, bee colony,
artificial bees bees
bee colony, algorithm, fireworks
algorithm, algorithm,
fireworks and grey
algorithm, and wolf
grey
optimizer
wolf mustmust
optimizer also be mentioned;
also be mentioned; However, the last
However, thetwo
last algorithms had the
two algorithms hadhighest iter-
the highest
ation count
iteration in in
count two problems.
two problems.Harmony
Harmonysearch,
search,bacterial
bacterialforaging,
foraging,forest
forest optimization,
optimization,
clonal selection,
clonal selection, andand other
other algorithms
algorithms werewere unable
unable toto find
find accurate
accurate enough
enough solutions
solutions in
in
less
less than
than 1000
1000 iterations.
iterations. Figure
Figure 67 67 illustrates
illustrates the
the algorithms
algorithms with
with the
the least
least and
and the
the greatest
greatest
number
number of of iterations.
iterations.
Figure 67.
Figure 67. Comparison
Comparison of
of algorithms,
algorithms, number
number of
of problems
problems with
with least
least and
and most
most iterations.
iterations.
Figure 67. Comparison of algorithms, number of problems with least and most iterations.
In contrast
In contrast to
to their
their poor
poor performance
performance regarding
regarding the
the iteration
iteration count,
count, the
the harmony
harmony
search and gray
search and gray wolf
wolf algorithms
algorithms had
had the
the fastest
fastest performance
performance in in the
the three
three problems.
problems. The
The
bacterial foraging and firefly algorithms were the slowest algorithms in the four problems.
bacterial foraging and firefly algorithms were the slowest algorithms in the four problems.
Other information
Other information regarding
regarding the
the average
average calculation
calculation time
time of
of the
the algorithms
algorithms is is shown
shown in
in
Figure 68.
Figure 68.
One other important factor, specifically for algorithms such as the Powell and Dixon
price, which have high-cost values, is the mean square error of the final result. While
sine cosine, artificial bee colony, and four other algorithms had the most accurate results,
the genetic algorithm had the worst results. However, some of the results were in the
time order of 10−4 and 10−5 but had the highest errors compared to the other algorithms.
Figure 69 illustrates the comparison of the most and least accurate algorithms.
DronesDrones
2023, 7, x FOR
2023, PEER REVIEW
7, 427 88 of 91
134 of 140
Figure 68. Comparison of algorithms; number of problems with shortest and longest calculation
time.
One other important factor, specifically for algorithms such as the Powell and Dixon
price, which have high-cost values, is the mean square error of the final result. While sine
cosine, artificial bee colony, and four other algorithms had the most accurate results, the
genetic algorithm had the worst results. However, some of the results were in the time
Figure
order 68.10
of
Figure Comparison
68.−4Comparisonofofalgorithms;
and 10−5 but number
had the highest
algorithms; number ofofproblems
errorsproblems with
compared
with toshortest
theand
shortest and
other longest
algorithms.
longest calculation
Figure
calculation
time.
69 time.
illustrates the comparison of the most and least accurate algorithms.
One other important factor, specifically for algorithms such as the Powell and Dixon
price, which have high-cost values, is the mean square error of the final result. While sine
cosine, artificial bee colony, and four other algorithms had the most accurate results, the
genetic algorithm had the worst results. However, some of the results were in the time
order of 10−4 and 10−5 but had the highest errors compared to the other algorithms. Figure
69 illustrates the comparison of the most and least accurate algorithms.
Figure 69.69.
Figure Comparison
Comparisonof
ofalgorithms; numberofofproblems
algorithms; number problems with
with the the lowest
lowest and highest
and highest error. error.
4. Nature-Inspired
4. Nature-InspiredAlgorithms in Drones
Algorithms in Dronesand
andAerospace
Aerospace Engineering
Engineering
ThisThis sectionreviews
section reviews the
thepotential
potential applications
applications of nature-inspired
of nature-inspired algorithms in differ-
algorithms in dif-
ent fields of aerospace engineering and drones. A comprehensive review in this field would
ferent fields of aerospace engineering and drones. A comprehensive review in this field
require extensive work. As a result, this section only provides a brief overview of potential
would require extensive work. As a result, this section only provides a brief overview of
applications. A deep study of different applications, their specifications, and requirements
potential
Figure
with69. applications.
a comprehensive Aalgorithms;
Comparison of review
deepof study of different
thenumber is leftapplications,
of problems
literature for with
futurethe their specifications,
lowestInand
works. highest
order and re-
error.
to provide
quirements
such an overview, it is a good idea to review similar research in this field, as many otherorder
with a comprehensive review of the literature is left for future works. In
to Nature-Inspired
4. provide
researchers such
have anstudied
overview,
Algorithms it in
is Drones
optimizationa good idea to
in review
and Aerospace
problems aerospacesimilar research
Engineering
and drones. Forinexample,
this field, as
many
Liu other researchers
et al. have reviewedhave studied optimization
the applications problems in
of convex optimization in aerospace and drones. For
aerospace engineering.
Thison section reviews the potentialmethods
applications of nature-inspired algorithms in dif-
example,
Based Liu et al.
their have reviewed
research, optimization the applications have of convex optimization
applications in optimal in aerospace
trajectory
ferent fields of aerospace engineering and drones. A comprehensive review in this field
engineering. Basedavoidance,
design, collision on their research,
and formation optimization
control ofmethods have applications
UAVs. Regarding spacecraft, in optimal
there
would require
are many extensiveinwork.
applications As aoptimal
designing result, trajectories,
this sectionoptimal
only provides a brief optimal
control policies, overview of
trajectory design, collision avoidance, and formation control of UAVs. Regarding space-
potential
rendezvous applications.
guidance,Aoptimal
deep study
guidance of different
for a swarm applications, their
of spacecraft, andspecifications,
satellite station and re-
craft, there are many applications in designing optimal trajectories, optimal control poli-
quirements with a comprehensive
keeping. Optimization algorithmsrevieware alsoofuseful
the literature is leftaircraft.
in high-speed for future works.
There is muchIn order
cies, optimal rendezvous guidance, optimal guidance for a swarm of spacecraft, and sat-
to provide
room such an overview,
for developing it is approaches
optimization a good idea in to review
re-entry similarhypersonic
vehicles, research aerial
in thisvehi-
field, as
ellite station keeping. Optimization algorithms are also useful in high-speed
cles, rockets, and gliders [422]. Padula et al. have also studied the optimization applications aircraft.
many other researchers have studied optimization problems in aerospace and drones. For
There is much accounting
in aerospace room for developing
for uncertainty.optimization
Regarding approaches in re-entry
them, optimization vehicles,
methods are key hyper-
example, Liu et al. have reviewed the applications of convex optimization in aerospace
sonic
toolsaerial vehicles,
in aircraft rockets,
impact and gliders
dynamics, [422].the
optimizing Padula
weight,et increasing
al. have also studied
safety, the optimi-
optimizing
engineering. Based on theirwing
research, optimization methods have[423].
applications inhas
optimal
zation
airfoilapplications
shape, in aerospacedesign,
aerodynamic accounting for uncertainty.
and structural wing design Regarding them, optimiza-
Mieloszyk
trajectory design, collision avoidance, and formation control of Regarding
UAVs. Regarding space-
tion methods
also reviewed arenumerical
key toolsoptimization’s
in aircraft impact dynamics,
applications optimizing the weight,
in aerospace. increasing
him, some
craft, there are many
specifications such as applications in designing
speed and accuracy optimalfor
are necessary trajectories,
aerospace optimal control
applications. He poli-
safety, optimizing airfoil shape, aerodynamic wing design, and structural wing design
cies,
hasoptimal
reviewed rendezvous
optimization guidance, optimal guidance
in airfoil geometry optimization forconsidering
a swarm of thespacecraft,
maximum-lift and sat-
ellite station keeping. Optimization algorithms are also useful in high-speed aircraft.
There is much room for developing optimization approaches in re-entry vehicles, hyper-
sonic aerial vehicles, rockets, and gliders [422]. Padula et al. have also studied the optimi-
zation applications in aerospace accounting for uncertainty. Regarding them, optimiza-
Drones 2023, 7, x FOR PEER REVIEW 92 of 140
Figure 70.70.
Figure Classification
Classificationof
of optimization algorithms
optimization algorithms in in aerospace
aerospace systems
systems and drones.
and drones.
optimizer and moth-flame algorithm have the best performance [428]. Optimal conceptual
design is also a popular approach in satellite design which is always formulated as an
MDO problem [429,430]. In the conceptual design of novel and unconventional drone
configurations such as tilt-rotor, tilt-wing, and helicopter, optimization serves as a key tool.
The design of such drones involves multiple technical aspects, including the need to balance
performance requirements such as range, speed, and payload capacity, while also ensuring
safety and stability. Optimization algorithms enable the exploration of large design spaces
and the identification of optimal solutions that satisfy multiple design objectives and
constraints [431,432]. Of course, PSO has also been used to solve the optimal conceptual
design of aircraft to find the best possible configuration [433].
4.3. Structure
Structural efficiency in aerospace systems and drones is a critical topic. For many
important aspects, including safety, energy consumption, and cost reduction, it is necessary
to provide an efficient structural design. There are many problems in this field—including
structure weight optimization, stress reduction, elastic and aeroelastic characterizations,
and thermal resilience—which need appropriate optimization methods to be solved. In
complex computational-method-based structural problems where deterministic algorithms
are not applicable, the stochastic nature-inspired algorithm can provide acceptable and
relatively fast results [443]. These algorithms can also be used for optimal finite element
model updates in structural analysis. For example, Boulkabeit et al. have used the fish
school search algorithm for the FEM model update of a GARTEUR SM-AG19 aircraft
structure. They have shown that this algorithm provides more accurate results compared
to GA and PSO [444]. This section provides an overview of the optimization tools used
in aircraft design, with a focus on their applications in optimizing structural features and
computational analysis.
Structure Design
The use of optimization methods in aircraft design is crucial for ensuring safety in
crash accidents or reducing weight. The optimization goal can be the geometry of the
aircraft or specific structural features such as the rib in wings to optimize stress and
manufacturability. Nature-inspired algorithms, including GA, have been widely used
to optimize these features. Additionally, bio-inspired algorithms can optimize process
parameters of welding in aircraft wing structures, aeroelasticity characteristics, and elastic
deformation in classic aluminum-based structures or composite materials. Nature-inspired
algorithms can also be applied to computational structural analysis, such as optimizing the
fiber orientation of a composite wing to maximize flutter speed.
Drones often use lightweight structures made of carbon fiber reinforced plastics
(CFRP), particularly continuous unidirectional (UD) carbon fibers due to their superior
mechanical properties. The layered nature of UD plies offers design flexibility, allowing
the mechanical properties of the resulting laminate to be tailored to best suit the applied
loading and stiffness requirements. The challenge in optimizing the stacking sequence of
large aerospace structures lies in the mixed discrete and continuous nature of the problem.
Structural constraints such as strength and maximum displacements are formulated using
continuous quantities, while design and manufacturing rules concern discrete plies. Nature-
inspired algorithms have numerous iterations which makes the calculations too expensive.
Meanwhile, gradient-based algorithms have better performance in physical constraints
handling. A hybrid approach is suggested to be the best solution. Heuristic algorithms
can handle discrete variables, while gradient-based algorithms are responsible for physical
constraints. To bridge the gap between these stages, multiple iterations of the two-stage
Drones 2023, 7, 427 92 of 134
4.4. Aerodynamics
Aerodynamic shape design is a popular field that focuses on optimizing the airfoil,
wing, fuselage, control surfaces [452], tail, and other aerodynamic-related parts of an aircraft.
Especially when it is combined with computational fluid dynamics (CFD), ASO becomes
an important approach in modern aircraft design. ASO reduces aircraft development’s
cycle time considerably and improves performance [453]. Lian et al. have studied the
applications of evolutionary algorithms in aerodynamic applications. Based on their survey,
EAs and hybrid algorithms can be used in the design of turbopumps, compressors, and
micro-air vehicles [425]. Giannakoglou has also studied the design of aerodynamic shapes
using stochastic algorithms like evolution algorithm, genetic algorithm, evolution strategies,
and their variants [454]. He has shown the performance of such algorithms in designing
uniform and multi-element airfoils.
Figure
Figure71.
71.Design
Designofofoptimized
optimizeduniform
uniformand
andmulti-element
multi-elementairfoil
airfoil[454].
[454].
ItAnother
has beenfeasible
shown problem
that in a is
liftthe
maximization airfoil design
three-dimensional shape problem,
design ofGA the has
wingbe con-
er
results compared to SA and gradient-based optimization, while its computational cost is
sidering airfoil, wing span, swept angle, wing sections, etc. Wing shape optimization
higher [455]. structural
considering The application
loads, of evolution strategy
deformations, and even algorithms has also been
both aerodynamics andstudied in
structural
wing
loadsand andblade
their airfoil design
interaction at [456].
the same Tiantime
andisLianother
have also used an improved
application version op-
[423]. A common of
FFO to solve
timization the airfoil
method in thisshape
groupdesign problem. optimization
is CFD-based Based on them, thistries
which algorithm provides
to systematically
be er results
compute compared
different to some
variants of theoftwo-
the evolutionary algorithms
or three-dimensional model[457]. Predictably
using PSO has
the computational
been used for airfoil shape design as well as other algorithms. The results show model,
fluid dynamics methods. Generally, a piece of code is responsible for getting the be er
performance and convergence
providing appropriate meshing, speed for PSO compared
computations, and an to GA [458]. Naumann
optimization et al.
process that haveto
leads
also used a modified version of CS to optimize airfoil shape in order to maximize the
this code [436].
lift/drag ratio
It has [459].
been ABCthat
shown has in
also been
a lift used to optimize
maximization airfoilwind
designturbine bladeGA
problem, shape
hasusing
better
CFD and
results BEM methods
compared to SA and [460]. Hoseynipoor optimization,
gradient-based et al. have used whileGSAits for two-dimensional
computational cost is
airfoil
highershape
[455].design to achieve of
The application theevolution
maximumstrategy
lift-over-drag ratio has
algorithms [461].
alsoHSbeen
has studied
also been in
wing and blade airfoil design [456]. Tian and Li have also used
used for this problem [462,463]. Jalili has also applied the SCA and obtained a smoothan improved version of
FFO to
shape andsolve
lowthedragairfoil
[382].shape design problem. Based on them, this algorithm provides
better results compared to some of the evolutionary algorithms [457]. Predictably PSO has
beenWing
4.4.2. used andfor airfoil shape design as well as other algorithms. The results show better
Tail Design
performance and convergence speed for PSO compared to GA [458]. Naumann et al. have
In addition to airfoils, the overall configuration of the wings can also be optimized
also used a modified version of CS to optimize airfoil shape in order to maximize the
using bio-inspired algorithms. Specifically, in unconventional wings, including multi-
lift/drag ratio [459]. ABC has also been used to optimize wind turbine blade shape using
CFD and BEM methods [460]. Hoseynipoor et al. have used GSA for two-dimensional
airfoil shape design to achieve the maximum lift-over-drag ratio [461]. HS has also been
used for this problem [462,463]. Jalili has also applied the SCA and obtained a smooth
shape and low drag [382].
main design process on it. A good reason behind this is that compared to gradient-bas
approaches, bio-inspired algorithms such as GA have been shown to have more comp
tationand
costs. Therefore, considering their easier implementation, they seem to be mo
dihedral angle) or to get a primary design of the wing or tail to base the main design
process on it. A good reason behind this is that compared to gradient-based approaches,
suitable for low-detail
bio-inspired algorithmsproblems
such as GAandhavepreliminary
been shown todesign [455].
have more Nature-inspired
computation costs. alg
rithms have shown
Therefore, promise
considering theirin solving
easier complex multidisciplinary
implementation, they seem to be more design challenges.
suitable for R
searchers have used algorithms such as PSO, ACO, BFO, DE, and ABC in hybrid forms
low-detail problems and preliminary design [455]. Nature-inspired algorithms have shown
in combination of machine
promise in solving complexlearning to improve
multidisciplinary aerodynamic
design performance,
challenges. Researchers and optim
have used
algorithms such as PSO, ACO, BFO, DE, and ABC in hybrid forms or in combination of
wingmachine
size, topology, aeroelastic design, and morphing wing tip design.
learning to improve aerodynamic performance, and optimize wing size, topology,
aeroelastic design, and morphing wing tip design.
such cases, for example, in the MDO design of aircraft wings, the potential and performance
of bio-inspired algorithms like PSO has been shown [467]. Wang et al. have also shown the
effectiveness of ACO in solving the optimization problem of wing size and topology [468].
BFO has also been used to solve the optimization problem of the aeroelastic design of
a rectangular wing [451]. The applicability of the DE algorithm has been shown for
multimodal problems like wing and airfoil design [469]. Li et al. have also used FSO
to design a variable camber morphing wing [470]. Another research also focuses on the
application of ABC in morphing wing tip design of aircraft considering aerodynamic
specifications [471].
Figure 73.
Figure 73. Classification of aerospace/drone
Classification of aerospace/drone systems.
systems.
Some research focuses on optimal equipment placed inside the fuselage using GA
[474]. Li et al. have used bat algorithm to locate the flapping hinge in a coaxial helicopter
to reduce the vibration and achieve minimum hub [475]. Viviani et al. have also used a
variant of GA to solve the problem of the optimal body shape of a re-entry vehicle [476].
Another similar research is done by Arora and Kumar in the aerodynamic shape optimi-
Drones 2023, 7, 427 96 of 134
Some research focuses on optimal equipment placed inside the fuselage using GA [474].
Li et al. have used bat algorithm to locate the flapping hinge in a coaxial helicopter to
reduce the vibration and achieve minimum hub [475]. Viviani et al. have also used a variant
of GA to solve the problem of the optimal body shape of a re-entry vehicle [476]. Another
similar research is done by Arora and Kumar in the aerodynamic shape optimization of a
re-entry vehicle [477]. PSO has also been used to find the optimal geometric body shape in a
flying wing glider [478]. Generally, the geometric shape design of a flying wing considering
aerodynamic specifications contains non-convex functions, which make algorithms such
as DE suitable candidates [479]. Chen et al. have also applied DE for satellite layout
optimization or equipment placement. They have considered three-dimensional layout
optimization, which is an NP-hard problem. They have shown the robustness and efficiency
of this algorithm and its hybrid variants in the optimal layout design of satellites with up
to 40 pieces of equipment to be placed [480].
Table 7 summarizes the studied publications in aerodynamic optimization and the
algorithms the have used. The most popular algorithms and applications in this section are
GA and airfoil shape design.
priority of avoiding detected hazards or choosing the shortest path [482]. Yu et al. have
proposed using drones for disaster situational awareness by optimizing their path planning
through an adaptive selection mutation constrained DE algorithm. The algorithm selects
individuals based on their fitness values and constraint violations, improving exploitation
and maintaining exploration. Experimental results show that the proposed algorithm is
competitive with state-of-the-art algorithms, making it suitable for disaster scenarios [483].
Qu et al. have designed a novel reinforcement learning-based GWO (RLGWO) to address
the challenge of high-quality path planning for drones in complex three-dimensional flight
environments. The proposed algorithm incorporates reinforcement learning to enable
adaptive switching of operations based on accumulated performance. Four operations—
namely, exploration, exploitation, geometric adjustment, and optimal adjustment—are
introduced for each individual to serve UAVs path planning. The generated flight route is
smoothed using the cubic B-spline curve, making it suitable for UAVs. Simulation results
demonstrate the RLGWO algorithm’s feasibility and effectiveness in generating a suitable
path for UAVs in complex environments [484]. Another research by Shen et al. focuses on
a multi-objective optimization approach to path planning in a three-dimensional terrain
scenario with constraints, using an evolutionary algorithm based on multi-level constraint
processing (ANSGA-III-PPS) to plan the shortest collision-free flight path of a gliding
UAV. The proposed algorithm employs an adaptive constraint processing mechanism to
improve path constraints in a three-dimensional environment and an improved adaptive
non-dominated sorting GA to enhance path planning ability in a complex environment.
Experimental results demonstrate that ANSGA-III-PPS outperforms four other algorithms
in terms of solution performance, validating the effectiveness of the proposed algorithm
and enriching research results in UAV path planning [485].
One of the popular algorithms in path planning is bat algorithm. Lin et al. have
used an improved version of the bat algorithm in a combination of artificial potential field
and chaos strategy for UAV path planning. Using this approach, they have achieved a
robust performance and global optimality [486]. Wang et al. have also used an improved
version of the bat algorithm for UAV’s optimal dynamic target tracking problem [487]. Bat
algorithm has also been used for optimal pitch control [488] and landing of aircraft [489].
Li et al. have used clonal selection algorithm for optimal UAV route evaluation [490].
The trajectory-tracking problem of a quadrotor has been studied and solved by cuckoo
search and PSO [491]. Trajectory planning of MAVs in urban environments using cuckoo
search has been studied by Hu et al. [492]. Zhang et al. have applied an improved version
of DE for online path planning of a quadrotor. Based on them, this algorithm produces
feasible paths and better performance compared to PSO and GA [493]. Nikolos and Brintaki
have also considered coordinated path planning of UAVs using DE [494]. Alihodzic has
used the fireworks algorithm for solving the NP-hard problem of UAV path planning.
According to him, this algorithm outperforms other nature-inspired algorithms such as
DE, PSO, and CS [495]. This is well-aligned with the results of the comparison in Section 3,
where in most of the benchmark functions fireworks algorithm stands on the top of the
list. Zhangs have also studied the path planning problem in UAVs using the hybrid
method of DE and fireworks algorithm [496]. Roberge and Tarbouchi have used flower
pollination algorithm for real-time trajectory planning of a UAV [497]. Li and Duan have
studied the problem of optimal path planning for a UAV using the gravitational search
algorithm [498]. Qu et al. have also used the GWO for optimal path planning of a UAV [499].
Lou et al. have developed an improved butterfly optimization (BOA-TSAR) algorithm for
autonomous three-dimensional pathfinding of drones in complex spaces. The algorithm
improves the randomness strategy of initial population generation using the tent chaotic
mapping method, adaptive nonlinear inertia weights, a simulated annealing strategy,
and stochasticity mutation with global adaptive features. Simulation experiments verify
the superior performance of BOA-TSAR, achieving optimal path length and smoothness
measures. The algorithm is competitive among swarm intelligence algorithms of the same
type [500].
Drones 2023, 7, 427 99 of 134
UAVs are being increasingly used for a variety of civilian applications such as delivery,
logistics, surveillance, entertainment, and more. Path and trajectory selection for UAVs
can be formalized as a TSP path optimization problem under constraints, which shares
similarities with similar problems that have been studied in the context of urban vehicles. A
recent study by Khoufi et al. reveals the applications of GA, PSO, ACO, and SA in solving
this problem [501]. Drone-truck problem is one of the famous optimization problems
studied by numerous researchers. This problem is usually modeled as a travelling sales
man (TSP) problem in which a truck and a drone are used for package delivery. The goal
is to deliver packages by only passing each city or node for one time. The drone leaves
the truck in a city and returns back in another city for package loading and recharging
(switching) batteries. Based on a recent research, the majority of efforts in this field focus
on applying heuristic algorithms [502]. Cooperation of drones with other vehicles such as
underwater and ground vehicles is another problem which can be solved by nature-inspired
algorithms. Drones can be used to support the operation of other vehicles and drones or
may perform independent missions [503]. Weng et. al. proposes a cooperative truck and
drone delivery path optimization problem to minimize delivery task completion time. The
truck carries cargo along the outer boundary of a restricted traffic zone and sends/receives
the drone responsible for delivering the cargo to customers. To solve this problem, a hybrid
meta-heuristic optimization algorithm based on WWO is applied to optimize the paths
of the truck and drone. Experimental results show that the proposed algorithm performs
competitively compared to other popular optimization algorithms such as basic WWO, GA,
PSO, DE, BBO, and EBO [504].
Coverage path planning is another field in drone guidance and control which relies
on optimization algorithms. In this problem, a path should be found in way such that all
points of some area are covered at least once. Some topics such as range of the drone’s
camera and duplicate coverage avoidance make coverage path planning a challenging
problem. It may be combined with other objectives such as energy reduction or coverage in
minimum time. Other challenges such as obstacle avoidance make the final optimization
problem even harder. This problem is a common in agriculture, environment protection,
disaster management, and save and rescue applications. Otto et al. have shown that the
majority of the research in this area appies heuristic and meta-heuristic algorithms [503].
The complete automation of fixed-wing UAV operations involves autonomous execu-
tion of take-off, cruising, and landing. The landing stage is particularly crucial, requiring
the UAV to maintain a constant speed and glide slope to ensure stability and a successful
touchdown on the runway, while also estimating the landing point accurately in minimal
time. Incorporating bio-inspired algorithms into UAV control systems can improve the
accuracy and speed of landing point estimation. A study by Ilango and R. utilized the
bats optimization algorithm, moth flame optimization algorithm, and artificial bee colony
algorithm to determine the computed path coordinates and optimal landing point within
the operational limits of the UAV. The objective was to identify the optimal landing point
in minimal time based on the computed points. The error rate between the actual path
and estimated path computed points was used to measure performance. Empirical re-
sults indicate that the moth flame optimization algorithm performs the best, taking the
least amount of time to compute the optimal point with minimal error, compared to the
other two optimization algorithms examined [505]. A dual swarm optimization algorithm
that combines the dragonfly optimization method and the DE method is designed by
Liang et al. to address the obstacle avoidance trajectory planning problem in the landing
process of micro drones. An orthogonal learning mechanism is implemented to facilitate
adaptive switching between the two algorithms. In the landing route planning process,
the planning plane is obtained by making the gliding plane tangent to the obstacle. The
obstacle projection is transformed into multiple unreachable line segments in the planning
plane. An optimization model is designed to transform the three-dimensional landing
route planning problem into a two-dimensional obstacle avoidance route optimization
problem. The shortest route is chosen as the optimization objective, and a penalty factor
Drones 2023, 7, 427 100 of 134
is introduced into the cost function to prevent the intersection of the landing route and
obstacle. During the optimization process, the hybrid algorithm adaptively selects the next
iterative algorithm through orthogonal learning of intermediate iterative results, allowing
for the full utilization of the respective advantages of the two algorithms. The optimization
results demonstrate that the proposed hybrid optimization algorithm is more effective in
solving the landing route planning problem for micro-small UAVs compared to a single
optimization algorithm [506]. The problem of landing scheduling has been also considered
in applications of nature-inspired algorithms. Some research suggests applying flower
pollination for this problem [507,508], while others focus on GWO [509] and harmony
search algorithm [510]. Abdul-Razaq and Ali have also used the bees algorithm to provide
a nature-inspired landing scheduling for aircraft [511]. Jia et al. have also solved a similar
problem using the clonal selection algorithm [512].
Trajectory optimization in space missions is also an optimization problem. An opti-
mized trajectory is a key factor in vehicle stability and mission success. Chai et al. have
reviewed the optimization techniques in the trajectory design of space crafts. According to
their research, in complex trajectory design problems where gradient-based approaches are
not applicable, stochastic or evolution-based algorithms such as GA, PSO, DE, and ACO
have been used solely or in combination with other approaches such as gradient-based
algorithms to solve the optimal trajectory design of spacecraft. However, they indicate
when using NIAs the validation of solution optimality becomes difficult, and the com-
putational complexity due to the heuristic optimization process tends to be very high,
making it challenging to treat heuristic-based methods as a standard optimization algo-
rithm that can solve general spacecraft trajectory planning problems [513]. Shuang et al.
have also conducted similar and interesting research. They have studied the optimization
approaches in international and China’s national trajectory optimization competitions.
Based on their research in different missions and problems such as multi-spacecraft ex-
ploration and removing space debris, which contains trajectory design and rendezvous
problems, algorithms such as GA, PSO, ACO, and some forms of hybrid algorithms have
been used by the researcher [514]. Shirazi et al. have also conducted similar research which
concluded that common objectives in spacecraft trajectory optimization are Mayer term
(state or input at the end of trajectory), time, velocity, Lagrange term (integral of input or
state along the trajectory), acceleration, fuel mass, or smoothness of the trajectory. Based
on their findings, in addition to previous algorithms, DE and SA have also been used in
spacecraft trajectory optimization [515]. Su and Wang have studied the optimal trajectory
optimization of a reusable launch vehicle using GSA [516].
GNC can be considered as the most popular application for nature-inspired optimiza-
tion algorithms. These algorithms have been widely used in optimal control, path planning,
task allocation in a swarm, mission planning, obstacle avoidance, formation control, and
autonomous flight. Zhou et al. have studied the UAV swarm systems, based on their
research, algorithms such as ACO, Wolf Swarm, ABC, and Firefly algorithm, have shown
their potential applications in solving UAV swarm distributed control problems as well
as GA and PSO. Nature-inspired algorithms such as GA, ACO, GWO, glowworm opti-
mization, wolf pack, simulate annealing, and krill herd algorithm, have also considerable
applications in task allocation in swarm systems. Specifically, about path planning, algo-
rithms such as PSO, GWO, fruit fly optimization, and pigeon-inspired algorithm, have been
used so far for three-dimensional path planning, dynamic path planning, area coverage
path planning, and other optimal planning applications. Algorithms such as fireworks
algorithm have been used also for the optimization of satellite control law [517]. By model-
ing particulars of the aircraft and constraints of the flight scenario within an optimization
framework solvable via the fireworks algorithm, Xue et al. demonstrate the generation
of optimal trajectories [518]. The complex scheduling and routing issues inherent to air
traffic control systems have also been effectively addressed through the application of the
gravitational search algorithm [519]. Trajectory tracking along with trajectory planning is
another popular problem [520]. Aircraft engine control is also one of the probable applica-
Drones 2023, 7, 427 101 of 134
tions. Algorithms such as GWO have been used so far for such problems [521]. Katal et al.
have also used bat algorithm for robust flight control of a UAV [522].
Control parameter tuning using bio-inspired algorithms is one common application
for these algorithms; Lin et al. have used such an approach for PID UAV flight control
tuning with ABC [523]. Bian et al. have done similar research using the bacterial foraging
algorithm [524]. Another example is research by Oyekan and Hu who developed a PID
control gain tuning for a UAV [525]. Bencharef and Boubertakh have also used bat algo-
rithm for parameter tuning of a quadrotor’s PD controller [526]. Zeri et al. have used the
bees algorithm for optimal tuning of an aircraft’s fuzzy controller that consists of a set of
linguistic rules with adjustable membership functions and scaling factors that determine
its performance [527]. Huang and Fei have used clonal selection for parameter tuning of
an active disturbance rejection controller of a UAV [528]. Zatout et al. have used PSO, bat
algorithm, and cuckoo search for optimizing the fuzzy attitude controller of a quadrotor.
According to them, bat algorithm provided a better computing time and performance com-
pared to CS and PSO [529]. Glida et al. have used cuckoo search for parameter optimization
of a quadrotor’s backstepping controller [530]. Pedro et al. also utilized the differential
evolution optimization algorithm for proportional-integral-derivative (PID) gain tuning
of a quadrotor unmanned aerial vehicle (UAV) operating in hovering flight conditions.
The authors were able to achieve significantly improved hovering performance for the
quadrotor UAV compared to untuned initial PID gains. The results demonstrate the utility
of evolutionary optimization techniques such as differential evolution for automating the
complex process of PID controller design for nonlinear dynamical systems such as quadro-
tor vehicles [531]. Wang et al. have also used a similar approach for quadrotors trajectory
tracking using PID [532]. Keskins have conducted similar research on tuning PD parame-
ters of a quadrotor position control using the firefly algorithm [533]. Kaba has also used the
firefly algorithm for PID tunning of a quadrotor’s controller [534]. Nonlinear controllers
such as the sliding mode controller and backstepping controller have been tuned by firefly
algorithm for the quadrotor’s optimal flight control [535]. Prabaningtyas and Mardlijah
have used the firefly algorithm for parameter tuning of a linear quadratic Gaussian tracking
control applied in a quadrotor’s trajectory tracking problem [536]. Yin et al. have used the
fireworks algorithm for parameter tuning in a hypersonic vehicle sliding mode control [537].
Glida et al. have used flower pollination algorithm to optimize a fuzzy adaptive backstep-
ping controller for quadrotor attitude control [538]. A similar approach has been used by
Basri and Noordin using gravitational search optimization [539]. Abbas and Sami have
also used this algorithm for tuning the PID gains of a quadrotor’s controller [540]. Cai et al.
have worked on the application of grey wolf optimization on active disturbance rejection
control parameter tuning for a quadrotor’s trajectory tracking [520]. Hartawan has also ap-
plied the harmony search algorithm for PID gain tunning in a quadrotor’s controller [541].
Altan has studied the performance of Harris hawk optimization in PID gain tunning of
attitude and altitude controller of a UAV in path following problem [542]. Yuan et. al. have
developed a robust close-formation control system for unmanned aerial vehicle (UAV)
flights using dynamic estimation and compensation to address wake vortex effects and
advance UAV close-formation flights to an engineer-implementation level. The control
system is divided into three control subsystems for the longitudinal, altitude, and lateral
channels, using linear active-disturbance rejection control (LADRC) with two cascaded
first-order LADRC controllers. Sine-powered pigeon-inspired optimization is proposed
to optimize the control parameters for each channel. Simulation results show that the de-
signed control system achieves stable and robust dynamic performance within the expected
error range, maximizing the aerodynamic benefits for a trailing UAV [543]. Jing et al. have
proposed a disturbance-observer-based nonlinear sliding mode surface controller (SMC)
for a simulated PX4-conducted quadcopter and optimized its parameters using PSO. The
quadcopter’s tracking performance is evaluated and compared under various noise and
disturbance conditions against PID control strategies. Results show that the PSO-powered
SMC controller with disturbance observer enables accurate and rapid adaptation of the
Drones 2023, 7, 427 102 of 134
One other application for nature-inspired algorithms is aircraft’s active vibration re-
duction. Zarchi and Attaran have developed a method for improving an aircraft’s vibration
absorber using bees algorithm [558]. A similar technique is applied by Toloei et al. [559].
Table 8 summarizes the studied publications in this section. As it can be seen, although
this section includes a wide range of algorithms, the majority of papers have focused on
a limited number of algorithms such as DE, PSO, BA, GWO, and GA. Figure 74 provides
the share of different nature-inspired algorithms from the reviewed publications in this
section. It can be said the most popular applications of bio-inspired algorithms in this
group are controller parameter tunning, path, and trajectory planning, as well as optimal
swarm motion.
Table 8. Cont.
Figure 74. Share of nature-inspired algorithms and applications from guidance and control publications.
Figure 74. Share of nature-inspired algorithms and applications from guidance and control publi-
cations.
4.5.3. Navigation
Nature-inspired algorithms have shown great potential in the navigation of robots,
particularly in challenging environments such as urban and crowded areas. Optimization
algorithms such as bat algorithm, MFO, PSO, CS, and GWO have been applied to automatic
Drones 2023, 7, 427 106 of 134
robot navigation, image processing, and localization problems in swarm systems. These
algorithms have been used in various applications such as integrated navigation, target
recognition, and source localization in UAV-based search and rescue missions. Additionally,
researchers have used CS and DE algorithms for automatic guided vehicles’ navigation and
autonomous UAV swarm coordination, respectively. Furthermore, hybrid versions of GWO
and SCA have been applied to energy-efficient localization of UAVs and visual tracking
techniques, respectively, resulting in better performance. These algorithms provide safe and
collision-free trajectories in the presence of uncertainty, error, and external disturbances.
Recent work has examined applications of algorithms such as the bat algorithm, moth
flame optimizer, PSO, CS, and GWO in automatic robot navigation [570]. Some naviga-
tion methods rely on optimization processes for calculations or process images in novel
vision-based navigation approaches. Optimization algorithms also have applications in
localization problems in a swarm system. Zhangs have applied the PSO for optimal local-
ization in a UAV swarm in order to reduce the localization error [571]. Shanshan et al. have
used ABC for improving the performance of the integrated navigation which resulted in a
reduction in velocity and position error [572]. Duan has also studied the application of ABC
in the target recognition, and PSO in image matching for a low-altitude UAV [573]. Clonal
selection has been used in an automatic guided vehicle’s navigation in a warehouse [574].
Banerjee et al. have applied the cuckoo search algorithm for source localization in UAV-
based search and rescue missions to determine the location of the victim [575]. Alfeo et al.
have used DE in order to provide autonomous UAVs in a swarm with self-coordination and
robustness [576]. Sun et al. have also used DE to increase the geosynchronous synthetic
aperture radar imaging performance of a UAV used in the navigation and path planning
of the UAV [577]. Li et al. have developer a three-dimensional localization approach for
multiple UAVs using a flipping ambiguity avoidance optimization algorithm. Beacon UAVs
collect data and utilize a semidefinite programming-based approach to estimate the global
position of GPS-denied UAVs. They have applied an improved GWO algorithm which
is used to improve positioning accuracy in noisy environments. Simulation results show
the superiority of the proposed approach on similar methods [484]. Arafat and Moh have
developed a similar energy-efficient localization method for UAVs in swarms based on
a hybrid version of GWO [578]. The navigation of drones in urban and crowded places
is a challenging issue. The first problem is safety and the second problem is uncertainty
and inaccurate measurements such as GPS. To overcome this problem, Radmanesh and
Kumar have designed an optimization method based on GWO, which by using automatic
dependent surveillance-broadcast (ADS-B), determines the accurate distance to obstacles
and provides optimal safe and collision-free trajectories [579]. Nenavath et al. have applied
the sine cosine algorithm for a visual tracking technique called trigonometric particle filter
(TPF) to achieve better performance. Based on their research, this algorithm provides better
results compared to spider monkey optimization, firefly algorithm, and PSO [580].
Hao et al. have proposed a passive location and tracking algorithm for moving targets
using a UAV swarm. The algorithm is based on an improved particle swarm optimization
(PSO) algorithm. The localization method of cluster cooperative passive localization
is employed, and the problem of improving passive location accuracy is transformed
into the problem of obtaining more target information. The A criterion is used as the
optimization target, and a recursive neural network (RNN) is used to predict the probability
distribution of the target’s location in the next moment, making the localization method
suitable for moving targets. The particle swarm algorithm is improved using grouping and
time period strategies, and the algorithm flow for moving target location is constructed.
Simulation verification and algorithm comparison demonstrate the advantages of the
proposed algorithm [581].
Li et al. have developed a method to improve the navigation accuracy of inertial
navigation systems in drones by identifying errors in horizontal gyroscopes and accelerom-
eters using the improved pigeon-inspired optimization (PIO) method. This approach has
Drones 2023, 7, 427 107 of 134
the potential to reduce the need for sending the inertial navigation system back to the
manufacturer for calibration, saving time and resources [582].
Table 10 summarizes the studied publications in optimal navigation using nature-
inspired algorithms and the algorithms which have been used in them. It can be seen
that a wide range of algorithms have been applied on a wide range of applications. How-
ever, the list of algorithms in this section is limited to a few number algorithms such as
other categories.
4.6. Communication
A UAV-based communication system can potentially pair heuristic/meta-heuristic
solutions with UAVs to enhance the performance of wireless communication networks,
namely in the spectral efficiency and coverage of these networks. A UAV-based com-
munication system can be readily applied in an emergency or offloading scenario. For
example, researchers have proposed methods to prejudice, asses, and preserve a response
in emergency situations [583]. The combination of heuristics and UAV platforms can po-
tentially mitigate the inherent weaknesses of a pure UAV-based communication system,
such as channel modeling, resource management, positioning, and security [584]. UAVs are
currently utilized for data delivery and collection from dangerous or inaccessible locations.
However, trajectory planning remains a major issue for UAVs. Khoufi et al. have conducted
research on determining optimized routes for data pickup and delivery by drones within a
time window and intermittent connectivity network, while allowing for battery recharge in
route to destinations. The problem is formulated as a multi-objective optimization problem
and solved using Non-dominated Sorting GA II (NSGA-II). Various experiments validated
the proposed algorithm in different scenarios [585]. Optimizing device-device communica-
tion, the deployment process, and limited power supply for the devices and hardware they
carry are practical issues to be addressed in applying drones in disaster response scenarios.
In this field, the bio-inspired self-organizing network (BISON) achieved promising results
using Voronoi tessellations. However, in this approach, the wireless sensor network nodes
were using knowledge about their coverage areas center of gravity, which a drone would
not automatically know. To address this, Eledlebi et al. have augmented BISON with a
GA to further improve network deployment time and overall coverage. Their evaluations
show an increase in energy cost [586].
The development of the edge computing paradigm, IoT-based devices, and 5G tech-
nology has led to increased data traffic that requires efficient processing. UAVs can replace
edge servers used in mobile edge computing (MEC). Subburaj et al. have proposed a self-
adaptive trajectory optimization algorithm (STO) for a UAV-assisted MEC system using
DE. The STO is a multi-objective optimization algorithm that aims to minimize the energy
consumed by MEC and the process emergency indicator. The proposed self-adaptive
Drones 2023, 7, 427 108 of 134
In a review paper, bio-inspired nature algorithms were examined in their ability to rout
multiple UAVs in flying ad hoc networks (FANETS). Based on this research, several opti-
mization techniques such as the KH, GWO, BAT, red deer optimization, PSO, FSA, WOA,
ACO, BCO, GSO, MFO, FFA, and BFA are used for this aim. In addition to basic algorithms,
hybrid forms of NIAs with combination to each other or other methods such as fuzzy logic
are also studied in this specific application [598]. A recent study by Otto et al. reveals
the applications of heuristic and meta-heuristic algorithms in FANET operations [503]. A
summarization of the studied publications in this section and their application is provided
in Table 11.
be classified into three categories: exact methods, heuristic methods, and meta-heuristic
methods. Meta-heuristic methods are problem-agnostic and can treat functions as black
boxes which makes them suitable for this purpose. Algorithms such as PSO have been
applied to minimize transmission power of UAVs serving as relays in IoT communications
while considering the outage probability of IoT devices. Other algorithms such as GA
were used to design an energy-efficient trajectory for UAV-BSs during backhaul connec-
tion to terrestrial BSs in post-disaster scenarios. A UAV-BS path planning framework can
be impowered with GA to determine the optimal path with minimal turns and energy
consumption [599].
5. Summary
Considering the selected list of the most popular nature-inspired algorithms, the
current state of nature-inspired algorithms used in aerospace applications and drones can
Drones 2023, 7, 427 111 of 134
Multidisciplinary Design
Engine Design
Structure Design
Airfoil Design
System Identification
Navigation
Drone Communication
Body Design
The most popular field for nature-inspired algorithms is control systems. Nature-
inspired algorithms have been widely used in control systems, path planning, trajectory
design, trajectory tracking, and swarm and formation control. The most popular algorithms
in aerospace/drone systems are GA, PSO, ABC, and DE, which are used in many different
fields of aerospace/drone systems, from aerodynamics to control systems. Forest optimiza-
tion and cat swarm are not popular in aerospace/drone systems as they are in other fields.
According to Section 3, forest optimization has obtained middle to low results, but cat
swarm is usually above the mean and near the top of lists. Cat swarm obtained the best per-
formance and accuracy in the expanded Schaffer benchmark function. Therefore, it seems
there is still room for research and development for this algorithm in aerospace applications.
Instead of control systems, other applications are only dedicated to a few algorithms. It is
highly suggested to apply algorithms with high performances, such as sine cosine, Harris
hawks, firefly algorithm, fireworks algorithm, grey wolf optimization, and cat swarm, for a
wider range of applications since they are expected to provide superior results.
6. Conclusions
This paper reviewed the majority of nature-inspired algorithms (about 350 algo-
rithms) based on their source of inspiration. A comprehensive classification of the nature-
inspired algorithms was provided based on the sources of inspiration, including bio-based,
ecosystem-based, social-based, physics-based, mathematics-based, chemistry-based, music-
based, sport-based, and hybrid algorithms. In each category, a group of the most popular
algorithms has been reviewed in detail, while others have been studied briefly, introducing
their source of inspiration. In order to evaluate these algorithms, in the final section of the
paper, a comparison is provided by solving 10 different benchmark functions or optimiza-
tion problems with a selected number of algorithms. The results of simulations provide
many parameters such as cost value, number of iterations, average time, and error for each
set of algorithms and problems. Based on the results of the simulations, in addition to the
high performance of the nature-inspired algorithms, the advantages of some algorithms in
terms of accuracy and speed are shown.
This study revealed the massive amount of research on nature-inspired algorithms
which are mimicking different aspects of nature. From oceans to space, there can be found
algorithms focusing on a specific phenomenon or creature. Neglecting the open question
of the novelty of these algorithms and their similarities to most famous algorithms, the
high number of publications in this area is considerable. Most of the publications in this
field do not provide any or enough information on the performance of the algorithms, their
exact differences, and their contributions. The majority of papers solely provide a simple
comparison with just one or a few well-known algorithms such as GA (which is shown to
have one of the lowest performances among nature-inspired algorithms) by only solving
a single problem. This cannot, of course, provide us with a good understanding of the
real values of the algorithms. Performance analysis of nature-inspired algorithms is what
is felt necessary right now. The main focus of research in this field should be put on the
applications of these algorithms and their performance evaluation and improvements. It
seems that there are more than enough algorithms available with different perspectives.
Studying the current literature reveals that there is a focus on developing hybrid algorithms
to increase the performance of the algorithms which can broaden their field of applications.
The performance of hybrid algorithms is reported to be significantly better than the basic
algorithms which makes it reasonable to expect more research on this topic in the future.
Current research illustrated the latest developments in the field of nature-inspired
optimization, the popularity of these algorithms, and some related challenges such as
constraint handling and their performance in solving different problems. A compact
review of the applications of nature-inspired algorithms in aerospace systems has been
provided. Based on this review, the most used algorithms in aerospace systems are GA,
PSO, and ABC. The field where nature-inspired algorithms have been used widely is in
control systems. Based on evaluations in this paper, it is recommended to apply high-
Drones 2023, 7, 427 113 of 134
performance algorithms such as the sine cosine algorithm, Harris hawk optimization, firefly
algorithm, fireworks algorithm, grey wolf optimization, and cat swarm optimization in
a wider range of applications from conceptual design, MDO, aerodynamics, and shape
design to navigation and identification. Considering the progress of hybrid algorithms,
and their considerably improved performance, their application in aerospace systems is
recommended. All results of the paper, including data, and codes in Python and MATLAB,
are also published in public GitHub repositories.
Author Contributions: Conceptualization and methodology, validation S.D.; software, A.D.; writing—
original draft preparation, S.D., M.E. and A.D.; writing—review and editing, S.D.; supervision, M.H.
All authors have read and agreed to the published version of the manuscript.
Funding: This research received no external funding.
Data Availability Statement: All of the information, data, and codes of this paper are publicly avail-
able in its GitHub repository: https://github.com/shahind/Nature-Inspired-Algorithms (accessed
on 20 June 2023).
Conflicts of Interest: The authors declare no conflict of interest.
References
1. Lange, K. Optimization, 2nd ed.; Springer Science & Business Media: New York, NY, USA, 2013.
2. Fister, I.; Yang, X.S.; Brest, J.; Fister, D. A brief review of nature-inspired algorithms for optimization. Elektroteh. Vestnik/Electrotech.
Rev. 2013, 80, 116–122.
3. Yang, X.-S. (Ed.) Nature-Inspired Algorithms and Applied Optimization; Springer: Berlin/Heidelberg, Germany, 2017.
4. Molina, D.; Poyatos, J.; Del Ser, J.; García, S.; Hussain, A.; Herrera, F. Comprehensive Taxonomies of Nature- and Bio-inspired
Optimization: Inspiration Versus Algorithmic Behavior, Critical Analysis Recommendations. Cognit. Comput. 2020, 12, 897–939.
[CrossRef]
5. Sörensen, K. Metaheuristics-the metaphor exposed. Int. Trans. Oper. Res. 2015, 22, 3–18. [CrossRef]
6. Tzanetos, A.; Fister, I.; Dounias, G. A comprehensive database of Nature-Inspired Algorithms. Data Brief 2020, 31, 105792.
[CrossRef]
7. Yang, X.-S. Nature-inspired optimization algorithms: Challenges and open problems. J. Comput. Sci. 2020, 46, 101104. [CrossRef]
8. Muller, S.D. Bio-Inspired Optimization Algorithms for Engineering Applications; Swiss Federal Institute of Technology Zurich: Zürich,
Switzerland, 2002.
9. Osman, I.H. Focused issue on applied meta-heuristics. Comput. Ind. Eng. 2003, 44, 205–207. [CrossRef]
10. Gendreau, M.; Potvin, J.Y. Metaheuristics in combinatorial optimization. Ann. Oper. Res. 2005, 140, 189–213. [CrossRef]
11. Abdel-Basset, M.; Abdel-Fatah, L.; Sangaiah, A.K. Metaheuristic Algorithms: A Comprehensive Review. In Computational
Intelligence for Multimedia Big Data on the Cloud with Engineering Applications; Elsevier: Amsterdam, The Netherlands, 2018;
pp. 185–231. [CrossRef]
12. Espinosa, H. Nature-Inspired Computing for Control Systems; Springer: Berlin/Heidelberg, Germany, 2016.
13. Holland, J.H. Adaptation in Natural and Artificial Systems; MIT Press: Cambridge, MA, USA, 2019.
14. Kumar, M.; Husain, M.; Upreti, N.; Gupta, D. Genetic Algorithm: Review and Application. Int. J. Inf. Technol. Knowl. Manag. 2010,
2, 451–454. [CrossRef]
15. Dastanpour, A.; Mahmood, R.A.R. Feature selection based on genetic algorithm and SupportVector machine for intrusion
detection system. In Proceedings of the Second International Conference on Informatics Engineering & Information Science,
Kuala Lumpur, Malaysia, 12–14 November 2013; pp. 169–181.
16. Umbarkar, A.J.; Sheth, P.D. Crossover Operators in Genetic Algorithms: A Review. ICTACT J. Soft Comput. 2015, 06, 1083–1092.
[CrossRef]
17. Deb, K.; Deb, A. Analysing mutation schemes for real-parameter genetic algorithms. Int. J. Artif. Intell. Soft Comput. 2014, 4, 1.
[CrossRef]
18. Katoch, S.; Chauhan, S.S.; Kumar, V. A review on genetic algorithm: Past, present, and future. Multimed. Tools Appl. 2021, 80,
8091–8126. [CrossRef] [PubMed]
19. Mukhopadhyay, D.; Balitanas, M. Genetic algorithm: A tutorial review. Int. J. Grid Distrib. Comput. 2009, 2, 25–32. Available
online: http://www.sersc.org/journals/IJGDC/vol2_no3/3.pdf (accessed on 20 June 2023).
20. Storn, R. On the usage of differential evolution for function optimization. In Proceedings of the North American Fuzzy Information
Processing, Berkeley, CA, USA, 19–22 June 1996; pp. 519–523. [CrossRef]
21. Georgioudakis, M.; Plevris, V. A Comparative Study of Differential Evolution Variants in Constrained Structural Optimization.
Front. Built Environ. 2020, 6, 102. [CrossRef]
22. Storn, R.; Price, K. Differential Evolution—A Simple and Efficient Heuristic for global Optimization over Continuous Spaces. J.
Glob. Optim. 1997, 11, 341–359. [CrossRef]
Drones 2023, 7, 427 114 of 134
23. Ayaz, M.; Panwar, A.; Pant, M. A Brief Review on Multi-objective Differential Evolution. In Soft Computing: Theories and
Applications; Springer: Berlin/Heidelberg, Germany, 2020; pp. 1027–1040. [CrossRef]
24. Fogel, L.J.; Owens, A.J.; Walsh, M.J. Artificial Intelligence Through Simulated Evolution; Wiley-IEEE Press: Piscataway, NJ, USA, 1966.
25. Asthana, R.G.S. Evolutionary Algorithms and Neural Networks. Soft Comput. Intell. Syst. 2000, 111–136. [CrossRef]
26. Fogel, D. Evolutionary programming: An introduction and some current directions. Stat. Comput. 1994, 4, 113–129. [CrossRef]
27. Jacob, C. Evolutionary Programming. In Illustrating Evolutionary Computation with Mathematica; Elsevier: Amsterdam,
The Netherlands, 2001; pp. 297–344. [CrossRef]
28. Dagdia, Z.C.; Mirchev, M. When Evolutionary Computing Meets Astro- and Geoinformatics. In Knowledge Discovery in Big Data
from Astronomy and Earth Observation; Elsevier: Amsterdam, The Netherlands, 2020; pp. 283–306. [CrossRef]
29. Hoorfar, A. Evolutionary Programming in Electromagnetic Optimization: A Review. IEEE Trans. Antennas Propag. 2007, 55,
523–537. [CrossRef]
30. Bäck, T.; Rudolph, G.; Schwefel, H.-P. Evolutionary Programming and Evolution Strategies: Similarities and Differences. In
Proceedings of the Second Annual Conference on Evolutionary Programming, La Jolla, CA, USA, 25–26 February 1993; pp. 11–22.
31. Rechenberg, I. Evolution Strategy: Optimization of Technical systems by means of biological evolution. Fromman-Holzboog.
Stuttgart 1973, 104, 15.
32. Ferreira, C. Gene Expression Programming: A New Adaptive Algorithm for Solving Problems. arXiv 2001, arXiv:cs/0102027v3.
33. Moscato, P. On Evolution, Search, Optimization, Genetic Algorithms and Martial Arts: Towards Memetic Algorithms; Caltech concurrent
computation program, C3P Report; California Institute of Technology: Pasadena, CA, USA, 1989.
34. Ryan, C.; Collins, J.; Neill, M.O. Grammatical evolution: Evolving programs for an arbitrary language. In Proceedings of the
Genetic Programming: First European Workshop, EuroGP’98, Paris, France, 14–15 April 1998; pp. 83–96. [CrossRef]
35. Farmer, J.D.; Packard, N.H.; Perelson, A.S. The immune system, adaptation, and machine learning. Phys. D Nonlinear Phenom.
1986, 22, 187–204. [CrossRef]
36. Dasgupta, D. Artificial Immune Systems and Their Applications; Springer Science & Business Media: Berlin/Heidelberg, Germany,
1999. [CrossRef]
37. Beluch, W.; Burczyński, T.; Kuś, W. Parallel Artificial Immune System in Optimization and Identification of Composite Structures.
In Parallel Problem Solving from Nature, PPSN XI; Springer: Berlin/Heidelberg, Germany, 2010; pp. 171–180. [CrossRef]
38. De Castro, L.N.; von Zuben, F.J. The Clonal Selection Algorithm with Engineering Applications. In Proceedings of the GECCO,
Cancún, Mexico, 8–12 July 2020; pp. 36–37.
39. de Castro, L.N.; Timmis, J. An artificial immune network for multimodal function optimization. In Proceedings of the 2002
Congress on Evolutionary Computation. CEC’02 (Cat. No.02TH8600), Honolulu, HI, USA, 12–17 May 2002; pp. 699–704.
[CrossRef]
40. Jaddi, N.S.; Alvankarian, J.; Abdullah, S. Kidney-inspired algorithm for optimization problems. Commun. Nonlinear Sci. Numer.
Simul. 2017, 42, 358–369. [CrossRef]
41. Hatamlou, A. Heart: A novel optimization algorithm for cluster analysis. Prog. Artif. Intell. 2014, 2, 167–173. [CrossRef]
42. Kaveh, A.; Kooshkebaghi, M. Artificial Coronary Circulation System; A new bio-inspired metaheuristic algorithm. Sci. Iran. 2019,
26, 2731–2747. [CrossRef]
43. Asil Gharebaghi, S.; Ardalan Asl, M. New Meta-Heuristic Optimization Algorithm Using Neuronal Communication. Int. J. Optim.
Civ. Eng. 2017, 7, 413–431.
44. Raouf, O.A.; Hezam, I.M. Sperm motility algorithm: A novel metaheuristic approach for global optimisation. Int. J. Oper. Res.
2017, 28, 143. [CrossRef]
45. Enciso, V.O.; Cuevas, E.; Oliva, D.; Sossa, H.; Cisneros, M.P. A bio-inspired evolutionary algorithm: Allostatic optimisation. Int. J.
Bio-Inspired Comput. 2016, 8, 154. [CrossRef]
46. Simon, D. Biogeography-Based Optimization. IEEE Trans. Evol. Comput. 2008, 12, 702–713. [CrossRef]
47. Cheng, M.-Y.; Prayogo, D. Symbiotic Organisms Search: A new metaheuristic optimization algorithm. Comput. Struct. 2014, 139,
98–112. [CrossRef]
48. He, S.; Wu, Q.H.; Saunders, J.R. Group Search Optimizer: An Optimization Algorithm Inspired by Animal Searching Behavior.
IEEE Trans. Evol. Comput. 2009, 13, 973–990. [CrossRef]
49. Civicioglu, P. Transforming geocentric cartesian coordinates to geodetic coordinates by using differential search algorithm.
Comput. Geosci. 2012, 46, 229–247. [CrossRef]
50. Li, X.; Zhang, J.; Yin, M. Animal migration optimization: An optimization algorithm inspired by animal migration behavior.
Neural Comput. Appl. 2014, 24, 1867–1877. [CrossRef]
51. Zhang, Q.; Wang, R.; Yang, J.; Lewis, A.; Chiclana, F.; Yang, S. Biology migration algorithm: A new nature-inspired heuristic
methodology for global optimization. Soft Comput. 2019, 23, 7333–7358. [CrossRef]
52. Oftadeh, R.; Mahjoob, M.J.; Shariatpanahi, M. A novel meta-heuristic optimization algorithm inspired by group hunting of
animals: Hunting search. Comput. Math. Appl. 2010, 60, 2087–2098. [CrossRef]
53. Fausto, F.; Cuevas, E.; Valdivia, A.; González, A. A global optimization algorithm inspired in the behavior of selfish herds.
Biosystems 2017, 160, 39–55. [CrossRef] [PubMed]
54. Tilahun, S.L.; Ong, H.C. Prey-Predator Algorithm: A New Metaheuristic Algorithm for Optimization Problems. Int. J. Inf. Technol.
Decis. Mak. 2015, 14, 1331–1352. [CrossRef]
Drones 2023, 7, 427 115 of 134
55. Dai, C.; Zhu, Y.; Chen, W. Seeker Optimization Algorithm. In Proceedings of the Computational Intelligence and Security:
International Conference, CIS 2006, Guangzhou, China, 3–6 November 2006; pp. 167–176. [CrossRef]
56. Cuevas, E.; González, M.; Zaldivar, D.; Pérez-Cisneros, M.; García, G. An Algorithm for Global Optimization Inspired by
Collective Animal Behavior. Discret. Dyn. Nat. Soc. 2012, 2012, 638275. [CrossRef]
57. Farasat, A.; Menhaj, M.B.; Mansouri, T.; Moghadam, M.R.S. ARO: A new model-free optimization algorithm inspired from
asexual reproduction. Appl. Soft Comput. 2010, 10, 1284–1292. [CrossRef]
58. Kaveh, A.; Zolghadr, A. Cyclical Parthenogenesis Algorithm: A new meta-heuristic algorithm. Asian J. Civ. Eng. 2017, 18, 673–701.
59. Chen, H.; Zhu, Y.; Hu, K.; He, X. Hierarchical Swarm Model: A New Approach to Optimization. Discret. Dyn. Nat. Soc. 2010,
2010, 379649. [CrossRef]
60. Parpinelli, R.S.; Lopes, H.S. An eco-inspired evolutionary algorithm applied to numerical optimization. In Proceedings of the
2011 Third World Congress on Nature and Biologically Inspired Computing, Salamanca, Spain, 19–21 October 2011; pp. 466–471.
[CrossRef]
61. Mohseni, S.; Gholami, R.; Zarei, N.; Zadeh, A.R. Competition over Resources: A New Optimization Algorithm Based on Animals
Behavioral Ecology. In Proceedings of the 2014 International Conference on Intelligent Networking and Collaborative Systems,
Salerno, Italy, 10–12 September 2014; pp. 311–315. [CrossRef]
62. Nguyen, H.T.; Bhanu, B. Zombie Survival Optimization: A swarm intelligence algorithm inspired by zombie foraging. In
Proceedings of the 21st International Conference on Pattern Recognition (ICPR2012), Tsukuba, Japan, 11–15 November 2012;
pp. 987–990.
63. Pattnaik, S.S.; Bakwad, K.M.; Sohi, B.S.; Ratho, R.K.; Devi, S. Swine Influenza Models Based Optimization (SIMBO). Appl. Soft
Comput. 2013, 13, 628–653. [CrossRef]
64. Huang, G. Artificial infectious disease optimization: A SEIQR epidemic dynamic model-based function optimization algorithm.
Swarm Evol. Comput. 2016, 27, 31–67. [CrossRef] [PubMed]
65. Tang, D.; Dong, S.; Jiang, Y.; Li, H.; Huang, Y. ITGO: Invasive tumor growth optimization algorithm. Appl. Soft Comput. 2015, 36,
670–698. [CrossRef]
66. Salmani, M.H.; Eshghi, K. A Metaheuristic Algorithm Based on Chemotherapy Science: CSA. J. Optim. 2017, 2017, 3082024.
[CrossRef]
67. Muller, S.D.; Marchetto, J.; Airaghi, S.; Kournoutsakos, P. Optimization based on bacterial chemotaxis. IEEE Trans. Evol. Comput.
2002, 6, 16–29. [CrossRef]
68. Passino, K.M. Bacterial Foraging Optimization. In Innovations and Developments of Swarm Intelligence Applications; IGI Global:
Hershey, PA, USA, 2012; pp. 219–234. [CrossRef]
69. Niu, B.; Wang, H. Bacterial colony optimization. Discret. Dyn. Nat. Soc. 2012, 2012, 698057. [CrossRef]
70. Tang, W.J.; Wu, Q.H.; Saunders, J.R. A bacterial swarming algorithm for global optimization. In Proceedings of the 2007 IEEE
Congress on Evolutionary Computation, Singapore, 25–28 September 2007; pp. 1207–1212. [CrossRef]
71. Nawa, N.E.; Furuhashi, T. Bacterial evolutionary algorithm for fuzzy system design. In Proceedings of the SMC’98 Conference
Proceedings 1998 IEEE International Conference on Systems, Man, and Cybernetics (Cat. No.98CH36218), San Diego, CA, USA,
14 October 1998; pp. 2424–2429. [CrossRef]
72. Mo, H.; Xu, L. Magnetotactic bacteria optimization algorithm for multimodal optimization. In Proceedings of the 2013 IEEE
Symposium on Swarm Intelligence (SIS), Singapore, 16–19 April 2013; pp. 240–247. [CrossRef]
73. Chandramouli Anandaraman; Arun Vikram Madurai Sankar; Ramaraj Natarajan A New Evolutionary Algorithm Based on
Bacterial Evolution and Its Application for Scheduling A Flexible Manufacturing System. J. Tek. Ind. 2012, 14, 1–12.
74. Li, M.D.; Zhao, H.; Weng, X.W.; Han, T. A novel nature-inspired algorithm for optimization: Virus colony search. Adv. Eng. Softw.
2016, 92, 65–88. [CrossRef]
75. Cortés, P.; García, J.M.; Muñuzuri, J.; Onieva, L. Viral systems: A new bio-inspired optimisation approach. Comput. Oper. Res.
2008, 35, 2840–2860. [CrossRef]
76. Jaderyan, M.; Khotanlou, H. Virulence Optimization Algorithm. Appl. Soft Comput. 2016, 43, 596–618. [CrossRef]
77. Kelsey, J.; Timmis, J. Immune Inspired Somatic Contiguous Hypermutation for Function Optimisation. In Genetic and Evolution-
ary Computation Conference—GECCO 2003: Genetic and Evolutionary Computation—GECCO 2003; Springer: Berlin/Heidelberg,
Germany, 2003; pp. 207–218. [CrossRef]
78. Taherdangkoo, M.; Yazdi, M.; Bagheri, M.H. Stem Cells Optimization Algorithm. In International Conference on Intelligent
Computing—ICIC 2011: Bio-Inspired Computing and Applications; Springer: Berlin/Heidelberg, Germany, 2012; pp. 394–403.
[CrossRef]
79. Zhang, X.; Huang, S.; Hu, Y.; Zhang, Y.; Mahadevan, S.; Deng, Y. Solving 0-1 knapsack problems based on amoeboid organism
algorithm. Appl. Math. Comput. 2013, 219, 9959–9970. [CrossRef]
80. Krishnaveni, M.; Subashini, P.; Dhivyaprabha, T.T. A new optimization approach—SFO for denoising digital images. In
Proceedings of the 2016 International Conference on Computation System and Information Technology for Sustainable Solutions
(CSITSS), Bengaluru, India, 6–8 October 2016; pp. 34–39. [CrossRef]
81. Dorigo, M.; Birattari, M.; Stutzle, T. Ant colony optimization. IEEE Comput. Intell. Mag. 2006, 1, 28–39. [CrossRef]
82. Dorigo, M.; Stützle, T. Ant Colony Optimization: Overview and Recent Advances. In Handbook of Metaheuristics; Springer:
Berlin/Heidelberg, Germany, 2019; pp. 311–351. [CrossRef]
Drones 2023, 7, 427 116 of 134
83. Karaboga, D. An Idea based on Honey Bee Swarm for Numerical Optimization; Technical report-tr06; Computer Engineering Department,
Engineering Faculty, Erciyes University: Kayseri, Turkey, 2005; Volume 200, pp. 1–10. Available online: http://mf.erciyes.edu.tr/
abc/pub/tr06_2005.pdf (accessed on 20 June 2023).
84. Karaboga, D.; Basturk, B. A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC)
algorithm. J. Glob. Optim. 2007, 39, 459–471. [CrossRef]
85. Karaboga, D.; Basturk, B. On the performance of artificial bee colony (ABC) algorithm. Appl. Soft Comput. J. 2008, 8, 687–697.
[CrossRef]
86. Dervis Karaboga; Bahriye Akay A comparative study of Artificial Bee Colony algorithm. Appl. Math. Comput. 2009, 214, 108–132.
87. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl-Based Syst. 2015, 89, 228–249.
[CrossRef]
88. Teodorović, D. Bee Colony Optimization (BCO). In Innovations in Swarm Intelligence; Springer: Berlin/Heidelberg, Germany, 2009;
pp. 39–60. [CrossRef]
89. Haddad, O.B.; Afshar, A.; Mariño, M.A. Honey-Bees Mating Optimization (HBMO) Algorithm: A New Heuristic Approach for
Water Resources Optimization. Water Resour. Manag. 2006, 20, 661–680. [CrossRef]
90. Abbass, H.A. MBO: Marriage in honey bees optimization a haplometrosis polygynous swarming approach. In Proceedings of the
2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546), Seoul, Republic of Korea, 27–30 May 2001; pp. 207–214.
[CrossRef]
91. Jung, S.H. Queen-bee evolution for genetic algorithms. Electron. Lett. 2003, 39, 575. [CrossRef]
92. Akbari, R.; Mohammadi, A.; Ziarati, K. A novel bee swarm optimization algorithm for numerical function optimization. Commun.
Nonlinear Sci. Numer. Simul. 2010, 15, 3142–3155. [CrossRef]
93. Lu, X.; Zhou, Y. A Novel Global Convergence Algorithm: Bee Collecting Pollen Algorithm. In Advanced Intelligent Computing
Theories and Applications. With Aspects of Artificial Intelligence: Proceedings of the 4th International Conference on Intelligent Computing,
ICIC 2008, Shanghai, China, 15–18 September 2008; Springer: Berlin/Heidelberg, Germany, 2008; pp. 518–525. [CrossRef]
94. Maia, R.D.; de Castro, L.N.; Caminhas, W.M. Bee colonies as model for multimodal continuous optimization: The OptBees
algorithm. In Proceedings of the 2012 IEEE Congress on Evolutionary Computation, Brisbane, Australia, 10–15 June 2012; pp. 1–8.
[CrossRef]
95. Abdullah, J.M.; Ahmed, T. Fitness Dependent Optimizer: Inspired by the Bee Swarming Reproductive Process. IEEE Access 2019,
7, 43473–43486. [CrossRef]
96. Comellas, F.; Martinez-Navarro, J. Bumblebees. In Proceedings of the first ACM/SIGEVO Summit on Genetic and Evolutionary
Computation—GEC ’09, Shanghai, China, 12–14 June 2009; ACM Press: New York, NY, USA, 2009; p. 811. [CrossRef]
97. Marinakis, Y.; Marinaki, M.; Matsatsinis, N. A Bumble Bees Mating Optimization Algorithm for Global Unconstrained Optimiza-
tion Problems. NICSO 2010, 284, 305–318. [CrossRef]
98. Mirjalili, S. The Ant Lion Optimizer. Adv. Eng. Softw. 2015, 83, 80–98. [CrossRef]
99. Mirjalili, S. Dragonfly algorithm: A new meta-heuristic optimization technique for solving single-objective, discrete, and
multi-objective problems. Neural Comput. Appl. 2016, 27, 1053–1073. [CrossRef]
100. Pan, W.-T. A new Fruit Fly Optimization Algorithm: Taking the financial distress model as an example. Knowl-Based Syst. 2012,
26, 69–74. [CrossRef]
101. Al-Rifaie, M.M. Dispersive Flies Optimisation. In Proceedings of the 2014 Federated Conference on Computer Science and
Information Systems, Warsaw, Poland, 7–10 September 2014; pp. 529–538. [CrossRef]
102. Saremi, S.; Mirjalili, S.; Lewis, A. Grasshopper Optimisation Algorithm: Theory and application. Adv. Eng. Softw. 2017, 105, 30–47.
[CrossRef]
103. Chen, S. Locust Swarms—A new multi-optima search technique. In Proceedings of the 2009 IEEE Congress on Evolutionary
Computation, Trondheim, Norway, 18–21 May 2009; pp. 1745–1752. [CrossRef]
104. Canayaz, M.; Karci, A. Cricket behaviour-based evolutionary computation technique in solving engineering optimization
problems. Appl. Intell. 2016, 44, 362–376. [CrossRef]
105. Yang, X.S.; He, X. Firefly algorithm: Recent advances and applications. Int. J. Swarm Intell. 2013, 1, 36. [CrossRef]
106. Krishnanand, K.N.; Ghose, D. Glowworm swarm optimisation: A new method for optimising multi-modal functions. Int. J.
Comput. Intell. Stud. 2009, 1, 93. [CrossRef]
107. Bidar, M.; Rashidy Kanan, H. Jumper firefly algorithm. In Proceedings of the ICCKE 2013, Mashhad, Iran, 31 October–1 November
2013; pp. 267–271. [CrossRef]
108. Cuevas, E.; Cienfuegos, M.; Zaldívar, D.; Pérez-Cisneros, M. A swarm optimization algorithm inspired in the behavior of the
social-spider. Expert Syst. Appl. 2013, 40, 6374–6384. [CrossRef]
109. Hayyolalam, V.; Pourhaji Kazem, A.A. Black Widow Optimization Algorithm: A novel meta-heuristic approach for solving
engineering optimization problems. Eng. Appl. Artif. Intell. 2020, 87, 103249. [CrossRef]
110. Kaveh, A.; Dadras Eslamlou, A. Water strider algorithm: A new metaheuristic and applications. Structures 2020, 25, 520–541.
[CrossRef]
111. Wang, G.-G.; Deb, S.; Cui, Z. Monarch butterfly optimization. Neural Comput. Appl. 2019, 31, 1995–2014. [CrossRef]
Drones 2023, 7, 427 117 of 134
112. Arora, S.; Singh, S. Butterfly algorithm with Lèvy Flights for global optimization. In Proceedings of the 2015 International
Conference on Signal Processing, Computing and Control (ISPCC), Waknaghat, India, 24–26 September 2015; pp. 220–224.
[CrossRef]
113. Qi, X.; Zhu, Y.; Zhang, H. A new meta-heuristic butterfly-inspired algorithm. J. Comput. Sci. 2017, 23, 226–239. [CrossRef]
114. Havens, T.C.; Spain, C.J.; Salmon, N.G.; Keller, J.M. Roach Infestation Optimization. In Proceedings of the 2008 IEEE Swarm
Intelligence Symposium, St. Louis, MO, USA, 21–23 September 2008; pp. 1–7. [CrossRef]
115. Chen, Z.; Tang, H. Notice of Retraction: Cockroach Swarm Optimization. In Proceedings of the 2010 2nd International Conference
on Computer Engineering and Technology, Chengdu, China, 16–18 April 2010; pp. V6-652–V6-655. [CrossRef]
116. Bouarara, H.A.; Hamou, R.M.; Amine, A. Novel Bio-Inspired Technique of Artificial Social Cockroaches (ASC). Int. J. Organ.
Collect. Intell. 2015, 5, 47–79. [CrossRef]
117. Cheng, L.; Han, L.; Zeng, X.; Bian, Y.; Yan, H. Adaptive Cockroach Colony Optimization for Rod-Like Robot Navigation. J. Bionic
Eng. 2015, 12, 324–337. [CrossRef]
118. Wu, S.-J.; Wu, C.-T. A bio-inspired optimization for inferring interactive networks: Cockroach swarm evolution. Expert Syst. Appl.
2015, 42, 3253–3267. [CrossRef]
119. Kallioras, N.A.; Lagaros, N.D.; Avtzis, D.N. Pity beetle algorithm—A new metaheuristic inspired by the behavior of bark beetles.
Adv. Eng. Softw. 2018, 121, 147–166. [CrossRef]
120. Wang, T.; Yang, L. Beetle Swarm Optimization Algorithm:Theory and Application. arXiv 2018. [CrossRef]
121. Jiang, X.; Li, S. BAS: Beetle Antennae Search Algorithm for Optimization Problems. arXiv 2017. [CrossRef]
122. Alauddin, M. Mosquito flying optimization (MFO). In Proceedings of the 2016 International Conference on Electrical, Electronics,
and Optimization Techniques (ICEEOT), Chennai, India, 3–5 March 2016; pp. 79–84. [CrossRef]
123. Minhas, F.U.A.A.; Arif, M. MOX: A novel global optimization algorithm inspired from Oviposition site selection and egg hatching
inhibition in mosquitoes. Appl. Soft Comput. 2011, 11, 4614–4625. [CrossRef]
124. Hedayatzadeh, R.; Akhavan Salmassi, F.; Keshtgari, M.; Akbari, R.; Ziarati, K. Termite colony optimization: A novel approach for
optimizing continuous problems. In Proceedings of the 2010 18th Iranian Conference on Electrical Engineering, Isfahan, Iran,
11–13 May 2010; pp. 553–558. [CrossRef]
125. Wang, P.; Zhu, Z.; Huang, S. Seven-Spot Ladybird Optimization: A Novel and Efficient Metaheuristic Algorithm for Numerical
Optimization. Sci. World J. 2013, 2013, 378515. [CrossRef]
126. Ahmadi, F.; Salehi, H.; Karimi, K. Eurygaster Algorithm: A New Approach to Optimization. Int. J. Comput. Appl. 2012, 57, 8887.
127. Yang, X.-S.; Deb, S. Cuckoo Search via Lévy flights. In Proceedings of the 2009 World Congress on Nature & Biologically Inspired
Computing (NaBIC), Coimbatore, India, 9–11 December 2009; pp. 210–214. [CrossRef]
128. Ladhari, T.; Khoja, I.; Msahli, F.; Sakly, A. Parameter identification of a reduced nonlinear model for an activated sludge process
based on cuckoo search algorithm. Trans. Inst. Meas. Control 2019, 41, 3352–3363. [CrossRef]
129. Sur, C.; Shukla, A. New Bio-inspired Meta-Heuristics—Green Herons Optimization Algorithm—For Optimization of Travelling
Salesman Problem and Road Network. In Swarm, Evolutionary, and Memetic Computing: Proceedings of the 4th International
Conference, SEMCCO 2013, Chennai, India, 19–21 December 2013; Springer International Publishing: Berlin/Heidelberg, Germany,
2013; pp. 168–179. [CrossRef]
130. Yang, X.; Hossein Gandomi, A. Bat algorithm: A novel approach for global engineering optimization. Eng. Comput. 2012, 29,
464–483. [CrossRef]
131. Song, S. Auditory Device Design Inspired by Nature; Brunel University: London, UK, 2014.
132. Askarzadeh, A. A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm.
Comput. Struct. 2016, 169, 1–12. [CrossRef]
133. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications.
Futur. Gener. Comput. Syst. 2019, 97, 849–872. [CrossRef]
134. Yang, X.S.; Deb, S. Eagle strategy using Lévy walk and firefly algorithms for stochastic optimization. Stud. Comput. Intell. 2010,
284, 101–111. [CrossRef]
135. De Vasconcelos Segundo, E.H.; Mariani, V.C.; Coelho, L. dos S. Design of heat exchangers using Falcon Optimization Algorithm.
Appl. Therm. Eng. 2019, 156, 119–144. [CrossRef]
136. Khan, A.T.; Li, S.; Stanimirovic, P.S.; Zhang, Y. Model-free optimization using eagle perching optimizer. arXiv 2018,
arXiv:1807.02754.
137. Alsattar, H.A.; Zaidan, A.A.; Zaidan, B.B. Novel meta-heuristic bald eagle search optimisation algorithm. Artif. Intell. Rev. 2020,
53, 2237–2264. [CrossRef]
138. Gheraibia, Y.; Moussaoui, A. Penguins Search Optimization Algorithm (PeSOA). In Recent Trends in Applied Artificial Intelligence:
Proceedings of the 26th International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems, IEA/AIE
2013, Amsterdam, The Netherlands, 17–21 June 2013; Springer: Berlin/Heidelberg, Germany, 2013; pp. 222–231. [CrossRef]
139. Harifi, S.; Khalilian, M.; Mohammadzadeh, J.; Ebrahimnejad, S. Emperor Penguins Colony: A new metaheuristic algorithm for
optimization. Evol. Intell. 2019, 12, 211–226. [CrossRef]
140. Dhiman, G.; Kumar, V. Emperor penguin optimizer: A bio-inspired algorithm for engineering problems. Knowl-Based Syst. 2018,
159, 20–50. [CrossRef]
Drones 2023, 7, 427 118 of 134
141. Meng, X.; Liu, Y.; Gao, X.; Zhang, H. A New Bio-inspired Algorithm: Chicken Swarm Optimization. In Advances in Swarm Intelli-
gence: Proceedings of the 5th International Conference, ICSI 2014, Hefei, China, 17–20 October 2014, Part I; Springer: Berlin/Heidelberg,
Germany, 2014; pp. 86–94. [CrossRef]
142. Meng, X.-B.; Gao, X.Z.; Lu, L.; Liu, Y.; Zhang, H. A new bio-inspired optimisation algorithm: Bird Swarm Algorithm. J. Exp. Theor.
Artif. Intell. 2016, 28, 673–687. [CrossRef]
143. Duman, E.; Uysal, M.; Alkaya, A.F. Migrating Birds Optimization: A new metaheuristic approach and its performance on
quadratic assignment problem. Inf. Sci. 2012, 217, 65–77. [CrossRef]
144. Neshat, M.; Sepidnam, G.; Sargolzaei, M. Swallow swarm optimization algorithm: A new method to optimization. Neural Comput.
Appl. 2013, 23, 429–454. [CrossRef]
145. Askarzadeh, A. Bird mating optimizer: An optimization algorithm inspired by bird mating strategies. Commun. Nonlinear Sci.
Numer. Simul. 2014, 19, 1213–1228. [CrossRef]
146. Hosseini, E. Laying Chicken Algorithm: A New Meta-Heuristic Approach to Solve Continuous Programming Problems. J. Appl.
Comput. Math. 2017, 6, 1–8. [CrossRef]
147. Lamy, J.B. Artificial feeding birds (afb): A new metaheuristic inspired by the behavior of pigeons. In Advances in Nature-Inspired
Computing and Applications; Springer: Berlin/Heidelberg, Germany, 2019; pp. 43–60. [CrossRef]
148. Duan, H.; Qiao, P. Pigeon-inspired optimization: A new swarm intelligence optimizer for air robot path planning. Int. J. Intell.
Comput. Cybern. 2014, 7, 24–37. [CrossRef]
149. Dhiman, G.; Kumar, V. Seagull optimization algorithm: Theory and its applications for large-scale industrial engineering problems.
Knowl-Based Syst. 2019, 165, 169–196. [CrossRef]
150. Samareh Moosavi, S.H.; Khatibi Bardsiri, V. Satin bowerbird optimizer: A new optimization algorithm to optimize ANFIS for
software development effort estimation. Eng. Appl. Artif. Intell. 2017, 60, 1–15. [CrossRef]
151. Jain, M.; Maurya, S.; Rani, A.; Singh, V. Owl search algorithm: A novel nature-inspired heuristic paradigm for global optimization.
J. Intell. Fuzzy Syst. 2018, 34, 1573–1582. [CrossRef]
152. Sur, C.; Sharma, S.; Shukla, A. Egyptian vulture optimization algorithm—A new nature inspired meta-heuristics for knapsack
problem. Adv. Intell. Syst. Comput. 2013, 209 AISC, 227–237. [CrossRef]
153. Hajiaghaei-Keshteli, M.; Aminnayeri, M. Keshtel Algorithm (KA); A New Optimization Algorithm Inspired by Keshtels’ Feeding.
Proceeding IEEE Conf. Ind. Eng. Manag. Syst. 2012, 1, 2249–2253.
154. Dhiman, G.; Kaur, A. STOA: A bio-inspired based optimization algorithm for industrial engineering problems. Eng. Appl. Artif.
Intell. 2019, 82, 148–174. [CrossRef]
155. Brabazon, A.; Cui, W.; O’Neill, M. The raven roosting optimisation algorithm. Soft Comput. 2016, 20, 525–545. [CrossRef]
156. Almonacid, B.; Soto, R. Andean Condor Algorithm for cell formation problems. Nat. Comput. 2019, 18, 351–381. [CrossRef]
157. Omidvar, R.; Parvin, H.; Rad, F. SSPCO optimization algorithm (See-See Partridge Chicks Optimization). In Proceedings of the
2015 Fourteenth Mexican International Conference on Artificial Intelligence (MICAI), Cuernavaca, Mexico, 25–31 October 2015;
pp. 101–106. [CrossRef]
158. El-Dosuky, M.; EL-Bassiouny, A.; Hamza, T.; Rashad, M. New Hoopoe Heuristic Optimization. arXiv 2012, arXiv:1211.6410.
159. Blanco, A.L.; Chaparro, N.; Rojas-Galeano, S. An urban pigeon-inspired optimiser for unconstrained continuous domains.
In Proceedings of the 2019 8th Brazilian Conference on Intelligent Systems (BRACIS), Salvador, Brazil, 15–18 October 2019;
pp. 521–526. [CrossRef]
160. Tawfeeq, M.A. Intelligent Algorithm for Optimum Solutions Based on the Principles of Bat Sonar. arXiv 2012, arXiv:1211.0730.
161. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [CrossRef]
162. Hofman, J. Bubble-Net Feeding, Instagram. 2021. Available online: https://www.instagram.com/p/B4H160do6u (accessed on
26 June 2021).
163. Gandomi, A.H.; Alavi, A.H. Krill herd: A new bio-inspired optimization algorithm. Commun. Nonlinear Sci. Numer. Simul. 2012,
17, 4831–4845. [CrossRef]
164. Li, L.X. An optimizing method based on autonomous animals: Fish-swarm algorithm. Syst. Eng. Theory Pract. 2002, 22, 32–38.
165. Neshat, M.; Sepidnam, G.; Sargolzaei, M.; Toosi, A.N. Artificial fish swarm algorithm: A survey of the state-of-the-art, hybridiza-
tion, combinatorial and indicative applications. Artif. Intell. Rev. 2014, 42, 965–997. [CrossRef]
166. Li, G.; Yang, Y.; Zhao, T.; Peng, P.; Zhou, Y.; Hu, Y.; Guo, C. An improved artificial fish swarm algorithm and its application to
packing and layout problems. In Proceedings of the 2017 36th Chinese Control Conference (CCC), Dalian, China, 26–28 July 2017;
pp. 9824–9828. [CrossRef]
167. Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp Swarm Algorithm: A bio-inspired optimizer
for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [CrossRef]
168. Xiao-lei, L.; Fei, L.; Tian, G.H.; Qian, J.X. Applications of artificial fish school algorithm in combinatorial optimization problems. J.
Shandong Univ. Eng. Sci. 2005, 34, 64–67.
169. Filho, C.J.A.B.; de Lima Neto, F.B.; Lins, A.J.C.C.; Nascimento, A.I.S.; Lima, M.P. Fish School Search. Nat-Inspired Algorithms
Optim. 2009, 193, 261–277. [CrossRef]
170. Shadravan, S.; Naji, H.R.; Bardsiri, V.K. The Sailfish Optimizer: A novel nature-inspired metaheuristic algorithm for solving
constrained engineering optimization problems. Eng. Appl. Artif. Intell. 2019, 80, 20–34. [CrossRef]
Drones 2023, 7, 427 119 of 134
171. Mozaffari, A.; Fathi, A.; Behzadipour, S. The great salmon run: A novel bio-inspired algorithm for artificial system design and
optimisation. Int. J. Bio-Inspired Comput. 2012, 4, 286–301. [CrossRef]
172. Jahani, E.; Chizari, M. Tackling global optimization problems with a novel algorithm—Mouth Brooding Fish algorithm. Appl. Soft
Comput. 2018, 62, 987–1002. [CrossRef]
173. Zaldívar, D.; Morales, B.; Rodríguez, A.; Valdivia-G, A.; Cuevas, E.; Pérez-Cisneros, M. A novel bio-inspired optimization model
based on Yellow Saddle Goatfish behavior. Biosystems 2018, 174, 1–21. [CrossRef]
174. Zhao, W.; Zhang, Z.; Wang, L. Manta ray foraging optimization: An effective bio-inspired optimizer for engineering applications.
Eng. Appl. Artif. Intell. 2020, 87, 103300. [CrossRef]
175. Yilmaz, S.; Sen, S. Electric fish optimization: A new heuristic algorithm inspired by electrolocation. Neural Comput. Appl. 2020, 32,
11543–11578. [CrossRef]
176. Haldar, V.; Chakraborty, N. A novel evolutionary technique based on electrolocation principle of elephant nose fish and shark:
Fish electrolocation optimization. Soft Comput. 2017, 21, 3827–3848. [CrossRef]
177. Kaveh, A.; Farhoudi, N. A new optimization method: Dolphin echolocation. Adv. Eng. Softw. 2013, 59, 53–70. [CrossRef]
178. Shiqin, Y.; Jianjun, J.; Guangxing, Y. A Dolphin Partner Optimization. In Proceedings of the 2009 WRI Global Congress on
Intelligent Systems, Xiamen, China, 19–21 May 2009; pp. 124–128. [CrossRef]
179. Wu, T.; Yao, M.; Yang, J. Dolphin swarm algorithm. Front. Inf. Technol. Electron. Eng. 2016, 17, 717–729. [CrossRef]
180. Yong, W.; Tao, W.; Cheng-Zhi, Z.; Hua-Juan, H. A New Stochastic Optimization Approach—Dolphin Swarm Optimization
Algorithm. Int. J. Comput. Intell. Appl. 2016, 15, 1650011. [CrossRef]
181. Serani, A.; Diez, M. Dolphin Pod Optimization. In Advances in Swarm Intelligence: Proceedings of the 8th International Conference,
ICSI 2017, Fukuoka, Japan, 27 July–1 August 2017, Part I; Springer: Berlin/Heidelberg, Germany, 2017; pp. 50–62. [CrossRef]
182. Abedinia, O.; Amjady, N.; Ghasemi, A. A new metaheuristic algorithm based on shark smell optimization. Complexity 2016, 21,
97–116. [CrossRef]
183. Ebrahimi, A.; Khamehchi, E. Sperm whale algorithm: An effective metaheuristic algorithm for production optimization problems.
J. Nat. Gas Sci. Eng. 2016, 29, 211–222. [CrossRef]
184. Faramarzi, A.; Heidarinejad, M.; Mirjalili, S.; Gandomi, A.H. Marine Predators Algorithm: A nature-inspired metaheuristic.
Expert Syst. Appl. 2020, 152, 113377. [CrossRef]
185. Biyanto, T.R.; Matradji; Irawan, S.; Febrianto, H.Y.; Afdanny, N.; Rahman, A.H.; Gunawan, K.S.; Pratama, J.A.D.; Bethiana, T.N.
Killer Whale Algorithm: An Algorithm Inspired by the Life of Killer Whale. Procedia Comput. Sci. 2017, 124, 151–157. [CrossRef]
186. Zeng, B.; Gao, L.; Li, X. Whale Swarm Algorithm for Function Optimization. In Proceedings of the Intelligent Computing Theories
and Application: 13th International Conference, ICIC 2017, Liverpool, UK, 7–10 August 2017; pp. 624–639. [CrossRef]
187. Mohammad Taisir Masadeh, R.; Abdel-Aziz Sharieh, A.; Mahafzah, B.A.; Masadeh, R.; Sharieh, A. Humpback Whale Optimization
Algorithm Based on Vocal Behavior for Task Scheduling in Cloud Computing. Int. J. Adv. Sci. Technol. 2019, 13, 121–140.
188. Uymaz, S.A.; Tezel, G.; Yel, E. Artificial algae algorithm (AAA) for nonlinear global optimization. Appl. Soft Comput. 2015, 31,
153–171. [CrossRef]
189. Salcedo-Sanz, S.; Del Ser, J.; Landa-Torres, I.; Gil-López, S.; Portilla-Figueras, J.A. The Coral Reefs Optimization Algorithm: A
Novel Metaheuristic for Efficiently Solving Optimization Problems. Sci. World J. 2014, 2014, 739768. [CrossRef] [PubMed]
190. Eesa, A.S.; Brifcani, A.M.A.; Orman, Z. Cuttlefish algorithm-a novel bio-inspired optimization algorithm. Int. J. Sci. Eng. Res.
2013, 4, 1978–1987.
191. An, J.; Kang, Q.; Wang, L.; Wu, Q. Mussels Wandering Optimization: An Ecologically Inspired Algorithm for Global Optimization.
Cognit. Comput. 2013, 5, 188–199. [CrossRef]
192. Kaur, S.; Awasthi, L.K.; Sangal, A.L.; Dhiman, G. Tunicate Swarm Algorithm: A new bio-inspired based metaheuristic paradigm
for global optimization. Eng. Appl. Artif. Intell. 2020, 90, 103541. [CrossRef]
193. Masadeh, R.; Mahafzah, B.A.; Sharieh, A. Sea Lion Optimization algorithm. Int. J. Adv. Comput. Sci. Appl. 2019, 10, 388–395.
[CrossRef]
194. Sulaiman, M.H.; Mustaffa, Z.; Saari, M.M.; Daniyal, H.; Musirin, I.; Daud, M.R. Barnacles mating optimizer: An evolutionary
algorithm for solving optimization. In Proceedings of the 2018 IEEE International Conference on Automatic Control and
Intelligent Systems (I2CACIS), Shah Alam, Malaysia, 20 October 2018; pp. 99–104. [CrossRef]
195. Pook, M.F.; Ramlan, E.I. The Anglerfish algorithm: A derivation of randomized incremental construction technique for solving
the traveling salesman problem. Evol. Intell. 2019, 12, 11–20. [CrossRef]
196. Catalbas, M.C.; Gulten, A. Circular structures of puffer fish: A new metaheuristic optimization algorithm. In Proceedings of the
2018 Third International Conference on Electrical and Biomedical Engineering, Clean Energy and Green Computing (EBECEGC),
Beirut, Lebanon, 25–27 April 2018; pp. 1–5. [CrossRef]
197. Ghojogh, B.; Sharifian, S. Pontogammarus maeoticus swarm optimization: A metaheuristic optimization algorithm. arXiv 2018,
arXiv:1807.01844.
198. Sukoon, M.; Banka, H. Water-Tank Fish Algorithm: A New Metaheuristic for Optimization. Int. J. Comput. Appl. 2018, 182, 1–5.
[CrossRef]
199. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [CrossRef]
200. Saber, M.; El-kenawy, E.-S.M. Design and implementation of accurate frequency estimator depend on deep learning. Int. J. Eng.
Technol. 2020, 9, 367–377. [CrossRef]
Drones 2023, 7, 427 120 of 134
201. Eusuff, M.; Lansey, K.; Pasha, F. Shuffled frog-leaping algorithm: A memetic meta-heuristic for discrete optimization. Eng. Optim.
2006, 38, 129–154. [CrossRef]
202. Elbeltagi, E.; Hegazy, T.; Grierson, D. A modified shuffled frog-leaping optimization algorithm: Applications to project manage-
ment. Struct. Infrastruct. Eng. 2007, 3, 53–60. [CrossRef]
203. Li, X.; Luo, J.; Chen, M.-R.; Wang, N. An improved shuffled frog-leaping algorithm with extremal optimisation for continuous
optimisation. Inf. Sci. 2012, 192, 143–151. [CrossRef]
204. Zhang, X.; Hu, X.; Cui, G.; Wang, Y.; Niu, Y. An improved shuffled frog leaping algorithm with cognitive behavior. In Proceedings
of the 2008 7th World Congress on Intelligent Control and Automation, Chongqing, China, 25–27 June 2008; pp. 6197–6202.
[CrossRef]
205. Chu, S.-C.; Tsai, P.; Pan, J.-S. Cat Swarm Optimization. In PRICAI 2006: Trends in Artificial Intelligence, 9th Pacific Rim International
Conference on Artificial Intelligence, Guilin, China, 7–11 August 2006, Proceedings; Springer: Berlin/Heidelberg, Germany, 2006; pp.
854–858. [CrossRef]
206. Bansal, J.C.; Sharma, H.; Jadon, S.S.; Clerc, M. Spider Monkey Optimization algorithm for numerical optimization. Memetic
Comput. 2014, 6, 31–47. [CrossRef]
207. Mucherino, A.; Seref, O.; Seref, O.; Kundakcioglu, O.E.; Pardalos, P. Monkey search: A novel metaheuristic search for global
optimization. In AIP Conference Proceedings; AIP: College Park, MD, USA, 2007; Volume 953, pp. 162–173. [CrossRef]
208. Meng, Z.; Pan, J.-S. Monkey King Evolution: A new memetic evolutionary algorithm and its application in vehicle fuel
consumption optimization. Knowl.-Based Syst. 2016, 97, 144–157. [CrossRef]
209. Mahmood, M.; Al-Khateeb, B. The blue monkey: A new nature inspired metaheuristic optimization algorithm. Period. Eng. Nat.
Sci. 2019, 7, 1054. [CrossRef]
210. Yazdani, M.; Jolai, F. Lion Optimization Algorithm (LOA): A nature-inspired metaheuristic algorithm. J. Comput. Des. Eng. 2016,
3, 24–36. [CrossRef]
211. Rajakumar, B.R. The Lion’s Algorithm: A New Nature-Inspired Search Algorithm. Procedia Technol. 2012, 6, 126–135. [CrossRef]
212. Wang, B.; Jin, X.; Cheng, B. Lion pride optimizer: An optimization algorithm inspired by lion pride behavior. Sci. China Inf. Sci.
2012, 55, 2369–2389. [CrossRef]
213. Kaveh, A.; Mahjoubi, S. Lion Pride Optimization Algorithm: A meta-heuristic method for global optimization problems. Sci. Iran.
2018, 25, 3113–3132. [CrossRef]
214. Tang, R.; Fong, S.; Yang, X.S.; Deb, S. Wolf search algorithm with ephemeral memory. In Proceedings of the Seventh International
Conference on Digital Information Management (ICDIM 2012), Macau, China, 22–24 August 2012; pp. 165–172. [CrossRef]
215. Wu, H.S.; Zhang, F.M. Wolf pack algorithm for unconstrained global optimization. Math. Probl. Eng. 2014, 2014, 465082.
[CrossRef]
216. Alhijawi, B. Dominion algorithm- a novel metaheuristic optimization method. Int. J. Adv. Intell. Paradig. 2021, 20, 221–242.
217. Chi, M. An improved Wolf pack algorithm. In Proceedings of the International Conference on Artificial Intelligence, Information
Processing and Cloud Computing (AIIPCC’19), Sanya, China, 19–21 December 2019; ACM: Guildford, UK, 2019. [CrossRef]
218. Dhiman, G.; Kumar, V. Spotted hyena optimizer: A novel bio-inspired based metaheuristic technique for engineering applications.
Adv. Eng. Softw. 2017, 114, 48–70. [CrossRef]
219. Pierezan, J.; Dos Santos Coelho, L. Coyote Optimization Algorithm: A New Metaheuristic for Global Optimization Problems. In
Proceedings of the 2018 IEEE Congress on Evolutionary Computation (CEC), Rio de Janeiro, Brazil, 8–13 July 2018. [CrossRef]
220. Polap, D.; Woźniak, M. Polar bear optimization algorithm: Meta-heuristic with fast population movement and dynamic birth and
death mechanism. Symmetry 2017, 9, 203. [CrossRef]
221. Klein, C.E.; Mariani, V.C.; Coelho, L.D.S. Cheetah based optimization algorithm: A novel swarm intelligence paradigm. In
Proceedings of the ESANN 2018 Proceedings, European Symposium on Artificial Neural Networks, Computational Intelligence
and Machine Learning, Bruges, Belgium, 25–27 April 2018; pp. 685–690.
222. Goudhaman, M. Cheetah chase algorithm (CCA): A nature-inspired metaheuristic algorithm. Int. J. Eng. Technol. 2018, 7, 1804.
[CrossRef]
223. Chen, C.C.; Tsai, Y.C.; Liu, I.I.; Lai, C.C.; Yeh, Y.T.; Kuo, S.Y.; Chou, Y.H. A Novel Metaheuristic: Jaguar Algorithm with Learning
Behavior. In Proceedings of the 2015 IEEE International Conference on Systems, Man, and Cybernetics, Hong Kong, China, 9–12
October 2015; pp. 1595–1600. [CrossRef]
224. Subramanian, C. African Wild Dog Algorithm: A New Meta Heuristic Approach for Optimal Design of Steel Structures. Ph.D.
Thesis, Anna University, Nadu, India, 2015.
225. Tripathi, A.K.; Sharma, K.; Bala, M. Military dog based optimizer and its application to fake review detection. arXiv 2019,
arXiv:1909.11890.
226. Zhang, L.M.; Dahlmann, C.; Zhang, Y. Human-Inspired Algorithms for continuous function optimization. In Proceedings of the
2009 IEEE International Conference on Intelligent Computing and Intelligent Systems, Shanghai, China, 20–22 November 2009;
pp. 318–321. [CrossRef]
227. Wang, G.G.; Deb, S.; Coelho, L.D.S. Elephant Herding Optimization. In Proceedings of the 2015 3rd International Symposium on
Computational and Business Intelligence (ISCBI), Bali, Indonesia, 7–9 December 2015; pp. 1–5. [CrossRef]
228. Deb, S.; Fong, S.; Tian, Z. Elephant Search Algorithm for optimization problems. In Proceedings of the 2015 Tenth International
Conference on Digital Information Management (ICDIM), Jeju, Republic of Korea, 21–23 October 2015; pp. 249–255. [CrossRef]
Drones 2023, 7, 427 121 of 134
229. Jain, M.; Singh, V.; Rani, A. A novel nature-inspired algorithm for optimization: Squirrel search algorithm. Swarm Evol. Comput.
2019, 44, 148–175. [CrossRef]
230. Azizyan, G.; Miarnaeimi, F.; Rashki, M.; Shabakhty, N. Flying Squirrel Optimizer (FSO): A novel SI-based optimization algorithm
for engineering problems. Iran. J. Optim. 2019, 11, 177–205.
231. Klein, C.E.; Coelho, L.D.S. Meerkats-inspired algorithm for global optimization problems. In Proceedings of the ESANN 2018
Proceedings, European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Bruges,
Belgium, 25–27 April 2018; pp. 679–684.
232. Al-Obaidi, A.T.S.; Abdullah, H.S.; Ahmed, Z.O. Meerkat clan algorithm: A new swarm intelligence algorithm. Indones. J. Electr.
Eng. Comput. Sci. 2018, 10, 354–360. [CrossRef]
233. Kim, H.; Ahn, B. A new evolutionary algorithm based on sheep flocks heredity model. In Proceedings of the 2001 IEEE Pacific
Rim Conference on Communications, Computers and Signal Processing (IEEE Cat. No.01CH37233), Victoria, BC, Canada, 26–28
August 2001; pp. 514–517. [CrossRef]
234. Kaveh, A.; Zaerreza, A. Shuffled shepherd optimization method: A new Meta-heuristic algorithm. Eng. Comput. 2020, 37,
2357–2389. [CrossRef]
235. Khalid Ibrahim, M.; Salim Ali, R. Novel Optimization Algorithm Inspired by Camel Traveling Behavior. Iraqi J. Electr. Electron.
Eng. 2016, 12, 167–177. [CrossRef]
236. Motevali, M.M.; Shanghooshabad, A.M.; Aram, R.Z.; Keshavarz, H. WHO: A New Evolutionary Algorithm Bio-Inspired by
Wildebeests with a Case Study on Bank Customer Segmentation. Int. J. Pattern Recognit. Artif. Intell. 2019, 33, 1959017. [CrossRef]
237. Maciel, C.O.; Cuevas, E.; Navarro, M.A.; Zaldívar, D.; Hinojosa, S. Side-Blotched Lizard Algorithm: A polymorphic population
approach. Appl. Soft Comput. J. 2020, 88, 106039. [CrossRef]
238. Zangbari Koohi, S.; Abdul Hamid, N.A.W.; Othman, M.; Ibragimov, G. Raccoon Optimization Algorithm. IEEE Access 2019, 7,
5383–5399. [CrossRef]
239. Tian, Z.; Fong, S.; Tang, R.; Deb, S.; Wong, R. Rhinoceros Search Algorithm. In Proceedings of the 2016 3rd International
Conference on Soft Computing & Machine Intelligence (ISCMI), Dubai, United Arab Emirates, 23–25 November 2016; pp. 18–22.
[CrossRef]
240. Yousefi, F.S.; Karimian, N.; Ghodousian, A. Xerus Optimization Algorithm (XOA): A novel nature-inspired metaheuristic
algorithm for solving global optimization problems. J. Algorithms Comput. 2019, 51, 111–126.
241. Wang, G.G.; Deb, S.; Dos Santos Coelho, L. Earthworm optimisation algorithm: A bio-inspired metaheuristic algorithm for global
optimisation problems. Int. J. Bio-Inspired Comput. 2018, 12, 1–22. [CrossRef]
242. Fathollahi Fard, A.M.; Hajiaghaei-Keshteli, M.; Tavakkoli-Moghaddam, R. Red Deer Algorithm (RDA); A New Optimization
Algorithm Inspired by Red Deers’ Mating. In Proceedings of the 12th International Conference on Industerial Engineering (ICIE
2016), Tehran, Iran, 25–26 January 2016; pp. 1–10.
243. Mohammad, T.M.H.; Mohammad, H.B. A novel meta-heuristic algorithm for numerical function optimization: Blind, naked
mole-rats (BNMR) algorithm. Sci. Res. Essays 2012, 7, 3566–3583. [CrossRef]
244. Wang, G.-G.; Gao, X.-Z.; Zenger, K.; dos Coelho, L.S. A Novel Metaheuristic Algorithm inspired by Rhino Herd Behavior. In
Proceedings of the 9th EUROSIM Congress on Modelling and Simulation, EUROSIM 2016, the 57th SIMS Conference on Simulation and
Modelling SIMS 2016; Linköping University Electronic Press: Jönköping, Sweden, 2018; Volume 142, pp. 1026–1033. [CrossRef]
245. Shamsaldin, A.S.; Rashid, T.A.; Al-Rashid Agha, R.A.; Al-Salihi, N.K.; Mohammadi, M. Donkey and smuggler optimization
algorithm: A collaborative working approach to path finding. J. Comput. Des. Eng. 2019, 6, 562–583. [CrossRef]
246. Odili, J.B.; Kahar, M.N.M.; Anwar, S. African Buffalo Optimization: A Swarm-Intelligence Technique. Procedia Comput. Sci. 2015,
76, 443–448. [CrossRef]
247. Garcia, F.; Perez, J. Jumping frogs optimization: A new swarm method for discrete optimization. Doc. Trab. DEIOC 2008, 3, 10.
248. Yang, X.-S. Flower Pollination Algorithm for Global Optimization. In Unconventional Computation and Natural Computation:
Proceedings of the 11th International Conference, UCNC 2012, Orléan, France, 3–7 September 2012; Springer: Berlin/Heidelberg,
Germany, 2012; pp. 240–249. [CrossRef]
249. Abdel-Basset, M.; Shawky, L.A. Flower pollination algorithm: A comprehensive review. Artif. Intell. Rev. 2019, 52, 2533–2557.
[CrossRef]
250. Mehrabian, A.R.; Lucas, C. A novel numerical optimization algorithm inspired from weed colonization. Ecol. Inform. 2006, 1,
355–366. [CrossRef]
251. Hume, G. Dandelion (Taraxacum Officinale); Wikipedia. 2006. Available online: https://en.wikipedia.org/wiki/Taraxacum#
/media/File:DandelionFlower.jpg (accessed on 20 June 2023).
252. Epukas Burdock—Arctium tomentosum. Wikipedia. 2008. Available online: https://en.wikipedia.org/wiki/Arctium#/media/
File:Villtakjas_2008.jpg (accessed on 20 June 2023).
253. Stüber, K. Species: Amaranthus Tricolor Family: Amaranthaceae. Wikipedia. 2004. Available online: https://en.wikipedia.org/
wiki/Amaranth#/media/File:Amaranthus_tricolor0.jpg (accessed on 20 June 2023).
254. Kiran, M.S. TSA: Tree-seed algorithm for continuous optimization. Expert Syst. Appl. 2015, 42, 6686–6698. [CrossRef]
255. Ghaemi, M.; Feizi-Derakhshi, M.-R. Forest Optimization Algorithm. Expert Syst. Appl. 2014, 41, 6676–6687. [CrossRef]
256. Cheraghalipour, A.; Hajiaghaei-Keshteli, M.; Paydar, M.M. Tree Growth Algorithm (TGA): A novel approach for solving
optimization problems. Eng. Appl. Artif. Intell. 2018, 72, 393–414. [CrossRef]
Drones 2023, 7, 427 122 of 134
257. Li, Q.Q.; Song, K.; He, Z.C.; Li, E.; Cheng, A.G.; Chen, T. The artificial tree (AT) algorithm. Eng. Appl. Artif. Intell. 2017, 65, 99–110.
[CrossRef]
258. Moez, H.; Kaveh, A.; Taghizadieh, N. Natural Forest Regeneration Algorithm: A New Meta-Heuristic. Iran. J. Sci. Technol. Trans.
Civ. Eng. 2016, 40, 311–326. [CrossRef]
259. Salhi, A.; Fraga, E.S. Nature-inspired optimisation approaches and the new plant propagation algorithm. Int. Conf. Numer. Anal.
Optim. 2011, K2. [CrossRef]
260. Merrikh-Bayat, F. A Numerical Optimization Algorithm Inspired by the Strawberry. arXiv 2014, arXiv:1407.7399.
261. Bidar, M.; Kanan, H.R.; Mouhoub, M.; Sadaoui, S. Mushroom Reproduction Optimization (MRO): A Novel Nature-Inspired
Evolutionary Algorithm. In Proceedings of the 2018 IEEE Congress on Evolutionary Computation (CEC), Rio de Janeiro, Brazil,
8–13 July 2018; pp. 1–10. [CrossRef]
262. Shayanfar, H.; Gharehchopogh, F.S. Farmland fertility: A new metaheuristic algorithm for solving continuous optimization
problems. Appl. Soft Comput. 2018, 71, 728–746. [CrossRef]
263. Premaratne, U.; Samarabandu, J.; Sidhu, T. A new biologically inspired optimization algorithm. In Proceedings of the 2009
International Conference on Industrial and Information Systems (ICIIS), Peradeniya, Sri Lanka, 28–31 December 2009; pp. 279–284.
[CrossRef]
264. Mohammadi, M.; Khodaygan, S. An algorithm for numerical nonlinear optimization: Fertile Field Algorithm (FFA). J. Ambient
Intell. Humaniz. Comput. 2020, 11, 865–878. [CrossRef]
265. Luqman, M.; Saeed, M.; Ali, J.; Tabassam, M.F.; Mahmood, T. Targeted showering optimization: Training irrigation tools to solve
crop planning problems. Pakistan J. Agric. Sci. 2019, 56, 225–235.
266. Merrikh-Bayat, F. The runner-root algorithm: A metaheuristic for solving unimodal and multimodal optimization problems
inspired by runners and roots of plants in nature. Appl. Soft Comput. J. 2015, 33, 292–303. [CrossRef]
267. Labbi, Y.; Ben Attous, D.; Gabbar, H.A.; Mahdad, B.; Zidan, A. A new rooted tree optimization algorithm for economic dispatch
with valve-point effect. Int. J. Electr. Power Energy Syst. 2016, 79, 298–311. [CrossRef]
268. Zhang, H.; Zhu, Y.; Chen, H. Root growth model: A novel approach to numerical function optimization and simulation of plant
root system. Soft Comput. 2014, 18, 521–537. [CrossRef]
269. Qi, X.; Zhu, Y.; Chen, H.; Zhang, D.; Niu, B. An Idea Based on Plant Root Growth for Numerical Optimization. In Intelligent
Computing Theories and Technology: Proceedings of the 9th International Conference, ICIC 2013, Nanning, China, 28–31 July 2013;
Springer: Berlin/Heidelberg, Germany, 2013; pp. 571–578. [CrossRef]
270. Cai, W.; Yang, W.; Chen, X. A global optimization algorithm based on plant growth theory: Plant growth optimization. In
Proceedings of the 2008 International Conference on Intelligent Computation Technology and Automation (ICICTA), Changsha,
China, 20–22 October 2008; pp. 1194–1199. [CrossRef]
271. Liu, L.; Song, Y.; Ma, H.; Zhang, X. Physarum optimization: A biology-inspired algorithm for minimal exposure path problem
in wireless sensor networks. In Proceedings of the 2012 Proceedings IEEE INFOCOM, Orlando, FL, USA, 25–30 March 2012;
pp. 1296–1304. [CrossRef]
272. Feng, X.; Liu, Y.; Yu, H.; Luo, F. Physarum-energy optimization algorithm. Soft Comput. 2019, 23, 871–888. [CrossRef]
273. Karci, A.; Alatas, B. Thinking capability of saplings growing up algorithm. In International Conference on Intelligent Data
Engineering and Automated Learning: Proceedings of the 7th International Conference, Burgos, Spain, 20–23 September 2006; Springer:
Berlin/Heidelberg, Germany, 2006; pp. 386–393. [CrossRef]
274. Sulaiman, M.; Salhi, A. A seed-based plant propagation algorithm: The feeding station model. Sci. World J. 2015, 2015, 904364.
[CrossRef]
275. Zhao, Z.; Cui, Z.; Zeng, J.; Yue, X. Artificial plant optimization algorithm for constrained optimization problems. In Proceedings
of the 2011 Second International Conference on Innovations in Bio-inspired Computing and Applications, Shenzhen, China,
16–18 December 2011; pp. 120–123. [CrossRef]
276. Cheng, L.; Zhang, Q.; Tao, F.; Ni, K.; Cheng, Y. A novel search algorithm based on waterweeds reproduction principle for job
shop scheduling problem. Int. J. Adv. Manuf. Technol. 2016, 84, 405–424. [CrossRef]
277. Gowri, R.; Rathipriya, R. Non-Swarm Plant Intelligence Algorithm: BladderWorts Suction (BWS) Algorithm. In Proceedings of
the 2018 International Conference on Circuits and Systems in Digital Enterprise Technology (ICCSDET), Kottayam, India, 21–22
December 2018. [CrossRef]
278. Murase, H. Finite element inverse analysis using a photosynthetic algorithm. Comput. Electron. Agric. 2000, 29, 115–123.
[CrossRef]
279. Eskandar, H.; Sadollah, A.; Bahreininejad, A.; Hamdi, M. Water cycle algorithm—A novel metaheuristic optimization method for
solving constrained engineering optimization problems. Comput. Struct. 2012, 110–111, 151–166. [CrossRef]
280. Rabanal, P.; Rodríguez, I.; Rubio, F. Using river formation dynamics to design heuristic algorithms. In Proceedings of the 6th
International Conference, UC 2007, Kingston, CA, Canada, 13–17 August 2007; pp. 163–177. [CrossRef]
281. Kaveh, A.; Bakhshpoori, T. Water Evaporation Optimization: A novel physically inspired optimization algorithm. Comput. Struct.
2016, 167, 69–85. [CrossRef]
282. Aghay Kaboli, S.H.; Selvaraj, J.; Rahim, N.A. Rain-fall optimization algorithm: A population based algorithm for solving
constrained optimization problems. J. Comput. Sci. 2017, 19, 31–42. [CrossRef]
Drones 2023, 7, 427 123 of 134
283. Wedyan, A.; Whalley, J.; Narayanan, A. Hydrological Cycle Algorithm for Continuous Optimization Problems. J. Optim. 2017,
2017, 3828420. [CrossRef]
284. Gao-Wei, Y.; Zhanju, H. A Novel Atmosphere Clouds Model Optimization Algorithm. In Proceedings of the 2012 Interna-
tional Conference on Computing, Measurement, Control and Sensor Network, 2012 International Conference on Computing,
Measurement, Control and Sensor Network, Taiyuan, China, 7–9 July 2012; pp. 217–220. [CrossRef]
285. Jiang, Q.; Wang, L.; Hei, X.; Fei, R.; Yang, D.; Zou, F.; Li, H.; Cao, Z.; Lin, Y. Optimal approximation of stable linear systems with
a novel and efficient optimization algorithm. In Proceedings of the 2014 IEEE Congress on Evolutionary Computation (CEC),
Beijing, China, 6–11 July 2014; pp. 840–844. [CrossRef]
286. Shareef, H.; Ibrahim, A.A.; Mutlag, A.H. Lightning search algorithm. Appl. Soft Comput. 2015, 36, 315–333. [CrossRef]
287. Nematollahi, A.F.; Rahiminejad, A.; Vahidi, B. A novel physical based meta-heuristic optimization method known as Lightning
Attachment Procedure Optimization. Appl. Soft Comput. 2017, 59, 596–621. [CrossRef]
288. Bayraktar, Z.; Komurcu, M.; Werner, D.H. Wind Driven Optimization (WDO): A novel nature-inspired optimization algorithm and
its application to electromagnetics. In Proceedings of the 2010 IEEE Antennas and Propagation Society International Symposium,
Toronto, ON, Canada, 11–17 July 2010; pp. 1–4. [CrossRef]
289. Rbouh, I.; Imrani, A.A. El Hurricane-based Optimization Algorithm. AASRI Procedia 2014, 6, 26–33. [CrossRef]
290. Zhao, W.; Wang, L.; Zhang, Z. Artificial ecosystem-based optimization: A novel nature-inspired meta-heuristic algorithm. Neural
Comput. Appl. 2020, 32, 9383–9425. [CrossRef]
291. Adham, M.T.; Bentley, P.J. An Artificial Ecosystem Algorithm applied to static and Dynamic Travelling Salesman Problems.
In Proceedings of the 2014 IEEE International Conference on Evolvable Systems, Orlando, FL, USA, 9–12 December 2014;
pp. 149–156. [CrossRef]
292. Jahedbozorgan, M.; Amjadifard, R. Sunshine: A novel random search for continuous global optimization. In Proceedings of the
2016 1st Conference on Swarm Intelligence and Evolutionary Computation (CSIEC), Bam, Iran, 9–11 March 2016; pp. 12–17.
[CrossRef]
293. Hosseini, E.; Sadiq, A.S.; Ghafoor, K.Z.; Rawat, D.B.; Saif, M.; Yang, X. Volcano eruption algorithm for solving optimization
problems. Neural Comput. Appl. 2021, 33, 2321–2337. [CrossRef]
294. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95—International Conference on Neural
Networks, Perth, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [CrossRef]
295. Freitas, D.; Lopes, L.G.; Morgado-Dias, F. Particle Swarm Optimisation: A Historical Review up to the Current Developments.
Entropy 2020, 22, 362. [CrossRef]
296. Marini, F.; Walczak, B. Particle swarm optimization (PSO). A tutorial. Chemom. Intell. Lab. Syst. 2015, 149, 153–165. [CrossRef]
297. Rao, R.V.; Savsani, V.J.; Vakharia, D.P. Teaching–learning-based optimization: A novel method for constrained mechanical design
optimization problems. Comput. Des. 2011, 43, 303–315. [CrossRef]
298. Xie, X.-F.; Zhang, W.-J.; Yang, Z.-L. Social cognitive optimization for nonlinear programming problems. In Proceedings of the
International Conference on Machine Learning and Cybernetics, Beijing, China, 4–5 November 2002; Volume 2, pp. 779–783.
[CrossRef]
299. Xu, Y.; Cui, Z.; Zeng, J. Social Emotional Optimization Algorithm for Nonlinear Constrained Optimization Problems. In Swarm,
Evolutionary, and Memetic Computing: Proceedings of the First International Conference on Swarm, Evolutionary, and Memetic Computing,
SEMCCO 2010, Chennai, India, 16–18 December 2010; Springer: Berlin/Heidelberg, Germany, 2010; pp. 583–590. [CrossRef]
300. Shi, Y. Brain Storm Optimization Algorithm. In Advances in Swarm Intelligence, Part I: Proceedings of the Second International
Conference, ICSI 2011, Chongqing, China, 12–15 June 2011; Springer: Berlin/Heidelberg, Germany, 2011; pp. 303–309. [CrossRef]
301. Cheng, S.; Qin, Q.; Chen, J.; Shi, Y. Brain storm optimization algorithm: A review. Artif. Intell. Rev. 2016, 46, 445–458. [CrossRef]
302. Mousavirad, S.J.; Ebrahimpour-Komleh, H. Human mental search: A new population-based metaheuristic optimization algorithm.
Appl. Intell. 2017, 47, 850–887. [CrossRef]
303. Wang, L.; Ni, H.; Yang, R.; Fei, M.; Ye, W. A Simple Human Learning Optimization Algorithm. In Computational Intelligence,
Networked Systems and Their Applications: Proceedings of the International Conference of Life System Modeling and Simulation, LSMS
2014 and International Conference on Intelligent Computing for Sustainable Energy and Environment, ICSEE 2014, Shanghai, China, 20–23
September 2014; Springer: Berlin/Heidelberg, Germany, 2014; pp. 56–65. [CrossRef]
304. Feng, X.; Zou, R.; Yu, H. A novel optimization algorithm inspired by the creative thinking process. Soft Comput. 2015, 19,
2955–2972. [CrossRef]
305. Atashpaz-Gargari, E.; Lucas, C. Imperialist competitive algorithm: An algorithm for optimization inspired by imperialistic
competition. In Proceedings of the 2007 IEEE Congress on Evolutionary Computation, Singapore, 25–28 September 2007;
pp. 4661–4667. [CrossRef]
306. Reynolds, R.G. An Introduction to Cultural Algorithms. In Proceedings of the 3rd Annual Conference on Evolutionary Program-
ming; World Scientific Publishing: Singapore, 1994; pp. 131–139. Available online: https://www.researchgate.net/publication/
201976967 (accessed on 20 June 2023).
307. Gandomi, A.H. Interior search algorithm (ISA): A novel approach for global optimization. ISA Trans. 2014, 53, 1168–1183.
[CrossRef] [PubMed]
308. Ghorbani, N.; Babaei, E. Exchange market algorithm. Appl. Soft Comput. 2014, 19, 177–187. [CrossRef]
Drones 2023, 7, 427 124 of 134
309. Punnathanam, V.; Kotecha, P. Yin-Yang-pair Optimization: A novel lightweight optimization algorithm. Eng. Appl. Artif. Intell.
2016, 54, 62–79. [CrossRef]
310. Shayeghi, H.; Dadashpour, J. Anarchic Society Optimization Based PID Control of an Automatic Voltage Regulator (AVR) System.
Electr. Electron. Eng. 2012, 2, 199–207. [CrossRef]
311. Yampolskiy, R.V.; Ashby, L.; Hassan, L. Wisdom of Artificial Crowds—A Metaheuristic Algorithm for Optimization. J. Intell.
Learn. Syst. Appl. 2012, 4, 98–107. [CrossRef]
312. Kulkarni, A.J.; Krishnasamy, G.; Abraham, A. Cohort Intelligence: A Socio-Inspired Optimization Method; Springer International
Publishing: Cham, Switzerland, 2017. [CrossRef]
313. Borji, A. A New Global Optimization Algorithm Inspired by Parliamentary Political Competitions. In MICAI 2007: Advances in
Artificial Intelligence; Springer: Berlin/Heidelberg, Germany, 2007; pp. 61–71. [CrossRef]
314. Chen, T. A Novel Bionic Intelligent Optimization Algorithm: Artificial Tribe Algorithm and its Performance Analysis. In
Proceedings of the 2010 International Conference on Measuring Technology and Mechatronics Automation, Changsha, China,
13–14 March 2010; pp. 222–225. [CrossRef]
315. Kashan, A.H.; Tavakkoli-Moghaddam, R.; Gen, M. A Warfare Inspired Optimization Algorithm: The Find-Fix-Finish-Exploit-
Analyze (F3EA) Metaheuristic Algorithm. In Proceedings of the Tenth International Conference on Management Science and
Engineering Management; Springer: Singapore, 2017; pp. 393–408. [CrossRef]
316. Khormouji, H.B.; Hajipour, H.; Rostami, H. BODMA: A novel metaheuristic algorithm for binary optimization problems based
on open source Development Model Algorithm. In Proceedings of the 7’th International Symposium on Telecommunications
(IST’2014), Tehran, Iran, 9–11 September 2014; pp. 49–54. [CrossRef]
317. Chifu, V.R.; Salomie, I.; Chifu, E.Ş.; Pop, C.B.; Poruţiu, P.; Antal, M. Jigsaw inspired metaheuristic for selecting the optimal
solution in web service composition. Adv. Intell. Syst. Comput. 2016, 356, 573–584. [CrossRef]
318. Pincus, M. Letter to the Editor—A Monte Carlo Method for the Approximate Solution of Certain Types of Constrained Optimiza-
tion Problems. Oper. Res. 1970, 18, 1225–1228. [CrossRef]
319. Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. Optimization by Simulated Annealing. Science 1983, 220, 671–680. [CrossRef] [PubMed]
320. Busetti, F. Simulated Annealing Overview, Lancs. 2003, pp. 1–10. Available online: https://www.aiinfinance.com/saweb.pdf
(accessed on 20 August 2021).
321. Varty, Z. Simulated Annealing Overview. 2017. Available online: http://lancs.ac.uk/~varty/RTOne.pdf (accessed on 20 August 2021).
322. Haddock, J.; Mittenthal, J. Simulation optimization using simulated annealing. Comput. Ind. Eng. 1992, 22, 387–395. [CrossRef]
323. Formato, R.A. Central force optimization: A new metaheuristic with applications in applied electromagnetics. Prog. Electromagn.
Res. 2007, 77, 425–491. [CrossRef]
324. Rashedi, E.; Nezamabadi-pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci. 2009, 179, 2232–2248. [CrossRef]
325. Hatamlou, A. Black hole: A new heuristic optimization approach for data clustering. Inf. Sci. 2013, 222, 175–184. [CrossRef]
326. Erol, O.K.; Eksin, I. A new optimization method: Big Bang-Big Crunch. Adv. Eng. Softw. 2006, 37, 106–111. [CrossRef]
327. Hosseini, H.S. Principal components analysis by the galaxy-based search algorithm: A novel metaheuristic for continuous
optimisation. Int. J. Comput. Sci. Eng. 2011, 6, 132. [CrossRef]
328. Muthiah-Nakarajan, V.; Noel, M.M. Galactic Swarm Optimization: A new global optimization metaheuristic inspired by galactic
motion. Appl. Soft Comput. J. 2016, 38, 771–787. [CrossRef]
329. Hsiao, Y.T.; Chuang, C.L.; Jiang, J.A.; Chien, C.C. A novel optimization algorithm: Space gravitational optimization. In
Proceedings of the 2005 IEEE International Conference on Systems, Man and Cybernetics, Waikoloa, HI, USA, 12 October 2005;
pp. 2323–2328. [CrossRef]
330. Flores, J.J.; Lopez, R.; Barrera, J. Gravitational interactions optimization. In Learning and Intelligent Optimization: Proceedings of the
5th International Conference, LION 5, Rome, Italy, 17–21 January 2011; Springer: Berlin/Heidelberg, Germany, 2011; pp. 226–237.
[CrossRef]
331. Beiranvand, H.; Rokrok, E. General Relativity Search Algorithm: A Global Optimization Approach. Int. J. Comput. Intell. Appl.
2015, 14, 1550017. [CrossRef]
332. Bendato, I.; Cassettari, L.; Giribone, P.G.; Fioribello, S. Attraction Force Optimization (AFO): A deterministic nature-inspired
heuristic for solving optimization problems in stochastic simulation. Appl. Math. Sci. 2016, 10, 989–1011. [CrossRef]
333. Kilinç, N.; Mahouti, P.; Güneş, F. Space gravity optimization applied to the feasible design target space required for a wide-band
front-end amplifier. Prog. Electromagn. Res. Symp. 2013, 2013, 1495–1499.
334. Hudaib, A.A.; Fakhouri, H.N. Supernova Optimizer: A Novel Natural Inspired Meta-Heuristic. Mod. Appl. Sci. 2017, 12, 32.
[CrossRef]
335. Mirjalili, S.; Mirjalili, S.M.; Hatamlou, A. Multi-verse optimizer: A nature-inspired algorithm for global optimization. Neural
Comput. Appl. 2016, 27, 495–513. [CrossRef]
336. Zhao, W.; Wang, L.; Zhang, Z. Atom search optimization and its application to solve a hydrogeologic parameter estimation
problem. Knowledge-Based Syst. 2019, 163, 283–304. [CrossRef]
337. Rahmanzadeh, S.; Pishvaee, M.S. Electron radar search algorithm: A novel developed meta-heuristic algorithm. Soft Comput.
2020, 24, 8443–8465. [CrossRef]
338. Wei, Z.; Huang, C.; Wang, X.; Han, T.; Li, Y. Nuclear Reaction Optimization: A Novel and Powerful Physics-Based Algorithm for
Global Optimization. IEEE Access 2019, 7, 1–9. [CrossRef]
Drones 2023, 7, 427 125 of 134
339. Yalcin, Y.; Pekcan, O. Nuclear Fission–Nuclear Fusion algorithm for global optimization: A modified Big Bang–Big Crunch
algorithm. Neural Comput. Appl. 2020, 32, 2751–2783. [CrossRef]
340. Birbil, Ş.I.; Fang, S.C. An electromagnetism-like mechanism for global optimization. J. Glob. Optim. 2003, 25, 263–282. [CrossRef]
341. Abedinpourshotorban, H.; Mariyam Shamsuddin, S.; Beheshti, Z.; Jawawi, D.N.A. Electromagnetic field optimization: A
physics-inspired metaheuristic optimization algorithm. Swarm Evol. Comput. 2016, 26, 8–22. [CrossRef]
342. Yadav, A. AEFA: Artificial electric field algorithm for global optimization. Swarm Evol. Comput. 2019, 48, 93–108. [CrossRef]
343. Bouchekara, H.R.E.H. Electrostatic discharge algorithm: A novel nature-inspired optimisation algorithm and its application to
worst-case tolerance analysis of an EMC filter. IET Sci. Meas. Technol. 2019, 13, 518–522. [CrossRef]
344. Fadafen, M.K.; Mehrshad, N.; Zahiri, S.H.; Razavi, S.M. A New Algorithm for Optimization Based on Ohm’s Law. CIVILICA
2017, 1, 16–22.
345. Kaveh, A.; Talatahari, S. A novel heuristic optimization method: Charged system search. Acta Mech. 2010, 213, 267–289. [CrossRef]
346. Ghasemi, M.; Ghavidel, S.; Aghaei, J.; Akbari, E.; Li, L. CFA optimizer: A new and powerful algorithm inspired by Franklin’s and
Coulomb’s laws theory for solving the economic load dispatch problems. Int. Trans. Electr. Energy Syst. 2018, 28, e2536. [CrossRef]
347. Zaránd, G.; Pázmándi, F.; Pál, K.F.; Zimányi, G.T. Using hysteresis for optimization. Phys. Rev. Lett. 2002, 89, 150201. [CrossRef]
348. Sadollah, A.; Bahreininejad, A.; Eskandar, H.; Hamdi, M. Mine blast algorithm: A new population based algorithm for solving
constrained engineering optimization problems. Appl. Soft Comput. J. 2013, 13, 2592–2612. [CrossRef]
349. Kaveh, A.; Dadras, A. A novel meta-heuristic optimization algorithm: Thermal exchange optimization. Adv. Eng. Softw. 2017, 110,
69–84. [CrossRef]
350. Ahrari, A.; Atai, A.A. Grenade Explosion Method—A novel tool for optimization of multimodal functions. Appl. Soft Comput. J.
2010, 10, 1132–1140. [CrossRef]
351. Patel, V.K.; Savsani, V.J. Heat transfer search (HTS): A novel optimization algorithm. Inf. Sci. 2015, 324, 217–246. [CrossRef]
352. Hashim, F.A.; Houssein, E.H.; Mabrouk, M.S.; Al-Atabany, W.; Mirjalili, S. Henry gas solubility optimization: A novel physics-
based algorithm. Futur. Gener. Comput. Syst. 2019, 101, 646–667. [CrossRef]
353. Abdechiri, M.; Meybodi, M.R.; Bahrami, H. Gases brownian motion optimization: An algorithm for optimization (GBMO). Appl.
Soft Comput. J. 2013, 13, 2932–2946. [CrossRef]
354. Moein, S.; Logeswaran, R. KGMO: A swarm optimization algorithm based on the kinetic energy of gas molecules. Inf. Sci. 2014,
275, 127–144. [CrossRef]
355. Varaee, H.; Ghasemi, M.R. Engineering optimization based on ideal gas molecular movement algorithm. Eng. Comput. 2017, 33,
71–93. [CrossRef]
356. Zheng, Y.J. Water wave optimization: A new nature-inspired metaheuristic. Comput. Oper. Res. 2015, 55, 1–11. [CrossRef]
357. Doğan, B.; Ölmez, T. A new metaheuristic for numerical function optimization: Vortex Search algorithm. Inf. Sci. 2015, 293,
125–145. [CrossRef]
358. Shah-Hosseini, H. Intelligent water drops algorithm. Int. J. Intell. Comput. Cybern. 2008, 1, 193–212. [CrossRef]
359. Faramarzi, A.; Heidarinejad, M.; Stephens, B.; Mirjalili, S. Equilibrium optimizer: A novel optimization algorithm. Knowl-Based
Syst. 2020, 191, 105190. [CrossRef]
360. Ali, J.; Saeed, M.; Luqman, M.; Tabassum, M.F. Artificial Showering Algorithm: A New Meta-Heuristic for Unconstrained
Optimization. Sci. Int. 2015, 27, 4939–4942.
361. Colak, M.E.; Varol, A. A Novel Intelligent Optimization Algorithm Inspired from Circular Water Waves. Elektron. Elektrotechnika
2015, 21, 3–6. [CrossRef]
362. Cortés-Toro, E.M.; Crawford, B.; Gómez-Pulido, J.A.; Soto, R.; Lanza-Gutiérrez, J.M. A new metaheuristic inspired by the
vapour-liquid equilibrium for continuous optimization. Appl. Sci. 2018, 8, 2080. [CrossRef]
363. Tahani, M.; Babayan, N. Flow Regime Algorithm (FRA): A physics-based meta-heuristics algorithm. Knowl. Inf. Syst. 2019, 60,
1001–1038. [CrossRef]
364. Zou, Y. The whirlpool algorithm based on physical phenomenon for solving optimization problems. Eng. Comput. 2019, 36,
664–690. [CrossRef]
365. Kaveh, A.; Mahdavi, V.R. Colliding bodies optimization: A novel meta-heuristic method. Comput. Struct. 2014, 139, 18–27.
[CrossRef]
366. Javidy, B.; Hatamlou, A.; Mirjalili, S. Ions motion algorithm for solving optimization problems. Appl. Soft Comput. J. 2015, 32,
72–79. [CrossRef]
367. Kaveh, A.; Ilchi Ghazaan, M. A new meta-heuristic algorithm: Vibrating particles system. Sci. Iran. 2017, 24, 551–566. [CrossRef]
368. Sacco, W.F.; de Oliveira, C.R.E. A new stochastic optimization algorithm based on a particle collision metaheuristic. In Proceedings
of the 6th World Congresses of Structural and Multidisciplinary Optimization, Rio de Janeiro, Brazil, 30 May–3 June 2005.
369. Mejía-de-Dios, J.-A.; Mezura-Montes, E. A New Evolutionary Optimization Method Based on Center of Mass. In Decision Science
in Action: Theory and Applications of Modern Decision Analytic Optimisation; Springer: Berlin/Heidelberg, Germany, 2019; pp. 65–74.
[CrossRef]
370. Xie, L.; Zeng, J.; Cui, Z. General framework of artificial physics optimization algorithm. In Proceedings of the 2009 World Congress
on Nature & Biologically Inspired Computing (NaBIC), Coimbatore, India, 9–11 December 2009; pp. 1321–1326. [CrossRef]
371. Cuevas, E.; Echavarría, A.; Ramírez-Ortegón, M.A. An optimization algorithm inspired by the States of Matter that improves the
balance between exploration and exploitation. Appl. Intell. 2014, 40, 256–272. [CrossRef]
Drones 2023, 7, 427 126 of 134
372. Kaveh, A.; Khayatazad, M. A new meta-heuristic method: Ray Optimization. Comput. Struct. 2012, 112–113, 283–294. [CrossRef]
373. Husseinzadeh Kashan, A. A new metaheuristic for optimization: Optics inspired optimization (OIO). Comput. Oper. Res. 2015, 55,
99–125. [CrossRef]
374. Baykasoğlu, A.; Akpinar, Ş. Weighted Superposition Attraction (WSA): A swarm intelligence algorithm for optimization
problems—Part 1: Unconstrained optimization. Appl. Soft Comput. J. 2017, 56, 520–540. [CrossRef]
375. Tzanetos, A.; Dounias, G. A new metaheuristic method for optimization: Sonar inspired optimization. Commun. Comput. Inf. Sci.
2017, 744, 417–428. [CrossRef]
376. Feng, X.; Ma, M.; Yu, H. Crystal energy optimization algorithm. Comput. Intell. 2016, 32, 284–322. [CrossRef]
377. Dehghani, M.; Montazeri, Z.; Dehghani, A.; Nouri, N.; Seifi, A. BSSA: Binary spring search algorithm. In Proceedings of the 2017
IEEE 4th International Conference on Knowledge-Based Engineering and Innovation (KBEI), Tehran, Iran, 22–22 December 2017;
pp. 0220–0224. [CrossRef]
378. Tan, Y.; Zhu, Y. Fireworks algorithm for optimization. In International Conference in Swarm Intelligence: Proceedings of the First
International Conference, ICSI 2010, Beijing, China, 12–15 June 2010; Springer: Berlin/Heidelberg, Germany, 2010; pp. 355–364.
[CrossRef]
379. Lam, A.Y.S.; Li, V.O.K. Chemical-Reaction-Inspired Metaheuristic for Optimization. IEEE Trans. Evol. Comput. 2010, 14, 381–399.
[CrossRef]
380. Alatas, B. ACROA: Artificial Chemical Reaction Optimization Algorithm for global optimization. Expert Syst. Appl. 2011, 38,
13170–13180. [CrossRef]
381. Siddique, N.; Adeli, H. Nature-Inspired Chemical Reaction Optimisation Algorithms. Cognit. Comput. 2017, 9, 411–422. [CrossRef]
382. Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowledge-Based Syst. 2016, 96, 120–133. [CrossRef]
383. Salimi, H. Stochastic Fractal Search: A powerful metaheuristic algorithm. Knowledge-Based Syst. 2015, 75, 1–18. [CrossRef]
384. Ibrahim, Z.; Aziz, N.H.A.; Aziz, N.A.A.; Razali, S.; Mohamad, M.S. Simulated Kalman Filter: A Novel Estimation-Based
Metaheuristic Optimization Algorithm. Adv. Sci. Lett. 2016, 22, 2941–2946. [CrossRef]
385. Salem, S.A. BOA: A novel optimization algorithm. In Proceedings of the 2012 International Conference on Engineering and
Technology (ICET), Cairo, Egypt, 10–11 October 2012; pp. 1–5. [CrossRef]
386. TANYILDIZI, E.; DEMIR, G. Golden Sine Algorithm: A Novel Math-Inspired Algorithm. Adv. Electr. Comput. Eng. 2017, 17,
71–78. [CrossRef]
387. Zhao, J.; Tang, D.; Liu, Z.; Cai, Y.; Dong, S. Spherical search optimizer: A simple yet efficient meta-heuristic approach. Neural
Comput. Appl. 2020, 32, 9777–9808. [CrossRef]
388. Geem, Z.W.; Kim, J.H.; Loganathan, G.V. A new heuristic optimization algorithm: Harmony search. Simulation 2001, 76, 60–68.
[CrossRef]
389. Ashrafi, S.M.; Dariane, A.B. A novel and effective algorithm for numerical optimization: Melody Search (MS). In Proceedings of
the 2011 11th International Conference on Hybrid Intelligent Systems (HIS), Melacca, Malaysia, 5–8 December 2011; pp. 109–114.
[CrossRef]
390. Weyland, D. A Rigorous Analysis of the Harmony Search Algorithm. Int. J. Appl. Metaheuristic Comput. 2010, 1, 50–60. [CrossRef]
391. Mora-Gutiérrez, R.A.; Ramírez-Rodríguez, J.; Rincón-García, E.A. An optimization algorithm inspired by musical composition.
Artif. Intell. Rev. 2014, 41, 301–315. [CrossRef]
392. Kashan, A.H. League Championship Algorithm: A New Algorithm for Numerical Function Optimization. In Proceedings of the
2009 International Conference of Soft Computing and Pattern Recognition, Malacca, Malaysia, 4–7 December 2009; pp. 43–48.
[CrossRef]
393. Osaba, E.; Diaz, F.; Onieva, E. Golden ball: A novel meta-heuristic to solve combinatorial optimization problems based on soccer
concepts. Appl. Intell. 2014, 41, 145–166. [CrossRef]
394. Moosavian, N.; Roodsari, B.K. Soccer League Competition Algorithm, a New Method for Solving Systems of Nonlinear Equations.
Int. J. Intell. Sci. 2014, 4, 7–16. [CrossRef]
395. Fadakar, E.; Ebrahimi, M. A new metaheuristic football game inspired algorithm. In Proceedings of the 2016 1st Conference on
Swarm Intelligence and Evolutionary Computation (CSIEC), Bam, Iran, 9–11 March 2016; pp. 6–11. [CrossRef]
396. Kaveh, A.; Zolghadr, A. a Novel Meta-Heuristic Algorithm: Tug of War Optimization. Int. J. Optim. Civ. Eng. Int. J. Optim. Civ.
Eng 2016, 6, 469–492.
397. Blum, C.; Puchinger, J.; Raidl, G.R.; Roli, A. Hybrid metaheuristics in combinatorial optimization: A survey. Appl. Soft Comput.
2011, 11, 4135–4151. [CrossRef]
398. Nabil, E. A Modified Flower Pollination Algorithm for Global Optimization. Expert Syst. Appl. 2016, 57, 192–203. [CrossRef]
399. Tseng, L.-Y.; Liang, S.-C. A Hybrid Metaheuristic for the Quadratic Assignment Problem. Comput. Optim. Appl. 2006, 34, 85–113.
[CrossRef]
400. D’Andreagiovanni, F.; Krolikowski, J.; Pulaj, J. A fast hybrid primal heuristic for multiband robust capacitated network design
with multiple time periods. Appl. Soft Comput. 2015, 26, 497–507. [CrossRef]
401. Fontes, D.B.M.M.; Homayouni, S.M.; Gonçalves, J.F. A hybrid particle swarm optimization and simulated annealing algorithm
for the job shop scheduling problem with transport resources. Eur. J. Oper. Res. 2023, 306, 1140–1157. [CrossRef]
402. Binu, D.; Selvi, M.; George, A. MKF-Cuckoo: Hybridization of Cuckoo Search and Multiple Kernel-based Fuzzy C-means
Algorithm. AASRI Procedia 2013, 4, 243–249. [CrossRef]
Drones 2023, 7, 427 127 of 134
403. Yue, Z.; Zhang, S.; Xiao, W. A Novel Hybrid Algorithm Based on Grey Wolf Optimizer and Fireworks Algorithm. Sensors 2020, 20,
2147. [CrossRef]
404. Jia, H.; Xing, Z.; Song, W. A New Hybrid Seagull Optimization Algorithm for Feature Selection. IEEE Access 2019, 7, 49614–49631.
[CrossRef]
405. Zhang, Z.; Ding, S.; Jia, W. A hybrid optimization algorithm based on cuckoo search and differential evolution for solving
constrained engineering problems. Eng. Appl. Artif. Intell. 2019, 85, 254–268. [CrossRef]
406. Dey, B.; Raj, S.; Mahapatra, S.; Márquez, F.P.G. Optimal scheduling of distributed energy resources in microgrid systems based on
electricity market pricing strategies by a novel hybrid optimization technique. Int. J. Electr. Power Energy Syst. 2022, 134, 107419.
[CrossRef]
407. Kottath, R.; Singh, P.; Bhowmick, A. Swarm-based hybrid optimization algorithms: An exhaustive analysis and its applications to
electricity load and price forecasting. Soft Comput. 2023, 1–32. [CrossRef] [PubMed]
408. Yeniay, Ö. Penalty Function Methods for Constrained Optimization with Genetic Algorithms. Math. Comput. Appl. 2005, 10,
45–56. [CrossRef]
409. Mezura-Montes, E.; Coello Coello, C.A. Constraint-handling in nature-inspired numerical optimization: Past, present and future.
Swarm Evol. Comput. 2011, 1, 173–194. [CrossRef]
410. Chehouri, A.; Younes, R.; Perron, J.; Ilinca, A. A constraint-handling technique for genetic algorithms using a violation factor. J.
Comput. Sci. 2016, 12, 350–362. [CrossRef]
411. Gen, M.; Cheng, R. A survey of penalty techniques in genetic algorithms. In Proceedings of the IEEE International Conference on
Evolutionary Computation, Nagoya, Japan, 20–22 May 1996; pp. 804–809. [CrossRef]
412. Jordehi, A.R. A review on constraint handling strategies in particle swarm optimisation. Neural Comput. Appl. 2015, 26, 1265–1275.
[CrossRef]
413. Vrbančič, G.; Brezočnik, L.; Mlakar, U.; Fister, D.; Fister, I., Jr. NiaPy: Python microframework for building nature-inspired
algorithms. J. Open Source Softw. 2018, 3, 613. [CrossRef]
414. Darvishpoor, S.; Darvishpour, A. NIA, PYPI. 2021. Available online: https://pypi.org/project/nia/ (accessed on 9 May 2022).
415. Darvishpoor, S. Nature Inspired Algorithms Review, GitHub. 2022. Available online: https://github.com/shahind/Nature-
Inspired-Algorithms-Review (accessed on 4 March 2022).
416. Jamil, M.; Yang, X.S. A literature survey of benchmark functions for global optimisation problems. Int. J. Math. Model. Numer.
Optim. 2013, 4, 150–194. [CrossRef]
417. Bingham, D. Optimization Test Problems, Simon Fraser Univ. 2013. Available online: https://www.sfu.ca/~ssurjano/
optimization.html (accessed on 4 March 2022).
418. Al-Roomi, A.R. Unconstrained Multi-Objective Benchmark Functions Repository. 2016. Available online: https://www.al-roomi.
org/benchmarks/multi-objective/unconstrained-list (accessed on 20 June 2023).
419. Darvishpoor, S.; Darvishpour, A. Modified NiaPy, GitHub. 2022. Available online: https://github.com/salar-shdk/NiaPy
(accessed on 21 June 2022).
420. Digalakis, J.G.; Margaritis, K.G. On benchmarking functions for genetic algorithms. Int. J. Comput. Math. 2001, 77, 481–506.
[CrossRef]
421. Chen, H.; Zhu, Y.; Hu, K. Adaptive Bacterial Foraging Optimization. Abstr. Appl. Anal. 2011, 2011, 108269. [CrossRef]
422. Liu, X.; Lu, P.; Pan, B. Survey of convex optimization for aerospace applications. Astrodynamics 2017, 1, 23–40. [CrossRef]
423. Padula, S.L.; Gumbert, C.R.; Li, W. Aerospace applications of optimization under uncertainty. Optim. Eng. 2006, 7, 317–328.
[CrossRef]
424. Mieloszyk, J. Practical problems of numerical optimization in aerospace sciences. Aircr. Eng. Aerosp. Technol. 2017, 89, 570–578.
[CrossRef]
425. Lian, Y.; Oyama, A.; Liou, M.-S. Progress in design optimization using evolutionary algorithms for aerodynamic problems. Prog.
Aerosp. Sci. 2010, 46, 199–223. [CrossRef]
426. Gage, P.J. New Approaches to Optimisation in Aerospace Conceptual Design; Stanford University: Stanford, CA, USA, 1994.
427. Crossley, W.A.; Laananen, D.H. Conceptual design of helicopters via genetic algorithm. J. Aircr. 1996, 33, 1062–1070. [CrossRef]
428. Champasak, P.; Panagant, N.; Bureerat, S.; Pholdee, N. Investigation on the performance of meta-heuristics for solving single
objective conceptual design of a conventional fixed wing unmanned aerial vehicle. J. Res. Appl. Mech. Eng. 2022, 10, 1.
429. Jafarsalehi, A.; Fazeley, H.R.; Mirshams, M. Conceptual Remote Sensing Satellite Design Optimization under uncertainty. Aerosp.
Sci. Technol. 2016, 55, 377–391. [CrossRef]
430. Jilla, C.; Miller, D. A Multiobjective, Multidisciplinary Design Optimization Methodology for the Conceptual Design of Distributed
Satellite Systems. In Proceedings of the 9th AIAA/ISSMO Symposium on Multidisciplinary Analysis and Optimization, Panama
City Beach, FL, USA, 4–6 September 2002; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2002. [CrossRef]
431. Abedini, A.; Bataleblu, A.A.; Roshanian, J. Co-design Optimization of a Novel Multi-identity Drone Helicopter (MICOPTER). J.
Intell. Robot. Syst. 2022, 106, 56. [CrossRef] [PubMed]
432. HASSANALIAN, M.; SALAZAR, R.; ABDELKEFI, A. Conceptual design and optimization of a tilt-rotor micro air vehicle. Chin. J.
Aeronaut. 2019, 32, 369–381. [CrossRef]
433. Blasi, L.; Core, G. Del Particle Swarm Approach in Finding Optimum Aircraft Configuration. J. Aircr. 2007, 44, 679–683. [CrossRef]
Drones 2023, 7, 427 128 of 134
434. Corrado, G.; Ntourmas, G.; Sferza, M.; Traiforos, N.; Arteiro, A.; Brown, L.; Chronopoulos, D.; Daoud, F.; Glock, F.; Ninic, J.; et al.
Recent progress, challenges and outlook for multidisciplinary structural optimization of aircraft and aerial vehicles. Prog. Aerosp.
Sci. 2022, 135, 100861. [CrossRef]
435. Sobieszczanski-Sobieski, J.; Haftka, R.T. Multidisciplinary aerospace design optimization: Survey of recent developments. Struct.
Optim. 1997, 14, 1–23. [CrossRef]
436. Keane, A.; Scanlan, J. Design search and optimization in aerospace engineering. Philos. Trans. R. Soc. A Math. Phys. Eng. Sci. 2007,
365, 2501–2529. [CrossRef]
437. Neufeld, D.; Chung, J.; Behdinan, K. Development of a flexible MDO architecture for aircraft conceptual design. In Proceedings
of the 2008 EngOpt conference (International Conference on Engineering Optimization), Rio de Janeiro, Brazil, 1–5 June 2008;
pp. 1–8.
438. Ganguli, R.; Rajagopal, S. Multidisciplinary Design Optimization of an UAV Wing Using Kriging Based Multi-Objective
Genetic Algorithm. In Proceedings of the 50th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials
Conference; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2009. [CrossRef]
439. Ampellio, E.; Bertini, F.; Ferrero, A.; Larocca, F.; Vassio, L. Turbomachinery design by a swarm-based optimization method
coupled with a CFD solver. Adv. Aircr. Spacecr. Sci. 2016, 3, 149–170. [CrossRef]
440. Jafari, S.; Nikolaidis, T. Meta-heuristic global optimization algorithms for aircraft engines modelling and controller design; A
review, research challenges, and exploring the future. Prog. Aerosp. Sci. 2019, 104, 40–53. [CrossRef]
441. Gur, O.; Rosen, A. Optimizing Electric Propulsion Systems for Unmanned Aerial Vehicles. J. Aircr. 2009, 46, 1340–1353. [CrossRef]
442. Pelz, P.F.; Leise, P.; Meck, M. Sustainable aircraft design—A review on optimization methods for electric propulsion with derived
optimal number of propulsors. Prog. Aerosp. Sci. 2021, 123, 100714. [CrossRef]
443. Wang, X.; Damodaran, M. Comparison of Deterministic and Stochastic Optimization Algorithms for Generic Wing Design
Problems. J. Aircr. 2000, 37, 929–932. [CrossRef]
444. Boulkabeit, I.; Mthembu, L.; Marwala, T.; de Neto, F.B.L. Finite Element Model Updating Using Fish School Search Optimization
Method. In Proceedings of the 2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational
Intelligence, Ipojuca, Brazil, 8–11 September 2013; pp. 447–452. [CrossRef]
445. Toropov, V.V.; Jones, R.; Willment, T.; Funnell, M. Weight and Manufacturability Optimization of Composite Aircraft Components
Based on a Genetic Algorithm. In Proceedings of the 6th World Congresses of Structural and Multidisciplinary Optimization, Rio
de Janeiro, Brazil, 30 May–3 June 2005.
446. Viana, F.A.C.; Steffen, V.; Butkewitsch, S.; de Freitas Leal, M. Optimization of aircraft structural components by using nature-
inspired algorithms and multi-fidelity approximations. J. Glob. Optim. 2009, 45, 427–449. [CrossRef]
447. Sandeep, R.; Jeevanantham, A.K.; Manikandan, M.; Arivazhagan, N.; Tofil, S. Multi-Performance Optimization in Friction Stir
Welding of AA6082/B4C Using Genetic Algorithm and Desirability Function Approach for Aircraft Wing Structures. J. Mater.
Eng. Perform. 2021, 30, 5845–5857. [CrossRef]
448. Weis, L.; Koke, H.; Huhne, C. Structural optimisation of a composite aircraft frame applying a particle swarm algorithm. In
Proceedings of the 2015 IEEE Congress on Evolutionary Computation (CEC), Sendai, Japan, 25–28 May 2015; pp. 582–588.
[CrossRef]
449. Keshtegar, B.; Hao, P.; Wang, Y.; Hu, Q. An adaptive response surface method and Gaussian global-best harmony search algorithm
for optimization of aircraft stiffened panels. Appl. Soft Comput. 2018, 66, 196–207. [CrossRef]
450. Varatharajoo, R.; Romli, F.I.; Ahmad, K.A.; Majid, D.L.; Mustapha, F. Aeroelastic Tailoring of Composite Wing Design Using Bee
Colony Optimisation. Appl. Mech. Mater. 2014, 629, 182–188. [CrossRef]
451. Georgiou, G.; Vio, G.A.; Cooper, J.E. Aeroelastic tailoring and scaling using Bacterial Foraging Optimisation. Struct. Multidiscip.
Optim. 2014, 50, 81–99. [CrossRef]
452. de Wit, A.J.; Lammen, W.F.; Vankan, W.J.; Timmermans, H.; van der Laan, T.; Ciampa, P.D. Aircraft rudder optimization—A
multi-level and knowledge-enabled approach. Prog. Aerosp. Sci. 2020, 119, 100650. [CrossRef]
453. Li, J.; Du, X.; Martins, J.R.R.A. Machine learning in aerodynamic shape optimization. Prog. Aerosp. Sci. 2022, 134, 100849.
[CrossRef]
454. Giannakoglou, K.C. Design of optimal aerodynamic shapes using stochastic optimization methods and computational intelligence.
Prog. Aerosp. Sci. 2002, 38, 43–76. [CrossRef]
455. Yu, Y.; Lyu, Z.; Xu, Z.; Martins, J.R.R.A. On the influence of optimization algorithm and initial design on wing aerodynamic shape
optimization. Aerosp. Sci. Technol. 2018, 75, 183–199. [CrossRef]
456. Olhofer, M.; Jin, Y.; Sendhoff, B. Adaptive encoding for aerodynamic shape optimization using evolution strategies. In Proceedings
of the 2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546), Seoul, Republic of Korea, 27–30 May 2001; Volume 1,
pp. 576–583. [CrossRef]
457. Tian, X.; Li, J. A novel improved fruit fly optimization algorithm for aerodynamic shape design optimization. Knowl-Based Syst.
2019, 179, 77–91. [CrossRef]
458. Hoyos, J.; Jímenez, J.H.; Echavarría, C.; Alvarado, J.P. Airfoil Shape Optimization: Comparative Study of Meta-heuristic
Algorithms, Airfoil Parameterization Methods and Reynolds Number Impact. IOP Conf. Ser. Mater. Sci. Eng. 2021, 1154, 012016.
[CrossRef]
Drones 2023, 7, 427 129 of 134
459. Naumann, D.S.; Evans, B.; Walton, S.; Hassan, O. A novel implementation of computational aerodynamic shape optimisation
using Modified Cuckoo Search. Appl. Math. Model. 2016, 40, 4543–4559. [CrossRef]
460. Derakhshan, S.; Tavaziani, A.; Kasaeian, N. Numerical Shape Optimization of a Wind Turbine Blades Using Artificial Bee Colony
Algorithm. J. Energy Resour. Technol. 2015, 137, 051210. [CrossRef]
461. Hoseynipoor, M.; Malek Jafarian, M.; Safavinejad, A. Two-objective optimization of aerodynamic shapes using gravitational
search algorithm. Modares Mech. Eng. 2017, 17, 211–220.
462. Jalili, F.; MalekJafarian, S.M.; Safavinejad, A.; Masoumi, H. A New Modified Harmony Search Optimization Algorithm for
Evaluating Airfoil Shape Parameterization Methods and Aerodynamic Optimization. Iran. J. Mech. Eng. Trans. ISME 2022, 23,
80–104.
463. Jalili, F.; Malek-Jafarian, M.; Safavinejad, A. Introduction of Harmony Search Algorithm for Aerodynamic Shape Optimization
Using. J. Appl. Comput. Sci. Mech. 2013, 24, 81–96.
464. Darvishpoor, S.; Roshanian, J.; Raissi, A.; Hassanalian, M. Configurations, flight mechanisms, and applications of unmanned
aerial systems: A review. Prog. Aerosp. Sci. 2020, 121, 100694. [CrossRef]
465. Keane, A.J. Wing Optimization Using Design of Experiment, Response Surface, and Data Fusion Methods. J. Aircr. 2003, 40,
741–750. [CrossRef]
466. Vicini, A.; Quagliarella, D. Airfoil and Wing Design Through Hybrid Optimization Strategies. AIAA J. 1999, 37, 634–641.
[CrossRef]
467. Venter, G.; Sobieszczanski-Sobieski, J. Multidisciplinary optimization of a transport aircraft wing using particle swarm optimiza-
tion. Struct. Multidiscip. Optim. 2004, 26, 121–131. [CrossRef]
468. Wang, W.; Guo, S.; Yang, W. Simultaneous partial topology and size optimization of a wing structure using ant colony and
gradient based methods. Eng. Optim. 2011, 43, 433–446. [CrossRef]
469. Martinez, A.D.; Osaba, E.; Oregi, I.; Fister, I.; Fister, I.; Ser, J. Del Hybridizing differential evolution and novelty search for
multimodal optimization problems. In Proceedings of the Genetic and Evolutionary Computation Conference Companion,
Prague, Czech Republic, 13–17 July 2019; ACM: New York, NY, USA, 2019; pp. 1980–1989.
470. Li, Y.; Ge, W.; Zhou, J.; Zhang, Y.; Zhao, D.; Wang, Z.; Dong, D. Design and experiment of concentrated flexibility-based variable
camber morphing wing. Chin. J. Aeronaut. 2019, 35, 455–469. [CrossRef]
471. Koreanschi, A.; Sugar Gabor, O.; Acotto, J.; Brianchon, G.; Portier, G.; Botez, R.M.; Mamou, M.; Mebarki, Y. Optimization and
design of an aircraft’s morphing wing-tip demonstrator for drag reduction at low speed, Part I—Aerodynamic optimization
using genetic, bee colony and gradient descent algorithms. Chin. J. Aeronaut. 2017, 30, 149–163. [CrossRef]
472. Darvishpoor, S.; Roshanian, J.; Tayefi, M. A novel concept of VTOL bi-rotor UAV based on moving mass control. Aerosp. Sci.
Technol. 2020, 107, 106238. [CrossRef]
473. Sudmeijer, K.; Mooij, E. Shape Optimization for a Small Experimental Re-entry Module. In Proceedings of the AIAA/AAAF 11th
International Space Planes and Hypersonic Systems and Technologies Conference, Orleans, France, 29 September–4 October 2002;
American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2002.
474. Suzdaltsev, I.V.; Chermoshencev, S.F.; Bogula, N.Y. Genetic algorithm for onboard equipment placement inside the unmanned
aerial vehicle fuselage. In Proceedings of the 2016 XIX IEEE International Conference on Soft Computing and Measurements
(SCM), St. Petersburg, Russia, 25–27 May 2016; pp. 262–264. [CrossRef]
475. Li, L.; Chen, M.; Cao, F.; Ma, Y. Coaxial helicopter optimum dynamics design based on multi-objective bat algorithm and
experimental validation. In Proceedings of the 2017 8th International Conference on Mechanical and Aerospace Engineering
(ICMAE), Prague, Czech Republic, 22–25 July 2017; pp. 411–415. [CrossRef]
476. Viviani, A.; Iuspa, L.; Aprovitola, A. An optimization-based procedure for self-generation of Re-entry Vehicles shape. Aerosp. Sci.
Technol. 2017, 68, 123–134. [CrossRef]
477. Arora, R.; Kumar, P. Aerodynamic Shape Optimization of a Re-entry Capsule. In AIAA Atmospheric Flight Mechanics Conference
and Exhibit; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2003. [CrossRef]
478. Wang, Z.; Yu, J.; Zhang, A.; Wang, Y.; Zhao, W. Parametric geometric model and hydrodynamic shape optimization of a
flying-wing structure underwater glider. China Ocean Eng. 2017, 31, 709–715. [CrossRef]
479. Rodríguez-Cortés, H.; Arias-Montaño, A. Robust geometric sizing of a small flying wing planform based on evolutionary
algorithms. Aeronaut. J. 2012, 116, 175–188. [CrossRef]
480. Chen, X.; Yao, W.; Zhao, Y.; Chen, X.; Zhang, J.; Luo, Y. The Hybrid Algorithms Based on Differential Evolution for Satellite
Layout Optimization Design. In Proceedings of the 2018 IEEE Congress on Evolutionary Computation (CEC), Rio de Janeiro,
Brazil, 8–13 July 2018; pp. 1–8. [CrossRef]
481. Israr, A.; Ali, Z.A.; Alkhammash, E.H.; Jussila, J.J. Optimization Methods Applied to Motion Planning of Unmanned Aerial
Vehicles: A Review. Drones 2022, 6, 126. [CrossRef]
482. Konatowski, S.; Pawłowski, P. Application of the ACO algorithm for UAV path planning. Prz. Elektrotechniczny 2019, 95, 115–119.
[CrossRef]
483. Yu, X.; Li, C.; Zhou, J. A constrained differential evolution algorithm to solve UAV path planning in disaster scenarios. Knowl-Based
Syst. 2020, 204, 106209. [CrossRef]
484. Li, Z.; Xia, X.; Yan, Y. A Novel Semidefinite Programming-based UAV 3D Localization Algorithm with Gray Wolf Optimization.
Drones 2023, 7, 113. [CrossRef]
Drones 2023, 7, 427 130 of 134
485. Shen, Y.; Zhu, Y.; Kang, H.; Sun, X.; Chen, Q.; Wang, D. UAV Path Planning Based on Multi-Stage Constraint Optimization. Drones
2021, 5, 144. [CrossRef]
486. Lin, N.; Tang, J.; Li, X.; Zhao, L. A novel improved bat algorithm in UAV path planning. Comput. Mater. Contin. 2019, 61, 323–344.
[CrossRef]
487. Wang, Y.; Li, K.; Han, Y.; Ge, F.; Xu, W.; Liu, L. Tracking a dynamic invading target by UAV in oilfield inspection via an improved
bat algorithm. Appl. Soft Comput. 2020, 90, 106150. [CrossRef]
488. Kumar, P.; Narayan, S. Multi-objective bat algorithm tuned optimal FOPID controller for robust aircraft pitch control. Int. J. Syst.
Control Commun. 2017, 8, 348. [CrossRef]
489. Xie, J.; Zhou, Y.; Zheng, H. A Hybrid Metaheuristic for Multiple Runways Aircraft Landing Problem Based on Bat Algorithm. J.
Appl. Math. 2013, 2013, 742653. [CrossRef]
490. Li, X.; Zhou, D.; Yang, Z.; Huang, J.; Zhang, K.; Pan, Q. UAV route evaluation algorithm based on CSA-AHP and TOPSIS. In
Proceedings of the 2017 IEEE International Conference on Information and Automation (ICIA), Macau, China, 18–20 July 2017;
pp. 914–915. [CrossRef]
491. El Gmili; Mjahed; El Kari; Ayad Particle Swarm Optimization and Cuckoo Search-Based Approaches for Quadrotor Control and
Trajectory Tracking. Appl. Sci. 2019, 9, 1719. [CrossRef]
492. Hu, H.; Wu, Y.; Xu, J.; Sun, Q. Cuckoo search-based method for trajectory planning of quadrotor in an urban environment. Proc.
Inst. Mech. Eng. Part G J. Aerosp. Eng. 2019, 233, 4571–4582. [CrossRef]
493. Zhang, X.; Chen, J.; Xin, B.; Fang, H. Online Path Planning for UAV Using an Improved Differential Evolution Algorithm. IFAC
Proc. Vol. 2011, 44, 6349–6354. [CrossRef]
494. Nikolos, I.K.; Brintaki, A.N. Coordinated UAV Path Planning Using Differential Evolution. In Proceedings of the 2005 IEEE
International Symposium on, Mediterrean Conference on Control and Automation Intelligent Control, Limassol, Cyprus, 27–29
June 2005; pp. 549–556. [CrossRef]
495. Alihodzic, A. Fireworks Algorithm with New Feasibility-Rules in Solving UAV Path Planning. In Proceedings of the 2016 3rd
International Conference on Soft Computing & Machine Intelligence (ISCMI), Dubai, United Arab Emirates, 23–25 November
2016; pp. 53–57. [CrossRef]
496. Zhang, X.; Zhang, X. UAV Path Planning Based on Hybrid Differential Evolution with Fireworks Algorithm. In International
Conference on Sensing and Imaging: ICSI 2022: Advances in Swarm Intelligence; Springer: Berlin/Heidelberg, Germany, 2022;
pp. 354–364. [CrossRef]
497. Roberge, V.; Tarbouchi, M. Parallel Hybrid 2-Opt Flower Pollination Algorithm for Real-Time UAV Trajectory Planning on GPU.
ITM Web Conf. 2022, 48, 03007. [CrossRef]
498. Li, P.; Duan, H. Path planning of unmanned aerial vehicle based on improved gravitational search algorithm. Sci. China Technol.
Sci. 2012, 55, 2712–2719. [CrossRef]
499. Qu, C.; Gai, W.; Zhang, J.; Zhong, M. A novel hybrid grey wolf optimizer algorithm for unmanned aerial vehicle (UAV) path
planning. Knowl-Based Syst. 2020, 194, 105530. [CrossRef]
500. Luo, Y.; Lu, J.; Zhang, Y.; Zheng, K.; Qin, Q.; He, L.; Liu, Y. Near-Ground Delivery Drones Path Planning Design Based on
BOA-TSAR Algorithm. Drones 2022, 6, 393. [CrossRef]
501. Khoufi, I.; Laouiti, A.; Adjih, C. A Survey of Recent Extended Variants of the Traveling Salesman and Vehicle Routing Problems
for Unmanned Aerial Vehicles. Drones 2019, 3, 66. [CrossRef]
502. Chung, S.H.; Sah, B.; Lee, J. Optimization for drone and drone-truck combined operations: A review of the state of the art and
future directions. Comput. Oper. Res. 2020, 123, 105004. [CrossRef]
503. Otto, A.; Agatz, N.; Campbell, J.; Golden, B.; Pesch, E. Optimization approaches for civil applications of unmanned aerial vehicles
(UAVs) or aerial drones: A survey. Networks 2018, 72, 411–458. [CrossRef]
504. Weng, Y.-Y.; Wu, R.-Y.; Zheng, Y.-J. Cooperative Truck–Drone Delivery Path Optimization under Urban Traffic Restriction. Drones
2023, 7, 59. [CrossRef]
505. Ilango, H.S.; Ramanathan, R. A Performance Study of Bio-Inspired Algorithms in Autonomous Landing of Unmanned Aerial
Vehicle. Procedia Comput. Sci. 2020, 171, 1449–1458. [CrossRef]
506. Liang, S.; Song, B.; Xue, D. Landing route planning method for micro drones based on hybrid optimization algorithm. Biomim.
Intell. Robot. 2021, 1, 100003. [CrossRef]
507. Mahmud, A.A.A.; Satakshi; Jeberson, W. Aircraft Landing Scheduling Using Embedded Flower Pollination Algorithm. Int. J.
Parallel Program. 2020, 48, 771–785. [CrossRef]
508. Zhou, G.; Wang, R.; Zhou, Y. Flower pollination algorithm with runway balance strategy for the aircraft landing scheduling
problem. Cluster Comput. 2018, 21, 1543–1560. [CrossRef]
509. Teimoori, M.; Taghizadeh, H.; Pourmahmoud, J.; Honarmand Azimi, M. A multi-objective grey wolf optimization algorithm for
aircraft landing problem. J. Appl. Res. Ind. Eng. 2021, 8, 386–398. [CrossRef]
510. Abdullah, O.S.; Abdullah, S.; Sarim, H.M. Harmony search algorithm for the multiple runways aircraft landing scheduling
problem. J. Telecommun. Electron. Comput. Eng. 2017, 9, 59–65.
511. Abdul-Razaq, T.S.; Ali, F.H. Hybrid Bees Algorithm to Solve Aircraft Landing Problem. J. Zankoy Sulaimani—Part A 2014, 17,
71–90. [CrossRef]
Drones 2023, 7, 427 131 of 134
512. Jia, X.; Cao, X.; Guo, Y.; Qiao, H.; Zhang, J. Scheduling Aircraft Landing Based on Clonal Selection Algorithm and Receding
Horizon Control. In Proceedings of the 2008 11th International IEEE Conference on Intelligent Transportation Systems, Beijing,
China, 12–15 October 2008; pp. 357–362. [CrossRef]
513. Chai, R.; Savvaris, A.; Tsourdos, A.; Chai, S.; Xia, Y. A review of optimization techniques in spacecraft flight trajectory design.
Prog. Aerosp. Sci. 2019, 109, 100543. [CrossRef]
514. Li, S.; Huang, X.; Yang, B. Review of optimization methodologies in global and China trajectory optimization competitions. Prog.
Aerosp. Sci. 2018, 102, 60–75. [CrossRef]
515. Shirazi, A.; Ceberio, J.; Lozano, J.A. Spacecraft trajectory optimization: A review of models, objectives, approaches and solutions.
Prog. Aerosp. Sci. 2018, 102, 76–98. [CrossRef]
516. Su, Z.; Wang, H. A novel robust hybrid gravitational search algorithm for reusable launch vehicle approach and landing trajectory
optimization. Neurocomputing 2015, 162, 116–127. [CrossRef]
517. Panteleev, A.V.; Kryuchkov, A.Y. Application of Modified Fireworks Algorithm for Multiobjective Optimization of Satellite Control
Law. In Advances in Theory and Practice of Computational Mechanics; Springer: Berlin/Heidelberg, Germany, 2020; pp. 333–349.
[CrossRef]
518. Xue, J.-J.; Wang, Y.; Li, H.; Xiao, J. Discrete Fireworks Algorithm for Aircraft Mission Planning. In International Conference on
Swarm Intelligence: ICSI 2016: Advances in Swarm Intelligence; Springer: Berlin/Heidelberg, Germany, 2016; pp. 544–551. [CrossRef]
519. Dastgerdi, K.; Mehrshad, N.; Farshad, M. A new intelligent approach for air traffic control using gravitational search algorithm.
Sadhana 2016, 41, 183–191. [CrossRef]
520. Cai, Z.; Lou, J.; Zhao, J.; Wu, K.; Liu, N.; Wang, Y.X. Quadrotor trajectory tracking and obstacle avoidance by chaotic grey wolf
optimization-based active disturbance rejection control. Mech. Syst. Signal Process. 2019, 128, 636–654. [CrossRef]
521. Xiao, L.; Xu, M.; Chen, Y.; Chen, Y. Hybrid Grey Wolf Optimization Nonlinear Model Predictive Control for Aircraft Engines
Based on an Elastic BP Neural Network. Appl. Sci. 2019, 9, 1254. [CrossRef]
522. Katal, N.; Kumar, P.; Narayan, S. Design of PIλDµ controller for robust flight control of a UAV using multi-objective bat algorithm.
In Proceedings of the 2015 2nd International Conference on Recent Advances in Engineering & Computational Sciences (RAECS),
Chandigarh, India, 21–22 December 2015; pp. 1–5. [CrossRef]
523. Lin, F.; Wang, X.; Qu, X. PID parameters tuning of UAV flight control system based on artificial bee colony algorithm. In 2015
2nd International Conference on Electrical, Computer Engineering and Electronics; Atlantis Press: Amsterdam, The Netherlands, 2015.
[CrossRef]
524. Bian, Q.; Nener, B.; Wang, X. A modified bacterial-foraging tuning algorithm for multimodal optimization of the flight control
system. Aerosp. Sci. Technol. 2019, 93, 105274. [CrossRef]
525. Oyekan, J.; Hu, H. A novel bacterial foraging algorithm for automated tuning of PID controllers of UAVs. In Proceedings of the
The 2010 IEEE International Conference on Information and Automation, Harbin, China, 20–23 June 2010; pp. 693–698. [CrossRef]
526. Bencharef, S.; Boubertakh, H. Optimal tuning of a PD control by bat algorithm to stabilize a quadrotor. In Proceedings of the 8th
International Conference on Modelling, Identification and Control (ICMIC), Algiers, Algeria, 15–17 November 2016. [CrossRef]
527. Zaeri, R.; Ghanbarzadeh, A.; Attaran, B.; Zaeri, Z. Fuzzy Logic Controller based pitch control of aircraft tuned with Bees
Algorithm. In Proceedings of the The 2nd International Conference on Control, Instrumentation and Automation, Shiraz, Iran,
27–29 December 2011; pp. 705–710. [CrossRef]
528. Huang, Y.; Fei, Q. Clonal selection algorithm based optimization of the ADRC parameters designed to control UAV longitudinal
channel. In Proceedings of the 2015 IEEE International Conference on Control System, Computing and Engineering (ICCSCE),
Penang, Malaysia, 27–29 November 2015; pp. 448–452. [CrossRef]
529. Zatout, M.S.; Rezoug, A.; Rezoug, A.; Baizid, K.; Iqbal, J. Optimisation of fuzzy logic quadrotor attitude controller—Particle
swarm, cuckoo search and BAT algorithms. Int. J. Syst. Sci. 2022, 53, 883–908. [CrossRef]
530. Glida, H.E.; Abdou, L.; Chelihi, A.; Sentouh, C.; Hasseni, S.-E.-I. Optimal model-free backstepping control for a quadrotor
helicopter. Nonlinear Dyn. 2020, 100, 3449–3468. [CrossRef]
531. Pedro, J.O.; Dangor, M.; Kala, P.J. Differential evolution-based PID control of a quadrotor system for hovering application.
In Proceedings of the 2016 IEEE Congress on Evolutionary Computation (CEC), Vancouver, BC, Canada, 24–29 July 2016;
pp. 2791–2798. [CrossRef]
532. Wang, W.; Yuan, X.; Zhu, J. Automatic PID tuning via differential evolution for quadrotor UAVs trajectory tracking. In Proceedings
of the 2016 IEEE Symposium Series on Computational Intelligence (SSCI), Athens, Greece, 6–9 December 2016; pp. 1–8.
533. Keskin, B.; Keskin, K. Position Control of Quadrotor using Firefly Algorithm. El-Cezeri 2021, 9, 554–566. [CrossRef]
534. Kaba, A. Improved PID rate control of a quadrotor with a convexity-based surrogated model. Aircr. Eng. Aerosp. Technol. 2021, 93,
1287–1301. [CrossRef]
535. Ebrahimkhani, E.; Dehghani, H.; Asadollahi, M.; Ghiasi, A.R. Controlling a Micro Quadrotor Using Nonlinear Techniques Tuned
by Firefly Algorithm (FA). IN Int. Conf. New Res. Electr. Eng. Comput. Sci. 2015, 1–11. [CrossRef]
536. Prabaningtyas, S. Mardlijah LQGT Control Design Based on Firefly Algorithm optimization for Trajectory Tracking on Quadcopter.
In Proceedings of the 2022 International Seminar on Intelligent Technology and Its Applications (ISITIA), Surabaya, Indonesia,
20–21 July 2022; pp. 261–266. [CrossRef]
537. Yin, X.; Wei, X.; Liu, L.; Wang, Y. Improved Hybrid Fireworks Algorithm-Based Parameter Optimization in High-Order Sliding
Mode Control of Hypersonic Vehicles. Complexity 2018, 2018, 9098151. [CrossRef]
Drones 2023, 7, 427 132 of 134
538. Glida, H.-E.; Abdou, L.; Chelihi, A. Optimal Fuzzy Adaptive Backstepping Controller for Attitude Control of a Quadrotor
Helicopter. In Proceedings of the 2019 International Conference on Control, Automation and Diagnosis (ICCAD), Grenoble,
France, 2–4 July 2019; pp. 1–6. [CrossRef]
539. Basri, M.A.; Noordin, A. Optimal backstepping control of quadrotor UAV using gravitational search optimization algorithm. Bull.
Electr. Eng. Inform. 2020, 9, 1819–1826. [CrossRef]
540. Abbas, N.H.; Sami, A.R. Tuning of PID Controllers for Quadcopter System using Hybrid Memory based Gravitational Search
Algorithm-Particle Swarm Optimization. Int. J. Comput. Appl. 2017, 172, 975–8887.
541. Hartawan, W. Otomasi Pid Tuning Untuk Optimasi Kontrol Quadcopter Menggunakan Metode Harmony Search. J. Inov. Tek.
Inform. 2021, 4, 21–28. Available online: http://journal.universitaspahlawan.ac.id/index.php/jiti/article/view/2012 (accessed
on 20 June 2023).
542. Altan, A. Performance of Metaheuristic Optimization Algorithms based on Swarm Intelligence in Attitude and Altitude Control
of Unmanned Aerial Vehicle for Path Following. In Proceedings of the 2020 4th International Symposium on Multidisciplinary
Studies and Innovative Technologies (ISMSIT), Istanbul, Turkey, 22–24 October 2020; pp. 1–6. [CrossRef]
543. Yuan, G.; Duan, H. Robust Control for UAV Close Formation Using LADRC via Sine-Powered Pigeon-Inspired Optimization.
Drones 2023, 7, 238. [CrossRef]
544. Jing, Y.; Wang, X.; Heredia-Juesas, J.; Fortner, C.; Giacomo, C.; Sipahi, R.; Martinez-Lorenzo, J. PX4 Simulation Results of a
Quadcopter with a Disturbance-Observer-Based and PSO-Optimized Sliding Mode Surface Controller. Drones 2022, 6, 261.
[CrossRef]
545. Shafieenejad, I.; Rouzi, E.D.; Sardari, J.; Araghi, M.S.; Esmaeili, A.; Zahedi, S. Fuzzy logic, neural-fuzzy network and honey bees
algorithm to develop the swarm motion of aerial robots. Evol. Syst. 2022, 13, 319–330. [CrossRef]
546. Zhang, B.; Sun, X.; Liu, S.; Deng, X. Adaptive Differential Evolution-based Receding Horizon Control Design for Multi-UAV
Formation Reconfiguration. Int. J. Control. Autom. Syst. 2019, 17, 3009–3020. [CrossRef]
547. Bian, L.; Sun, W.; Sun, T. Trajectory Following and Improved Differential Evolution Solution for Rapid Forming of UAV Formation.
IEEE Access 2019, 7, 169599–169613. [CrossRef]
548. WANG, Y.; ZHANG, T.; CAI, Z.; ZHAO, J.; WU, K. Multi-UAV coordination control by chaotic grey wolf optimization based
distributed MPC with event-triggered strategy. Chin. J. Aeronaut. 2020, 33, 2877–2897. [CrossRef]
549. Ma, M.; Wu, J.; Shi, Y.; Yue, L.; Yang, C.; Chen, X. Chaotic Random Opposition-Based Learning and Cauchy Mutation Improved
Moth-Flame Optimization Algorithm for Intelligent Route Planning of Multiple UAVs. IEEE Access 2022, 10, 49385–49397.
[CrossRef]
550. Xiong, T.; Liu, F.; Liu, H.; Ge, J.; Li, H.; Ding, K.; Li, Q. Multi-Drone Optimal Mission Assignment and 3D Path Planning for
Disaster Rescue. Drones 2023, 7, 394. [CrossRef]
551. Qiu, H.; Duan, H. A multi-objective pigeon-inspired optimization approach to UAV distributed flocking among obstacles. Inf. Sci.
2020, 509, 515–529. [CrossRef]
552. Ali, Z.A.; Zhangang, H.; Zhengru, D. Path planning of multiple UAVs using MMACO and DE algorithm in dynamic environment.
Meas. Control 2023, 56, 459–469. [CrossRef]
553. Wu, J.; Yi, J.; Gao, L.; Li, X. Cooperative path planning of multiple UAVs based on PH curves and harmony search algorithm. In
Proceedings of the 2017 IEEE 21st International Conference on Computer Supported Cooperative Work in Design (CSCWD),
Wellington, New Zealand, 26–28 April 2017; pp. 540–544. [CrossRef]
554. Yu, J.; Guo, J.; Zhang, X.; Zhou, C.; Xie, T.; Han, X. A Novel Tent-Levy Fireworks Algorithm for the UAV Task Allocation Problem
Under Uncertain Environment. IEEE Access 2022, 10, 102373–102385. [CrossRef]
555. Zhang, Y.; Wang, X. Research on UAV Task Assignment Based on Fireworks Algorithm. Acad. J. Comput. Inf. Sci. 2022, 5, 103–107.
[CrossRef]
556. Cui, Y.; Dong, W.; Hu, D.; Liu, H. The Application of Improved Harmony Search Algorithm to Multi-UAV Task Assignment.
Electronics 2022, 11, 1171. [CrossRef]
557. Xiang, H.; Han, Y.; Pan, N.; Zhang, M.; Wang, Z. Study on Multi-UAV Cooperative Path Planning for Complex Patrol Tasks in
Large Cities. Drones 2023, 7, 367. [CrossRef]
558. Zarchi, M.; Attaran, B. Performance improvement of an active vibration absorber subsystem for an aircraft model using a bees
algorithm based on multi-objective intelligent optimization. Eng. Optim. 2017, 49, 1905–1921. [CrossRef]
559. RezaToloei, A.; Zarchi, M.; Attaran, B. Application of Active Suspension System to Reduce Aircraft Vibration using PID Technique
and Bees Algorithm. Int. J. Comput. Appl. 2014, 98, 17–24. [CrossRef]
560. Ding, L.; Wu, H.; Yao, Y. Chaotic Artificial Bee Colony Algorithm for System Identification of a Small-Scale Unmanned Helicopter.
Int. J. Aerosp. Eng. 2015, 2015, 801874. [CrossRef]
561. Ghosh Roy, A.; Peyada, N.K. Aircraft parameter estimation using Hybrid Neuro Fuzzy and Artificial Bee Colony optimization
(HNFABC) algorithm. Aerosp. Sci. Technol. 2017, 71, 772–782. [CrossRef]
562. Gotmare, A.; Bhattacharjee, S.S.; Patidar, R.; George, N.V. Swarm and evolutionary computing algorithms for system identification
and filter design: A comprehensive review. Swarm Evol. Comput. 2017, 32, 68–84. [CrossRef]
563. El Gmili, N.; Mjahed, M.; El Kari, A.; Ayad, H. Quadrotor Identification through the Cooperative Particle Swarm Optimization-
Cuckoo Search Approach. Comput. Intell. Neurosci. 2019, 2019, 8925165. [CrossRef] [PubMed]
Drones 2023, 7, 427 133 of 134
564. Yang, J.; Cai, Z.; Lin, Q.; Zhang, D.; Wang, Y. System identification of quadrotor UAV based on genetic algorithm. In Proceedings
of the 2014 IEEE Chinese Guidance, Navigation and Control Conference, Yantai, China, 8–10 August 2014; pp. 2336–2340.
[CrossRef]
565. Wang, S.; Guo, H.; Li, W.; Dong, F.; Bu, L. Differential evolution parameter identification of multi-rotor unmanned aerial vehicle
(UAV) based on gradient prey acceleration strategy. Int. J. Simul. Syst. Sci. Technol. 2016, 17, 5.1–5.6. [CrossRef]
566. Tijani, I.B.; Akmeliawati, R.; Legowo, A.; Budiyono, A. Nonlinear identification of a small scale unmanned helicopter using
optimized NARX network with multiobjective differential evolution. Eng. Appl. Artif. Intell. 2014, 33, 99–115. [CrossRef]
567. Nonut, A.; Kanokmedhakul, Y.; Bureerat, S.; Kumar, S.; Tejani, G.G.; Artrit, P.; Yıldız, A.R.; Pholdee, N. A small fixed-wing UAV
system identification using metaheuristics. Cogent Eng. 2022, 9, 2114196. [CrossRef]
568. Li, J.; Duan, H. Boid-Inspired Harmony Search approach to aircraft parameter estimation. In Proceedings of the 11th World
Congress on Intelligent Control and Automation, Shenyang, China, 29 June–4 July 2014; pp. 3556–3561. [CrossRef]
569. Yang, J.; Wang, G.; Zhu, J. Frequency-domain identification of a small-scale unmanned helicopter with harmony search algorithm.
Int. J. Comput. Appl. Technol. 2014, 49, 141. [CrossRef]
570. Samarakoon, S.M.B.P.; Muthugala, M.A.V.J.; Elara, M.R. Metaheuristic based navigation of a reconfigurable robot through narrow
spaces with shape changing ability. Expert Syst. Appl. 2022, 201, 117060. [CrossRef]
571. Zhang, W.; Zhang, W. Efficient UAV Localization Based on Modified Particle Swarm Optimization. In Proceedings of the 2022
IEEE International Conference on Communications Workshops (ICC Workshops), Seoul, Republic of Korea, 16–20 May 2022;
pp. 1089–1094. [CrossRef]
572. Shanshan, G.; Zhong, Y.; Weina, C.; Yizhi, W. Artificial Bee Colony Particle Filtering Algorithm for Integrated Navigation. In
Advances in Guidance, Navigation and Control: Proceedings of the 2020 International Conference on Guidance, Navigation and Control,
ICGNC 2020, Tianjin, China, 23–25 October 2020; Springer: Berlin/Heidelberg, Germany, 2020; pp. 3797–3806. [CrossRef]
573. Duan, H. Biological Vision-Based Surveillance and Navigation. In Bio-Inspired Computation in Unmanned Aerial Vehicles; Springer:
Berlin/Heidelberg, Germany, 2014; pp. 215–246. [CrossRef]
574. Shrivastava, A. AGV Using Clonal Selection in Warehouse; Galgotias College of Engineering and Technology: Uttar Pradesh, India, 2021.
575. Banerjee, A.; Nilhani, A.; Dhabal, S.; Venkateswaran, P. A novel sound source localization method using a global-best guided
cuckoo search algorithm for drone-based search and rescue operations. In Unmanned Aerial Systems; Elsevier: Amsterdam,
The Netherlands, 2021; pp. 375–415. [CrossRef]
576. Alfeo, A.L.; Cimino, M.G.C.A.; De Francesco, N.; Lega, M.; Vaglini, G. Design and simulation of the emergent behavior of small
drones swarming for distributed target localization. J. Comput. Sci. 2018, 29, 19–33. [CrossRef]
577. Sun, Z.; Wu, J.; Yang, J.; Huang, Y.; Li, C.; Li, D. Path Planning for GEO-UAV Bistatic SAR Using Constrained Adaptive
Multiobjective Differential Evolution. IEEE Trans. Geosci. Remote Sens. 2016, 54, 6444–6457. [CrossRef]
578. Arafat, M.Y.; Moh, S. Bio-Inspired Approaches for Energy-Efficient Localization and Clustering in UAV Networks for Monitoring
Wildfires in Remote Areas. IEEE Access 2021, 9, 18649–18669. [CrossRef]
579. Radmanesh, M.; Kumar, M. Grey wolf optimization based sense and avoid algorithm for UAV path planning in uncertain
environment using a Bayesian framework. In Proceedings of the 2016 International Conference on Unmanned Aircraft Systems
(ICUAS), Arlington, VA, USA, 7–10 June 2016; pp. 68–76. [CrossRef]
580. Nenavath, H.; Ashwini, K.; Jatoth, R.K.; Mirjalili, S. Intelligent Trigonometric Particle Filter for visual tracking. ISA Trans. 2022,
128, 460–476. [CrossRef]
581. Hao, L.; Xiangyu, F.; Manhong, S. Research on the Cooperative Passive Location of Moving Targets Based on Improved Particle
Swarm Optimization. Drones 2023, 7, 264. [CrossRef]
582. Li, Z.; Deng, Y.; Liu, W. Identification of INS Sensor Errors from Navigation Data Based on Improved Pigeon-Inspired Optimiza-
tion. Drones 2022, 6, 287. [CrossRef]
583. Egi, Y.; Otero, C.E. Machine-Learning and 3D Point-Cloud Based Signal Power Path Loss Model for the Deployment of Wireless
Communication Systems. IEEE Access 2019, 7, 42507–42517. [CrossRef]
584. Bithas, P.S.; Michailidis, E.T.; Nomikos, N.; Vouyioukas, D.; Kanatas, A.G. A Survey on Machine-Learning Techniques for
UAV-Based Communications. Sensors 2019, 19, 5170. [CrossRef]
585. Khoufi, I.; Laouiti, A.; Adjih, C.; Hadded, M. UAVs Trajectory Optimization for Data Pick Up and Delivery with Time Window.
Drones 2021, 5, 27. [CrossRef]
586. Eledlebi, K.; Hildmann, H.; Ruta, D.; Isakovic, A.F. A Hybrid Voronoi Tessellation/Genetic Algorithm Approach for the
Deployment of Drone-Based Nodes of a Self-Organizing Wireless Sensor Network (WSN) in Unknown and GPS Denied
Environments. Drones 2020, 4, 33. [CrossRef]
587. Subburaj, B.; Jayachandran, U.M.; Arumugham, V.; Suthanthira Amalraj, M.J.A. A Self-Adaptive Trajectory Optimization
Algorithm Using Fuzzy Logic for Mobile Edge Computing System Assisted by Unmanned Aerial Vehicle. Drones 2023, 7, 266.
[CrossRef]
588. Anicho, O.; Charlesworth, P.B.; Baicher, G.S.; Nagar, A.; Buckley, N. Comparative study for coordinating multiple unmanned
HAPS for communications area coverage. In Proceedings of the 2019 International Conference on Unmanned Aircraft Systems,
ICUAS, Atlanta, GA, USA, 11–14 June 2019; pp. 467–474. [CrossRef]
Drones 2023, 7, 427 134 of 134
589. Du, W.; Ying, W.; Yang, P.; Cao, X.; Yan, G.; Tang, K.; Wu, D. Network-Based Heterogeneous Particle Swarm Optimization and
Its Application in UAV Communication Coverage. In IEEE Transactions on Emerging Topics in Computational Intelligence; IEEE:
New York, NY, USA, 2020; Volume 4, pp. 312–323. [CrossRef]
590. Torky, M.; El-Dosuky, M.; Goda, E.; Snášel, V.; Hassanien, A.E. Scheduling and Securing Drone Charging System Using Particle
Swarm Optimization and Blockchain Technology. Drones 2022, 6, 237. [CrossRef]
591. Trotta, A.; Andreagiovanni, F.D.; Di Felice, M.; Natalizio, E.; Chowdhury, K.R. When UAVs Ride A Bus: Towards Energy-efficient
City-scale Video Surveillance. In Proceedings of the IEEE INFOCOM 2018—IEEE Conference on Computer Communications,
Honolulu, HI, USA, 16–19 April 2018; Volume 2018-April, pp. 1043–1051. [CrossRef]
592. Li, L.; Xu, Y.; Zhang, Z.; Yin, J.; Chen, W.; Han, Z. A prediction-based charging policy and interference mitigation approach in the
wireless powered internet of things. IEEE J. Sel. Areas Commun. 2018, 37, 439–451. [CrossRef]
593. Xie, J.; Fu, Q.; Jia, R.; Lin, F.; Li, M.; Zheng, Z. Optimal Energy and Delay Tradeoff in UAV-Enabled Wireless Sensor Networks.
Drones 2023, 7, 368. [CrossRef]
594. Zhang, X.; Xiang, X.; Lu, S.; Zhou, Y.; Sun, S. Evolutionary Optimization of Drone-Swarm Deployment for Wireless Coverage.
Drones 2022, 7, 8. [CrossRef]
595. Mukherjee, A.; Fakoorian, S.A.A.; Huang, J.; Swindlehurst, A.L. Principles of physical layer security in multiuser wireless
networks: A survey. IEEE Commun. Surv. Tutorials 2014, 16, 1550–1573. [CrossRef]
596. Li, B.; Fei, Z.; Zhang, Y.; Guizani, M. Secure UAV communication networks over 5G. IEEE Wirel. Commun. 2019, 26, 114–120.
[CrossRef]
597. Bassily, R.; Ekrem, E.; He, X.; Tekin, E.; Xie, J.; Bloch, M.R.; Ulukus, S.; Yener, A. Cooperative security at the physical layer: A
summary of recent advances. IEEE Signal Process. Mag. 2013, 30, 16–28. [CrossRef]
598. Beegum, T.R.; Idris, M.Y.I.; Bin Ayub, M.N.; Shehadeh, H.A. Optimized Routing of UAVs Using Bio-Inspired Algorithm in FANET:
A Systematic Review. IEEE Access 2023, 11, 15588–15622. [CrossRef]
599. Abubakar, A.I.; Ahmad, I.; Omeke, K.G.; Ozturk, M.; Ozturk, C.; Abdel-Salam, A.M.; Mollel, M.S.; Abbasi, Q.H.; Hussain, S.;
Imran, M.A. A Survey on Energy Optimization Techniques in UAV-Based Cellular Networks: From Conventional to Machine
Learning Approaches. Drones 2023, 7, 214. [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.