Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

Memetic algorithm

A memetic algorithm (MA) in computer science and operations research, is an extension of the traditional
genetic algorithm (GA) or more general evolutionary algorithm (EA). It may provide a sufficiently good
solution to an optimization problem. It uses a suitable heuristic or local search technique to improve the
quality of solutions generated by the EA and to reduce the likelihood of premature convergence.[1]

Memetic algorithms represent one of the recent growing areas of research in evolutionary computation. The
term MA is now widely used as a synergy of evolutionary or any population-based approach with separate
individual learning or local improvement procedures for problem search. Quite often, MAs are also referred
to in the literature as Baldwinian evolutionary algorithms (EAs), Lamarckian EAs, cultural algorithms, or
genetic local search.

Introduction
Inspired by both Darwinian principles of natural evolution and Dawkins' notion of a meme, the term
memetic algorithm (MA) was introduced by Pablo Moscato in his technical report[2] in 1989 where he
viewed MA as being close to a form of population-based hybrid genetic algorithm (GA) coupled with an
individual learning procedure capable of performing local refinements. The metaphorical parallels, on the
one hand, to Darwinian evolution and, on the other hand, between memes and domain specific (local
search) heuristics are captured within memetic algorithms thus rendering a methodology that balances well
between generality and problem specificity. This two-stage nature makes them a special case of dual-phase
evolution.

In the context of complex optimization, many different instantiations of memetic algorithms have been
reported across a wide range of application domains, in general, converging to high-quality solutions more
efficiently than their conventional evolutionary counterparts.[3]

In general, using the ideas of memetics within a computational framework is called memetic computing or
memetic computation (MC).[4][5] With MC, the traits of universal Darwinism are more appropriately
captured. Viewed in this perspective, MA is a more constrained notion of MC. More specifically, MA
covers one area of MC, in particular dealing with areas of evolutionary algorithms that marry other
deterministic refinement techniques for solving optimization problems. MC extends the notion of memes to
cover conceptual entities of knowledge-enhanced procedures or representations.

Theoretical Background
The no-free-lunch theorems of optimization and search[6][7] state that all optimization strategies are equally
effective with respect to the set of all optimization problems. Conversely, this means that one can expect the
following: The more efficiently an algorithm solves a problem or class of problems, the less general it is and
the more problem-specific knowledge it builds on. This insight leads directly to the recommendation to
complement generally applicable metaheuristics with application-specific methods or heuristics,[8] which
fits well with the concept of MAs.

The development of MAs

1st generation

Pablo Moscato characterized an MA as follows: "Memetic algorithms are a marriage between a


population-based global search and the heuristic local search made by each of the individuals. ... The
mechanisms to do local search can be to reach a local optimum or to improve (regarding the objective cost
function) up to a predetermined level." And he emphasizes "I am not constraining an MA to a genetic
representation.".[9] This original definition of MA although encompasses characteristics of cultural
evolution (in the form of local refinement) in the search cycle, it may not qualify as a true evolving system
according to universal Darwinism, since all the core principles of inheritance/memetic transmission,
variation, and selection are missing. This suggests why the term MA stirred up criticisms and controversies
among researchers when first introduced.[2] The following pseudo code would correspond to this general
definition of an MA:

Pseudo code

Procedure Memetic Algorithm


Initialize: Generate an initial population, evaluate the individuals and assign a quality
value to them;
while Stopping conditions are not satisfied do
Evolve a new population using stochastic search operators.
Evaluate all individuals in the population and assign a quality value to them.
Select the subset of individuals, , that should undergo the individual improvement
procedure.
for each individual in do
Perform individual learning using meme(s) with frequency or probability of ,
with an intensity of .
Proceed with Lamarckian or Baldwinian learning.
end for
end while

Lamarckian learning in this context means to update the chromosome according to the improved solution
found by the individual learning step, while Baldwinian learning leaves the chromosome unchanged and
uses only the improved fitness. This pseudo code leaves open which steps are based on the fitness of the
individuals and which are not. In question are the evolving of the new population and the selection of .

Since most MA implementations are based on EAs, the pseudo code of a corresponding representative of
the first generation is also given here, following Krasnogor:[10]

Pseudo code

Procedure Memetic Algorithm Based on an EA


Initialization: ;
randomly generate an initial population ;
while Stopping conditions are not satisfied do
Evaluation: Compute the fitness ;
Selection: Accordingly to choose a subset of and store it in ;
Offspring: Recombine and mutate individuals and store them in ;
Learning: Improve by local search or heuristic ;
Evaluation: Compute the fitness ;
if Lamarckian learning then
Update chromosome of according to improvement ;
fi
New generation: Generate by selecting some individuals from and ;
;
end while
Return best individual as result;

There are some alternatives for this MA scheme. For example:

All or some of the initial individuals may be improved by the meme(s).


The parents may be locally improved instead of the offspring.
Instead of all offspring, only a randomly selected or fitness-dependent fraction may undergo
local improvement.

2nd generation

Multi-meme,[11] hyper-heuristic[12][13] and meta-Lamarckian MA[14][15] are referred to as second


generation MA exhibiting the principles of memetic transmission and selection in their design. In Multi-
meme MA, the memetic material is encoded as part of the genotype. Subsequently, the decoded meme of
each respective individual/chromosome is then used to perform a local refinement. The memetic material is
then transmitted through a simple inheritance mechanism from parent to offspring(s). On the other hand, in
hyper-heuristic and meta-Lamarckian MA, the pool of candidate memes considered will compete, based on
their past merits in generating local improvements through a reward mechanism, deciding on which meme
to be selected to proceed for future local refinements. Memes with a higher reward have a greater chance of
continuing to be used. For a review on second generation MA; i.e., MA considering multiple individual
learning methods within an evolutionary system, the reader is referred to.[16]

3rd generation

Co-evolution[17] and self-generating MAs[18] may be regarded as 3rd generation MA where all three
principles satisfying the definitions of a basic evolving system have been considered. In contrast to 2nd
generation MA which assumes that the memes to be used are known a priori, 3rd generation MA utilizes a
rule-based local search to supplement candidate solutions within the evolutionary system, thus capturing
regularly repeated features or patterns in the problem space.

Some design notes


The learning method/meme used has a significant impact on the improvement results, so care must be taken
in deciding which meme or memes to use for a particular optimization problem.[12][16][19] The frequency
and intensity of individual learning directly define the degree of evolution (exploration) against individual
learning (exploitation) in the MA search, for a given fixed limited computational budget. Clearly, a more
intense individual learning provides greater chance of convergence to the local optima but limits the amount
of evolution that may be expended without incurring excessive computational resources. Therefore, care
should be taken when setting these two parameters to balance the computational budget available in
achieving maximum search performance. When only a portion of the population individuals undergo
learning, the issue of which subset of individuals to improve need to be considered to maximize the utility
of MA search. Last but not least, it has to be decided whether the respective individual should be changed
by the learning success (Lamarckian learning) or not (Baldwinian learning). Thus, the following five design
questions[15][19][20] must be answered, the first of which is addressed by all of the above 2nd generation
representatives during an MA run, while the extended form of meta-Lamarckian learning of [15] expands
this to the first four design decisions.

Selection of an individual learning method or meme to be used for a


particular problem or individual

In the context of continuous optimization, individual learning exists in the form of local heuristics or
conventional exact enumerative methods.[21] Examples of individual learning strategies include the hill
climbing, Simplex method, Newton/Quasi-Newton method, interior point methods, conjugate gradient
method, line search, and other local heuristics. Note that most of the common individual learning methods
are deterministic.

In combinatorial optimization, on the other hand, individual learning methods commonly exist in the form
of heuristics (which can be deterministic or stochastic) that are tailored to a specific problem of interest.
Typical heuristic procedures and schemes include the k-gene exchange, edge exchange, first-improvement,
and many others.

Determination of the individual learning frequency

One of the first issues pertinent to memetic algorithm design is to consider how often the individual learning
should be applied; i.e., individual learning frequency. In one case,[19] the effect of individual learning
frequency on MA search performance was considered where various configurations of the individual
learning frequency at different stages of the MA search were investigated. Conversely, it was shown
elsewhere[22] that it may be worthwhile to apply individual learning on every individual if the
computational complexity of the individual learning is relatively low.

Selection of the individuals to which individual learning is applied

On the issue of selecting appropriate individuals among the EA population that should undergo individual
learning, fitness-based and distribution-based strategies were studied for adapting the probability of
applying individual learning on the population of chromosomes in continuous parametric search problems
with Land[23] extending the work to combinatorial optimization problems. Bambha et al. introduced a
simulated heating technique for systematically integrating parameterized individual learning into
evolutionary algorithms to achieve maximum solution quality.[24]

Specification of the intensity of individual learning

Individual learning intensity, , is the amount of computational budget allocated to an iteration of


individual learning; i.e., the maximum computational budget allowable for individual learning to expend on
improving a single solution.

Choice of Lamarckian or Baldwinian learning

It is to be decided whether a found improvement is to work only by the better fitness (Baldwinian learning)
or whether also the individual is adapted accordingly (lamarckian learning). In the case of an EA, this
would mean an adjustment of the genotype. This question has been controversially discussed for EAs in the
literature already in the 1990s, stating that the specific use case plays a major role.[25][26][27] The
background of the debate is that genome adaptation may promote premature convergence. This risk can be
effectively mitigated by other measures to better balance breadth and depth searches, such as the use of
structured populations.[28]

Applications
Memetic algorithms have been successfully applied to a multitude of real-world problems. Although many
people employ techniques closely related to memetic algorithms, alternative names such as hybrid genetic
algorithms are also employed.

Researchers have used memetic algorithms to tackle many classical NP problems. To cite some of them:
graph partitioning, multidimensional knapsack, travelling salesman problem, quadratic assignment problem,
set cover problem, minimal graph coloring, max independent set problem, bin packing problem, and
generalized assignment problem.

More recent applications include (but are not limited to) business analytics and data science,[3] training of
artificial neural networks,[29] pattern recognition,[30] robotic motion planning,[31] beam orientation,[32]
circuit design,[33] electric service restoration,[34] medical expert systems,[35] single machine scheduling,[36]
automatic timetabling (notably, the timetable for the NHL),[37] manpower scheduling,[38] nurse rostering
optimisation,[39] processor allocation,[40] maintenance scheduling (for example, of an electric distribution
network),[41] scheduling of multiple workflows to constrained heterogeneous resorces,[42]
multidimensional knapsack problem,[43] VLSI design,[44] clustering of gene expression profiles,[45]
feature/gene selection,[46][47] parameter determination for hardware fault injection,[48] and multi-class,
multi-objective feature selection.[49][50]

Recent activities in memetic algorithms


IEEE Workshop on Memetic Algorithms (WOMA 2009). Program Chairs: Jim Smith,
University of the West of England, U.K.; Yew-Soon Ong, Nanyang Technological University,
Singapore; Gustafson Steven, University of Nottingham; U.K.; Meng Hiot Lim, Nanyang
Technological University, Singapore; Natalio Krasnogor, University of Nottingham, U.K.
Memetic Computing Journal (https://www.springer.com/journal/12293), first issue appeared
in January 2009.
2008 IEEE World Congress on Computational Intelligence (WCCI 2008) (http://www.wcci20
08.org/), Hong Kong, Special Session on Memetic Algorithms (http://users.jyu.fi/~neferran/M
A2008/MA2008.htm).
Special Issue on 'Emerging Trends in Soft Computing - Memetic Algorithm' (http://www.ntu.e
du.sg/home/asysong/SC/Special-Issue-MA.htm), Soft Computing Journal, Completed & In
Press, 2008.
IEEE Computational Intelligence Society Emergent Technologies Task Force on Memetic
Computing (http://www.ntu.edu.sg/home/asysong/ETTC/ETTC%20Task%20Force%20-%20
Memetic%20Computing.htm)
IEEE Congress on Evolutionary Computation (CEC 2007) (https://web.archive.org/web/2010
0306001555/http://cec2007.nus.edu.sg/), Singapore, Special Session on Memetic
Algorithms (https://web.archive.org/web/20080216234225/http://ntu-cg.ntu.edu.sg/ysong/MA
-SS/MA.htm).
'Memetic Computing' (http://www.esi-topics.com/erf/2007/august07-Ong_Keane.html) by
Thomson Scientific's Essential Science Indicators as an Emerging Front Research Area.
Special Issue on Memetic Algorithms (https://ieeexplore.ieee.org/document/4067075), IEEE
Transactions on Systems, Man, and Cybernetics - Part B: Cybernetics, Vol. 37, No. 1,
February 2007.
Recent Advances in Memetic Algorithms (http://www.springeronline.com/sgw/cda/frontpage/
0,11855,5-40356-72-34233226-0,00.html), Series: Studies in Fuzziness and Soft
Computing, Vol. 166, ISBN 978-3-540-22904-9, 2005.
Special Issue on Memetic Algorithms (http://www.mitpressjournals.org/doi/abs/10.1162/1063
656041775009?prevSearch=allfield%3A%28memetic+algorithm%29), Evolutionary
Computation Fall 2004, Vol. 12, No. 3: v-vi.

References
1. Poonam Garg (April 2009). "A Comparison between Memetic algorithm and Genetic
algorithm for the cryptanalysis of Simplified Data Encryption Standard algorithm".
International Journal of Network Security & Its Applications (IJNSA). 1 (1). arXiv:1004.0574
(https://arxiv.org/abs/1004.0574). Bibcode:2010arXiv1004.0574G (https://ui.adsabs.harvard.
edu/abs/2010arXiv1004.0574G).
2. Moscato, Pablo (1989), On Evolution, Search, Optimization, Genetic Algorithms and Martial
Arts: Towards Memetic Algorithms (https://www.researchgate.net/publication/2354457),
Caltech Concurrent Computation Program, Technical Report 826, Pasadena, CA: California
Institute of Technology
3. Moscato, P.; Mathieson, L. (2019). "Memetic Algorithms for Business Analytics and Data
Science: A Brief Survey". Business and Consumer Analytics: New Ideas. Springer. pp. 545–
608. doi:10.1007/978-3-030-06222-4_13 (https://doi.org/10.1007%2F978-3-030-06222-4_1
3). ISBN 978-3-030-06221-7. S2CID 173187844 (https://api.semanticscholar.org/CorpusID:1
73187844).
4. Chen, X. S.; Ong, Y. S.; Lim, M. H.; Tan, K. C. (2011). "A Multi-Facet Survey on Memetic
Computation". IEEE Transactions on Evolutionary Computation. 15 (5): 591–607.
doi:10.1109/tevc.2011.2132725 (https://doi.org/10.1109%2Ftevc.2011.2132725).
S2CID 17006589 (https://api.semanticscholar.org/CorpusID:17006589).
5. Chen, X. S.; Ong, Y. S.; Lim, M. H. (2010). "Research Frontier: Memetic Computation - Past,
Present & Future". IEEE Computational Intelligence Magazine. 5 (2): 24–36.
doi:10.1109/mci.2010.936309 (https://doi.org/10.1109%2Fmci.2010.936309).
hdl:10356/148175 (https://hdl.handle.net/10356%2F148175). S2CID 17955514 (https://api.s
emanticscholar.org/CorpusID:17955514).
6. Wolpert, D.H.; Macready, W.G. (April 1997). "No free lunch theorems for optimization" (http
s://ieeexplore.ieee.org/document/585893). IEEE Transactions on Evolutionary Computation.
1 (1): 67–82. doi:10.1109/4235.585893 (https://doi.org/10.1109%2F4235.585893).
S2CID 5553697 (https://api.semanticscholar.org/CorpusID:5553697).
7. Wolpert, D. H.; Macready, W. G. (1995). "No Free Lunch Theorems for Search" (https://pdfs.s
emanticscholar.org/8bdf/dc2c2777b395c086810c03a8cdeccc55c4db.pdf) (PDF). Technical
Report SFI-TR-95-02-010. Santa Fe Institute. S2CID 12890367 (https://api.semanticscholar.
org/CorpusID:12890367).
8. Davis, Lawrence (1991). Handbook of Genetic Algorithms. New York: Van Nostrand
Reinhold. ISBN 0-442-00173-8. OCLC 23081440 (https://www.worldcat.org/oclc/23081440).
9. Moscato, Pablo (1989), On Evolution, Search, Optimization, Genetic Algorithms and Martial
Arts: Towards Memetic Algorithms (https://www.researchgate.net/publication/2354457),
Caltech Concurrent Computation Program, Technical Report 826, Pasadena, CA: California
Institute of Technology, pp. 19–20
10. Krasnogor, Natalio (2002). Studies on the Theory and Design Space of Memetic Algorithms
(https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.249135) (PhD). Bristol, UK: University of
the West of England. p. 23.
11. Krasnogor, Natalio (1999). "Coevolution of genes and memes in memetic algorithms" (http
s://www.researchgate.net/publication/2468537). Graduate Student Workshop: 371.
12. Kendall G. and Soubeiga E. and Cowling P. Choice function and random hyperheuristics (htt
p://www.cs.nott.ac.uk/~pszgxk/aim/2008/coursework/paper4.pdf) (PDF). 4th Asia-Pacific
Conference on Simulated Evolution and Learning. SEAL 2002. pp. 667–671.
13. Burke E. K.; Gendreau M.; Hyde M.; Kendall G.; Ochoa G.; Ouml; zcan E.; Qu R. (2013).
"Hyper-heuristics: A Survey of the State of the Art". Journal of the Operational Research
Society. 64 (12): 1695–1724. CiteSeerX 10.1.1.384.9743 (https://citeseerx.ist.psu.edu/viewd
oc/summary?doi=10.1.1.384.9743). doi:10.1057/jors.2013.71 (https://doi.org/10.1057%2Fjor
s.2013.71). S2CID 3053192 (https://api.semanticscholar.org/CorpusID:3053192).
14. Y. S. Ong & A. J. Keane (2004). "Meta-Lamarckian learning in memetic algorithms" (https://e
prints.soton.ac.uk/22794/1/ong_04.pdf) (PDF). IEEE Transactions on Evolutionary
Computation. 8 (2): 99–110. doi:10.1109/TEVC.2003.819944 (https://doi.org/10.1109%2FTE
VC.2003.819944). S2CID 11003004 (https://api.semanticscholar.org/CorpusID:11003004).
15. Jakob, Wilfried (September 2010). "A general cost-benefit-based adaptation framework for
multimeme algorithms" (http://link.springer.com/10.1007/s12293-010-0040-9). Memetic
Computing. 2 (3): 201–218. doi:10.1007/s12293-010-0040-9 (https://doi.org/10.1007%2Fs12
293-010-0040-9). ISSN 1865-9284 (https://www.worldcat.org/issn/1865-9284).
S2CID 167807 (https://api.semanticscholar.org/CorpusID:167807).
16. Ong Y. S. and Lim M. H. and Zhu N. and Wong K. W. (2006). "Classification of Adaptive
Memetic Algorithms: A Comparative Study" (http://researchrepository.murdoch.edu.au/982/1/
Published_Version.pdf) (PDF). IEEE Transactions on Systems, Man, and Cybernetics - Part
B: Cybernetics. 36 (1): 141–152. doi:10.1109/TSMCB.2005.856143 (https://doi.org/10.110
9%2FTSMCB.2005.856143). hdl:10220/4653 (https://hdl.handle.net/10220%2F4653).
PMID 16468573 (https://pubmed.ncbi.nlm.nih.gov/16468573). S2CID 818688 (https://api.se
manticscholar.org/CorpusID:818688).
17. Smith J. E. (2007). "Coevolving Memetic Algorithms: A Review and Progress Report" (http://
eprints.uwe.ac.uk/18169/1/18169.pdf) (PDF). IEEE Transactions on Systems, Man, and
Cybernetics - Part B: Cybernetics. 37 (1): 6–17. doi:10.1109/TSMCB.2006.883273 (https://d
oi.org/10.1109%2FTSMCB.2006.883273). PMID 17278554 (https://pubmed.ncbi.nlm.nih.go
v/17278554). S2CID 13867280 (https://api.semanticscholar.org/CorpusID:13867280).
18. Krasnogor N. & Gustafson S. (2002). "Toward truly "memetic" memetic algorithms:
discussion and proof of concepts". Advances in Nature-Inspired Computation: The PPSN VII
Workshops. PEDAL (Parallel Emergent and Distributed Architectures Lab). University of
Reading.
19. Hart, William E. (December 1994). Adaptive Global Optimization with Local Search (https://d
l.acm.org/doi/10.5555/221524) (PhD). San Diego, CA: University of California.
CiteSeerX 10.1.1.473.1370 (https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.473.
1370).
20. Hart, William E.; Krasnogor, Natalio; Smith, Jim E. (September 2004). "Editorial Introduction
Special Issue on Memetic Algorithms" (https://direct.mit.edu/evco/article/12/3/v-vi/1185).
Evolutionary Computation. 12 (3): v–vi. doi:10.1162/1063656041775009 (https://doi.org/10.1
162%2F1063656041775009). ISSN 1063-6560 (https://www.worldcat.org/issn/1063-6560).
S2CID 9912363 (https://api.semanticscholar.org/CorpusID:9912363).
21. Schwefel, Hans-Paul (1995). Evolution and optimum seeking. New York: Wiley. ISBN 0-471-
57148-2.
22. Ku, K. W. C.; Mak, M. W.; Siu., W. C (2000). "A study of the Lamarckian evolution of recurrent
neural networks". IEEE Transactions on Evolutionary Computation. 4 (1): 31–42.
doi:10.1109/4235.843493 (https://doi.org/10.1109%2F4235.843493). hdl:10397/289 (https://
hdl.handle.net/10397%2F289).
23. Land, M. W. S. (1998). Evolutionary Algorithms with Local Search for Combinatorial
Optimization (Thesis). San Diego, CA: University of California. CiteSeerX 10.1.1.55.8986 (ht
tps://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.55.8986). ISBN 978-0-599-12661-
9.
24. Bambha N. K. and Bhattacharyya S. S. and Teich J. and Zitzler E. (2004). "Systematic
integration of parameterized local search into evolutionary algorithms". IEEE Transactions
on Evolutionary Computation. 8 (2): 137–155. doi:10.1109/TEVC.2004.823471 (https://doi.or
g/10.1109%2FTEVC.2004.823471). S2CID 8303351 (https://api.semanticscholar.org/Corpu
sID:8303351).
25. Gruau, Frédéric; Whitley, Darrell (September 1993). "Adding Learning to the Cellular
Development of Neural Networks: Evolution and the Baldwin Effect" (https://direct.mit.edu/ev
co/article/1/3/213-233/1109). Evolutionary Computation. 1 (3): 213–233.
doi:10.1162/evco.1993.1.3.213 (https://doi.org/10.1162%2Fevco.1993.1.3.213). ISSN 1063-
6560 (https://www.worldcat.org/issn/1063-6560). S2CID 15048360 (https://api.semanticschol
ar.org/CorpusID:15048360).
26. Orvosh, David; Davis, Lawrence (1993), Forrest, Stephanie (ed.), "Shall We Repair?
Genetic Algorithms, Combinatorial Optimization, and Feasibility Constraints" (https://www.se
manticscholar.org/paper/Shall-We-Repair-Genetic-AlgorithmsCombinatorial-Orvosh-Davis/c
1a930dea91a59ccec2cf20f7f81a953bd53f46a), Conf. Proc. of the 5th Int. Conf. on Genetic
Algorithms (ICGA), San Mateo, CA, USA: Morgan Kaufmann, p. 650, ISBN 978-1-55860-
299-1, S2CID 10098180 (https://api.semanticscholar.org/CorpusID:10098180)
27. Whitley, Darrell; Gordon, V. Scott; Mathias, Keith (1994), Davidor, Yuval; Schwefel, Hans-
Paul; Männer, Reinhard (eds.), "Lamarckian evolution, the Baldwin effect and function
optimization" (http://link.springer.com/10.1007/3-540-58484-6_245), Parallel Problem
Solving from Nature — PPSN III, Berlin, Heidelberg: Springer Berlin Heidelberg, vol. 866,
pp. 5–15, doi:10.1007/3-540-58484-6_245 (https://doi.org/10.1007%2F3-540-58484-6_245),
ISBN 978-3-540-58484-1, retrieved 2023-02-07
28. Jakob, Wilfried (September 2010). "A general cost-benefit-based adaptation framework for
multimeme algorithms" (http://link.springer.com/10.1007/s12293-010-0040-9). Memetic
Computing. p.207. 2 (3): 201–218. doi:10.1007/s12293-010-0040-9 (https://doi.org/10.100
7%2Fs12293-010-0040-9). ISSN 1865-9284 (https://www.worldcat.org/issn/1865-9284).
S2CID 167807 (https://api.semanticscholar.org/CorpusID:167807).
29. Ichimura, T.; Kuriyama, Y. (1998). Learning of neural networks with parallel hybrid GA using
a royal road function. IEEE International Joint Conference on Neural Networks. Vol. 2. New
York, NY. pp. 1131–1136. doi:10.1109/IJCNN.1998.685931 (https://doi.org/10.1109%2FIJC
NN.1998.685931).
30. Aguilar, J.; Colmenares, A. (1998). "Resolution of pattern recognition problems using a
hybrid genetic/random neural network learning algorithm". Pattern Analysis and
Applications. 1 (1): 52–61. doi:10.1007/BF01238026 (https://doi.org/10.1007%2FBF012380
26). S2CID 15803359 (https://api.semanticscholar.org/CorpusID:15803359).
31. Ridao, M.; Riquelme, J.; Camacho, E.; Toro, M. (1998). An evolutionary and local search
algorithm for planning two manipulators motion. Lecture Notes in Computer Science.
Vol. 1416. Springer-Verlag. pp. 105–114. CiteSeerX 10.1.1.324.2668 (https://citeseerx.ist.ps
u.edu/viewdoc/summary?doi=10.1.1.324.2668). doi:10.1007/3-540-64574-8_396 (https://doi.
org/10.1007%2F3-540-64574-8_396). ISBN 978-3-540-64574-0.
32. Haas, O.; Burnham, K.; Mills, J. (1998). "Optimization of beam orientation in radiotherapy
using planar geometry". Physics in Medicine and Biology. 43 (8): 2179–2193.
Bibcode:1998PMB....43.2179H (https://ui.adsabs.harvard.edu/abs/1998PMB....43.2179H).
doi:10.1088/0031-9155/43/8/013 (https://doi.org/10.1088%2F0031-9155%2F43%2F8%2F01
3). PMID 9725597 (https://pubmed.ncbi.nlm.nih.gov/9725597). S2CID 250856984 (https://ap
i.semanticscholar.org/CorpusID:250856984).
33. Harris, S.; Ifeachor, E. (1998). "Automatic design of frequency sampling filters by hybrid
genetic algorithm techniques". IEEE Transactions on Signal Processing. 46 (12): 3304–
3314. Bibcode:1998ITSP...46.3304H (https://ui.adsabs.harvard.edu/abs/1998ITSP...46.3304
H). doi:10.1109/78.735305 (https://doi.org/10.1109%2F78.735305).
34. Augugliaro, A.; Dusonchet, L.; Riva-Sanseverino, E. (1998). "Service restoration in
compensated distribution networks using a hybrid genetic algorithm". Electric Power
Systems Research. 46 (1): 59–66. doi:10.1016/S0378-7796(98)00025-X (https://doi.org/10.1
016%2FS0378-7796%2898%2900025-X).
35. Wehrens, R.; Lucasius, C.; Buydens, L.; Kateman, G. (1993). "HIPS, A hybrid self-adapting
expert system for nuclear magnetic resonance spectrum interpretation using genetic
algorithms". Analytica Chimica Acta. 277 (2): 313–324. doi:10.1016/0003-2670(93)80444-P
(https://doi.org/10.1016%2F0003-2670%2893%2980444-P). hdl:2066/112321 (https://hdl.ha
ndle.net/2066%2F112321). S2CID 53954763 (https://api.semanticscholar.org/CorpusID:539
54763).
36. França, P.; Mendes, A.; Moscato, P. (1999). Memetic algorithms to minimize tardiness on a
single machine with sequence-dependent setup times (https://pdfs.semanticscholar.org/c21
3/5d68ceb0fd8e924aabe97cac1858ff6a2ce4.pdf) (PDF). Proceedings of the 5th
International Conference of the Decision Sciences Institute. Athens, Greece. pp. 1708–1710.
S2CID 10797987 (https://api.semanticscholar.org/CorpusID:10797987).
37. Costa, Daniel (1995). "An Evolutionary Tabu Search Algorithm And The NHL Scheduling
Problem". INFOR: Information Systems and Operational Research. 33 (3): 161–178.
doi:10.1080/03155986.1995.11732279 (https://doi.org/10.1080%2F03155986.1995.117322
79). S2CID 15491435 (https://api.semanticscholar.org/CorpusID:15491435).
38. Aickelin, U. (1998). Nurse rostering with genetic algorithms. Proceedings of young
operational research conference 1998. Guildford, UK. arXiv:1004.2870 (https://arxiv.org/abs/
1004.2870).
39. Ozcan, E. (2007). "Memes, Self-generation and Nurse Rostering". Practice and Theory of
Automated Timetabling VI. Lecture Notes in Computer Science. Vol. 3867. Springer-Verlag.
pp. 85–104. doi:10.1007/978-3-540-77345-0_6 (https://doi.org/10.1007%2F978-3-540-7734
5-0_6). ISBN 978-3-540-77344-3.
40. Ozcan, E.; Onbasioglu, E. (2007). "Memetic Algorithms for Parallel Code Optimization".
International Journal of Parallel Programming. 35 (1): 33–61. doi:10.1007/s10766-006-0026-
x (https://doi.org/10.1007%2Fs10766-006-0026-x). S2CID 15182941 (https://api.semanticsc
holar.org/CorpusID:15182941).
41. Burke, E.; Smith, A. (1999). "A memetic algorithm to schedule planned maintenance for the
national grid" (https://doi.org/10.1145%2F347792.347801). Journal of Experimental
Algorithmics. 4 (4): 1–13. doi:10.1145/347792.347801 (https://doi.org/10.1145%2F347792.3
47801). S2CID 17174080 (https://api.semanticscholar.org/CorpusID:17174080).
42. Jakob, Wilfried; Strack, Sylvia; Quinte, Alexander; Bengel, Günther; Stucky, Karl-Uwe; Süß,
Wolfgang (2013-04-22). "Fast Rescheduling of Multiple Workflows to Constrained
Heterogeneous Resources Using Multi-Criteria Memetic Computing" (https://doi.org/10.339
0%2Fa6020245). Algorithms. 6 (2): 245–277. doi:10.3390/a6020245 (https://doi.org/10.339
0%2Fa6020245). ISSN 1999-4893 (https://www.worldcat.org/issn/1999-4893).
43. Ozcan, E.; Basaran, C. (2009). "A Case Study of Memetic Algorithms for Constraint
Optimization". Soft Computing: A Fusion of Foundations, Methodologies and Applications.
13 (8–9): 871–882. CiteSeerX 10.1.1.368.7327 (https://citeseerx.ist.psu.edu/viewdoc/summa
ry?doi=10.1.1.368.7327). doi:10.1007/s00500-008-0354-4 (https://doi.org/10.1007%2Fs005
00-008-0354-4). S2CID 17032624 (https://api.semanticscholar.org/CorpusID:17032624).
44. Areibi, S.; Yang, Z. (2004). "Effective memetic algorithms for VLSI design automation =
genetic algorithms + local search + multi-level clustering". Evolutionary Computation. 12 (3):
327–353. doi:10.1162/1063656041774947 (https://doi.org/10.1162%2F106365604177494
7). PMID 15355604 (https://pubmed.ncbi.nlm.nih.gov/15355604). S2CID 2190268 (https://ap
i.semanticscholar.org/CorpusID:2190268).
45. Merz, P.; Zell, A. (2002). "Clustering Gene Expression Profiles with Memetic Algorithms".
Parallel Problem Solving from Nature — PPSN VII. Lecture Notes in Computer Science.
Vol. 2439. Springer. pp. 811–820. doi:10.1007/3-540-45712-7_78 (https://doi.org/10.1007%2
F3-540-45712-7_78). ISBN 978-3-540-44139-7.
46. Zexuan Zhu, Y. S. Ong and M. Dash (2007). "Markov Blanket-Embedded Genetic Algorithm
for Gene Selection". Pattern Recognition. 49 (11): 3236–3248.
Bibcode:2007PatRe..40.3236Z (https://ui.adsabs.harvard.edu/abs/2007PatRe..40.3236Z).
doi:10.1016/j.patcog.2007.02.007 (https://doi.org/10.1016%2Fj.patcog.2007.02.007).
47. Zexuan Zhu, Y. S. Ong and M. Dash (2007). "Wrapper-Filter Feature Selection Algorithm
Using A Memetic Framework". IEEE Transactions on Systems, Man, and Cybernetics - Part
B: Cybernetics. 37 (1): 70–76. doi:10.1109/TSMCB.2006.883267 (https://doi.org/10.1109%2
FTSMCB.2006.883267). hdl:10338.dmlcz/141593 (https://hdl.handle.net/10338.dmlcz%2F1
41593). PMID 17278560 (https://pubmed.ncbi.nlm.nih.gov/17278560). S2CID 18382400 (htt
ps://api.semanticscholar.org/CorpusID:18382400).
48. "Artificial Intelligence for Fault Injection Parameter Selection | Marina Krček | Hardwear.io
Webinar" (https://hardwear.io/webinar/AI-for-fault-injection-parameter-selection.php).
hardwear.io. Retrieved 2021-05-21.
49. Zhu, Zexuan; Ong, Yew-Soon; Zurada, Jacek M (April 2010). "Identification of Full and
Partial Class Relevant Genes" (https://ieeexplore.ieee.org/document/4653480). IEEE/ACM
Transactions on Computational Biology and Bioinformatics. 7 (2): 263–277.
doi:10.1109/TCBB.2008.105 (https://doi.org/10.1109%2FTCBB.2008.105). ISSN 1545-5963
(https://www.worldcat.org/issn/1545-5963). PMID 20431146 (https://pubmed.ncbi.nlm.nih.go
v/20431146). S2CID 2904028 (https://api.semanticscholar.org/CorpusID:2904028).
50. G. Karkavitsas & G. Tsihrintzis (2011). Automatic Music Genre Classification Using Hybrid
Genetic Algorithms (https://semanticscholar.org/paper/c68aeb6ac71ae9414791fc7c9377ef8
46b724349). Intelligent Interactive Multimedia Systems and Services. Smart Innovation,
Systems and Technologies. Vol. 11. Springer. pp. 323–335. doi:10.1007/978-3-642-22158-
3_32 (https://doi.org/10.1007%2F978-3-642-22158-3_32). ISBN 978-3-642-22157-6.
S2CID 15011089 (https://api.semanticscholar.org/CorpusID:15011089).

Retrieved from "https://en.wikipedia.org/w/index.php?title=Memetic_algorithm&oldid=1160753297"

You might also like