Professional Documents
Culture Documents
Research Article: Genetic Algorithm For Traveling Salesman Problem With Modified Cycle Crossover Operator
Research Article: Genetic Algorithm For Traveling Salesman Problem With Modified Cycle Crossover Operator
Research Article
Genetic Algorithm for Traveling Salesman Problem with
Modified Cycle Crossover Operator
Received 1 June 2017; Revised 17 July 2017; Accepted 7 August 2017; Published 25 October 2017
Copyright © 2017 Abid Hussain et al. This is an open access article distributed under the Creative Commons Attribution License,
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Genetic algorithms are evolutionary techniques used for optimization purposes according to survival of the fittest idea. These
methods do not ensure optimal solutions; however, they give good approximation usually in time. The genetic algorithms are useful
for NP-hard problems, especially the traveling salesman problem. The genetic algorithm depends on selection criteria, crossover,
and mutation operators. To tackle the traveling salesman problem using genetic algorithms, there are various representations such
as binary, path, adjacency, ordinal, and matrix representations. In this article, we propose a new crossover operator for traveling
salesman problem to minimize the total distance. This approach has been linked with path representation, which is the most natural
way to represent a legal tour. Computational results are also reported with some traditional path representation methods like
partially mapped and order crossovers along with new cycle crossover operator for some benchmark TSPLIB instances and found
improvements.
crystallography [4], and automatic drilling of printed circuit Table 1: Summary of crossover operators for TSP.
boards and threading of scan cells in a testable Very-Large-
Scale-Integrated (VLSI) circuits [5]. Representation Crossover operators Proposed year
Binary Classical + repair operator 1991
Over the last three decades, TSP received considerable
attention and various approaches are proposed to solve the Partially mapped crossover 1985
problem, such as branch and bound [6], cutting planes [7], Order crossover 1985
2-opt [8], particle swarm [9], simulated annealing [10], ant Cycle crossover 1987
Path
colony [11, 12], neural network [13], tabu search [14], and Heuristic crossover 1987
genetic algorithms [3, 15–17]. Some of these methods are Order based crossover 1991
exact, while others are heuristic algorithms. A comprehensive Position based crossover 1991
study about GAs approaches are successfully applied to the Alternative edge crossover 1985
TSP [18]. A survey of GAs approaches for TSP was presented Heuristic crossover 1 1985
by Potvin [17]. A new sequential constructive crossover Adjacency
Heuristic crossover 2 1989
generates high quality solution to the TSP by Ahmed [19].
Heuristic crossover 3 1987
A new genetic algorithm for asymmetric TSP is proposed
by Nagata and Soler [20]. Three new variations for order Ordinal Classical operators 1985
crossover are presented with improvements by Deep and Intersection crossover 1987
Matrix
Adane [21]. Ghadle and Muley presented modified one’s Union crossover 1987
algorithm with MATLAB programming to solve TSP [22].
Piwonska associated a profit based genetic algorithm with
TSP and obtained good results to be tested on networks Section 3; computational experiments and discussion are in
of cities in some voivodeships of Poland [23]. Kumar et Section 4 and conclusion is in Section 5.
al. presented the comparative analysis of different crossover
operators for TSP and showed partially mapped crossover 2. Crossover Operators for TSP
gives shortest path [24]. A simple and pure genetic algorithm
can be defined in the following steps. In literature, there are many representations to solve the TSP
using the GAs. Among these binary, path, adjacency, ordinal,
Step 1. Create an initial population of P chromosomes. and matrix representations are important. The further types
of these representations are given in Table 1. We are limiting
Step 2. Evaluate the fitness of each chromosome. our self only with the path representation which is most
natural and legal way to represent a tour and skip the others
Step 3. Choose P/2 parents from the current population via
representations.
proportional selection.
Step 4. Randomly select two parents to create offspring using 2.1. Path Representation. The most natural way to present
crossover operator. a legal tour is probably by using path representation. For
example, a tour 1 → 4 → 8 → 2 → 5 → 3 → 6 → 7
Step 5. Apply mutation operators for minor changes in the can be represented simply as (1 4 8 2 5 3 6 7).
results. Since the TSPs in combinatorial with the path repre-
sentation and the classical crossover operators such as one-
Step 6. Repeat Steps 4 and 5 until all parents are selected and point, two-point, and uniform crossovers are not suitable,
mated. we choose only partially mapped, order, and cycle crossover
operators from path representation which are mostly used in
Step 7. Replace old population of chromosomes with new literature and also we can compare our proposed crossover
one. operator with these operators.
Step 8. Evaluate the fitness of each chromosome in the new
population. 2.1.1. Partially Mapped Crossover Operator. The partially
mapped crossover (PMX) was proposed by Goldberg and
Step 9. Terminate if the number of generations meets some Lingle [25]. After choosing two random cut points on parents
upper bound; otherwise go to Step 3. to build offspring, the portion between cut points, one
parent’s string is mapped onto the other parent’s string
The selection criteria, crossover, and mutation are major and the remaining information is exchanged. Consider, for
operators, but crossover plays a vital role in GAs. A lot of example, the two parents tours with randomly one cut point
crossover operators have been proposed in literature and between 3rd and 4th bits and other cut point between 6th and
all have their significant importance. In this article, we also 7th bits are as follows (the two cut points marked with “|”):
proposed a new crossover operator for TSP which is moved
within two selected parents as previous cycle crossover oper- 𝑃1 = (3 4 8 | 2 7 1 | 6 5) ,
ator. In Section 2, we present crossover operators for TSP and (1)
proposed a new crossover operator for path representation in 𝑃2 = (4 2 5 | 1 6 8 | 3 7) .
Computational Intelligence and Neuroscience 3
The mapping sections are between the cut points. In this 2.1.3. Cycle Crossover Operator. The cycle crossover (CX)
example, the mapping systems are 2 ↔ 1, 7 ↔ 6, and 1 ↔ 8. operator was first proposed by Oliver et al. [27]. Using this
Now two mapping sections are copied with each other to technique to create offspring in such a way that each bit with
make offspring as follows: its position comes from one of the parents. For example,
consider the tours of two parents:
𝑂1 = (× × × | 1 6 8 | × ×) ,
(2) 𝑃1 = (1 2 3 4 5 6 7 8) ,
𝑂2 = (× × × | 2 7 1 | × ×) . (10)
𝑃2 = (8 5 2 1 3 6 4 7) .
Then we can fill further bits (from the original parents),
for those which have no conflict as follows: Now it is up to us how we choose the first bit for the offspring
𝑂1 = (3 4 × | 1 6 8 | × 5) , to be either from the first or from the second parent. In our
(3) example, the first bit of the offspring has to be 1 or 8. Let us
𝑂2 = (4 × 5 | 2 7 1 | 3 ×) . choose it be 1:
𝑂1 = (5 6 8 | 2 7 1 | 3 4) . (8) 𝑂1 = (3 4 8 2 7 1 6 5) ,
(18)
Analogously, we complete second offspring as well: 𝑂2 = (4 2 5 1 6 8 3 7) ,
3. Proposed Crossover Operators And for two moves below 8 is 5 and below 5 is 7, so
We are going to propose a new crossover operator which 𝑂2 = (1 7 × × × × × ×) . (23)
works similarly as CX, so we suggest it as CX2. At the same
time it generates both offspring from parents using cycle(s) Hence similarly,
till last bit. We differentiate CX2 in the following steps.
𝑂1 = (4 8 6 2 5 3 1 7) ,
Step 1. Choose two parents for mating. (24)
𝑂2 = (1 7 4 8 6 2 5 3) .
Step 2. Select 1st bit from second parent as a 1st bit of first
offspring. We see that the last bit of second offspring is 3 which was the
1st bit of first parent. Hence proposed scheme is over within
Step 3. The selected bit from Step 2 would be found in first one cycle.
parent and pick the exact same position bit which is in second
parent and that bit would be found again in the first parent 3.2. CX2: Case 2. Consider the two selected parents as
and, finally, the exact same position bit which is in second mentioned in Step 1:
parent will be selected for 1st bit of second offspring.
𝑃1 = (1 2 3 4 5 6 7 8) ,
Step 4. The selected bit from Step 3 would be found in first (25)
parent and pick the exact same position bit which is in second 𝑃2 = (2 7 5 8 4 1 6 3) .
parent as the next bit for first offspring. (Note: for the first
offspring, we choose bits only with one move and two moves Now using Step 2 of the scheme,
for second offspring’s bits.)
𝑂1 = (2 × × × × × × ×) . (26)
Step 5. Repeat Steps 3 and 4 till 1st bit of first parent will not
come in second offspring (complete a cycle) and process may After this, Step 3 calls us to locate the position of bit 2 in first
be terminated. parent, which is at 2nd and 7 is at same position in second
parent, again searching bit 7 in first parent and located at 7th
Step 6. If some bits are left, then the same bits in first parent position and 6 is at the same position in second parent, so we
and in second offspring till now and vice versa are left out choose bit 6 for second offspring:
from both parents. For remaining bits repeat Steps 2, 3, and
4 to complete the process. 𝑂2 = (6 × × × × × × ×) . (27)
According to the previous steps, we derive two cases for Continue Steps 4 and 5 as well:
CX2. First case of CX2 will be terminated within Step 5 and
second will take all six steps. We provide detailed examples of 𝑂1 = (2 1 6 7 × × × ×) ,
both cases in next subsections. (28)
𝑂2 = (6 7 2 1 × × × ×) .
3.1. CX2: Case 1. Consider the two selected parents as men-
tioned in Step 1: The Step 5 is finished because bit 1 has come in second
offspring which was in 1st position of first parent. Now before
𝑃1 = (3 4 8 2 7 1 6 5) , applying Step 6, we match first offspring’s bits with second
(19) parent or vice versa and leave out the existing bits with their
𝑃2 = (4 2 5 1 6 8 3 7) . position in both parents as follwos:
Using Step 2,
𝑃1 = (∙ ∙ 3 4 5 ∙ ∙ 8) ,
𝑂1 = (4 × × × × × × ×) . (20) (29)
𝑃2 = (∙ ∙ 5 8 4 ∙ ∙ 3) .
As using Step 3 which selected 4 in Step 2, where 4 is found
at second position in first parent and the bit at this position in Now filled positions of parents and “×” positions of offspring
second parent is 2. For searching again, 2 is at fourth position are considered 1st, 2nd, and 3rd positions, and so forth, so we
in first parent and 1 is at same position in second parent, so 1 can complete Step 6 as well:
is selected for second offspring as follows:
𝑂1 = (2 1 6 7 | 5 3 8 4) ,
𝑂2 = (1 × × × × × × ×) . (21) (30)
𝑂2 = (6 7 2 1 | 8 4 5 3) .
To follow Step 4, the previous bit was 1 and it is located at 6th
position in first parent and at this position bit is 8 in second Hence the scheme is over with efficient work.
parent, so To apply this crossover operator, we made a MATLAB
code for genetic algorithms and have given pseudo-code in
𝑂1 = (4 8 × × × × × ×) . (22) Algorithm 1.
Computational Intelligence and Neuroscience 5
𝑁 ← No. of Edges }
𝑀 ← Population Size 𝑅(𝑁 + 1) ← (Sum1 + Sum2)
𝐺 ← No. of Generations R ← Sort(𝑅)
𝐴 ← Random Population 𝑅 ← 𝑅(1 : 𝑀)
For each 1 ≤ 𝑖 ≤ 𝑁 𝑍 ← min(𝑅)
{ Plot(𝑍)
For each 1 ≤ 𝑗 ≤ 𝑁 }
{ }
Distance ← square-root((𝑥1 − 𝑥2 )2 + (𝑦1 − 𝑦2 )2 )
}
} Algorithm 1: Pseudo-code of proposed algorithm CX2.
For each 1 ≤ 𝑖 ≤ 𝑀
{
Table 2: Transition distance between cities.
𝑅(𝑖, :) ← Rand perm(𝑁)
} City 1 2 3 4 5 6 7
For each 1 ≤ generation ≤ G 1 0 34 36 37 31 33 35
{
2 — 0 29 23 22 25 24
For each 1 ≤ 𝑖 ≤ 𝑀
{ 3 — — 0 17 12 18 17
Sum1 ← Distance(𝑅(𝑖, 1), 𝑅(𝑖, 𝑁)) 4 — — — 0 32 30 29
For each 1 ≤ 𝑗 ≤ 𝑁 − 1 5 — — — — 0 26 24
{ 6 — — — — — 0 19
Sum2 ← Distance(𝑅(𝑖, 𝑗), 𝑅(𝑖, 𝑗 + 1)) + Sum2
7 — — — — — — 0
}
𝑅(𝑁 + 1) ← Sum1 + Sum2
} Table 3: Comparison of three crossover operators (30 runs).
𝑅 ← Sort(𝑅)
𝐵 ← 0.8 ∗ 𝑀 Average
Crossover Optimum Best value Worst value
𝐶 ← 0.1 ∗ 𝑀 value
Length ← Length(𝐵)/2 PMX 17/30 159.7 159 165
For each 1 ≤ 𝑖 ≤ Length OX 14/30 160.3 159 163
{
CX2 24/30 159.2 159 162
C1 ← zeros(𝑁)
C2 ← zeros(𝑁)
𝑃1 ← 𝑅(𝐵(2 ∗ 𝑖 − 1), 1 : 𝑁)
𝑃2 ← 𝑅(𝐵(2 ∗ 𝑖), 1 : 𝑁) 4. Computational Experiments and Discussion
St1 ← 0
St2 ← 0 We use genetic algorithm in MATLAB software to compare
Where (Length(C1) ∼= Length(P1)) the proposed crossover operator with some traditional path
} representation crossover operators. Our first experiment has
𝐶1(St1) ← 𝑃2(St1) 7 cities and we impose the transition distance between cities
While (St1 < 𝑁) in Table 2. To solve this problem using GAs, the genetic
{ parameters are set as population size, 𝑀 = 30; maximum
Ind1 ← find(P1 == C1(St1)) generation, 𝐺 = 10; crossover probability, 𝑃𝑐 = 0.8; mutation
Val1 ← P2(Ind1) probability, 𝑃𝑚 = 0.1. In this experiment, the optimal path and
𝐶2(St2) ← Val1
optimal value are 6 → 1 → 5 → 3 → 4 → 2 → 7 and 159,
St1 ← St1 + 1
Ind2 ← find(P2 == C2(St2))
respectively.
Val2 ← 𝑃1(Ind2) Table 3 summarizes the results and shows that the
C1(St1) ← Val2 performance of CX2 is much better than the two existing
St2 ← St2 + 1 crossover operators with 30 runs.
}
} 4.1. Benchmark Problems. We perform the proposed cross-
𝑅(𝑀 + 2 ∗ 𝑖 − 1, 1 : 𝑁) = 𝐶1 over operator CX2 along two traditional crossover operators
𝑅(𝑀 + 2 ∗ 𝑖, 1 : 𝑁) = 𝐶1
PMX and OX on twelve benchmark instances which are
For each 1 ≤ 𝑖 ≤ (𝑀 + 0.8 ∗ 𝑀)
{
taken from the TSPLIB [28]. In these twelve problems, the
Sum1 ← Distance(𝑅(1), 𝑅(2)) ftv33, ftv38, ft53, kro124p, ftv170, rbg323, rbg358, rbg403, and
For each 1 ≤ 𝑗 ≤ 𝑁 − 1 rbg443, are asymmetric and gr21, fri26, and dantzig42 are
{ symmetric TSPs. The experiments are performed 30 times (30
Sum2 ← Distance(𝑅(𝑗), 𝑅(𝑗 + 1)) + Sum1 runs) for each instance. The common parameters are selected
for GAs, that is, population size, maximum generation,
Algorithm 1: Continued. crossover, and mutation probabilities are 150, 500, 0.80, and
6 Computational Intelligence and Neuroscience
Table 4: Comparison results among three crossover operators. Table 5: Comparison results among three crossover operators.
Instance 𝑁 Optimum Results PMX OX CX2 Instance 𝑁 Optimum Results PMX OX CX2
value value
Best 2962 3005 2995 Best 090231 097122 092450
gr21 21 2707 Worst 3322 3693 3576 kro124p 100 36230 Worst 118386 122497 121513
Average 3127 3208 3145 Average 100335 103457 101229
Best 1056 1051 1099 Best 13346 15202 6421
fri26 26 937 Worst 1294 1323 1278 ftv170 171 2755 Worst 19314 19708 8416
Average 1133 1158 1128 Average 16775 17569 7019
Best 1708 1804 1811 Best 4123 3998 4212
ftv33 34 1286 Worst 2399 2366 2322 rbg323 323 1326 Worst 5147 5385 5342
Average 2012 2098 2083 Average 4434 4602 4654
Best 2345 2371 2252 Best 5380 5630 5404
ftv38 39 1530 Worst 2726 2913 2718 rbg358 358 1163 Worst 5915 5948 6004
Average 2578 2617 2560 Average 5532 5830 5622
Best 1298 1222 0699 Best 6231 6196 6257
dantzig42 42 699 Worst 1606 1562 0920 rbg403 403 2465 Worst 6653 6629 6671
Average 1425 1301 0802 Average 6536 6386 6455
Best 13445 13826 10987 Best 6754 6932 6854
ft53 53 6905 Worst 16947 16279 13055 rbg443 443 2720 Worst 7209 7351 7388
Average 14949 14724 12243 Average 6905 7121 6981
Generation = 500. Generation = 1000.
Acknowledgments [17] J.-Y. Potvin, “Genetic algorithms for the traveling salesman
problem,” Annals of Operations Research, vol. 63, pp. 339–370,
The authors extend their appreciation to the Deanship of 1996.
Scientific Research at King Saud University for funding this [18] M. Gen and R. Cheng, Genetic Algorithms and Engineering
work through Research Group no. RG-1437-027. Design, John Wiley & Sons, London, UK, 1997.
[19] H. Z. Ahmed, “Genetic algorithm for the traveling salesman
References problem using sequential constructive crossover operator,” in
Proceedings of the International Journal of Biometrics & Bioin-
[1] J. H. Holland, Adaptation in Natural and Artificial Systems: An formatics (IJBB), vol. 3, p. 96, 2010.
Introductory Analysis with Applications to Biology, Control, and [20] Y. Nagata and D. Soler, “A new genetic algorithm for the
Artificial Intelligence, University of Michigan Press, Oxford, UK, asymmetric traveling salesman problem,” Expert Systems with
1975. Applications, vol. 39, no. 10, pp. 8947–8953, 2012.
[2] D. E. Golberg, Genetic algorithms in search, optimization and [21] K. Deep and H. M. Adane, “New variations of order crossover
machine learning, Addison-Wesley Publishing Company, 1989. for travelling salesman problem,” International Journal of Com-
[3] P. Larrañaga, C. M. H. Kuijpers, R. H. Murga, I. Inza, and binatorial Optimization Problems and Informatics, vol. 2, no. 1,
S. Dizdarevic, “Genetic algorithms for the travelling salesman 2011.
problem: a review of representations and operators,” Artificial [22] K. P. Ghadle and Y. M. Muley, “Travelling salesman prob-
Intelligence Review, vol. 13, no. 2, pp. 129–170, 1999. lem with MATLAB programming,” International Journal of
[4] R. G. Bland and D. F. Shallcross, “Large traveling salesman Advances in Applied Mathematics and Mechanics, vol. 2, no. 3,
problems arising from experiments in x-ray crystallography: pp. 258–266, 2015.
a preliminary report on computation,” Operations Research [23] A. Piwonska, “Genetic algorithm finds routes in travelling
Letters, vol. 8, no. 3, pp. 125–128, 1989. salesman problem with profits. Zeszyty Naukowe Politechniki
[5] C. Ravikumar, “Parallel techniques for solving large scale trav- Bia lostockiej,” Informatyka, pp. 51–65, 2010.
elling salesperson problems,” Microprocessors and Microsystems, [24] N. Kumar, R. K. Karambir et al., “A comparative analysis of pmx,
vol. 16, no. 3, pp. 149–158, 1992. cx and ox crossover operators for solving traveling salesman
[6] G. Finke, A. Claus, and E. Gunn, “A two-commodity network problem,” International Journal of Latest Research in Science and
flow approach to the traveling salesman problem,” vol. 41, pp. Technology, vol. 1, 2012.
167–178. [25] D. Goldberg and R. Lingle, “Alleles, Loci and the Traveling
Salesman Problem,” in Proceedings of the 1st International
[7] P. Miliotis, “Using cutting planes to solve the symmetric
Conference on Genetic Algorithms and Their Applications, vol.
Travelling Salesman problem,” Mathematical Programming, vol.
1985, pp. 154–159, Los Angeles, USA.
15, no. 1, pp. 177–188, 1978.
[26] L. Davis, “Applying adaptive algorithms to epistatic domains,”
[8] S. Lin and B. W. Kernighan, “An effective heuristic algorithm
IJCAI, vol. 85, pp. 162–164, 1985.
for the traveling-salesman problem,” Operations Research, vol.
21, pp. 498–516, 1973. [27] I. M. Oliver, D. J. d. Smith, and R. C. J. Holland, “Study of permu-
tation crossover operators on the traveling salesman problem,”
[9] J. Kennedy, R. C. Eberhart, and Y. Shi, Swarm intelligence,
in Genetic algorithms and their applications: proceedings of the
morgan kaufmann publishers, Inc., San Francisco, CA, USA,
second International Conference on Genetic Algorithms: July 28-
2001.
31, 1987 at the Massachusetts Institute of Technology, Cambridge,
[10] S. Kirkpatrick and G. Toulouse, “Configuration space analysis of MA, USA, 1987.
travelling salesman problems,” Le Journal de Physique, vol. 46,
[28] G. Reinelt, http://www.iwr.uni-heidelberg.de/group/comopt/
no. 8, pp. 1277–1292, 1985.
software.
[11] M. Dorigo and L. M. Gambardella, “Ant colony system: a coop-
erative learning approach to the traveling salesman problem,”
IEEE Transactions on Evolutionary Computation, vol. 1, no. 1, pp.
53–66, 1997.
[12] M. Dorigo, V. Maniezzo, and A. Colorni, “Ant system: optimiza-
tion by a colony of cooperating agents,” IEEE Transactions on
Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 26, no.
1, pp. 29–41, 1996.
[13] S. Bhide, N. John, and M. R. Kabuka, “A Boolean Neural
Network Approach for the Traveling Salesman Problem,” IEEE
Transactions on Computers, vol. 42, no. 10, pp. 1271–1278, 1993.
[14] F. Glover, “Artificial intelligence, heuristic frameworks and tabu
search,” Managerial and Decision Economics, vol. 11, no. 5, pp.
365–375, 1990.
[15] Z. Michalewicz, Genetic Algorithms + Data Structures = Evolu-
tion Programs, Springer, New York, NY, USA, 1996.
[16] C. Moon, J. Kim, G. Choi, and Y. Seo, “An efficient genetic
algorithm for the traveling salesman problem with precedence
constraints,” European Journal of Operational Research, vol. 140,
no. 3, pp. 606–617, 2002.
Advances in Journal of
Industrial Engineering
Multimedia
Applied
Computational
Intelligence and Soft
Computing
The Scientific International Journal of
Distributed
Hindawi Publishing Corporation
World Journal
Hindawi Publishing Corporation
Sensor Networks
Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 201
Advances in
Fuzzy
Systems
Modelling &
Simulation
in Engineering
Hindawi Publishing Corporation
Hindawi Publishing Corporation Volume 2014 http://www.hindawi.com Volume 2014
http://www.hindawi.com
International Journal of
Advances in Computer Games Advances in
Computer Engineering Technology Software Engineering
Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 201 http://www.hindawi.com Volume 2014
International Journal of
Reconfigurable
Computing