Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 64

Particle Swarm optimization

By:
Mohammed Al-Alaw
Supervised By:
Prof. Fatih V. Çelebi
Particle Swarm Optimisation

• Introduction
• The Algorithm
– Flow diagram
– Pseudo code
– Java Code
– C code
• Applications
– Simple Arithmetic
– Travelling Salesperson Problem
– Pattern Search & a P.S. Applications
• Conclusion, Advantages & limitations
• Future Scope
• References
Particle Swarm Optimisation

• Introduction … The Story


• Inspired by Nature again…
• Inspired by the flocking and schooling patterns of birds and
fish,
• Imagine a flock of birds circling over an area where they can
smell a hidden source of food.
• The one who is closest to the food chirps the loudest and the
other birds swing around in his direction.
• If any of the other circling birds comes closer to the target than
the first, it chirps louder and the others veer over toward him.
• This tightening pattern continues until one of the birds
happens upon the food.
Particle Swarm Optimisation
Particle Swarm Optimisation

• Introduction …
• Inspired by the flocking and schooling patterns of birds and fish,
• Particle Swarm Optimization (PSO) was invented by Russell
Eberhart and James Kennedy in 1995.
• Originally, these two started out developing computer software
simulations of birds flocking around food sources,
• then later realized how well their algorithms worked on
optimization problems.
• Over a number of iterations, a group of variables have their
values adjusted closer to the member whose value is closest to
the target at any given moment.
• It's an algorithm that's simple and easy to implement.
Particle Swarm Optimisation

• Introduction
• The Scientific Talk
• In computer science, Particle Swarm Optimization (PSO) is a
computational method that optimizes a problem by
iteratively trying to improve a candidate solution with regard
to a given measure of quality (This is the stopping Condition).
• PSO optimizes a problem by having a population of candidate
solutions, (known as particles),
particles and moving these particles
around in the search-space; according to simple mathematical
formulae over the particle's position (Current DATA ex: x,y,z,
etc… ) and velocity (indicating how much the Data can be
changed). [6]
Particle Swarm Optimisation

• Introduction…
• Each particle's movement is influenced by its local best known
position but, is also guided toward the best known positions in
the search-space, which are updated as better positions when
they are found by other particles. This is expected to move the
swarm toward the best solutions. [6]
• PSO is originally attributed to Kennedy, Eberhart in 1995 and
was introduced in a Modified version by Eberhart & Shi (the
modification was introducing a social factor to speed up the
algorithm ) in 1998 by and was first intended for simulating
social behaviour in 1997 by a paper constructed by Kennedy; as
a stylized representation of the movement of particals in a bird
flock or fish school.
Particle Swarm Optimisation

• Introduction …

Figure 1. A few common population topologies (neighborhoods).


(A) Single-sighted. (B) Ring topology. (C) Fully connected topology. (D) Isolated,
Particle Swarm Optimisation

• Introduction…
• The algorithm was simplified and it was observed to be
performing optimization(first it was not intended to be used
in this manner).
• The book by Kennedy and Eberhart[6] describes many
philosophical aspects of PSO and swarm intelligence.
• PSO is a metaheuristic as it makes few or no assumptions
about the problem being optimized and can search very large
spaces of candidate solutions.
• However, metaheuristics such as PSO do not guarantee an
optimal solution is ever found.
Particle Swarm Optimisation

• Introduction…Scientific Defination
• More specifically, PSO does not use the gradient of the
problem being optimized, which means PSO does not require
that the optimization problem be differentiable as is required
by classic optimization methods such as gradient descent
(Gradient descent is a first-order optimization algorithm. To find a local
minimum of a function using gradient descent, one takes steps
proportional to the negative of the gradient (or of the approximate
gradient) of the function at the current point.) and quasi-newton
methods (Quasi-Newton methods are methods used to either find
zeroes or local maxima and minima of functions, as an alternative to
Newton's method. They can be used if the Jacobian or Hessian is
unavailable or is too expensive to compute at every iteration.).
Particle Swarm Optimisation

• Introduction …
• PSO can therefore also be used on optimization problems that
are partially irregular, noisy, change over time, etc. [*]:
– i.e. they are used for real time & data analysis & applications
Particle Swarm optimisation
• Introduction …
Particle Swarm optimisation
Particle Swarm Optimisation

• The Algorithm
• The algorithm keeps track of three global variables:
• Target value or condition
• Global best (gBest) value indicating which particle's data is currently
closest to the Target
• Stopping value indicating when the algorithm should stop if the Target
isn't found
• Each particle consists of:
• Data representing a possible solution
• A Velocity value indicating how much the Data can be changed
• A personal best (pBest) value indicating the closest the particle's Data has
ever come to the Target
Particle Swarm Optimisation

• The Algorithm…
• The particles' data could be anything. In the flocking birds
example above, the data would be the X, Y, Z coordinates of
each bird.
• The individual coordinates of each bird would try to move
closer to the coordinates of the bird which is closer to the
food's coordinates (gBest).
• If the data is a pattern or sequence, then individual pieces of
the data would be manipulated until the pattern matches the
target pattern.
Particle Swarm Optimisation

• The Algorithm…
• The velocity value is calculated according to how far an
individual's data is from the target. The further it is, the larger
the velocity value.
• In the birds example, the individuals furthest from the food
would make an effort to keep up with the others by flying
faster toward the gBest bird.
• If the data is a pattern or sequence, the velocity would
describe how different the pattern is from the target, and
thus, how much it needs to be changed to match the target.
(Making it similar to Neural Network)
Particle Swarm Optimisation

• The Algorithm…
• Each particle's pBest value only indicates the closest the data
has ever come to the target since the algorithm started.
• The gBest value only changes when any particle's pBest value
comes closer to the target than gBest.
• Through each iteration of the algorithm, gBest gradually
moves closer and closer to the target until one of the particles
reaches the target.
• It's also common to see PSO algorithms using population
topologies, or "neighborhoods",
neighborhoods which can be smaller,
localized subsets of the global best value.
Particle Swarm Optimisation

• The Algorithm…
• These neighborhoods can involve two or more particles which
are predetermined to act together, or subsets of the search
space that particles happen into during testing.
• The use of neighborhoods often help the algorithm to avoid
getting stuck in local minima.
• Neighborhood definitions and how they're used have
different effects on the behavior of the algorithm.
Particle Swarm Optimisation

• Introduction …
• remembering the neighborhood again

Figure 1. A few common population


topologies (neighborhoods).
(A) Single-sighted. (B) Ring topology.
(C) Fully connected topology. (D)
Isolated,
Particle Swarm Optimisation

• The Algorithm…
Particle Swarm Optimisation

• Flow diagram
Particle Swarm Optimisation

Pseudo Code
For each particle
{
Initialize particle
}
 
Do until maximum iterations or minimum error criteria
{
For each particle
{
Calculate Data fitness value
If the fitness value is better than pBest
{
Set pBest = current fitness value
}
If pBest is better than gBest
{
Set gBest = pBest
}
}

For each particle


{
Calculate particle Velocity
Use gBest and Velocity to update particle Data
}
Particle Swarm Optimisation

• Java Code
• http://www.mnemstudio.org/ai/pso/pso_java
_ex1.txt
Particle Swarm Optimisation

• C code
• // Filename: swarm1.h
• http://www.mnemstudio.org/ai/pso/pso_cpp_swarm1h_ex1.txt
• // Filename: swarm1.cpp
• http://www.mnemstudio.org/ai/pso/pso_cpp_swarm1cpp_ex1.txt
• // Filename: cParticle.h
• http://www.mnemstudio.org/ai/pso/pso_cpp_cparticleh_ex1.txt
• // Filename: cParticle.cpp
• http://www.mnemstudio.org/ai/pso/pso_cpp_cparticlecpp_ex1.txt
Particle Swarm optimisation
Particle Swarm Optimisation

• Applications [6]: 10. Engines and Motors


1. Antennas (SW antenna design)
2. Biomedical
11. Entertainment (Music
3. Communication Networks generation and games have also
4. Clustering and Classification
5. Combinatorial optimisation
a small contribution of PSO
6. Control applications)
7. Design
8. Distribution networks 12. Faults
9. Electronics and Electromagnetics
13. Financial
14. Fuzzy and Neuro-fuzzy
15. Graphics and Visualisation
16. Image and Video
17. Metallurgy
18. Modelling
Particle Swarm Optimisation

• Applications:
19.Neural networks
20.Prediction and forecasting
25. Sensor Networks
21.Power systems and plants
22.Robotics https://www.youtube.com/watch?v=VhRXst6DeYI
23.Scheduling
24.Security and Military (missile effectiveness optimisation ‘only simulation’ &
26. Signal Processing
27. etc… [6]

28. And 3 specific famous


applications are listed
in detail in the
following slides[1]

cryptography and cryptanalysis)


Particle Swarm Optimisation

• Applications:
– Simple Arithmetic A+B+C
• Simple Function Example 1
• This is a simple example where the algorithm
finds three numbers that add up to a target
value.
• A fully connected neighborhood is used, so all
particles can be compared to each other.
• This example's simplicity makes it very easy to
experiment with.
Particle Swarm Optimisation

• Applications:
– Simple Arithmetic A+B+C
• Simple Function Example 1
• Almost all variables can be modified.
• TARGET - the answer the algorithm is looking for.
• MAX_INPUTS - this number of operands in the expression.
• MAX_PARTICLES - number of particles employed in the test.
• V_MAX - maximum velocity change allowed.
• MAX_EPOCHS - number of iterations for the algorithm.
• START_RANGE_MIN - smallest random number to start operands at.
• START_RANGE_MAX - largest random number to start operands at.
Particle Swarm Optimisation

• Simple Arithmetic

• Java Code
– http://www.mnemstudio.org/ai/pso/pso_java_ex1.t
xt
Particle Swarm Optimisation

main Initialize
&
Define Variables

Define Data
Particle
PSO Algorithm (As a Class)

Print
Sub Results
&
Results
Particle Swarm Optimisation
private static class Particle
{
private int mData[] = new int[MAX_INPUTS];
private int mpBest = 0; public int pBest()
private double mVelocity = 0.0; {
return this.mpBest;
public Particle() }
{ public void pBest(int value)
this.mpBest = 0; {
this.mVelocity = 0.0; this.mpBest = value;
} return;
}
public int data(int index) public double velocity()
{ {
return this.mData[index]; return this.mVelocity;
} }
public void velocity(double
public void data(int index, int value) velocityScore)
{ {
this.mData[index] = value; this.mVelocity = velocityScore;
return; return;
} }
} // Particle
Particle Swarm Optimisation
public static void main(String[]
main args)
{ private static void printSolution()
printSolution
PSOAlgorithm(); {
printSolution(); // Find solution particle.
return; int i = 0;
} for(; i < particles.size(); i++)
{
} if(testProblem(i) == TARGET){
break; }
}//for
// Print it.
System.out.println("Particle " + i + " has achieved
for(int j = 0; j < MAX_INPUTS; j++)
{
if(j < MAX_INPUTS - 1){
System.out.print(particles.get(i).data(j) + " + "
}else{
System.out.print(particles.get(i).data(j) + " = "
}
} // j
System.out.print("\n");
return;
}
Particle Swarm Optimisation
private static void PSOAlgorithm() for(int i = 0; i < MAX_PARTICLES; i++)
{ {
int gBest = 0; aParticle = particles.get(i);
int gBestTest = 0; for(int j = 0; j < MAX_INPUTS; j++)
Particle aParticle = null; {
int epoch = 0; if(j < MAX_INPUTS - 1){
boolean done = false; System.out.print(aParticle.data(j) + " + ");
initialize(); }else{
while(!done) System.out.print(aParticle.data(j) + " = ");
{ }
// Two conditions can end this loop: } // j
// if the maximum number of epochs System.out.print(testProblem(i)+ "\n");
allowed has been reached, or, if(testProblem(i) == TARGET){
// if the Target value has been done = true;
found. }
if(epoch < MAX_EPOCHS){ } // i
gBestTest = minimum();
return; aParticle = particles.get(gBest);
}
Particle Swarm Optimisation

// if(any particle's pBest value is better than the gBest value, make it the new gBest value.
if(Math.abs(TARGET - testProblem(gBestTest)) < Math.abs(TARGET -
testProblem(gBest))){
gBest = gBestTest; Does Not
} Feel the
getVelocity(gBest); Error
updateparticles(gBest); Resulting
System.out.println("epoch number: " + epoch); Here
epoch += 1;
}//if
else{ done = true; } //else } //for slide above
Particle Swarm Optimisation
/**
* Sources:
* Kennedy, J. and Eberhart, R. C. Particle swarm optimization.
* Proc. IEEE int'l conf. on neural networks Vol. IV, pp. 1942-1948.
* IEEE service center, Piscataway, NJ, 1995.
* PSO Tutorial found at: http://www.swarmintelligence.org/tutorials.php */
import java.util.ArrayList;
import java.util.Random;
public class Swarm_Ex1
{
private static final int TARGET = 50;
private static final int MAX_INPUTS = 3;
private static final int MAX_PARTICLES = 20; // the i, k is the random input for the process
private static final int V_MAX = 10; // Maximum velocity change allowed.
private static final int MAX_EPOCHS = 200;
// The particles will be initialized with data randomly chosen within the range
// of these starting min and max values:
private static final int START_RANGE_MIN = 140;
private static final int START_RANGE_MAX = 190;
Particle Swarm Optimisation
private static ArrayList<Particle> particles = new ArrayList<Particle>();
private static void initialize()
{
for (int k = 0; k < MAX_PARTICLES; k++)
{

for(int i = 0; i < MAX_PARTICLES; i++)


{
Particle newParticle = new Particle();
int total = 0;
for(int j = 0; j < MAX_INPUTS; j++)
{
newParticle.data(j, getRandomNumber(START_RANGE_MIN,
START_RANGE_MAX));
total += newParticle.data(j);
} // j
newParticle.pBest(total);
particles.add(newParticle);
} // i private static int getRandomNumber(int low, int high)
} // k {
return; return (int)((high - low) * new Random().nextDouble() + low);
} }
Particle Swarm Optimisation
private static void getVelocity(int
getVelocity gBestindex)
{
// from Kennedy & Eberhart(1995).
// vx[][] = vx[][] + 2 * rand() * (pbestx[][] - presentx[][]) +
// 2 * rand() * (pbestx[][gbest] - presentx[][])

int testResults = 0; for(int i = 0; i < MAX_PARTICLES; i++)


int bestResults = 0; {
double vValue = 0.0; testResults = testProblem(i);
Particle aParticle = null; aParticle = particles.get(i);
bestResults = testProblem(gBestindex); vValue = aParticle.velocity() + 2 * new
Random().nextDouble() * (aParticle.pBest() -
testResults) + 2 * new Random().nextDouble()
return; * (bestResults - testResults);
} if(vValue > V_MAX){
aParticle.velocity(V_MAX);
}else if(vValue < -V_MAX){
aParticle.velocity(-V_MAX);
}else{
aParticle.velocity(vValue);
}
}
Particle Swarm Optimisation
private static void updateparticles(int
updateparticles gBestindex)
{
Particle gBParticle = particles.get(gBestindex);

for(int i = 0; i < MAX_PARTICLES; i++)


{
for(int j = 0; j < MAX_INPUTS; j++)
{
if(particles.get(i).data(j) != gBParticle.data(j)){
particles.get(i).data(j, particles.get(i).data(j) +
(int)Math.round(particles.get(i).velocity()));
}
} // j

// Check pBest value.


int total = testProblem(i);
if(Math.abs(TARGET - total) < particles.get(i).pBest()){
particles.get(i).pBest(total);
}

} // i
return;
}
Particle Swarm Optimisation

private static int testProblem(int


testProblem index)
{
int total = 0;
Particle aParticle = null;

aParticle = particles.get(index);

for(int i = 0; i < MAX_INPUTS; i++)


{
total += aParticle.data(i);
}
return total;
}
Particle Swarm Optimisation

• Applications:
– Travelling Salesperson
Problem (TSP)
Particle Swarm Optimisation

• Applications:
– Travelling Salesperson
Problem
• This is a rendition of the classic
Traveling Salesman Problem, where
the shortest tour needs to be found
among all cites without visiting the
sameworks
The algorithm oneout
twice. 
the minimum
Cartesian distance through eight cities
named 0, 1, 2, 3, 4, 5, 6 and 7.  There are
88 (or, 16,777,216) possible combinations,
Particle Swarm Optimisation

• Applications:
– Travelling Salesperson Problem
• but this algorithm can find them in less than 83, and sometimes less than 82!
• The cities' locations are (30, 5), (40, 10), (40, 20), (29, 25), (19, 25), (9, 19), (9, 9),
(20, 5).  This is roughly a circle; something easy to see on a graph and check the
algorithm's solutions with. All manner of combinations using eight digit strings of 0
through 7 are explored to find the one that represents the shortest distance (i.e.;
01234567). 
• By hand, I've found the solution to be about 86.6299, which is the target value of
the algorithm.
• To simplify this example, there is no starting or ending point, and it doesn't matter
which direction the tour travels in (i.e.; forward or backward).  For instance, a
solution that looks like 34567012 is valid because it travels forward from 3 and
around to 2.  The solution 76543210 is equally valid, it just goes backward from 7
to 0.
Particle Swarm Optimisation

• Applications:
– Travelling Salesperson Problem
• I decided to take a slightly different take on the PSO algorithm with this
one. I added the global worst to the global variables. The velocity score is
calculated using the global worst, defining velocity as the measure of how
bad each particle is doing (as opposed to how good).
• Modifying data is done by swapping digits within each particles own data
set. The amount of swapping depend on how bad it's doing (i.e.; velocity).
• Particles are slowly pushed towards the global best by copying pieces of
the next best particle's data (single-sighted topology).
• Swapping and copying are done to all particles except the global best.
Particle Swarm Optimisation

• Applications:
– Travelling Salesperson Problem

• In this example, only three variables can be experimented


with:
• PARTICLE_COUNT - number of particles employed in the test.
• V_MAX - maximum velocity change allowed.
• MAX_EPOCHS - number of iterations for the algorithm.
Particle Swarm Optimisation

• Travelling Salesperson Problem

• Java Code
– http://www.mnemstudio.org/ai/pso/pso_tsp_java_e
x1.txt
Particle Swarm Optimisation
Particle Swarm Optimisation

• Applications:
– Pattern Search
Particle Swarm Optimisation

• Applications:
– Pattern Search
• Pattern Search Example
• This example is another variation on PSO which searches for a
specific pattern of letters. In this case, the word "bingo". From the
start, the algorithm doesn't know what letters it's searching for,
what order they're supposed to be in, or even how many letters are
in the word. To aid in it's search, the algorithm uses the
Levenshtein measure of distance. The Levenshtein distance is
useful in finding the difference between sequences, and is often
used to measure the difference between words in various search
utilities.
Particle Swarm Optimisation

• Applications:
– Pattern Search
• MAX_LENGTH - maximum number of letters to combine.
• MIN_LENGTH - minimum number of letters to combine.
• TARGET - the word to search for.
• CHAR_SET - characters available to search with.
• PARTICLE_COUNT - number of particles to use.
• V_MAX - maximum velocity allowed.
• V_MIN - minimum velocity allowed. This might seem counter-
intuitive to PSO, but it really does help.
• MAX_EPOCHS - maximum number of epochs to run algorithm.
Particle Swarm Optimisation

• Applications:
– Pattern Search
• Normally, PSO algorithms complete when the velocity
reaches zero, but in this case, it actually helps to keep it from
zero. This example gets stuck in local minima almost every
time with velocity at zero. By keeping it just above, the target
word is more likely to be found.
• The CHAR_SET in this example range from "a" to "z" for the
sake of simplicity, but more or less characters can easily be
included.
Particle Swarm Optimisation

• Pattern Search

• Java Code
– http://www.mnemstudio.org/ai/pso/pso_java_ex3.t
xt
Particle Swarm optimisation
Particle Swarm Optimisation

• Applications:
Every Problem
is Specific
& there is
No
General Rule
Particle Swarm Optimisation

• Conclusion
1. The algorithm was simplified and it was observed to be performing
optimization [6]
i.e. initially meaning it wasn’t intended to preform optimization.
2. PSO can therefore also be used on optimization problems that are
partially irregular, noisy, change over time, etc. [*]:
–i.e. they are used for real time & data analysis & applications
3. Particle swarm optimisation (PSO) has been enormously successful.
Within little more than a decade hundreds of papers have reported
successful applications of PSO. [8]
4. Particle swarm optimization is a heuristic global optimization which is
used in various real life applications. [3]
Particle Swarm Optimisation

• Conclusion …
5. One such difficulty seen already was with simple uni-modal
functions (unimodality means possessing a unique mode. More
generally, unimodality means there is only a single highest value,
somehow defined, of some mathematical object.), where regrouping
is unnecessary since particles quickly and easily approximate
the global minimizer to a high degree of accuracy, and where
there is no better minimizer to be found.
Particle Swarm Optimisation

• Conclusion … (not very important)


6. A Modified Particle Swarm Optimizer (1998) works better but
only small benchmark function it uses to test [3]. (limitation)
7. The swarm and the queen: Towards deterministic and
adaptive particle swarm optimization by Clerc M.(1999) did
not clear whether optimal value is dependant on ᶲ [3] (social
or confidence coefficient [5]) (An Advanced Topic).
8. All the three methods (PSO, Modified PSO & The swarm and
the queen) mentioned above did not give constant result [3].
(limitation)
Particle Swarm Optimisation

• FUTURE SCOPE

• Future work will try to understand where the algorithm


suffers in order to understand any limitations and apply it to
the proper contexts.
Particle Swarm optimisation
Particle Swarm Optimisation

• References
[1]
http://www.mnemstudio.org/particle-swarm-introduction.ht
m
[2] Particle Swarm Optimization mini tutorial.ppt
From website:
http://dsp.szu.edu.cn/DSP2006/research/areas/T08/papers/PSO
%20mini%20tutorial.ppt
[3] A Brief Review on Particle Swarm Optimization: Limitations &
Future Directions, Shailendra S. Aote, Dr M M Raghuwanshi,
Dr. Latesh Malik, International Journal of Computer Science
Engineering (IJCSE) © Vol. 2 No.05 Sep 2013
Particle Swarm Optimisation

• References …
• [4] Yuhui Shi and Russell Eberhart, “A Modified Particle
Swarm Optimizer”, IEEE 1998.
• [5] Clerc, M. ; (France Telecom, Annecy, France) The swarm
and the queen: towards a deterministic and adaptive particle
swarm optimization, © Evolutionary Computation, 1999. CEC
99. Proceedings of the 1999 Congress on  (Volume:3 )
– http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=785513

• [6] Wikipedia.com which has used:


– Kennedy, J.; Eberhart, R.C. (2001). Swarm Intelligence.
Morgan Kaufmann. ISBN 1-55860-595-9.
Particle Swarm Optimisation

• References …
– Poli, R. (01/2008).
"Analysis of the publications on the applications of particle
swarm optimisation"
. Journal of Artificial Evolution and Applications 2008: 1–
10. doi:10.1155/2008/685175. 
• [7] a video introduced by Russ Eberhart found at:
https://www.youtube.com/watch?v=-QpKtk6o5VU
Particle Swarm Optimisation

• Any Questions?

• Either direct or

• Email: mnfsoft@gmail.com
Particle Swarm Optimisation

Thank You

You might also like