Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

1/4/2016

ParticleSwarmOptimization:Tutorial

PSOTutorial

AboutPSO
People
Bibliography
Tutorials
Conferences

AChineseversionisalsoavailable.
1.Introduction
Particleswarmoptimization(PSO)isapopulationbasedstochasticoptimizationtechniquedeveloped
by Dr. Eberhart and Dr. Kennedy in 1995, inspired by social behavior of bird flocking or fish
schooling.

Guestbook

PSOsharesmanysimilaritieswithevolutionarycomputationtechniquessuchasGeneticAlgorithms
(GA). The system is initialized with a population of random solutions and searches for optima by
updating generations. However, unlike GA, PSO has no evolution operators such as crossover and
mutation.InPSO,thepotentialsolutions,calledparticles,flythroughtheproblemspacebyfollowing
thecurrentoptimumparticles.Thedetailedinformationwillbegiveninfollowingsections.

AboutAuthor

Compared to GA, the advantages of PSO are that PSO is easy to implement and there are few

Links
Forum

parameters to adjust. PSO has been successfully applied in many areas: function optimization,
artificialneuralnetworktraining,fuzzysystemcontrol,andotherareaswhereGAcanbeapplied.

Genetic
Program
Models
Gene
Expression
Programming
andSymbolic
Regression
DTREG

Theremainingofthereportincludessixsections:
Background:artificiallife.
TheAlgorithm
ComparisonsbetweenGeneticalgorithmandPSO
ArtificialneuralnetworkandPSO
PSOparametercontrol
OnlineresourcesofPSO
2.Background:Artificiallife
Theterm"ArtificialLife"(ALife)isusedtodescriberesearchintohumanmadesystemsthatpossess
some of the essential properties of life. ALife includes twofolded research topic:
(http://www.alife.org)
1.ALifestudieshowcomputationaltechniquescanhelpwhenstudyingbiologicalphenomena
2.ALifestudieshowbiologicaltechniquescanhelpoutwithcomputationalproblems
The focus of this report is on the second topic. Actually, there are already lots of computational
techniquesinspiredbybiologicalsystems.Forexample,artificialneuralnetworkisasimplifiedmodel
ofhumanbraingeneticalgorithmisinspiredbythehumanevolution.
Here we discuss another type of biological system social system, more specifically, the collective
behaviorsofsimpleindividualsinteractingwiththeirenvironmentandeachother.Someonecalledit
as swarm intelligence. All of the simulations utilized local processes, such as those modeled by
cellularautomata,andmightunderlietheunpredictablegroupdynamicsofsocialbehavior.
Some popular examples are floys and boids. Both of the simulations were created to interpret the
movement of organisms in a bird flock or fish school. These simulations are normally used in
computeranimationorcomputeraideddesign.
There are two popular swarm inspired methods in computational intelligence areas: Ant colony
optimization(ACO)andparticleswarmoptimization(PSO).ACOwasinspiredbythebehaviorsofants
and
has
many
successful
applications
in
discrete
optimization
problems.
(http://iridia.ulb.ac.be/~mdorigo/ACO/ACO.html)
Theparticleswarmconceptoriginatedasasimulationofsimplifiedsocialsystem.Theoriginalintent
wastographicallysimulatethechoreographyofbirdofabirdblockorfishschool.However,itwas
found
that
particle
swarm
model
can
be
used
as
an
optimizer.
(http://www.engr.iupui.edu/~shi/Coference/psopap4.html)
3.Thealgorithm
As stated before, PSO simulates the behaviors of bird flocking. Suppose the following scenario: a
group of birds are randomly searching food in an area. There is only one piece of food in the area
beingsearched.Allthebirdsdonotknowwherethefoodis.Buttheyknowhowfarthefoodisin
eachiteration.Sowhat'sthebeststrategytofindthefood?Theeffectiveoneistofollowthebird
whichisnearesttothefood.
PSOlearnedfromthescenarioandusedittosolvetheoptimizationproblems.InPSO, each single
solutionisa"bird"inthesearchspace.Wecallit"particle".Allofparticleshavefitnessvalueswhich
areevaluatedbythefitnessfunctiontobeoptimized,andhavevelocitieswhichdirecttheflyingof

http://www.swarmintelligence.org/tutorials.php

1/4

1/4/2016

ParticleSwarmOptimization:Tutorial
theparticles.Theparticlesflythroughtheproblemspacebyfollowingthecurrentoptimumparticles.
PSO is initialized with a group of random particles (solutions) and then searches for optima by
updatinggenerations.Ineveryiteration,eachparticleisupdatedbyfollowingtwo"best"values.The
firstoneisthebestsolution(fitness)ithasachievedsofar.(Thefitnessvalueisalsostored.)This
valueiscalledpbest.Another"best"valuethatistrackedbytheparticleswarmoptimizeristhebest
value, obtained so far by any particle in the population. This best value is a global best and called
gbest.Whenaparticletakespartofthepopulationasitstopologicalneighbors,thebestvalueisa
localbestandiscalledlbest.
After finding the two best values, the particle updates its velocity and positions with following
equation(a)and(b).
v[]=v[]+c1*rand()*(pbest[]present[])+c2*rand()*(gbest[]present[])(a)
present[]=persent[]+v[](b)
v[]istheparticlevelocity,persent[]isthecurrentparticle(solution).pbest[]andgbest[]aredefined
asstatedbefore.rand()isarandomnumberbetween(0,1).c1,c2arelearningfactors.usuallyc1=
c2=2.
Thepseudocodeoftheprocedureisasfollows
Foreachparticle
Initializeparticle
END
Do
Foreachparticle
Calculatefitnessvalue
Ifthefitnessvalueisbetterthanthebestfitnessvalue(pBest)inhistory
setcurrentvalueasthenewpBest
End
ChoosetheparticlewiththebestfitnessvalueofalltheparticlesasthegBest
Foreachparticle
Calculateparticlevelocityaccordingequation(a)
Updateparticlepositionaccordingequation(b)
End
Whilemaximumiterationsorminimumerrorcriteriaisnotattained
Particles' velocities on each dimension are clamped to a maximum velocity Vmax. If the sum of
accelerations would cause the velocity on that dimension to exceed Vmax, which is a parameter
specifiedbytheuser.ThenthevelocityonthatdimensionislimitedtoVmax.
4.ComparisonsbetweenGeneticAlgorithmandPSO
Mostofevolutionarytechniqueshavethefollowingprocedure:
1.Randomgenerationofaninitialpopulation
2. Reckoning of a fitness value for each subject. It will directly depend on the distance to the
optimum.
3.Reproductionofthepopulationbasedonfitnessvalues.
4.Ifrequirementsaremet,thenstop.Otherwisegobackto2.
Fromtheprocedure,wecanlearnthatPSOsharesmanycommonpointswithGA.Bothalgorithms
start with a group of a randomly generated population, both have fitness values to evaluate the
population.Bothupdatethepopulationandsearchfortheoptimiumwithrandomtechniques.Both
systemsdonotguaranteesuccess.
However, PSO does not have genetic operators like crossover and mutation. Particles update
themselveswiththeinternalvelocity.Theyalsohavememory,whichisimportanttothealgorithm.
Comparedwithgeneticalgorithms(GAs),theinformationsharingmechanisminPSOissignificantly
different.InGAs,chromosomesshareinformationwitheachother.Sothewholepopulationmoves
likeaonegrouptowardsanoptimalarea.InPSO,onlygBest(orlBest)givesouttheinformationto
others. It is a one way information sharing mechanism. The evolution only looks for the best
solution.ComparedwithGA,alltheparticlestendtoconvergetothebestsolution quickly even in
thelocalversioninmostcases.
5.ArtificialneuralnetworkandPSO
Anartificialneuralnetwork(ANN)isananalysisparadigmthatisasimplemodelofthebrainandthe
backpropagation algorithm is the one of the most popular method to train the artificial neural
network. Recently there have been significant research efforts to apply evolutionary computation
(EC)techniquesforthepurposesofevolvingoneormoreaspectsofartificialneuralnetworks.
Evolutionary computation methodologies have been applied to three main attributes of neural
networks:networkconnectionweights,networkarchitecture(networktopology,transferfunction),
andnetworklearningalgorithms.

http://www.swarmintelligence.org/tutorials.php

2/4

1/4/2016

ParticleSwarmOptimization:Tutorial
MostoftheworkinvolvingtheevolutionofANNhasfocusedonthenetworkweightsandtopological
structure.Usuallytheweightsand/ortopologicalstructureareencodedasachromosomeinGA.The
selectionoffitnessfunctiondependsontheresearchgoals.Foraclassificationproblem,therateof
misclassifiedpatternscanbeviewedasthefitnessvalue.
TheadvantageoftheECisthatECcanbeusedincaseswithnondifferentiablePEtransferfunctions
andnogradientinformationavailable.Thedisadvantagesare1.Theperformanceisnotcompetitive
insomeproblems.2.representationoftheweightsisdifficultandthegeneticoperatorshavetobe
carefullyselectedordeveloped.
There are several papers reported using PSO to replace the backpropagation learning algorithm in
ANNinthepastseveralyears.ItshowedPSOisapromisingmethodtotrainANN.Itisfasterand
getsbetterresultsinmostcases.ItalsoavoidssomeoftheproblemsGAmet.
HereweshowasimpleexampleofevolvingANNwithPSO.Theproblemisabenchmarkfunctionof
classification problem: iris data set. Measurements of four attributes of iris flowers are provided in
each data set record: sepal length, sepal width, petal length, and petal width. Fifty sets of
measurementsare present for each of three varieties of iris flowers, for a total of 150 records, or
patterns.
A3layerANNisusedtodotheclassification.Thereare4inputsand3outputs.Sotheinputlayer
has4neuronsandtheoutputlayerhas3neurons.Onecanevolvethenumberofhiddenneurons.
However,fordemonstrationonly,herewesupposethehiddenlayerhas6neurons.Wecanevolve
other parameters in the feedforward network. Here we only evolve the network weights. So the
particlewillbeagroupofweights,thereare4*6+6*3=42weights,sotheparticleconsistsof42
realnumbers.Therangeofweightscanbesetto[100,100](thisisjustaexample,inrealcases,
one might try different ranges). After encoding the particles, we need to determine the fitness
function. For the classification problem, we feed all the patterns to the network whose weights is
determined by the particle, get the outputs and compare it the standard outputs. Then we record
thenumberofmisclassifiedpatternsasthefitnessvalueofthatparticle.NowwecanapplyPSOto
train the ANN to get lower number of misclassified patterns as possible. There are not many
parametersinPSOneedtobeadjusted.Weonlyneedtoadjustthenumberofhiddenlayersandthe
rangeoftheweightstogetbetterresultsindifferenttrials.
6.PSOparametercontrol
Fromtheabovecase,wecanlearnthattherearetwokeystepswhenapplyingPSOtooptimization
problems:therepresentationofthesolutionandthefitnessfunction.OneoftheadvantagesofPSO
is that PSO take real numbers as particles. It is not like GA, which needs to change to binary
encoding,orspecialgeneticoperatorshavetobeused.Forexample,wetrytofindthesolution for
f(x)=x1^2+x2^2+x3^2,theparticlecanbesetas(x1,x2,x3),andfitnessfunctionisf(x).Then
wecanusethestandardproceduretofindtheoptimum.Thesearchingisarepeatprocess,andthe
stopcriteriaarethatthemaximumiterationnumberisreachedortheminimumerrorconditionis
satisfied.
TherearenotmanyparameterneedtobetunedinPSO.Hereisalistoftheparameters and their
typicalvalues.
Thenumberofparticles:thetypicalrangeis2040.Actuallyformostoftheproblems10particlesis
large enough to get good results. For some difficult or special problems, one can try 100 or 200
particlesaswell.
Dimensionofparticles:Itisdeterminedbytheproblemtobeoptimized,
Range of particles: It is also determined by the problem to be optimized, you can specify different
rangesfordifferentdimensionofparticles.
Vmax:itdeterminesthemaximumchangeoneparticlecantakeduringoneiteration.Usuallyweset
therangeoftheparticleastheVmaxforexample,theparticle(x1,x2,x3)
X1belongs[10,10],thenVmax=20
Learningfactors:c1andc2usuallyequalto2.However,othersettingswerealsousedindifferent
papers.Butusuallyc1equalstoc2andrangesfrom[0,4]
The stop condition: the maximum number of iterations the PSO execute and the minimum error
requirement. for example, for ANN training in previous section, we can set the minimum error
requirement is one misclassified pattern. the maximum number of iterations is set to 2000. this
stopconditiondependsontheproblemtobeoptimized.
Globalversionvs.localversion:weintroducedtwoversionsofPSO.globalandlocal version. global
versionisfasterbutmightconvergetolocaloptimumforsomeproblems.localversionisalittlebit
slowerbutnoteasytobetrappedintolocaloptimim.Onecanuseglobalversiontogetquickresult
anduselocalversiontorefinethesearch.
Anotherfactorisinertiaweight,whichisintroducedbyShiandEberhart.Ifyouareinterestedinit,
pleaserefertotheirpaperin1998.(Title:Amodifiedparticleswarmoptimizer)
7.OnlineResourcesofPSO

http://www.swarmintelligence.org/tutorials.php

3/4

1/4/2016

ParticleSwarmOptimization:Tutorial
The development of PSO is still ongoing. And there are still many unknown areas in PSO research
suchasthemathematicalvalidationofparticleswarmtheory.
One can find much information from the internet. Following are some information you can get
online:
http://www.particleswarm.net lots of information about Particle Swarms and, particularly, Particle
SwarmOptimization.LotsofParticleSwarmLinks.
http://icdweb.cc.purdue.edu/~hux/PSO.shtml lists an updated bibliography of particle swarm
optimizationandsomeonlinepaperlinks
http://www.researchindex.com/youcansearchparticleswarmrelatedpapersandreferences.
References:
http://www.engr.iupui.edu/~eberhart/
http://users.erols.com/cathyk/jimk.html
http://www.alife.org
http://www.aridolan.com
http://www.red3d.com/cwr/boids/
http://iridia.ulb.ac.be/~mdorigo/ACO/ACO.html
http://www.engr.iupui.edu/~shi/Coference/psopap4.html
Kennedy,J.andEberhart,R.C.Particleswarmoptimization.Proc.IEEEint'lconf.onneuralnetworks
Vol.IV,pp.19421948.IEEEservicecenter,Piscataway,NJ,1995.
Eberhart, R. C. and Kennedy, J. A new optimizer using particle swarm theory. Proceedings of the
sixthinternationalsymposiumonmicromachineandhumansciencepp.3943.IEEEservicecenter,
Piscataway,NJ,Nagoya,Japan,1995.
Eberhart, R. C. and Shi, Y. Particle swarm optimization: developments, applications and resources.
Proc. congress on evolutionary computation 2001 IEEE service center, Piscataway, NJ., Seoul,
Korea.,2001.
Eberhart, R. C. and Shi, Y. Evolving artificial neural networks. Proc. 1998 Int'l Conf. on neural
networksandbrainpp.PL5PL13.Beijing,P.R.China,1998.
Eberhart,R.C.andShi,Y.Comparisonbetweengeneticalgorithmsandparticleswarmoptimization.
Evolutionary programming vii: proc. 7th ann. conf. on evolutionary conf., SpringerVerlag, Berlin,
SanDiego,CA.,1998.
Shi, Y. and Eberhart, R. C. Parameter selection in particle swarm optimization. Evolutionary
ProgrammingVII:Proc.EP98pp.591600.SpringerVerlag,NewYork,1998.
Shi,Y.andEberhart,R.C.Amodifiedparticleswarmoptimizer.ProceedingsoftheIEEEInternational
ConferenceonEvolutionaryComputationpp.6973.IEEEPress,Piscataway,NJ,1998
Adsby Google

Algorithm PsoWiki BeeSwarm

Binary

Allrightsreserved.XiaohuiHu2006

http://www.swarmintelligence.org/tutorials.php

4/4

You might also like