particle swarm optimization (pso) - university of...
TRANSCRIPT
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 1
Particle Swarm OptimizationParticle Swarm Optimization(PSO)(PSO)
Adaptive Swarms for OptimizationAdaptive Swarms for Optimization
AIAIChristian Jacob
Department of Computer Science
University of Calgary
CPSC 565 — Winter 2003
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 2
PSO: An OverviewPSO: An Overview
• Developed by– Russ Eberhart, Purdue School of Engineering and Technology,
Indianapolis
– Jim Kennedy, Bureau of Labor Statistics, Washington, DC
• A concept for optimizing non-linear functions usingparticle swarm methodology
• Has roots in Artificial Life and Evolutionary Computation
• Simple concept
• Easy to implement
• Computationally efficient
• Effective on a wide variety of problems
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 3
Evolution of Concept and ParadigmsEvolution of Concept and Paradigms
• Discovered through simplified social model simulation
• Related to bird flocking, fish schooling and swarmingtheory
• Related to evolutionary computation:– Genetic algorithms
– Evolution strategies
• Kennedy developed the “cornfield vector” for birdsseeking food
• Bird flock became swarm
• Expanded to multi-dimensional search
• Incorporated acceleration by distance
• Paradigm simplified
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 4
Flocks, Herds, and SchoolsFlocks, Herds, and Schools
• Avoid collisions
• Match neighbours’ velocity and orientation
• Steer toward the center
Separation Alignment Cohesion
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 5
PSO AlgorithmPSO Algorithm
1. Initialize population in hyperspace.• stochastically assign locations and velocities
2. Evaluate fitness of individual particles.
3. Keep track of location where individual had its highestfitness.
4. Modify velocities based on previous best and global (orneighbourhood) best positions.• neighbourhoods don’t change
5. Terminate if some condition is met.
6. Go to step 2.
Fly solutions through problem space …
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 6
Particle Swarm OptimizationParticle Swarm Optimization
DEMO
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 7
Particle SwarmsParticle Swarms
• Sociocognitive Space– High-dimensional
– Abstract: attitudes, behaviours, cognition
– Heterogeneous with respect to evaluation (dissonance)
– Multiple individuals
• Individual is characterized by– Position = “mental state”: xi
– Changes = velocity: vi
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 8
Particle Swarms: Particle Swarms: ““CodeCode””
• Individuals (particles) learn from their own experience:
– vi := vi + j() · (pi - xi)
– xi := xi + vi
• xi: current position of individual i
• vi: current velocity of individual i
• pi: so-far best position for individual I
• (pi - xi): acceleration towards previous best• j(): generates random positive number
• This formula, iterated over time, causes the individual’s trajectory tooscillate around their previous best point pi in sociocognitive space.
• The velocity of individual i is stochastically adjusted depending onprevious successes.
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 9
Particle Swarms with Neighbourhood BestParticle Swarms with Neighbourhood Best
• Sociocognitive space can contain many individuals that influence oneanother:
– vi := vi + j1() · (pi - xi) + j2() · (pg - xi)
– xi := xi + vi
• pg: previous best position in the population• (pi - xi): acceleration towards previous best• (pg - xi): acceleration towards global best• j1(), j2(): generate random positive numbers
• Evaluate your present position.• Compare it to your previous best and neighbourhood best.• Imitate self and others.
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 10
General PSO Update AlgorithmGeneral PSO Update Algorithm
• Global version:
– vid := wi vid + c1 j1() · (pid - xid) + c2 j2() · (pgd - xid)
– xid := xid + vid
• d: dimension• c1, c2 : positive constants
set exploration vs. exploitation• w: inertia• j1(), j2(): generate random positive numbers
• For neighbourhood version:– Change pgd to pld .
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 11
The The ““DrunkardDrunkard’’s Walks Walk””
• The particle will explode out of control if it is not limitedin some way. Three methods are widely used:
• Vmax:vi := vi + j1() · (pi - xi) + j2() · (pg - xi)
if vi > Vmax then vi := Vmax
else if vi < - Vmax then vi := Vmax
• Inertia weight a:vi := a vi + j1() · (pi - xi) + j2() · (pg - xi)
• Constriction coefficient c:vi := c (vi + j1() · (pi - xi) + j2() · (pg - xi))
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 12
Important Parameters: VImportant Parameters: Vmaxmax
• An important parameter in PSO; typically the only oneadjusted
• Clamps particles’ velocities on each dimension
• Determines “finteness” with which regions are searched:– If too high, can fly past optimal solutions
– If too low, can get stuck in local minima
• Set Vmax to dynamic range of the variables.
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 13
Important Parameters: Inertia Weight Important Parameters: Inertia Weight aa
• Inertia weight a:vi := a vi + j1() · (pi - xi) + j2() · (pg - xi)
• It seems possible that one can get rid of Vmax by setting aequal to the dynamic range of each variable.
• Then a must be selected carefully and/or decreased overthe run.
• Hence, the inertia weight a seems to have attributes of thetemperature in simulated annealing.
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 14
Evolutionary Computation & Particle SwarmsEvolutionary Computation & Particle Swarms
• Culture as evolution (anthropology)
• Adaptation / learning
• Memetics
• Evolutionary epistemology
• Change vs. selection
• Fitness and dissonance
• Cooperation vs. competition
• Evolution = competitive struggle
• PS = cooperation inherentPS = 5th EC paradigm?
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 15
Basic Principles of Swarm IntelligenceBasic Principles of Swarm Intelligence
• Proximity principle:– The population should be able to carry out simple space and time
computations.
• Quality principle:– The population should be able to respond to quality factors in the
environment.
• Diverse response principle:– The population should not commit its activities along excessively
narrow channels.
• Stability principle:– The population should not change its mode of behaviour every
time the environment changes.
• Adaptability principle:– The population must be able to change its behaviour mode when
it’s worth the computational price.
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 16
Adherence to Swarm Intelligence PrinciplesAdherence to Swarm Intelligence Principles
• Proximity:– N-dimensional space calculations carried out over a series of time
steps
• Quality:– Population responds to quality factors pbest and gbest (or lbest)
• Diverse response:– Responses allocated between pbest and gbest (or lbest)
• Stability:– Population changes state only when gbest (or lbest) changes
• Adaptability:– Population does change state when gbest (or lbest) changes
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 17
Enhancements of PSOEnhancements of PSO
• Elitist concept from GA might be helpful in PSO.– Carry global best particle into next generation?
• Could incorporate Gaussian distribution into stochasticvelocity changes.– Variance might then be like inertia weight.
– Put noise on decrease of inertia weights (better convergence)
• Could assign Vmax on a parameter-by-parameter basis.– Analogous to controlling severity of mutation in GA & EP
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 18
GAs vs. PSO: CrossoverGAs vs. PSO: Crossover
• Does not have crossover.
• Acceleration toward personal and global best is similarconcept.
• Particles midway between swarms also exhibit crossoverfeatures.
• Recombination operator in evolution strategies may bemore analogous.
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 19
GAs vs. PSO: MutationGAs vs. PSO: Mutation
• GAs are not actually ergodic:– A number of mutations probably required– Low fitness individuals will not survive selection– Probability of survival decreases geometrically with generations
• EP (for parameter optimization) is ergodic: can reach anypoint in one jump
• PSO seems to fall between GA and EP– any particle can eventually go anywhere
• PSO mutation-like behaviour is directional– GA and EP are omni-directional
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 20
GAs vs. PSO: SelectionGAs vs. PSO: Selection
• GA selection supports survival or the fittest(when using elitist strategy)
• There is no selection in PSO– All particles survive for the length of the run.
– Number of particles does not change.
• PSO is the only “evolutionary algorithm” that does notremove candidate population members.
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 21
PSO as an Evolutionary AlgorithmPSO as an Evolutionary Algorithm
• Distinctions among EC paradigms continue to blur.
• New hybrid PSO approaches will be emphasized.
• Practical PSO applications will be emphasized, in additionto benchmarking.
• Focus on how the PSO paradigms work.
• Many more (PSO) hybrids to come …
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 22
PSO ExamplePSO Example
Christian Jacob, University of CalgaryEmergent Computing — CPSC 565 — Winter 2003 23
ReferencesReferences
• Kennedy, J. and R. C. Eberhart (2001). Swarm Intelligence.San Francisco, Morgan Kaufmann Publishers.
• Kennedy, J. and R. C. Eberhart (2002). Tutorial onParticle Swarm Optimization. 2002 World Congress onComputational Intelligence, Hawaii, USA.