pso mini tutorial - · 2004−12−15 [email protected] swarm optimisation...
TRANSCRIPT
2004−12−15 [email protected] Swarm optimisation
Particle Swarm optimisation: A mini tutorial
2004−12−15 [email protected] Swarm optimisation
Part 1: United we stand
2004−12−15 [email protected] Swarm optimisation
Cooperation example
2004−12−15 [email protected] Swarm optimisation
Memory and informers
It is thanks to these eccentrics, whose behaviour is not conform to the one of the other bees, that all fruits sources around the colony are so quickly found.Karl von Frish 1927
DPNP
2004−12−15 [email protected] Swarm optimisation
Initialisation. Positions and velocities
2004−12−15 [email protected] Swarm optimisation
Psychosocial compromise
Here I am!
The best perf. of my
neighbours
My bestperf.
x g
p
v
i−proximity
g−proximity
2004−12−15 [email protected] Swarm optimisation
The historical algorithm
( )( )
( ) ( )( )( ) ( )( )
1
2
1
0,
0,
d
d
d d
d d
v t
v t
rand p x t
rand g x t
α
β ϕ
β ϕ
+ =⎧⎪⎪⎨+ −⎪⎪+ −⎩
for each particle
update the
velocity
( ) ( )1)1( ++=+ tvtxtxthen move
for each component d
At each time step tRandomness
inside the
loop
2004−12−15 [email protected] Swarm optimisation
The circular neighbourhood
Virtual circle
1
5
7
6 4
3
8 2Particle 1’s
3−neighbourhood
2004−12−15 [email protected] Swarm optimisation
Random proximity
xg
p
v
i−proximity
g−proximity
Hyperparallelepiped => Biased
DPNP=Mayan pyra
mid
2004−12−15 [email protected] Swarm optimisation
Maths and parametersThe right way
This way
Or this way
2004−12−15 [email protected] Swarm optimisation
Neighbourhoods (topologies)
“Small world” graph
or ?
(informers)
2004−12−15 [email protected] Swarm optimisation
Type 1” form
2121 ''),0(),0( ϕϕϕϕϕ +=+= randrand
1 2
1 2
' '' 'p gq ϕ ϕ
ϕ ϕ+
=+
( 1) ( ( ) ( ( )))( 1) ( 1) ( )
v t v t q x tx t v t x t
χ ϕ+ = + −⎧⎨ + = + +⎩with
⎪⎪⎩
⎪⎪⎨
⎧ >−+−=
4for 42
22
κ
ϕϕϕϕ
κ
χ
else
Usual values:κ=1ϕ=4.1=> χ=0.73swarm size=20hood size=3
Non divergence c
riterion
Global constriction coefficie
nt
2004−12−15 [email protected] Swarm optimisation
5D complex space
4
2
0
-2
-4
2
1
0
-1
-2
6
5
4
3
2
1
0
}
}Convergence
Non divergence
A 3D section ϕ
Re(y)Re(v)
2004−12−15 [email protected] Swarm optimisation
Move in a 2D section (attractor)
10.50-0.5-1
0.8
0.6
0.4
0.2
0-0.2
-0.4
-0.6
-0.8
Im(v)
Re(v)
2004−12−15 [email protected] Swarm optimisation
Beyond real numbers
0
1
2
3
4
0
1
2
3
0
1
2
3
4
1
2
3
4
5
6 0
1
2
0
1
2
3
4
8 8Bingo!
2004−12−15 [email protected] Swarm optimisation
Minimun requirements
( ) ( )( ) ( ) ( )( )'',)',( xfxfxfxfHHxx ≥∨<×∈∀
( )( )( )( ) positionvelocityposition
velocitypositionposition
velocityvelocityvelocity
velocityvelocitytcoefficien
⎯→⎯
⎯→⎯
⎯→⎯
⎯→⎯
⊕
Θ
⊗
,
,
,
,o
Comparing positions in the search space H
Algebraic operators
2004−12−15 [email protected] Swarm optimisation
velocity = pos_minus_pos(position1, position2)
velocity = linear_combin(α,velocity1,β,velocity2)
position = pos_plus_vel(position, velocity)
(position,velocity) = confinement(positiont+1,positiont)
Pseudo code form
} algebraic operators
2004−12−15 [email protected] Swarm optimisation
Confinements
=>
=>
Frontiers (ex. : interval)
Granularity
2004−12−15 [email protected] Swarm optimisation
End of Part 1
2004−12−15 [email protected] Swarm optimisation
Part 2: When the algo mutates
2004−12−15 [email protected] Swarm optimisation
The PSO Family
EmpiricalEmpirical CanonicalCanonical
ConstraintsConstraints
Multi−objectiveMulti−objective
WeightedWeighted
DiscreteDiscrete
HybrideHybrideCombinatorialCombinatorial
AdaptiveAdaptive
Multi−swarmMulti−swarm
Spherical,
Gaussian
Spherical,
Gaussian
PivotPivot
TRIBES
1995 1998 2000 2002 2004
OEP 0, 1, …
2004−12−15 [email protected] Swarm optimisation
0
0.2
0.4
0.6
0.8
1
1.2
1 3 5 7 9 11 13 15 17 19 21 23 25 27 29
Unbiased random proximity
xg
p
v
i−proximity
g−proximity
Hyperparallelepiped => Biased
DimensionVolume
Hypersphere vs hypercube
2004−12−15 [email protected] Swarm optimisation
Adaptive coefficients
The better I am the more I follow my own way
The better is my best neighbour the more I tend to go towards him
αvrand(0…b)(p-x)
Crisp or fuzzy rules
2004−12−15 [email protected] Swarm optimisation
Adaptive swarm size
There has been enough improvement
but there has been not enough improvement
although I'm the worst
I'm the best
I try to kill myself
I try to generate a new particle
Crisp or fuzzy rules
2004−12−15 [email protected] Swarm optimisation
TRIBES and strategies
Adaptive
information links
TRIBES Adaptive proximity
distributions (DPNP)
2004−12−15 [email protected] Swarm optimisation
Energies: classical process
Rosenbrock 2D. Swarm size=20, constant coefficients.
0,00E+00
1,00E+03
2,00E+03
3,00E+03
4,00E+03
5,00E+03
6,00E+03
7,00E+03
8,00E+03
9,00E+03
1,00E+04
Number of evaluations
0
5
10
15
20
25
Potential energyKinetic energySwarm size
ε=0.00001 after 2240 evaluations
Rosenbrock 2D. Swarm size=20, constant coefficients
2004−12−15 [email protected] Swarm optimisation
Energies: adaptive process
Rosenbrock 2D. Adaptive swarm size, adaptive coefficients.
0,00E+00
1,00E+03
2,00E+03
3,00E+03
4,00E+03
5,00E+03
6,00E+03
7,00E+03
8,00E+03
9,00E+03
1,00E+04
Number of evaluations
0
5
10
15
20
25
Potential energyKinetic energySwarm size
ε=0.00001 after 1414 evaluations
Rosenbrock 2D. Adaptive swarm size, adaptive coefficients
2004−12−15 [email protected] Swarm optimisation
The simplest PSORandom informers Pivot method
g−proximity
gx
1
5
7
6 4
3
8 2
K=3
2004−12−15 [email protected] Swarm optimisation
End of Part 2
2004−12−15 [email protected] Swarm optimisation
Part 3:Story of Optimisation
2004−12−15 [email protected] Swarm optimisation
Classical results
30D function PSO Type 1" Evolutionaryalgo.(Angeline 98)
Griewank [±300] 0.003944 0.4033
Rastrigin [±5] 82.95618 46.4689
Rosenbrock [±10] 50.193877 1610.359
Optimum=0, dimension=30Best result after 40 000 evaluations
2004−12−15 [email protected] Swarm optimisation
Some small problems
2004−12−15 [email protected] Swarm optimisation
Fifty−fifty
xi ∈ 1...N{ }i ≠ j ⇒ xi ≠ xj
xi = xiD / 2+1
D
∑1
D / 2
∑
⎧
⎨
⎪ ⎪ ⎪ ⎪
⎩
⎪ ⎪ ⎪ ⎪
N=100, D=20. Search space: [1,N]D
105 evaluations:63+90+16+54+71+20+23+60+38+15=12+48+13+51+36+42+86+26+57+79 (=450)
granularity=1
2004−12−15 [email protected] Swarm optimisation
Knapsack
xi ∈ 1...N{ }i ≠ j ⇒ xi ≠ xj
xi = Si∈I , I = D, I ⊂ 1, N{ }
∑
⎧
⎨
⎪ ⎪ ⎪ ⎪
⎩
⎪ ⎪ ⎪ ⎪
N=100, D=10, S=100, 870 evaluations: run 1 => (9, 14, 18, 1, 16, 5, 6, 2, 12, 17)run 2 => (29, 3, 16, 4, 1, 2, 6, 8, 26, 5)
granularity=1
2004−12−15 [email protected] Swarm optimisation
n1
n3n2
Apple trees
( ) ( )232
221 nnnnf −+−=
Evaluation n1 n2 n30 1 2 3
Swarm size=3Best position
7 7 63 11 66 4 103 0 17
2004−12−15 [email protected] Swarm optimisation
Graph Coloring Problem
1
45
2
11
55
3
22
1
1
5
5
5 0
2
0
-1
4-3
-1
-1+
=
pos−plus−v
el
2004−12−15 [email protected] Swarm optimisation
The Tireless Traveller
Example of position: X=(5,3,4,1,2,6)Example of velocity: v=((5,3),(2,5),(3,1))
2004−12−15 [email protected] Swarm optimisation
Ecological niche
Non linear Multiobjective
Fuzzy data
Heterogeneous
t1 t2 t3 t4 t5 t6 t7
Dynamical, real time
2004−12−15 [email protected] Swarm optimisation
End of Part 3
2004−12−15 [email protected] Swarm optimisation
Part 4: Real applications
Medical diagnosis Industrial mixer
Electrical generator Electrical vehicle
2004−12−15 [email protected] Swarm optimisation
Applications (1)
Salerno, J. Using the particle swarm optimization technique to train a recurrent neural model. IEEE International Conference on Tools with Artificial Intelligence, 1997, p. 45-49, 1997.
He Z., Wei C., Yang L., Gao X., Yao S., Eberhart R. C., Shi Y., "Extracting Rules from Fuzzy Neural Network by PSO", IEEE IEC, Anchorage, Alaska, USA, 1998.
Secrest B. R., Traveling Salesman Problem for Surveillance Mission using PSO, AFIT/GCE/ENG/01M-03, Air Force Institute of Technology, 2001.
Yoshida H., Kawata K., Fukuyama Y., "A PSO for Reactive Power and Voltage Control considering Voltage Security Assessment", IEEE TPS, vol. 15, 2001, p. 1232-1239.
Krohling, R. A., Knidel, H., and Shi, Y. Solving numerical equations of hydraulic problems using PSO. Proceedings of the IEEE CEC, Honolulu, Hawaii USA. 2002.
2004−12−15 [email protected] Swarm optimisation
Applications (2)Kadrovach, B.A., and Lamont G., A particle swarm model for swarm-based networked sensor systems, ACM symposium on Applied computing, Madrid, Spain, p. 918-924, 2002
Omran, M., Salman, A., and Engelbrecht, A. P. Image classification using PSO. Proceedings of the 4th Asia-Pacific Conference on Simulated Evolution and Learning 2002 (SEAL 2002), Singapore. p. 370-374, 2002.
Coello Coello, C. A., Luna, E. H., and Aguirre, A. H. Use of PSO to designcombinational logic circuits. LNCS No. 2606, p. 398-409, 2003.
Onwubolu, G. C. and Clerc, M., "Optimal path for automated drilling operations by a new heuristic approach using particle swarm optimization," International Journal of
Production Research, vol. 4, p. 473-491, 2004.
Onwubolu G.C., TRIBES application to the flowshop scheduling problem, NewOptimization Techniques in Engineering. Heidelberg, Germany, Springer: p. 517-536, 2004.
2004−12−15 [email protected] Swarm optimisation
Neuronal network
ei,1
ei,2
ei,N
.
.
.
si,1
si,P
.
.
.
s'i,1(t)
s'i,P(t)
. .
.
Transfer functions
Test Ei
Wanted output Si
Real outputS'i(t)
E = E1...Enb _tests( )X t( ) = xk, m t( )( )S = S1...Snb _ tests( )S ' t( )= S1
' t( )...Snb _tests' t( )( )
⎧
⎨
⎪ ⎪
⎩
⎪ ⎪
f X( )= S − S'Function to minimise
( ), ,,k m k mentry xτ
,
11 k mx entrye+
2004−12−15 [email protected] Swarm optimisation
To know more
Clerc M., Kennedy J., "The Particle Swarm-Explosion, Stability, and Convergence in a Multidimensional Complex space", IEEE Transaction on Evolutionary Computation, 2002,vol. 6, p. 58-73
Particle Swarm Central, http://www.particleswarm.infoTHE site:
Self advert
Kennedy, J., R. Eberhart, et al. (2001). SwarmIntelligence, Morgan Kaufmann Academic Press.
2005 IEEE TEC award
2004−12−15 [email protected] Swarm optimisation
More self ad.
Clerc M., "L'optimisation par essaim particulaire.Principes et pratique", Hermès, Techniques et Science de l'Informatique, 2002. Article de 25 p.
My PSO site: http://clerc.maurice.free.fr/pso/index.htm
If you read French
Clerc M., L ’optimisation par essaims particulaires. http://www.editions-hermes.fr/fr/. Parution février 2005
2004−12−15 [email protected] Swarm optimisation
PSO in the world
eXtended Particle Swarms (XPS) project
2004−12−15 [email protected] Swarm optimisation
Some open questions
New mathematical id
eas to
model particle intera
ctions
Metamodel
Better combinator
ial approaches
Adaptive we
ighted
relationships
2004−12−15 [email protected] Swarm optimisation
Beat the swarm!
Your current posit
ion
Your best perf.
Best perf. of
the swarm
2004−12−15 [email protected] Swarm optimisation
APPENDIX
2004−12−15 [email protected] Swarm optimisation
Canonical form
( ) ( ) ( )( )( ) ( ) ( )
1
1 1
v t v t q x t
x t x t v t
ϕ⎧ + = + −⎪⎨
+ = + +⎪⎩
( ) ( )y t q x t= −
v t +1( )= αv t( )+ βϕy t( )y t + 1( ) = −γv t( )+ δ − ηϕ( )y t( )
⎧ ⎨ ⎩
α β
γ δ−ηϕ
( )( )
( )( )⎥⎦
⎤⎢⎣
⎡⎥⎦
⎤⎢⎣
⎡−−
=⎥⎦
⎤⎢⎣
⎡++
tytv
tytv
ϕϕ
111
11
M
Eigen values e1 and e2
2004−12−15 [email protected] Swarm optimisation
Constriction
( ) ( ) ( )
( ) ( ) ( )⎪⎪⎪
⎩
⎪⎪⎪
⎨
⎧
−+−
−+−−+−−+=
−+−
−+−−++−+=
ϕϕϕ
δαβγδηαηϕηϕηϕδαχ
ϕϕϕ
δαβγδηαηϕηϕηϕδαχ
42
22
42
22
2
22
2
2
22
1
Constriction coefficients
2004−12−15 [email protected] Swarm optimisation
Convergence criterion
ϕ654321
3.5
3
2.5
2
1.5
1
0.5
χ1e1 < 1
χ2e2 < 1
⎧ ⎨ ⎪
⎩ ⎪ ⇐
χ1 < 1
χ2e2 < 1
⎧ ⎨ ⎪
⎩ ⎪ κ = χ2e2
2004−12−15 [email protected] Swarm optimisation
Robustness
( )( ) ( )2
1212
21
1100
,
xxx
xxf
−+−
=
Performance map :
NeededIterations(κ,ϕ)
Iter
2004−12−15 [email protected] Swarm optimisation
Clusters and queens
Each particle is weighted by its perf.
Dynamic clustering
Centroids = queens = temporary new “particles”
2004−12−15 [email protected] Swarm optimisation
Magic Square (1)
⎥⎥⎥
⎦
⎤
⎢⎢⎢
⎣
⎡
DDD
ji
D
mmm
mm
,1,
,
,11,1
K
MM
K
( ){ }
⎪⎪⎩
⎪⎪⎨
⎧
≠
∈
= −+
lkji
ji
Dijji
mm
Nm
xm
,,
,
1,
1L
( )
( )
0
1
1
2
11,,
1
1
2
1,1,
=
⎟⎟
⎠
⎞
⎜⎜
⎝
⎛−+
⎟⎟
⎠
⎞
⎜⎜
⎝
⎛−
∑ ∑
∑ ∑−
= =+
−
= =+
D
j
D
ijiji
D
i
D
jjiji
mm
mm
2004−12−15 [email protected] Swarm optimisation
Magic Square (2)
D=3x3, N=10010 runs13430 evaluations
10 solutions
55 30 68 42 49 62 56 74 23
30 61 53 89 32 23 25 51 68
27 96 39 73 40 49 62 26 74
22 70 58 40 75 35 88 5 57
50 43 67 58 55 47 52 62 4
43 51 78 75 33 64 54 88 30
50 65 68 69 42 72 64 76 43
18 25 59 32 53 17 52 24 26
80 3 30 22 72 19 11 38 64
65 28 64 63 55 39 29 74 54
2004−12−15 [email protected] Swarm optimisation
Non linear system
( )⎪⎩
⎪⎨⎧
=−=−+
010sin01
21
22
21
xxxx
Search space[0,1]2
1 run143 evaluations
1 solution
10 runs1430 evaluations
3 solutions
2004−12−15 [email protected] Swarm optimisation
Model fitting (ARMA +AIC)
Autoregressive Moving Average + Akaike's Information Criterion
∑ ∑= =
−− =n
i
m
jjtjiti ay
0 0θφ
( )N
yyN
itt ARMAdata∑
=
−= 1
2
2σ
( )( )mnnf
++=2
log 2σ
heterogenuous
2004−12−15 [email protected] Swarm optimisation
A binary PSO C code
// Pivot method -----------------------------------------// Works pretty well on some problems .. and pretty bad on some othersP[s]=P_m[g]; // Initialise the new position of particle s
// at the position of the best known around
dist=log(D); // We suppose here D>=2
r=alea(1,dist); // Radius for DPNP
for (k=0;k<r;k++)// Switch at random some bits{
d=alea_integer(0,D-1);P[s][d]=1- P[s][d][d]; // Around g
}
Information links are modified at random if there has been no improvement
2004−12−15 [email protected] Swarm optimisation
End of ANNEXE