partical swarm optimizatiom methodpartical swarm optimizatiom method a project submitted to...

34
Republic of Iraq Ministry of Higher Education and Scientific Research University of Baghdad, College of Science Department of Computer science PARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial Fulfillment of the Requirements for the degree of B.SC. In Computer Science. By Riyam Muhanad Al.Anie Hadeel Ayad Al.Quraishy Supervised by Assis. Lecturer: Khulood E. Dagher Date 2010-2011

Upload: others

Post on 26-Apr-2020

30 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

Republic of Iraq Ministry of Higher Education and Scientific Research University of Baghdad, College of Science Department of Computer science

PARTICAL SWARM OPTIMIZATIOM METHOD

A project

Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial Fulfillment of the Requirements for the degree of B.SC. In Computer

Science.

By

Riyam Muhanad Al.Anie Hadeel Ayad Al.Quraishy

Supervised by

Assis.

Lecturer: Khulood E. Dagher

Date 2010-2011

Page 2: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial
Page 3: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

 

 

األھـــــــــــداء

لصالة والسالم على ه تعالى واــــــــــــــــــــــالحمد والشكر لل.... و صحبه المنتجبين ل بيته الطيبين الطاھرينالمصطفى محمد وآ

ي العزيزةـــــمأ.............................. الى من علمني لغة الحنان

دي الطيبـــــ.... وال.................... الى من ذلل لي مصاعب الحياة

ي الكريمةـــــ........عائلت.............................الى مصدر سعادتي

دادـــــــقسم علوم الحاسبات بجامعة بغ اساتذة وتدريسيوا جميعالى

التي مدت لي يد المحترمة داغر ود اسكندرـــــالست خلالى استاذتي ..كل النصائح المھمة بكل اخالص وتفاني ..اعطتني وـــــــــون الع

اھدي ھذه الكلمات المتواضعة والبسيطة والقليلة لترفعني شرفا وحكمة على ان اكون قادرا على تقديم ولو بالشيئ البسيط الى بلدي العظيم

راق .ـــــــــــــــــالع

Page 4: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

 

بسم اهللا الرحمن الرحيم

اهللا ال إله إال هو الحي القيوم ال تأخذه سنة وال نوم له ما في السموات وما في األرض من ذا الذي يشفع عنده إال بأذنه يعلم ما بين أيديهم و ما خلفهم و ال

يحيطون بشيء من علمه إال بما شاء وسع كرسيه العلي السموات و األرض واليؤده حفظهما و هو

  العظيم

صدق اهللا العلي العظيم

Page 5: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

Contents

Abstract

Chapter One “Introduction”

1.1 Introduction…………………………………….……………….. (1)

1.2 Aim of project………………………………………………….. (2)

1.3 Project layout ………………………………………………… (3)

Chapter Two “Theoretical Background”

2.1 Introduction………………………………..…….………...…... (4)

2.2 Optimization……………………………………...……………. (4)

2.3 pseudo code…………………………………………………… (7)

2.4 Advantage and Disadvantage ………………………………. (8)

2.5 PSO applications………………………….………………….. (8)

Chapter Three “PSO”

3.1 Introduction……………………………………………. (9)

3.2 Algorithm……………………………………………… (9)

3.3 Flow chart…………………………………………….. (16)

Chapter four “PSO”

4.1 Introduction………………………………………….. (17)

4.2 Mathematical example……………………………… (17)

4.3 Banana function…………………………………….. (22)

Chapter five “conclusions and future work”

5.1 Introduction………………………………………….. (27)

5.2 Conclusions…………………………………………. (27)

5.3 future work…………………………………………… (27)

Reference

Page 6: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

Abstract

This project study Particle Swarm Optimization method and gives the

MATLAB code for it. Finally it gives the advantage and the disadvantage of

PSO with its practical application.

Page 7: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

 

 

1.1 In

P

techni

PSO s

Algori

In PSO

the var

veloci

ntroducti

Particle Swa

que develop

J.Kenn

hares many

ithms (GA).

O, the potent

riable value

ty.

ion

arm Optimiza

ped by Dr. Eb

nedy

similarities

The system

tial solutions

and are not

Cha

Intr

ation (PSO)

berhart and D

with evoluti

m is initialized

s, called part

binary enco

 

‐ 1 ‐ 

apter O

roducti

is a populat

Dr. Kennedy

ionary comp

d with a pop

ticles (same

ded. Each pa

One

ion

tion based st

y in 1995.

R.Eber

putation tech

pulation of ra

as the GA c

article move

tochastic opt

rhart

hniques such

andom solut

chromosome

es about the

timization

as Genetic

tions and sea

e).They conta

cost surface

arches

ain

with

Page 8: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

 

The th

as bird

PSO h

netwo

1.2 T

This p

F(x) =

hought proce

d flocking or

has been succ

rk training, f

The aim o

project study

=100*(x (2)-x

ess behind th

r fish school

cessfully app

fuzzy system

of the pro

y PSO and th

x (1) ^2) ^2+

he algorithm

ing.

plied in man

m control, an

oject

hen use it for

+ (1-x (1)) ^

 

‐ 2 ‐ 

was inspired

ny areas: fun

nd other area

r minimizing

2

d by the soc

nction optimi

as where GA

g the banana

cial behavior

ization, artif

A can be appl

function.

r of animals,

ficial neural

lied.

such

Page 9: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

 

‐ 3 ‐  

1.3 Project layout

This project consist of

Chapter One: Introduction.

Chapter Two: Theoretical background

Chapter Three: Particle Swarm Optimization.

Chapter Four: Results.

Chapter Five: Conclusion and future work.

Page 10: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

Chapter Two

Theoretical Background

2.1 Introduction

This chapter introduces the optimization then study advantage, disadvantages and applications of

it.

2.2 Optimization

Is the mechanism by which one finds the maximum or minimum value of a function or process

economics, and engineering where the goal is to maximize efficiency, production, or some other

measure. Optimization can refer to either minimization or maximization; this mechanism is used

in fields such as physics, chemistry, maximization of a function f is equivalent to minimization

of the opposite of this function, −f [1].

Mathematically, a minimization task is defined as:

Given f: R (n) R

Find x R (n) such that f (ˆx) ≤ f(x), xR (n).

Similarly, a maximization task is defined as:

Given f: R (n) R

Find ˆxR (n) such that f (ˆx) ≤ f(x), x R (n).

The domain R (n) of f is referred to as the search space (or parameter space [2]).

Each element of R (n) is called a candidate solution in the search space, with ˆx being the

optimal solution. The value n denotes the number of dimensions of the search space, and thus the

number of parameters involved in the optimization problem. The function f is called the

objective function, which maps the search space to the function space. Since a function has only

one output, this function space is usually one-dimensional.

Page 11: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

The function space is then mapped to the one-dimensional fitness space, providing a single

fitness value for each set of parameters. This single fitness value determines the optimality of the

set of parameters for the desired task. In most cases, including all the cases discussed in this

paper, the function space can be directly mapped to the fitness space. However, the distinction

between function space and fitness space is important in cases such as multi objective

optimization tasks, which include several objective functions drawing input from the same

parameter space [2, 3].

For a known (differentiable) function f, calculus can fairly easily provide us with the minima and

maxima of f. However, in real-life optimization tasks, this objective function f is often not

directly known. Instead, the objective function is a “black box” to which we apply parameters

(the candidate solution) and receive an output value. The result of this evaluation of a candidate

solution becomes the solution’s fitness.

The final goal of an optimization task is to find the parameters in the search space that maximize

or minimize this fitness [2].In some optimization tasks, called constrained optimization tasks, the

elements in a candidate solution can be subject to certain constraints (such as being greater than

or less than zero) [1]. For the purposes of this paper, we will focus on unconstrained optimization

tasks. A simple example of function optimization can be seen in Figure 1.

This figure shows a selected region the function f, demonstrated as the curve seen in the diagram.

This function maps from a one-dimensional parameter space—the set of real numbers R on the

horizontal x-axis—to a one-dimensional function space—the set of real numbers R on the

vertical y-axis. The x-axis represents the candidate solutions, and the y-axis represents the results

of the objective function when applied to these candidate solutions.

This type of diagram demonstrates what is called the fitness landscape of an optimization

problem [2].The fitness landscape plots the n-dimensional parameter space against the one-

dimensional fitness for each of these parameter.

Page 12: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

Figure 1

maximum

function

For exam

maximum

designed

However

also shows t

m. A local m

than any can

mple: - if we

m located at

d to find the l

r, the PSO al

the presence

maximum is a

ndidate solut

e choose the

the approxim

local maxim

lgorithm as d

Figure 1: F

e of a local m

a candidate s

tion in a part

interval [0, 2

mate value x

mum, ignoring

described int

Function Ma

maximum in

solution that

ticular regio

2.5] in Figur

x = 1.05. Ma

g other local

tended to fin

aximum

addition to t

t has a highe

on of the sear

re 1, the obje

any optimiza

l maxima an

nd the global

the marked g

er value from

rch space.

ective functi

ation algorith

nd the global

l maximum.

global

m the objectiv

ion has a loc

hms are only

l maximum.

ve

cal

y

Page 13: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

2.3 Pseudo Code

For each particle

Initialize particle

END

Do

For each particle

Calculate fitness value

If the fitness value is better than the best fitness value (pBest) in history

set current value as the new pBest

End

Choose the particle with the best fitness value of all the particles as the gBest

For each particle

Calculate particle velocity according equation (a)

Update particle position according equation (b)

End

While maximum iterations or minimum error criteria is not attained

Particles' velocities on each dimension are clamped to a maximum velocity Vmax. If the sum of

accelerations would cause the velocity on that dimension to exceed Vmax , which is a parameter

specified by the user. Then the velocity on that dimension is limited to Vmax.

Page 14: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

2.4 The Advantage and the Disadvantage of PSO

The Advantages of PSO

Insensitive to scaling of design variables.

Simple implementation.

Easily parallelized for concurrent processing.

Derivative free.

Very few algorithm parameters.

Very efficient global search algorithm.

The Disadvantages of PSO

Slow convergence in refined search stage (weak local search ability).

2.5 PSO applications

Training of neural network.

Optimization of electric power distribution network.

Structural optimization.

Process biochemistry.

System identification in biomechanics.

Images processing.

Page 15: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

Chapter Three

PSO

3.1 Introduction

This chapter introduces the PSO algorithm, flow chart, Topologies and then studies

Variants of it.

3.2 Algorithms

The PSO algorithm works by simultaneously maintaining several candidate solutions in the search

space. During each iteration of the algorithm, each candidate solution is evaluated by the objective

function being optimized, determining the fitness of that solution. Each candidate solution can be thought

of as a particle “flying” through the fitness landscape finding the maximum or minimum of the objective

function.

Initially, the PSO algorithm chooses candidate solutions randomly within the search space. Figure 2

shows the initial state of a four-particle PSO algorithm seeking the global maximum in a one-dimensional

search space. The search space is composed of all the possible solutions along the x-axis; the curve

denotes the objective function. It should be noted that the PSO algorithm has no knowledge of the

underlying objective function, and thus has no way of knowing if any of the candidate solutions are near

to or far away from a local or global maximum.

The PSO algorithm simply uses the objective function to evaluate its candidate solutions, and operates

upon the resultant fitness values.

Page 16: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

Each par

and its ve

the opera

solution t

candidate

Finally, t

swarm, c

the globa

1. Create

rticle maintai

elocity. Add

ation of the a

that achieve

e solution.

the PSO algo

called the glo

al best positi

e a ‘populatio

F

ins its positi

ditionally, it r

algorithm, re

d this fitness

orithm maint

obal best fitn

on or global

on’ of agents

Figure 2: I

on, compose

remembers t

eferred to as

s, referred to

tains the bes

ness, and the

l best candid

s (called par

Initial PSO

ed of the can

the best fitne

the individu

o as the indiv

st fitness valu

e candidate s

date solution

rticles) unifo

O State

ndidate solut

ess value it h

ual best fitne

vidual best p

ue achieved

solution that

[1].

ormly distrib

tion and its e

has achieved

ess, and the c

position or in

d among all p

achieved thi

buted over X

evaluated fitn

d thus far dur

candidate

ndividual bes

particles in th

is fitness, ca

.

ness,

ring

st

he

alled

Page 17: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

2. Evalua

3. If a pa

4. Determ

ate each part

article’s curre

mine the best

ticle’s positio

ent position

t particle (ac

on according

is better than

ccording to t

g to the obje

n its previou

the particle’s

ective functio

us best positi

s previous be

on.

ion, update i

est positions

it.

s).

Page 18: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

5. Update

6. Move

7. Go to

e particles’ v

particles to t

step 2 until s

velocities ac

their new po

stopping crit

cording to V

ositions acco

teria are sati

Vi(t+1)=vi(t)

ording to Xi

sfied.

)+ c1r1(pi(t)

(t+1) =xi (t)

– xi(t)) + c2

) + Vi (t+1).

2r2 (pg(t) xi((t)).

Page 19: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

2. Evalua

3. If a pa

ate each part

article’s curre

ticle’s positio

ent position

on according

is better than

g to the obje

n its previou

ective functio

us best positi

on.

ion, update iit.

Page 20: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

4. Determ

5. Update

6. Move

mine the best

e particles’ v

particles to t

t particle (ac

velocities ac

their new po

ccording to t

cording to V

ositions acco

the particle’s

Vi(t+1)= vi(t

ording to Xi t

s previous be

t) +c1r1(pi(t)

t+1=xi t + v

est positions

)−xi(t)) +c2r

vi t+1.

s).

r2(pg(t)−xi(tt)).r

Page 21: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

7. Go to

step 2 until sstopping crit

teria are sati

sfied.

Page 22: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

3.3 Flo

ow char

rt

Page 23: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

Chapter Four

Results

4.1 Introduction

This chapter introduces mathematical example (Hand Work) and Banana function.

4.2 Mathematical Example

Maximize the function f(X) =2X – X^2/16 which defines within [0, 31] interval:-

Let pop size=4 (population size)

Maxit=2 (maximum iteration)

C1=C2=2

C=1

Particles Velocity Cost

18.66 12.69 15.556

28.574 0.0106 9.01

30.639 16.7689 2.605

28.816 6.444 5.732

Hint: - Particles and velocity created randomly within (0, 31).

Global cost 15.556

Global particle 18.66

Page 24: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

NEW SWARM IT=1

R1 R2

0.2193 0.7485

0.325 0.5433

0.095 0.3381

0.745 0.8323

Hint: - R1 and R2 created randomly.

*****W= maxit /it.

*****Vel= (W*vel+C1*R1*(local par-par) +C2*R2 (global par-par))

*****Par=par +vel

Check par>o and par<31

Compare it with local cost

Then chose better cost.

particle

25.011

17.983

30.924

15.1346

New cost

10.924

15.7541

2.07

15.953

Local cost

15.55

9.0107

2.6050

5.7329

Page 25: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

Better cost Better particle

15.5567 18.66

15.7541 17.983

2.6050 30.63

15.9532 15.134

NOW

Global cost 15.953

Global particle 15.1346

Local cost=better cost

Local particle=better particle

It x f(x)

1 15.1346 15.953

NEW SWARM IT=2

R1 R2

0.5529 0.5464

0.9575 0.3967

0.892 0.6228

0.356 0.7960

Page 26: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

Hint: - R1 and R2 created randomly.

*****W= maxit /it.

*****Vel= (W*vel+C1*R1*(local par-par) +C2*R2 (global par-par))

*****Par=par +vel

Check par>o and par<31

Compare it with local cost

Then chose better cost

.

vel particle

17.8 7.2024

1.9 16.00

20.177 10.7471

0 15.1346

New cost

11.16

16.00

14.27

15.9

Local cost

15.556

5.7541

2.6

15.953

Page 27: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

Better cost Better particle

15.556 18.663

16.00 16.00

14.27 10.7471

15.953 15.1346

NOW

Global cost 16

Global particle 16

Local cost=better cost

Local particle=better particle

It x f(x)

2 16 16

Page 28: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

4.3 Banana function

% particle swarm optimization pso

Clear

clc

Close all

Pop size=20; %size of swarm

Npar=2; %dimension of the problem

Maxit=15; %maximum number of iteration

c1=1; %cognitive parameter

c2=1; %social parameter

%initialization swarm

% par=rand (popsize,npar)

% vel=rand (popsize,npar)

Par=rand (popsize,npar).*(2/.9999); %%%%%%%%% randn to get +ve and -ve

Vel=rand (popsize,npar).*(2/.9999);

Cost=fev (par,npar);

Minc (1) =min (cost);

Mean (1) =mean (cost);

Global min=minc (1);

Local par=par;

Local cost=cost;

[globalcost,indx]=min (cost);

Page 29: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

Global par=par (index, :);

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

[x,y]=mesh grid ([-2:0.05:2]);

Zz=100*(y-x. ^2). ^2+ (1-x). ^2;

Mesh(x,y,zz)

Hold on

%hsv2=hsv;

%hsv3= [hsv2 (11:64, :); hsv2 (1:10, :)];

View (10, 55);

%color map (hsv3);

Hold on

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%start%%%%%%%%%%%%%%%%%%%%%%%

Iter=0;

While (iter<maxit)

Iter=iter+1;

w= (maxit-iter)/maxit

r1=rand (popsize,npar)

r2=rand (popsize,npar)

vel=(w*vel+c1*r1.*(local par-par)+c2*r2.*(ones(popsize,1)*global par-par));

'New vel';

vel;

'New par';

Par=par+vel;

Par;

Page 30: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

Overlimit=par<=2 %%%%%%%%%%rang

Underlimit=par>=-2 %%%%%%%%%%%%%rang

%

%

Par=par.*over limit;

Par=par.*under limit;

'Par under over limit new swarm'

Par;

Cost=fev(par)

Better cost=cost<local cost

Local cost=local cost.*not (better cost) +cost.*better cost

Local par (find (better cost), :) =par (find (better cost), :)

[Temp, t]=min (local cost)

If temp<global cost

Global par=par (t, :)

Index=t

Global cost=temp

End%

'iter global par global cost'

datt=[iter global par global cost] %%%%%%%%for the current iteration

Minc (iter+1) =min (cost);

Global min (iter+1) =global cost;

Page 31: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

meanc(iter+1)=mean(cost);

%%%%%%%%%%%%%%%%%%%%%%%

[parfit,parind]=min (cost);

Parsol=par (parind, :);

plot3 (parsol (1), parsol (2), parfit,'r+');

Hold on

Axis ([-2, 2,-2, 2])

xlabel(parsol(1))

ylabel(parsol(2))

%zlabel(parfit)

Title ([iter,parfit])

Pause (.001)

End

Figure (2)

Iters=0: maxit;

Plot (iter,minc,iters,meanc,'-',iters,globalmin,':');

Xlabel ('generation'); ylabel ('cost');

%

Page 32: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

Figure 5: Banana function

-2 -1.5 -1 -0.5 0 0.5 1 1.5 2-2

-1

0

1

2

0

1000

2000

3000

4000

0.984907

0.969137

15 0.000309624

Page 33: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

Chapter Five

Conclusions and Future work

5.1 Introduction

This chapter introduces the conclusions and presents the future work.

5.2 conclusions

1) Particle Swarm Optimization is an extremely simple algorithm that seems to be effective for

optimizing a wide range of functions.

2) Social optimization occurs in the time frame of ordinary experience - in fact, it is ordinary

experience.

3) Particle swarm optimization seems to lie somewhere between genetic algorithms and

evolutionary programming.

4) It is highly dependent on stochastic processes, like evolutionary programming.

5) The adjustment toward pbest and gbest by the particle swarm optimizer is conceptually

similar to the crossover operation utilized by genetic algorithms.

6) PSO uses the concept of fitness, as do all evolutionary computation paradigms.

7) Much of the success of particle swarms seems to lie in the agents' tendency to hurtle past

their target.

8) The Particle Swarm Optimizer serves both of these fields equally well.

5.3 Future work

Study Genetics algorithm and compare it with Particle Swarm Optimization (PSO).

Page 34: PARTICAL SWARM OPTIMIZATIOM METHODPARTICAL SWARM OPTIMIZATIOM METHOD A project Submitted to Department of Computer Science, College of Science, and University of Baghdad in partial

Reference

[1] Frans van den Bergh, “An Analysis of Particle Swarm Optimizers”, PhD thesis, University of Pretoria, 2001. [2] James Kennedy, Russell Eberhart,and Yuhui Shi,”Swarm Intelligence”, Morgan Kaufmann, 2001. [3] E. Zitzler, M. Laumanns,and S.Bleuler,”A tutorial on evolutionary multi objective optimization”, 2002. [4] P. N. Suganthan,” Particle Swarm Optimizer with neighborhood operator ”,In Proceedings of the IEEE Congress on Evolutionary Computation (CEC), pages 1958-1962, 1999. [5] J. Kennedy and R. Mendes,” Population structure and particle swarm performance”, In Proceedings of the IEEE Congress on Evolutionary Computation (CEC), 2002. [6] D. J. Watts, “Small Worlds: The Dynamics of Networks between Order and Randomness”, Princeton University Press, 1999. [7] D.J. Watts and S.H. Strogatz,” Collective dynamics of small-world networks”, Nature, (393):440-442, 1998. [8] X. Li,” Adaptively choosing neighborhood bests using species in a particle swarm optimizer for multimodal function optimization”, Lecture Notes on Computer Science, 3102:105-116, 2004. [9] Holland, J. H. (1992),”Adaptation in Natural and Artificial Systems”, MIT Press, Cambridge, MA.