a new chaotic starling particle swarm optimization...

15
Research Article A New Chaotic Starling Particle Swarm Optimization Algorithm for Clustering Problems Lin Wang , 1 Xiyu Liu , 1 Minghe Sun , 2 Jianhua Qu , 1 and Yanmeng Wei 1 1 College of Management Science and Engineering, Shandong Normal University, Jinan 250014, China 2 College of Business, e University of Texas at San Antonio, San Antonio, TX, USA Correspondence should be addressed to Xiyu Liu; [email protected] Received 3 May 2018; Accepted 2 August 2018; Published 19 August 2018 Academic Editor: AMA Neves Copyright © 2018 Lin Wang et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. A new method using collective responses of starling birds is developed to enhance the global search performance of standard particle swarm optimization (PSO). e method is named chaotic starling particle swarm optimization (CSPSO). In CSPSO, the inertia weight is adjusted using a nonlinear decreasing approach and the acceleration coefficients are adjusted using a chaotic logistic mapping strategy to avoid prematurity of the search process. A dynamic disturbance term (DDT) is used in velocity updating to enhance convergence of the algorithm. A local search method inspired by the behavior of starling birds utilizing the information of the nearest neighbors is used to determine a new collective position and a new collective velocity for selected particles. Two particle selection methods, Euclidean distance and fitness function, are adopted to ensure the overall convergence of the search process. Experimental results on benchmark function optimization and classic clustering problems verified the effectiveness of this proposed CSPSO algorithm. 1. Introduction e particle swarm optimization (PSO) algorithm is a global optimization method based on intelligent search strategy in population inspired by the behavior of birds flocking. As a swarm intelligence algorithm, each particle flies in the search space, called the solution space, with a certain velocity and updates its velocity and position through a linear combination of individual and global best positions in history. Compared with other evolutionary algorithms, PSO uses individual and global experiences of the particles and has well-balanced mechanism between its exploitation and exploration abilities [1]. erefore, it has been successfully applied to many difficult optimization problems [2]. PSO provides good exploitation and exploration per- formance for solving optimization problems. e global experience guides the direction of the particle population, and individual experience gives a more precise direction in the search space. In this way, the particle population will move gradually closer to the global optimum. Hence, it has short computing time and is easy to implement. However, like most swarm intelligence algorithms, this algorithm can be easily trapped into local optima in later generations or iterations, and the search process may premature. Many works have been done to improve the standard or traditional PSO algorithm [3–6]. Data clustering has been an important technology in data analysis. e purpose of data clustering is to discover valuable informative patterns and implicit information [7]. As an unsupervised classification technique, data clustering classifies similar data into the same groups or clusters using the characteristics of the data without any prior knowledge about the groups or clusters. erefore, data clustering can extract potential patterns or knowledge from the dataset and can help in obtaining and understanding the valuable information in the data [8]. Because data clustering problems are NP-hard, traditional methods are sometimes ineffective in solving these problems [9]. erefore, a lot of work has been done to solve this problem by adopting PSO, and some recent researches and works are summarized below. Netjinda et al. [10] presented a new PSO procedure, called starling PSO, inspired by the collective responses of starlings. Dor et al. [11] proposed a dynamic topology DCluster algorithm based on two topologies using a four- cluster approach and a fitness function to solve the underlying Hindawi Mathematical Problems in Engineering Volume 2018, Article ID 8250480, 14 pages https://doi.org/10.1155/2018/8250480

Upload: others

Post on 24-Jan-2021

7 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A New Chaotic Starling Particle Swarm Optimization ...downloads.hindawi.com/journals/mpe/2018/8250480.pdfMathematicalProblemsinEngineering Chaotic Parameter u 2.5 3 3.5 4 Chaotic Value

Research ArticleA New Chaotic Starling Particle Swarm Optimization Algorithmfor Clustering Problems

LinWang 1 Xiyu Liu 1 Minghe Sun 2 Jianhua Qu 1 and YanmengWei1

1College of Management Science and Engineering Shandong Normal University Jinan 250014 China2College of Business The University of Texas at San Antonio San Antonio TX USA

Correspondence should be addressed to Xiyu Liu sdxyliu163com

Received 3 May 2018 Accepted 2 August 2018 Published 19 August 2018

Academic Editor AMA Neves

Copyright copy 2018 LinWang et alThis is an open access article distributed under the Creative Commons Attribution License whichpermits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

A new method using collective responses of starling birds is developed to enhance the global search performance of standardparticle swarm optimization (PSO) The method is named chaotic starling particle swarm optimization (CSPSO) In CSPSO theinertia weight is adjusted using a nonlinear decreasing approach and the acceleration coefficients are adjusted using a chaotic logisticmapping strategy to avoid prematurity of the search process A dynamic disturbance term (DDT) is used in velocity updating toenhance convergence of the algorithm A local search method inspired by the behavior of starling birds utilizing the informationof the nearest neighbors is used to determine a new collective position and a new collective velocity for selected particles Twoparticle selection methods Euclidean distance and fitness function are adopted to ensure the overall convergence of the searchprocess Experimental results on benchmark function optimization and classic clustering problems verified the effectiveness of thisproposed CSPSO algorithm

1 Introduction

The particle swarm optimization (PSO) algorithm is a globaloptimization method based on intelligent search strategyin population inspired by the behavior of birds flockingAs a swarm intelligence algorithm each particle flies inthe search space called the solution space with a certainvelocity and updates its velocity and position through alinear combination of individual and global best positions inhistory Compared with other evolutionary algorithms PSOuses individual and global experiences of the particles andhas well-balanced mechanism between its exploitation andexploration abilities [1] Therefore it has been successfullyapplied to many difficult optimization problems [2]

PSO provides good exploitation and exploration per-formance for solving optimization problems The globalexperience guides the direction of the particle populationand individual experience gives a more precise direction inthe search space In this way the particle population willmove gradually closer to the global optimum Hence it hasshort computing time and is easy to implement Howeverlike most swarm intelligence algorithms this algorithm canbe easily trapped into local optima in later generations or

iterations and the search process may premature Manyworks have been done to improve the standard or traditionalPSO algorithm [3ndash6]

Data clustering has been an important technology indata analysis The purpose of data clustering is to discovervaluable informative patterns and implicit information [7]As an unsupervised classification technique data clusteringclassifies similar data into the same groups or clusters usingthe characteristics of the data without any prior knowledgeabout the groups or clusters Therefore data clustering canextract potential patterns or knowledge from the datasetand can help in obtaining and understanding the valuableinformation in the data [8] Because data clustering problemsareNP-hard traditionalmethods are sometimes ineffective insolving these problems [9] Therefore a lot of work has beendone to solve this problem by adopting PSO and some recentresearches and works are summarized below

Netjinda et al [10] presented a new PSO procedurecalled starling PSO inspired by the collective responsesof starlings Dor et al [11] proposed a dynamic topologyDCluster algorithm based on two topologies using a four-cluster approach and afitness function to solve the underlying

HindawiMathematical Problems in EngineeringVolume 2018 Article ID 8250480 14 pageshttpsdoiorg10115520188250480

2 Mathematical Problems in Engineering

for 119894 = 1 to119872V119894119895 (119905 + 1) = 119908 lowast V119894119895 (119905) + 1198881 lowast 1199031 lowast (119901119894119895 (119905) minus 119909119894119895 (119905)) + 1198882 lowast 1199032 lowast (119901119892119895 (119905) minus 119909119894119895 (119905))119909119894119895 (119905 + 1) = 119909119894119895 (119905) + V119894119895 (119905 + 1)if Particle(119894)FitnessltParticle(119894)BestFitness

Particle(119894)BestPosition=Particle(119894)PositionParticle(119894)BestFitness= Particle(119894)Fitnessif Particle(119894)BestFitnessltBest ParticleFitness

Best ParticlePosition=Particle(119894)BestPositionBest ParticleFitness=Particle(119894)BestFitnessEnd if

End ifEnd for

Algorithm 1 The Standard PSO Algorithm

problem Ali [12] presented a new variant of the positionupdating rule of PSO in which each particle is treated as abidimensional vector Armano et al [13] proposed a multiob-jective clustering PSO algorithm with two objective functionsdefined to integrate data connectivity and cohesion Niu etal [14] proposed a population-based clustering algorithmwhich combines different PSO and k-means algorithms tohelp particles escape from local optima

Exploring the topological neighborhood of the k-nearestneighbors and employing pattern search are consideredto be useful tools to improve the performance of PSO[15] Cormark [16] proposed a PSO procedure using Renyientropy clustering which contains two steps initializationand particle removal Bharti and Singh [17] proposed a binaryPSO procedure with an opposition-based learning mecha-nism using chaotic mapping dynamic inertia weight and amutation operator Song et al [18] added an environmentfactor to the velocity adjustment in PSO to enhance therobust behavior of the particles Liu et al [19] proposed amodified coevolutionary multiswarm PSO procedure basedon new velocity updating and similarity detection to solvemultiobjective clustering problems

Although PSO has been widely used in many fields andhas shown a great potential in solving optimization problemsit still has some limitations For example it is easily trappedinto local optima and has a low convergence speed So farno effective methods have been developed to balance localand global searching abilities [20] Therefore more worksare needed to enhance the performance of PSO [21] Onthe other hand data clustering as one of the most populardata mining techniques in discovering potential informationand knowledge from data needs effective methods to obtainbetter clustering results In this study a chaotic starlingparticle swarm optimization (CSPSO) algorithm is proposedto obtain better clustering results by improving the PSOperformance

CSPSO has three major parts A chaotic mapping ratherthan a random generation method is introduced to generatethe acceleration parameters A dynamic disturbance term(DDT) is added in velocity updating to avoid trapping intolocal optima In order to improve the search ability a localsearch strategy based on the behavior of starling birds is used

and the information of neighbors is collected to guide thedirection of the particle

The rest of this paper is organized as follows Thetraditional PSO and the clustering problem are describedin Section 2 The CSPSO is developed and described indetail in Section 3 In Section 4 simulation experimentsare conducted and comparisons with existing methods areperformed to analyze the effectiveness of CSPSO Section 5gives conclusions and future research directions

2 Preliminary

In this section the basic concepts of PSO and data clusteringproblems related to the development of the proposed algo-rithm are described in some detail

21 The Standard PSO PSO is one of the swarm intelligencealgorithms inspired by social behavior of bird flocking [26]In PSO the population size of the particles is denoted by119872and the dimension of the search space is denoted by119863 Eachparticle 119894 has a position vector119909119894 = 1199091198941 1199091198942 sdot sdot sdot 119909119894119863 a velocityvector V119894 = V1198941 V1198942 sdot sdot sdot V119894119863 and an individual best position inhistory 119901119894 = 1199011198941 1199011198942 sdot sdot sdot 119901119894119863 The best position found by allparticles in the swarm called the global best is representedby 119901119892 = 1199011198921 1199011198922 sdot sdot sdot 119901119892119863 At iteration 119905 the new updatedposition and velocity of particle 119894 are determined by (1) and(2) respectively in the following

V119894119895 (119905 + 1) = 119908 lowast V119894119895 (119905) + 1198881 lowast 1199031 lowast (119901119894119895 (119905) minus 119909119894119895 (119905))+ 1198882 lowast 1199032 lowast (119901119892119895 (119905) minus 119909119894119895 (119905)) (1)

119909119894119895 (119905 + 1) = 119909119894119895 (119905) + V119894119895 (119905 + 1) (2)

where 119908 is the inertia weight 1198881 and 1198882 are the accelerationcoefficients controlling the step size and 1199031 and 1199032 aretwo independently generated random numbers uniformlydistributed between 0 and 1 Two termination criteria areused One is when a preset maximal number of iterationsis reached and the other is when a tolerance level hasbeen achieved The standard PSO algorithm is described inAlgorithm 1

Mathematical Problems in Engineering 3

Chaotic Parameter u 25 3 35 4

Cha

otic

Val

ue C

h(t)

0

01

02

03

04

05

06

07

08

09

1

3 3 444

Figure 1 Bifurcation diagram for the logistic map

22 Chaotic Mapping Chaotic maps are mapping methodsused to generate random numbers with features like ergodic-ity and nonlinear and random similarity [27]There aremanykinds of chaoticmaps such as tentmap Tchebychevmap andlogistic map Specifically the logistic map developed by May[28] can explore in the vicinity of a solution by oscillating inthe region One chaotic variant based on logistic mapping isgiven by (3) in the following [29]

119862ℎ (119905 + 1) = 119906 lowast 119862ℎ (119905) lowast (1 minus 119862ℎ (119905)) 119862ℎ (119905) isin (0 1) 119906 isin (0 +infin) (3)

where 119862ℎ(119905) represents the value of the chaotic num-ber at time 119905 119906 is the control parameter and 119862ℎ(0) notin0 025 05 075 10 In (3) the parameter 119906 controls thebehavior of the chaotic variant 119862ℎ The values of 119862ℎ(119905) forvarying values of 119906 are shown in Figure 1

23 Data Clustering Problems Let 119883 = 119883119902 for 119902 =1 2 119873 be a dataset containing119873 data points or instancesData point 119902 is represented by119883119902 = 1198831199021 1198831199022 119883119902119889 with119889 representing the dimension of the data The purpose of adata clustering problem is to find a partition of the datasetby optimizing a fitness function A partition is representedby 119862 = 119862119896 for 119896 = 1 2 119870 where 119870 is the numberof clusters [30] A partition must satisfy the following con-ditions

(i) 1198621198961 cap 1198621198962 = forall1198961 = 1198962(ii) ⋃119870119896=1 119862119896 = 119883(iii) 119862119896 = forall119896 = 1 2 119870

Usually the Euclidean distance is used to measure thedifference or similarity between two data points1198831199021 and1198831199022 Let 119911 = z119896 for 119896 = 1 2 119870 represent the centers of the119870cluster 119862119896 The intracluster distance of cluster 119862119896 is the sum

of the distances of all data points in the cluster to 119911119896 given by(4) in the following

119863 (119883119862119896) = sum119883119902isin119862119896

10038171003817100381710038171003817119883119902 minus 11991111989610038171003817100381710038171003817 (4)

The quality of the clustering results for the dataset can bemeasured by the sum of the intracluster distances over allclusters given by (5) in the following [31]

119865 = 119865 (1198621 1198622 119862119896) = 119870sum119896=1

119863(119883119862119896)

= 119870sum119896=1

sum119883119902isin119862119896

10038171003817100381710038171003817119883119902 minus 11991111989610038171003817100381710038171003817 (5)

119865 defined in (6) is used as the fitness function in clustering inthe following

3 The Chaotic Starling Particle SwarmOptimization Algorithm

Some efforts have been made in order to enhance the searchperformance and convergence speed of PSO In this section achaotic mapping method and a DDT are introduced into thePSOalgorithm to improve the global search ability and a localsearch technique based on starling birds is added to improvethe convergence speed This improved PSO method is theCSPSO algorithm The major components and the details ofthe CSPSO algorithm are discussed in this section

31 Dynamic Update of CSPSO Parameters Three compo-nents velocity of the previous iteration individual factorand social factor are used to update the velocity of aparticle Three parameters 119908 1199031 and 1199032 control the relativecontributions of the three components The inertia weight 119908controls the influence of the velocity of the previous iterationThe cognitive coefficients 1198881 and 1198882 balance the contributionsof the individual and social factors Furthermore the accel-eration coefficients 1199031 and 1199032 control the proportion retainedin the individual and social factors Each parameter plays animportant role in the velocity update and also consequentlyaffects the particle position update Instead of using fixedvalues logistic mapping is used to generate the accelerationcoefficients 1199031 and 1199032 in each generation In addition theinertia weight is adjusted using an exponential function Themodifications of these parameters are shown in (6) and (7)

119903119901 (119905 + 1) = 4 lowast 119903119901 (119905) lowast (1 minus 119903119901 (119905)) 119903119901 (119905) isin (0 1) 119901 = 1 2 (6)

119908 (t) = wmin + (wmax minus wmin) lowast e(minust(tmaxminust)) (7)

where119908min and119908max are the minimum and maximum of theinertia weight and 119905max is the maximum number of iterations

32 The Dynamic Disturbance Term As an optimizationalgorithm based on swarm intelligence PSO uses the

4 Mathematical Problems in Engineering

trajectory of particles in the population by comparing thefitness function values to select the local and global bestpositions Because of local optimal solutions the velocity ofa particle is gradually decreasing during the later iterationsTherefore the majority of the particles fall into and cannoteasily jump out of local optimal points called prematureconvergence and evolutionary stagnation To overcome theseproblems a method based on DDT [32] is used to changethe velocity update of each particle DDT is embedded intovelocity updating when the velocity of a particle becomes toolow or when its position stays unchanged in the middle andfinal generations The modified velocity is updated using (8)and the DDT represented by 119879 is updated using (9) in thefollowing

V119894119895 (119905 + 1) = 119908 lowast V119894119895 (119905) + 1198881 lowast 1199031 lowast (119901119894119895 (119905) minus 119909119894119895 (119905))+ 1198882 lowast 1199032 lowast (119901119892119895 (119905) minus 119909119894119895 (119905)) + 119879 (8)

119879 = 119898 lowast ( ttmax

minus 05) (9)

where 119898 is an accommodation coefficient with a valuebetween 0 and 1 This improvement helps in preventing aparticle from flying out of the boundary of the search spacewhen its velocity is too high in early generations and inescaping from a local optimal point by increasing its velocityso as to improve the overall searching capacity Besides 119879increases linearly to avoid oscillation and to keep a relativelystable search direction in the optimization process

33 Local Search Based on Starling Birds In nature a star-ling bird spreads some kinds of information to its nearbyneighbors in order to defend its position and the informationcoming from one starling bird can spread to everywherein the swarm Inspired by this phenomenon a local searchmethod based on starling bird behavior is used in thedeveloped PSO procedure Each particle seeks a new positionand velocity in the neighborhood by a weighted method thatcan be seen as a kind of information communication betweenan individual and its nearby neighborsThis mechanism triesto seek a better solution in the local exploration around theparticle and reduces the time needed in local search [25]

Dynamic Neighborhood Selection For each particle 119894 theneighborhood 119867 is a subset of the particle population Theparticles in the population are listed in decreasing order ofthe fitness-Euclidean distance ratio (FER) fromparticle 119894 [23]and the119873119890 particles ranked on the top are taken as the ones inthe neighborhood of the particle FER is determined by (10)in the following

119865119864119877(119909119894 1199091198941) = 119886 lowast ((119865 (119909119894) minus 119865 (1199091198941))119864119863(119909119894 1199091198941) ) 1198941 = 1 2 119872

(10)

where ED(119909119894 1199091198941) is the Euclidean distance between 119909119894 and1199091198941 120572 = 119904(F(119909119908) minus F(119909119887)) 119909119887 and 119909119908 are the best and

worst positions of the particles in the current population 119904is the size of search space defined as 119904 = radicsum119863119895 (119909119906119895 minus 119909119897119895)2and 119909119906119895 and 119909119897119895 are the maximum and minimum values of the119895th dimension of the data points in the dataset The numberof neighbors 119873119890 is the size of the neighborhood In order toexplore different search scope a dynamic strategy is used toadjust the size of the neighborhood specified in (11) in thefollowing

119873119890 = 119873119890min + INT(119905 lowast ((119873119890max minus 119873119890min)119905max)) (11)

where 119873119890max and 119873119890min are the maximum and minimumsizes of the neighborhood and INT(119886) is the integral functiontaking only the integer part of 119886Position Update The position of particle 119894 is adjusted usingthe information of the particles in the neighborhood Usingthe weighted positions of the neighbors the new position ofparticle 119894 is determined by (12)

119909119894 = 119909119894 + 1199033 lowast (( 1119873119890) lowast sum1198941isin119867

1199091198941) (12)

where 119909119894 is the current position of particle 119894 119909119894 is the newposition after the adjustment and 1199033 is a randomly generatedreal number uniformly distributed between minus1 and 1

Velocity Update Similar to the update of the position thevelocity of particle 119894 is updated using the velocity informationof the particles in the neighborhood The new velocity ofparticle 119894 is determined by (13) in the following

V119894 = V119894 + 1199034 lowast (( 1119873119890) lowast sum1198941isin119867

V1198941) (13)

where V119894 and V119894 are the current and updated velocities ofparticle 119894 and 1199034 is a randomly generated real number uni-formly distributed between 0 and 1 The collective responsesof starling birds are described by the pseudocode shown inAlgorithm 2

4 The CSPSO for Clustering Problems

41 Particle Selection In the previous section a local searchmethod based on starling bird behavior is used in theneighborhood of a particle to search for better solutionsTime needed by this method will increase In order toaccelerate search speed and reduce time needed the localsearch method is not applied to all particles Two methodsare adopted to select particles from the population The localsearchmethod is then applied to all the particles selected withthese two methods

An Euclidean Distance Method In each generation all theparticles are sorted in ascending order of their Euclideandistances from the global best The particles are dividedinto 119878119890 groups of approximately equal size based on their

Mathematical Problems in Engineering 5

Input 119873119890for Particle 119894

Find the119873119890 neighbors in the neighborhood119867 of particle 119894 using FERThe new position of particle 119894 119909119894 = 119909119894 + 1199033 lowast ((1119873119890) lowast sum1198941isin119867 1199091198941)The new position of particle 119894 V119894 = V119894 + 1199034 lowast ((1119873119890) lowast sum1198941isin119867 V1198941 )if Particle()FitnessltParticle(119894)Fitness

Particle(119894)Fitness=Particle()FitnessParticle(119894)Position=Particle()PositionParticle(119894)Velocity=Particle()Velocity

End ifReturn Particle 119894

End forOutput Particle 119894

Algorithm 2 Starling Birds Collective Responses

Particle swarm initialization

Update velocity and position ofeach particle

Update individual best

and globalbest positions

Local search method in starling birds

Sort the particle population using fitness

values

Selected particles using

Euclidean distance

particles using Fitness function

Max iteration

Return global best position

Particles

Identify neighbors

Position and velocity

adjustment

Select particleusing fitness values

update positionand velocity

Update globalbest position

NO

YES

=+

Selected

Figure 2 A local search method in the CSPSO algorithm

Euclidean distances to the global best The particle rankedon the top of each group is selected As a result a total of 119878119890particles are selected The particles selected in this methoddiversify the search and enhance exploration of the CSPSOprocedure

A Fitness Function Method In each generation all theparticles are sorted in the ascending order of their fitnessfunction values A total of 119878119891 particles ranked on the topare selected The particles selected in this method focus onthe promising regions in the search space so as to enhanceexploitation of the CSPSO procedure

The two particle selection methods described above areused in CSPSO to improve its convergence and increasediversity As a result a total of 119878 = 119878119890+119878119891 particles are selectedIn the implementation 119878119890 = 119878119891 is used Particles selected

through the Euclidean distance method are diversified in thesearch space to effectively avoid premature convergence Theparticles selected through the fitness function method willlead the search direction Figure 2 gives more details aboutthe process of particle selection

42 Population Initialization In the implementation ofCSPSO each cluster center is the position of a particle eachcluster center represents a cluster and different cluster centersrepresent different clustersThe set of cluster centers 119911 = 119911119896for 119896 = 1 2 119870 represents a partitioning result and alsorepresents the position of a particle 119909119894 = 1199111198941 1199111198942 119911119894119870In the initialization the initial position of each particle isgenerated randomly in the entire search space Dimension 119895of the position of particle 119894 represented by 119909119894119895 is generated

6 Mathematical Problems in Engineering

Input 119872 119905max 119873119890min 119873119890max 119908min 119908maxParticle Swarm Population Initializationfor 119894 = 1 to119872 (Population Size)

Particle(119894)BestPosition=Particle(119894)PositionParticle(119894)BestFitness=Particle(119894)Fitnessif Particle(119894)BestFitnessltBest Particle Fitness

Best ParticlePosition=Particle(119894)BestPositionBest ParticleFitness=Particle(119894)BestFitness

End ifEnd forfor 119905 = 1 to 119905max (Max Iteration)Algorithm 1 The Standard PSO Algorithm(See Algorithm 1)

Particle Selection119878119890 = Euclidean Distance Function (Best ParticlePosition)119878119891 = Fitness Function (Particle(119894)Fitness)119878 = 119878119890 + 119878119891Algorithm 2 Starling Birds Collective Responses

for 119894 = 1 to 119878Find the119873119890 neighbors in the neighborhood119867 of particle 119894 using FERThe new position of particle 119894 119909119894 = 119909119894 + 1199033 lowast ((1119873119890) lowast sum1198941isin119867 1199091198941)The new position of particle 119894 V119894 = V119894 + 1199034 lowast ((1119873119890) lowast sum1198941isin119867 V1198941 )if Particle()FitnessltParticle(119894)FitnessParticle(119894)Fitness=Particle()FitnessParticle(119894)Position=Particle()PositionParticle(119894)Velocity=Particle()Velocityif Particle()FitnessltBest ParticleFitness

Best ParticlePosition=Particle()BestPositionBest ParticleFitness=Particle()BestFitness

End ifEnd if

End for

119873119890 = 119873119890min + INT(119905 lowast ((119873119890max minus 119873119890min)119905max))

End forOutput Best ParticleFitness

Algorithm 3 A Chaotic Starling PSO Algorithm

randomly between 119909119897119895 and 119909119906119895 where 119909119897119895 and 119909119906119895 are theminimum and maximum values of dimension 119895 of the searchspace and also are the limits of the positions of the particlesTherefore 119909119894119895 isin [119909119897119895 119909119906119895 ] for 119894 = 1 2 119872 and 119895 =1 2 119863 The values of 119909119897119895 and 119909119906119895 are determined as (14)

119909119897119895 = min (119883119894119895 | 119894 = 1 2 119873) 119909119906119895 = max (119883119894119895 | 119894 = 1 2 119873)

for 119895 = 1 2 119889(14)

where 119883119894119895 is dimension 119895 of data point 119883119894 In particular thedimension 119863 of a particle is D = 119870 lowast 11988943 Boundary Handling If a particle flies outside or acrossthe boundary of the search space this particle no longerrepresents a solution The position and velocity of such aparticle are adjusted If the position of a particle in dimension

119895 is below the lower (above the upper) limit of that dimensionit is set to the lower (upper) limit of that dimensionAccordingly the velocity of this particle along dimension119895 is reversed These adjustments can limit the scope of thepositions and change the velocities as well as changing thedirections of the flying paths of the particles The positionand velocity of such a particle are adjusted as follows

119909119894119895 = 119909119897119895 if 119909119894119895 lt 119909119897119895119909119894119895 = 119909119906119895 if 119909119894119895 gt 119909119906119895 V119894119895 = minusV119894119895

(15)

where 119909119894119895 is the new value of dimension 119895 of the position andV119894119895 is the new value of dimension 119895 of the velocity of particle 119894after the adjustments

44The Pseudocode of the Proposed CSPSO Thepseudocodeof the proposed CSPSO is presented in Algorithm 3

Mathematical Problems in Engineering 7

Table 1 Benchmark Functions

Benchmark Functions Function Expression Domain 119883min 119865min

Sphere 119891min = 119863sum119894=1

1199092119894 [ndash100 100]D 0119863 0

Rastrigin 119891min = 119863sum119894=1

[1199092119894 minus 10 cos(2120587119909119894) + 10] [ndash512 512]D 0119863 0

Griewank 119891min = 119863sum119894=1

( 11990921198944000) minus 119863prod119894=1

cos( 119909119894radic119894) + 1 [ndash600 600]D 0119863 0

Rosenbrock 119891min = 119863sum119894=1

[(1199092119894+1 minus 1199092119894 )2 + (1 minus 119909119894)2] [ndash2048 2048]D 1119863 0

SphereRastrigin

times104

2

1

0100

0 0

minus100 minus100

100

60

40

20

0

minus20

minus40

5

50

0

minus5 minus5

Figure 3 The Sphere and Rastrigin Function for 119863 = 2

5 Simulation Experiments

Three experiments are conducted to evaluate the perfor-mance of CSPSO The first experiment validates the optimalparameter values in CSPSO using the Sphere and Rastrigin[33] benchmark functions The second experiment validatesthe performance of CSPSO using four classical numericalbenchmark functions [33] These functions all to be min-imized and their domains are presented in Table 1 Thethird experiment checks the effectiveness of CSPSO in dataclustering using some datasets from the Artificial Datasets[34] and the UCI Machine Learning Repository [35] CSPSOis implemented in MATLAB and all the experiments areconducted on a LENOVO desktop computer with an Intel320 GHz i5-4460P4 processor and 8GB of RAM in aWindows 8 environment

51 Experiment on Parameter Settings As described abovethe values of the parameters have important influences onthe performance of CSPSOThis section focuses on checkingthe influences of the three critical parameters in CSPSO theinertia weight 119908 the size of the neighborhood 119873119890 and thenumber of selected particles 119878 so as to find appropriate valuesfor them The Sphere and Rastrigin functions are used tostudy the influences of different values of the parameters in

CSPSO The shape and ranges of the Sphere and Rastriginfunctions for 119863 = 2 are depicted in Figure 3 The dimensionof benchmark functions is119863 = 10 in the experiment CSPSOran 30 times for each of the two benchmark functions

For the validity of the experiments and fair comparisonsparameters not to be tested in the experiments are kept atthe same values These parameters are the population size119872 = 50 the max number of iterations 119905max = 100 and thecognitive coefficients 1198881 = 1198882 = 2

The influences of the inertia weight 119908 is tested first Itsvalue is adjusted according to (8)Thevalues of119908min and119908maxdirectly determine the decline speed of 119908 and consequentlyaffect the convergence speed of CSPSO Their values arevaried in the experiments When the value of 119908 varies theminimum and maximum sizes of the neighborhood are keptat 119873119890min = 2 and 119873119890max = 5 and the number of selectedparticles is kept at 119878 = 10 The best and mean values of theSphere and Rastrigin functions are reported in Table 2

The differences in these results are obvious The optimalvalues of the Sphere and Rastrigin functions were obtained atdifferent values of 119908 However the mean values are differentfor the Rastrigin function because it is multimodal withmany local optima Generally it is difficult to find globaloptimal solutions for multimodal functions with traditionaloptimization algorithms Based on these results the optimal

8 Mathematical Problems in Engineering

Table 2 Results when varying the inertia weight 119908Inertia weight (119908min minus 119908max) Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin02minus06 0(00022) 0(09684) 33773 3132102minus09 0(0) 0(16269) 31235 3135804minus09 0(0) 0(17282) 31258 3130904minus12 0(0) 0(07567) 31229 31359

Table 3 Results when varying the size of the neighbourhood119873119890Neighbor size (119873119890min minus 119873119890max) Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin2-5 0(01382) 0(10071) 31503 314202-7 0(0) 0(07359) 31389 314162-9 0(0) 0(13716) 31403 31446

Table 4 Results when varying the number of selected particles 119878119878119890 119878119891 119878 Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin5 10 0(0) 0(06586) 31808 319667 14 0(07567) 0(15407) 42845 426459 18 0(25609) 0(24015) 53850 53840

values of the lower and upper limits of the inertia weight areset to 119908min = 04 and 119908max = 12 where the minimum meanvalues are achieved

The influences of the size of the neighborhood 119873119890 isexamined next In CSPSO the size of the neighborhood 119873119890increases gradually to enhance the local search ability Largevalues will lead to slow convergence and small values willrestrain the local search ability An appropriate value balancesexploration and exploitation The lower and upper limits onthe size of the neighborhood are set to different values and thebest and mean values of the Sphere and Rastrigin functionsare reported in Table 3

Table 3 shows that the best results are obtained when thesize of the neighborhood is between the lower and upperlimits 119873119890min = 2 and 119873119890max = 7 The influences of thenumber of selected particles 119878 are then tested It has influenceson both convergence and search ability Different values for 119878119890and 119878119891 are used and the results are reported in Table 4

The results in Table 4 show that the best number ofselected particles is 10 when the mean values of the testfunctions and computation time are both low It is easy tosee that more particles do not always lead to better resultsBecause of randomness built into the algorithm sometimesmost of the particles may just gather around local optima

Based on these results the lower and upper limits onthe inertia weight 119908min and 119908max are set to 04 and 12respectively the lower and upper limits on the size ofthe neighborhood 119873119890min and 119873119890max are set to 2 and 7respectively and the number of selected particles 119878 is set to10 These parameter values in CSPSO are kept the same andare used in the following experiments

52 Benchmark Functions In this section the 4 classicalnumerical benchmark functions presented in Table 1 are usedto validate the effectiveness of CSPSO The dimension of the

benchmark functions which determines the dimension ofthe search space is set to119863 = 10 20 and 30Theperformanceof CSPSO is compared with those of PSO fitness-Euclideandistance ratio particle swarm optimization (FER-PSO) [23]and starling particle swarm optimization (SPSO) [25] PSO isthe basic swarm intelligent algorithm that was developed byKennedy and Eberhart [26] FER-PSO utilizes the individualbest of a neighbor selected based on the Euclidean distanceand fitness function value to replace the individual best of itsown The SPSO uses the collective responses of the particlesto adjust the positions and velocities of the particles when thealgorithm stagnates

The values of the adjustable parameters in these compet-itive algorithms are the best ones reported in the respectivereferences as listed in Table 5 These competitive algorithmswere also run for 50 independent times each so as to getsome meaningful results The best fitness function values forthese functions obtained by these algorithms are reported inTable 6

Figure 4 shows the convergence of some functions ofthe three algorithms Compared with PSO and FER-PSOin Figures 4(b) and 4(c) the curves of the function valuesobtained by CSPSO decline slowly at the beginning of theevolutionary process because of the time taken by localsearch Hence it has a slow convergence speed Relying onDDT and chaotic mapping CSPSO can find lower fitnessvalues than other algorithms after about half of evolutionaryprocess rather than falling into local optima The local searchstrategy in the neighborhood helps in finding better solutionsin the nearby neighborhood of the current position of aparticle In particular the curve of the Sphere functionobtained by CSPSO disappeared after the best value 0 hasbeen found because this value cannot be marked on thechart As shown in Figure 4(d) for the Rosenbrock functiona multimodal function and traditional algorithms like PSO

Mathematical Problems in Engineering 9

Table 5 Parameter settings in the experiment

Parameters PSO ABC [22] FER-PSO [23] GABC [24] SPSO [25] CSPSOPopulation (119872) 50 50 50 50 50 50119905max 200 200 200 200 200 2001198881 1198882 22 minus 22 minus 22 2 21199031 1199032 (0 1) (minus1 1) (01 04) (minus1 1) (0 1) (0 1)(119908min 119908max) 1 minus 1 1 (02 04) (04 12)Neighbors (119873119890) minus minus 7 minus 7 2minus7Selected (119878) minus minus minus minus minus 10Sub-population minus minus minus minus 14 minusStagnant limit minus minus minus minus 2 minusTrial limit minus 10 minus 10 minus minus

Table 6 Best fitness function values obtained by different algorithms for the test functions

Function 119863 PSO FER-PSO SPSO CSPSO

Sphere10 0 0 0 020 05285 01613 82232e-12 030 24235 08891 51645e-12 0

Rastrigin10 00727 003647 0 020 03595 01895 0 030 10374 04847 0 0

Griewank10 05965 01135 0 020 14575 10994 0 030 32947 16874 0 0

Rosenbrock10 63220 68162 52486 0703720 198643 192219 187899 9843130 381130 337894 287863 186928

FER-PSO and SPSO are easily trapped into local optima butCSPSO can find the smallest function values among thosefound by these algorithms These results show that CSPSOhas strong global search ability through the use of DDT andchaotic mapping It can quickly escape local optima

As the results of the 4 classic functions show in Table 6CSPSO and SPSO perform better than PSO and FER-PSOon the Rastrigin and Griewank functions and CSPSO per-forms better than the other three algorithms on the Sphereand Rosenbrock functions In general the results showthat CSPSO has more powerful global search ability thantraditional algorithms like PSO FER-PSO and SPSO bothin single modal and multimodal functions Furthermore asindicated by the results in Table 6 the better performance ofCSPSO than others provides evidence that the improvementin CSPSO is effective in enhancing the performance of PSOand CSPSO can find the global optimum more often thanother algorithms in the experiments

53 Clustering Problems In this section experiments onclustering problemswith different datasets are performed andresults of different algorithms including CSPSO are reportedand discussed Eight benchmark datasets are used Theyare Data 3 2 Data 5 2 Data 10 2 and Data 4 3 from thework reported in [34] and Iris Glass Contraceptive Method

Choice (CMC) and Wine from the UCI Machine LearningRepository [35] More details about these datasets are givenin Table 7 Data 5 2 and Data 4 3 are also plotted in Figure 5

The optimized values of the parameters in the other algo-rithms shown in Table 5 are adopted in the experiments Inorder to perform fair comparisons all the algorithms run onall the clustering datasets for 50 times to eliminate the effectsrandom factors As simple statistics best values (Best) worstvalues (Worst) mean values (Mean) and standard deviations(SD) are used as the evaluation criteria to measure theeffectiveness of these clustering algorithms The environmentof experiment is the same for all clustering algorithms

Figure 6 shows the convergence of these algorithms onfour datasets for typical runs of the algorithms The fitnessfunctions of CSPSO decline slowly at the beginning ofthe evolutionary process and continue to decline ratherthan dropping into local optimum at the middle of theevolutionary process possibly because DDT gives the par-ticles high velocities Local search based on Euclidean andfitness neighborhood structure also helps the particles findbetter positions This phenomenon is more evident on high-dimensional datasets such as the Glass dataset

To be more clear Figure 7 gives more details aboutthe convergence of these algorithms near the end of thesearch process on three out of the four same datasets

10 Mathematical Problems in Engineering

Table 7 Description of the datasets used in the experiments

Datasets Clusters (119870) Features (119889) Total instances (119873) Instances in each clustersData 3 2 3 2 76 (252526)Data 5 2 5 2 250 (5050505050)Data 10 2 10 2 500 (50505050505050505050)Data 4 3 4 3 400 (100100100100)Iris 3 4 150 (505050)Glass 6 9 214 (70177613929)CMC 3 9 1473 (629334510)Wine 3 13 178 (597148)

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

CSPSOPSO

SPSOFER-PSO

104

103

102

101

100

10minus1

10minus2

(a) Sphere function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Bes

t Val

ues

103

102

101

100

(b) Rastrigin function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

105

100

10minus10

10minus5

10minus15

10minus20

(c) Griewank function

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

103

102

101

100

CSPSOPSO

SPSOFER-PSO

(d) Rosenbrock function

Figure 4 Convergence of the algorithms on some functions (D = 10)

CSPSO has the best performance among all these clusteringalgorithms ABC is the first converging to a local optimalpoint possibly because of the influence of randomly selectedneighbors GABC has a poor performance for large datavolumes such as the Data 5 2 and Glass datasets possibly

due to the local search behavior of onlooker bees FER-PSOhas apparently better convergence performance than otheralgorithms possibly due to the strategy that only one of theneighbors is selected to guide the search of a particle but it iseasily trapped in local optimal points in the later generations

Mathematical Problems in Engineering 11

Data_5_2 Data_4_34 6 8 10 12 14 16

4

6

8

10

12

14

16

2015

105

0minus5minus5

05

1015

0

5

10

15

20

minus520

Figure 5 Distributions of data points in some datasets

of the search process SPSO shows a better performance insearching for a global optimum but its local search strategy isnot working on high-dimensional datasets The above resultsand discussions reveal the different performances amongPSO ABC FER-PSO GABC SPSO and CSPSO and verifythe good performance of CSPSO as a clustering algorithm

It can be seen from Table 8 that CSPSO has a betterperformance on these classical clustering problems For theData 3 2 Data 5 2 and Data 10 2 datasets CSPSO obtainedlower mean fitness values than others but other algorithmsperform better than CSPSO on the best fitness values Thisis possibly due to the randomness of the swarm intelligencealgorithms For the Data 4 3 datasets because of their largenumbers of data points the ability of local search in CSPSOis affected by the selected neighbors ABC has better stabilityon this dataset but has a poor ability in searching for aglobal optimum than the others due to the random strategyCSPSO outperforms the other algorithms and obtains lowermean fitness values than others on the Iris Glass CMC andWine datasets Therefore it has better performance on high-dimensional data This proposed algorithm with embeddedDDT and chaotic mapping has better global search abilityand is more stable than traditional PSO through the use ofa local search method based on starling birds Furthermoreas the sizes of the datasets increase in both the dimensionand number of data points CSPSO has better performancethan PSO ABC GABC FER-PSO and SPSO It is morecapable of dealing with big data in the current informationage

6 Conclusions

Because the performance of standard PSO depends onits parameter values parameter adjustment is one of themost useful ways for PSO to obtain good exploration andexploitation abilities CSPSO proposed in this study enhancesthe overall convergence of the searching process Nonlinearadjustment of the inertia weight and the chaotic search

method based on logistic mapping enhance the global searchperformance of the traditional PSO and help particles escapefrom local optima DDT added to velocity update increasesthe velocities of the particles in later iterations and also helpsparticles escape from local optima The local search strategybased on behavior of starling birds utilizes the information ofthe nearest neighbors in the neighborhood and determinesa new collective position and a new collective velocity Thetwoparticle selectionmethods Euclideandistance and fitnessfunction value maintain population diversity of the particlesThese improvements help particles avoid stagnation and findbetter solutions in the search space

The results of simulation experiments in both benchmarkfunction optimization and classical clustering problems showthe effectiveness of CSPSO Compared with the traditionalPSO and other evolutionary algorithms CSPSO has betterconvergence properties and stronger global search abilityHowever as for other swarm intelligence algorithms CSPSOmay be trapped into and may stagnate around local optimalpoints and therefore may be unstable when applied toproblemswith multiple local optima In future works CSPSOwill be applied to other combinatorial optimization problemssuch as scheduling problems Other metaheuristic methodsand optimization techniques may also be used to improve theperformance of PSO

Data Availability

All the datasets used in the simulation experiments camefrom the Artificial Datasets available at httpwwwisicalacinsimsanghamidatahtml (accessed June 2017) and from theUCIMachine Learning Repository available at httparchiveicsuciedumldatasetshtml (accessed June 2017)

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

12 Mathematical Problems in Engineering

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

46

48

50

52

54

56

58

60

(a) Data 3 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

320

340

360

380

400

420

440

460

480

(b) Data 5 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

90

100

110

120

130

140

150

160

170

180

190

(c) Iris

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

250

300

350

400

450

500

550

600

650

(d) Glass

Figure 6 Convergence of the algorithms on some datasets

Iteration 19965 1997 19975 1998 19985 1999 19995 200

Best

Valu

es

47515

4752

47525

4753

47535

4754

Iteration 188 190 192 194 196 198 200

Best

Valu

es

322323324325326327328329330331

Iteration 188 190 192 194 196 198 200

Best

Valu

es

94

95

96

97

98

99

Figure 7 Convergence of the algorithms on the Data 3 2 Data 5 2 and Iris datasets

Mathematical Problems in Engineering 13

Table 8 Comparison of performance of CSPSO with other clustering algorithms

Datasets Parameters AlgorithmsPSO ABC FERminusPSO GABC SPSO CSPSO

Data 3 2

Best 475287 477630 475290 475282 475733 475282Worst 591935 601039 497856 591895 531721 475382Mean 484644 509702 478771 482279 478934 475300SD 31956 31345 04967 27975 08551 00022

Data 5 2

Best 3264386 3375538 3269987 3297155 3264666 3264432Worst 3922899 4110786 3326651 3648400 3270877 3269676Mean 3296249 3736977 3287897 3435928 3267493 3265300SD 129978 201046 11594 96214 01536 00942

Data 10 2

Best 10793e+03 9164320 8756264 13134e+03 8432274 8498712Worst 14459e+03 12718e+03 12068e+03 11657e+03 11827e+03 10141e+03Mean 12875e+03 10608e+03 10066e+03 638046 9418921 9182542SD 704484 761788 671964 13134e+03 855520 505344

Data 4 3

Best 7518897 10937e+03 8305456 7504056 7499506 7496738Worst 12866e+03 16347e+03 13753e+03 18485e+03 12726e+03 12725e+03Mean 8388670 13838e+03 10312e+03 11489e+03 7938641 7845028SD 1818515 1223579 1500554 2713938 1425136 1251508

Iris

Best 1005216 1131201 980975 966829 966605 966605Worst 1306931 1568960 1029780 1276869 986906 980888Mean 1137655 1331161 1003761 1050487 969465 969194SD 71771 110706 12211 112307 03921 02931

Glass

Best 3291487 3337214 2786565 2885152 2353554 2296544Worst 4365615 5031607 3265351 4025881 3090997 2789187Mean 3764035 4223809 2973875 3464078 2805624 2570244SD 215371 320309 111730 269515 145045 103633

CMC

Best 58449e+03 59895e+03 56153e+03 56704e+03 55615e+03 55404e+03Worst 69260e+03 72137e+03 59728e+03 64143e+03 57676e+03 55861e+03Mean 61772e+03 64821e+03 57297e+03 58864e+03 56426e+03 55562e+03SD 1867542 2432599 730180 1408947 446957 87806

Wine

Best 16308e+04 16399e+04 16312e+04 16309e+04 16305e+04 16298e+04Worst 16798e+04 17317e+04 16376e+04 16482e+04 16356e+04 16313e+04Mean 16438e+04 16705e+04 16336e+04 16355e+04 16322e+04 16304e+04SD 950097 2197057 159233 354218 120030 33799

Acknowledgments

This research project was partially supported by the NationalNatural Science Foundation of China (61472231 6150228361640201) Humanities and Social Science Research Fund ofthe Ministry of Education of China (12YJA630152) and theShandong Social Science Foundation of China (16BGLJ0611CGLJ22)

References

[1] X LiuH Liu andHDuan ldquoParticle swarmoptimizationbasedon dynamic niche technology with applications to conceptualdesignrdquo Advances in Engineering Software vol 38 no 10 pp668ndash676 2007

[2] Z Bao and TWatanabe ldquoMixed constrained image filter designusing particle swarm optimizationrdquo Artificial Life and Roboticsvol 15 no 3 pp 363ndash368 2010

[3] S Sahin andM A Cavuslu ldquoFPGA Implementation ofWaveletNeural Network Training with PSOiPSOrdquo Journal of CircuitsSystems and Computers vol 27 no 6 2018

[4] D Wu and H Gao ldquoA BP and Switching PSO Based Optimiza-tion Approach for Engine Optimizationrdquo National Academy ofScience Letters vol 40 no 1 pp 33ndash37 2017

[5] Q Jia and Y Guo ldquoHybridization of ABC and PSO algorithmsfor improved solutions of RCPSPrdquo Journal of the ChineseInstitute of Engineers Transactions of the Chinese Institute ofEngineersSeries AChung-kuo Kung Chrsquoeng Hsuch Krsquoan vol 39no 6 pp 727ndash734 2016

[6] C Yue B Qu and J Liang ldquoA Multi-objective Particle SwarmOptimizer Using Ring Topology for Solving Multimodal Multi-objective Problemsrdquo IEEE Transactions on Evolutionary Com-putation 2017

[7] X Liu and J Xue ldquoA Cluster Splitting Technique by HopfieldNetworks and P Systems on Simplicesrdquo Neural ProcessingLetters pp 1ndash24 2017

14 Mathematical Problems in Engineering

[8] Y Zhao X Liu and W Wang ldquoSpiking neural P systems withneuron division and dissolutionrdquo PLoS ONE vol 11 no 9Article ID e0162882 2016

[9] X Liu Y Zhao and M Sun ldquoAn Improved Apriori AlgorithmBased on an Evolution-Communication Tissue-Like P Systemwith Promoters and Inhibitorsrdquo Discrete Dynamics in Natureand Society vol 2017 2017

[10] N Netjinda T Achalakul and B Sirinaovakul ldquoParticle SwarmOptimization inspired by starling flock behaviorrdquo Applied SoftComputing vol 35 pp 411ndash422 2015

[11] A El Dor D Lemoine M Clerc P Siarry L Deroussi andMGourgand ldquoDynamic cluster in particle swarm optimizationalgorithmrdquo Natural Computing vol 14 no 4 pp 655ndash672 2015

[12] Y M B Ali ldquoUnsupervised Clustering Based an AdaptiveParticle Swarm Optimization Algorithmrdquo Neural ProcessingLetters vol 44 no 1 pp 221ndash244 2016

[13] G Armano and M R Farmani ldquoMultiobjective clusteringanalysis using particle swarm optimizationrdquo Expert Systemswith Applications vol 55 pp 184ndash193 2016

[14] B Niu Q Duan L Tan C Liu and P Liang ldquoA Population-Based Clustering Technique Using Particle Swarm Optimiza-tion and K-Meansrdquo in Advances in Swarm and ComputationalIntelligence vol 9140 of Lecture Notes in Computer Science pp145ndash152 Springer International Publishing Cham 2015

[15] X Zhao W Lin J Hao X Zuo and J Yuan ldquoClustering andpattern search for enhancing particle swarm optimization withEuclidean spatial neighborhood searchrdquo Neurocomputing vol171 pp 966ndash981 2016

[16] E Comak ldquoA modified particle swarm optimization algorithmusing Renyi entropy-based clusteringrdquo Neural Computing andApplications vol 27 no 5 pp 1381ndash1390 2016

[17] K K Bharti and P K Singh ldquoOpposition chaotic fitness muta-tion based adaptive inertia weight BPSO for feature selection intext clusteringrdquoApplied SoftComputing vol 43 pp 20ndash34 2016

[18] W Song W Ma and Y Qiao ldquoParticle swarm optimizationalgorithm with environmental factors for clustering analysisrdquoSoft Computing vol 21 no 2 pp 283ndash293 2017

[19] R Liu J Li J Fan C Mu and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimizationfor dynamic multi-objective optimizationrdquo European Journal ofOperational Research vol 261 no 3 pp 1028ndash1051 2017

[20] X Chu B Niu J J Liang and Q Lu ldquoAn orthogonal-designhybrid particle swarm optimiser with application to capacitatedfacility location problemrdquo International Journal of Bio-InspiredComputation vol 8 no 5 pp 268ndash285 2016

[21] DWang D Tan and L Liu ldquoParticle Swarm Optimization Anoverviewrdquo Soft Computing vol 22 no 2 p 22 2018

[22] D Karaboga ldquoAn Idea Based on Honey Bee Swarm for Numer-ical Optimizationrdquo Technical Report TR0 2005

[23] X Li ldquoA multimodal particle swarm optimizer based on fitnessEuclidean-distance ratiordquo in Proceedings of the the 9th annualconference pp 78ndash85 London England July 2007

[24] G Zhu and S Kwong ldquoGbest-guided artificial bee colonyalgorithm for numerical function optimizationrdquo Applied Math-ematics and Computation vol 217 no 7 pp 3166ndash3173 2010

[25] A Laudani F R Fulginei G M Lozito and A SalvinildquoSwarmflock optimization algorithms as continuous dynamicsystemsrdquo Applied Mathematics and Computation vol 243 pp670ndash683 2014

[26] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[27] K K Bharti and P K Singh ldquoChaotic gradient artificial beecolony for text clusteringrdquo Soft Computing vol 20 no 3 pp1113ndash1126 2016

[28] RMMay ldquoSimplemathematicalmodelswith very complicateddynamicsrdquo Nature vol 261 no 5560 pp 459ndash467 1976

[29] F S Hover and M S Triantafyllou ldquoApplication of polynomialchaos in stability and controlrdquo Automatica vol 42 no 5 pp789ndash795 2006

[30] X Liu L Xiang and X Wang ldquoSpatial cluster analysis bythe adleman-lipton DNA computing model and flexible gridsrdquoDiscrete Dynamics in Nature and Society vol 2012 Article ID894207 32 pages 2012

[31] H Peng X Luo ZGao JWang andZ Pei ldquoANovel ClusteringAlgorithm Inspired by Membrane Computingrdquo The ScientificWorld Journal vol 3 no 22 Article ID 929471 8 pages 2015

[32] B Ufnalski and L M Grzesiak ldquoPlug-in direct particle swarmrepetitive controller with a reduced dimensionality of a fitnesslandscape - a multi-swarm approachrdquo Bulletin of the PolishAcademy of SciencesmdashTechnical Sciences vol 63 no 4 pp 857ndash866 2015

[33] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[34] ldquoDatasetsrdquo httpwwwisicalacinsimsanghamidatahtml[35] H C Blake UCI Repository of Machine Learning Databases

1998

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 2: A New Chaotic Starling Particle Swarm Optimization ...downloads.hindawi.com/journals/mpe/2018/8250480.pdfMathematicalProblemsinEngineering Chaotic Parameter u 2.5 3 3.5 4 Chaotic Value

2 Mathematical Problems in Engineering

for 119894 = 1 to119872V119894119895 (119905 + 1) = 119908 lowast V119894119895 (119905) + 1198881 lowast 1199031 lowast (119901119894119895 (119905) minus 119909119894119895 (119905)) + 1198882 lowast 1199032 lowast (119901119892119895 (119905) minus 119909119894119895 (119905))119909119894119895 (119905 + 1) = 119909119894119895 (119905) + V119894119895 (119905 + 1)if Particle(119894)FitnessltParticle(119894)BestFitness

Particle(119894)BestPosition=Particle(119894)PositionParticle(119894)BestFitness= Particle(119894)Fitnessif Particle(119894)BestFitnessltBest ParticleFitness

Best ParticlePosition=Particle(119894)BestPositionBest ParticleFitness=Particle(119894)BestFitnessEnd if

End ifEnd for

Algorithm 1 The Standard PSO Algorithm

problem Ali [12] presented a new variant of the positionupdating rule of PSO in which each particle is treated as abidimensional vector Armano et al [13] proposed a multiob-jective clustering PSO algorithm with two objective functionsdefined to integrate data connectivity and cohesion Niu etal [14] proposed a population-based clustering algorithmwhich combines different PSO and k-means algorithms tohelp particles escape from local optima

Exploring the topological neighborhood of the k-nearestneighbors and employing pattern search are consideredto be useful tools to improve the performance of PSO[15] Cormark [16] proposed a PSO procedure using Renyientropy clustering which contains two steps initializationand particle removal Bharti and Singh [17] proposed a binaryPSO procedure with an opposition-based learning mecha-nism using chaotic mapping dynamic inertia weight and amutation operator Song et al [18] added an environmentfactor to the velocity adjustment in PSO to enhance therobust behavior of the particles Liu et al [19] proposed amodified coevolutionary multiswarm PSO procedure basedon new velocity updating and similarity detection to solvemultiobjective clustering problems

Although PSO has been widely used in many fields andhas shown a great potential in solving optimization problemsit still has some limitations For example it is easily trappedinto local optima and has a low convergence speed So farno effective methods have been developed to balance localand global searching abilities [20] Therefore more worksare needed to enhance the performance of PSO [21] Onthe other hand data clustering as one of the most populardata mining techniques in discovering potential informationand knowledge from data needs effective methods to obtainbetter clustering results In this study a chaotic starlingparticle swarm optimization (CSPSO) algorithm is proposedto obtain better clustering results by improving the PSOperformance

CSPSO has three major parts A chaotic mapping ratherthan a random generation method is introduced to generatethe acceleration parameters A dynamic disturbance term(DDT) is added in velocity updating to avoid trapping intolocal optima In order to improve the search ability a localsearch strategy based on the behavior of starling birds is used

and the information of neighbors is collected to guide thedirection of the particle

The rest of this paper is organized as follows Thetraditional PSO and the clustering problem are describedin Section 2 The CSPSO is developed and described indetail in Section 3 In Section 4 simulation experimentsare conducted and comparisons with existing methods areperformed to analyze the effectiveness of CSPSO Section 5gives conclusions and future research directions

2 Preliminary

In this section the basic concepts of PSO and data clusteringproblems related to the development of the proposed algo-rithm are described in some detail

21 The Standard PSO PSO is one of the swarm intelligencealgorithms inspired by social behavior of bird flocking [26]In PSO the population size of the particles is denoted by119872and the dimension of the search space is denoted by119863 Eachparticle 119894 has a position vector119909119894 = 1199091198941 1199091198942 sdot sdot sdot 119909119894119863 a velocityvector V119894 = V1198941 V1198942 sdot sdot sdot V119894119863 and an individual best position inhistory 119901119894 = 1199011198941 1199011198942 sdot sdot sdot 119901119894119863 The best position found by allparticles in the swarm called the global best is representedby 119901119892 = 1199011198921 1199011198922 sdot sdot sdot 119901119892119863 At iteration 119905 the new updatedposition and velocity of particle 119894 are determined by (1) and(2) respectively in the following

V119894119895 (119905 + 1) = 119908 lowast V119894119895 (119905) + 1198881 lowast 1199031 lowast (119901119894119895 (119905) minus 119909119894119895 (119905))+ 1198882 lowast 1199032 lowast (119901119892119895 (119905) minus 119909119894119895 (119905)) (1)

119909119894119895 (119905 + 1) = 119909119894119895 (119905) + V119894119895 (119905 + 1) (2)

where 119908 is the inertia weight 1198881 and 1198882 are the accelerationcoefficients controlling the step size and 1199031 and 1199032 aretwo independently generated random numbers uniformlydistributed between 0 and 1 Two termination criteria areused One is when a preset maximal number of iterationsis reached and the other is when a tolerance level hasbeen achieved The standard PSO algorithm is described inAlgorithm 1

Mathematical Problems in Engineering 3

Chaotic Parameter u 25 3 35 4

Cha

otic

Val

ue C

h(t)

0

01

02

03

04

05

06

07

08

09

1

3 3 444

Figure 1 Bifurcation diagram for the logistic map

22 Chaotic Mapping Chaotic maps are mapping methodsused to generate random numbers with features like ergodic-ity and nonlinear and random similarity [27]There aremanykinds of chaoticmaps such as tentmap Tchebychevmap andlogistic map Specifically the logistic map developed by May[28] can explore in the vicinity of a solution by oscillating inthe region One chaotic variant based on logistic mapping isgiven by (3) in the following [29]

119862ℎ (119905 + 1) = 119906 lowast 119862ℎ (119905) lowast (1 minus 119862ℎ (119905)) 119862ℎ (119905) isin (0 1) 119906 isin (0 +infin) (3)

where 119862ℎ(119905) represents the value of the chaotic num-ber at time 119905 119906 is the control parameter and 119862ℎ(0) notin0 025 05 075 10 In (3) the parameter 119906 controls thebehavior of the chaotic variant 119862ℎ The values of 119862ℎ(119905) forvarying values of 119906 are shown in Figure 1

23 Data Clustering Problems Let 119883 = 119883119902 for 119902 =1 2 119873 be a dataset containing119873 data points or instancesData point 119902 is represented by119883119902 = 1198831199021 1198831199022 119883119902119889 with119889 representing the dimension of the data The purpose of adata clustering problem is to find a partition of the datasetby optimizing a fitness function A partition is representedby 119862 = 119862119896 for 119896 = 1 2 119870 where 119870 is the numberof clusters [30] A partition must satisfy the following con-ditions

(i) 1198621198961 cap 1198621198962 = forall1198961 = 1198962(ii) ⋃119870119896=1 119862119896 = 119883(iii) 119862119896 = forall119896 = 1 2 119870

Usually the Euclidean distance is used to measure thedifference or similarity between two data points1198831199021 and1198831199022 Let 119911 = z119896 for 119896 = 1 2 119870 represent the centers of the119870cluster 119862119896 The intracluster distance of cluster 119862119896 is the sum

of the distances of all data points in the cluster to 119911119896 given by(4) in the following

119863 (119883119862119896) = sum119883119902isin119862119896

10038171003817100381710038171003817119883119902 minus 11991111989610038171003817100381710038171003817 (4)

The quality of the clustering results for the dataset can bemeasured by the sum of the intracluster distances over allclusters given by (5) in the following [31]

119865 = 119865 (1198621 1198622 119862119896) = 119870sum119896=1

119863(119883119862119896)

= 119870sum119896=1

sum119883119902isin119862119896

10038171003817100381710038171003817119883119902 minus 11991111989610038171003817100381710038171003817 (5)

119865 defined in (6) is used as the fitness function in clustering inthe following

3 The Chaotic Starling Particle SwarmOptimization Algorithm

Some efforts have been made in order to enhance the searchperformance and convergence speed of PSO In this section achaotic mapping method and a DDT are introduced into thePSOalgorithm to improve the global search ability and a localsearch technique based on starling birds is added to improvethe convergence speed This improved PSO method is theCSPSO algorithm The major components and the details ofthe CSPSO algorithm are discussed in this section

31 Dynamic Update of CSPSO Parameters Three compo-nents velocity of the previous iteration individual factorand social factor are used to update the velocity of aparticle Three parameters 119908 1199031 and 1199032 control the relativecontributions of the three components The inertia weight 119908controls the influence of the velocity of the previous iterationThe cognitive coefficients 1198881 and 1198882 balance the contributionsof the individual and social factors Furthermore the accel-eration coefficients 1199031 and 1199032 control the proportion retainedin the individual and social factors Each parameter plays animportant role in the velocity update and also consequentlyaffects the particle position update Instead of using fixedvalues logistic mapping is used to generate the accelerationcoefficients 1199031 and 1199032 in each generation In addition theinertia weight is adjusted using an exponential function Themodifications of these parameters are shown in (6) and (7)

119903119901 (119905 + 1) = 4 lowast 119903119901 (119905) lowast (1 minus 119903119901 (119905)) 119903119901 (119905) isin (0 1) 119901 = 1 2 (6)

119908 (t) = wmin + (wmax minus wmin) lowast e(minust(tmaxminust)) (7)

where119908min and119908max are the minimum and maximum of theinertia weight and 119905max is the maximum number of iterations

32 The Dynamic Disturbance Term As an optimizationalgorithm based on swarm intelligence PSO uses the

4 Mathematical Problems in Engineering

trajectory of particles in the population by comparing thefitness function values to select the local and global bestpositions Because of local optimal solutions the velocity ofa particle is gradually decreasing during the later iterationsTherefore the majority of the particles fall into and cannoteasily jump out of local optimal points called prematureconvergence and evolutionary stagnation To overcome theseproblems a method based on DDT [32] is used to changethe velocity update of each particle DDT is embedded intovelocity updating when the velocity of a particle becomes toolow or when its position stays unchanged in the middle andfinal generations The modified velocity is updated using (8)and the DDT represented by 119879 is updated using (9) in thefollowing

V119894119895 (119905 + 1) = 119908 lowast V119894119895 (119905) + 1198881 lowast 1199031 lowast (119901119894119895 (119905) minus 119909119894119895 (119905))+ 1198882 lowast 1199032 lowast (119901119892119895 (119905) minus 119909119894119895 (119905)) + 119879 (8)

119879 = 119898 lowast ( ttmax

minus 05) (9)

where 119898 is an accommodation coefficient with a valuebetween 0 and 1 This improvement helps in preventing aparticle from flying out of the boundary of the search spacewhen its velocity is too high in early generations and inescaping from a local optimal point by increasing its velocityso as to improve the overall searching capacity Besides 119879increases linearly to avoid oscillation and to keep a relativelystable search direction in the optimization process

33 Local Search Based on Starling Birds In nature a star-ling bird spreads some kinds of information to its nearbyneighbors in order to defend its position and the informationcoming from one starling bird can spread to everywherein the swarm Inspired by this phenomenon a local searchmethod based on starling bird behavior is used in thedeveloped PSO procedure Each particle seeks a new positionand velocity in the neighborhood by a weighted method thatcan be seen as a kind of information communication betweenan individual and its nearby neighborsThis mechanism triesto seek a better solution in the local exploration around theparticle and reduces the time needed in local search [25]

Dynamic Neighborhood Selection For each particle 119894 theneighborhood 119867 is a subset of the particle population Theparticles in the population are listed in decreasing order ofthe fitness-Euclidean distance ratio (FER) fromparticle 119894 [23]and the119873119890 particles ranked on the top are taken as the ones inthe neighborhood of the particle FER is determined by (10)in the following

119865119864119877(119909119894 1199091198941) = 119886 lowast ((119865 (119909119894) minus 119865 (1199091198941))119864119863(119909119894 1199091198941) ) 1198941 = 1 2 119872

(10)

where ED(119909119894 1199091198941) is the Euclidean distance between 119909119894 and1199091198941 120572 = 119904(F(119909119908) minus F(119909119887)) 119909119887 and 119909119908 are the best and

worst positions of the particles in the current population 119904is the size of search space defined as 119904 = radicsum119863119895 (119909119906119895 minus 119909119897119895)2and 119909119906119895 and 119909119897119895 are the maximum and minimum values of the119895th dimension of the data points in the dataset The numberof neighbors 119873119890 is the size of the neighborhood In order toexplore different search scope a dynamic strategy is used toadjust the size of the neighborhood specified in (11) in thefollowing

119873119890 = 119873119890min + INT(119905 lowast ((119873119890max minus 119873119890min)119905max)) (11)

where 119873119890max and 119873119890min are the maximum and minimumsizes of the neighborhood and INT(119886) is the integral functiontaking only the integer part of 119886Position Update The position of particle 119894 is adjusted usingthe information of the particles in the neighborhood Usingthe weighted positions of the neighbors the new position ofparticle 119894 is determined by (12)

119909119894 = 119909119894 + 1199033 lowast (( 1119873119890) lowast sum1198941isin119867

1199091198941) (12)

where 119909119894 is the current position of particle 119894 119909119894 is the newposition after the adjustment and 1199033 is a randomly generatedreal number uniformly distributed between minus1 and 1

Velocity Update Similar to the update of the position thevelocity of particle 119894 is updated using the velocity informationof the particles in the neighborhood The new velocity ofparticle 119894 is determined by (13) in the following

V119894 = V119894 + 1199034 lowast (( 1119873119890) lowast sum1198941isin119867

V1198941) (13)

where V119894 and V119894 are the current and updated velocities ofparticle 119894 and 1199034 is a randomly generated real number uni-formly distributed between 0 and 1 The collective responsesof starling birds are described by the pseudocode shown inAlgorithm 2

4 The CSPSO for Clustering Problems

41 Particle Selection In the previous section a local searchmethod based on starling bird behavior is used in theneighborhood of a particle to search for better solutionsTime needed by this method will increase In order toaccelerate search speed and reduce time needed the localsearch method is not applied to all particles Two methodsare adopted to select particles from the population The localsearchmethod is then applied to all the particles selected withthese two methods

An Euclidean Distance Method In each generation all theparticles are sorted in ascending order of their Euclideandistances from the global best The particles are dividedinto 119878119890 groups of approximately equal size based on their

Mathematical Problems in Engineering 5

Input 119873119890for Particle 119894

Find the119873119890 neighbors in the neighborhood119867 of particle 119894 using FERThe new position of particle 119894 119909119894 = 119909119894 + 1199033 lowast ((1119873119890) lowast sum1198941isin119867 1199091198941)The new position of particle 119894 V119894 = V119894 + 1199034 lowast ((1119873119890) lowast sum1198941isin119867 V1198941 )if Particle()FitnessltParticle(119894)Fitness

Particle(119894)Fitness=Particle()FitnessParticle(119894)Position=Particle()PositionParticle(119894)Velocity=Particle()Velocity

End ifReturn Particle 119894

End forOutput Particle 119894

Algorithm 2 Starling Birds Collective Responses

Particle swarm initialization

Update velocity and position ofeach particle

Update individual best

and globalbest positions

Local search method in starling birds

Sort the particle population using fitness

values

Selected particles using

Euclidean distance

particles using Fitness function

Max iteration

Return global best position

Particles

Identify neighbors

Position and velocity

adjustment

Select particleusing fitness values

update positionand velocity

Update globalbest position

NO

YES

=+

Selected

Figure 2 A local search method in the CSPSO algorithm

Euclidean distances to the global best The particle rankedon the top of each group is selected As a result a total of 119878119890particles are selected The particles selected in this methoddiversify the search and enhance exploration of the CSPSOprocedure

A Fitness Function Method In each generation all theparticles are sorted in the ascending order of their fitnessfunction values A total of 119878119891 particles ranked on the topare selected The particles selected in this method focus onthe promising regions in the search space so as to enhanceexploitation of the CSPSO procedure

The two particle selection methods described above areused in CSPSO to improve its convergence and increasediversity As a result a total of 119878 = 119878119890+119878119891 particles are selectedIn the implementation 119878119890 = 119878119891 is used Particles selected

through the Euclidean distance method are diversified in thesearch space to effectively avoid premature convergence Theparticles selected through the fitness function method willlead the search direction Figure 2 gives more details aboutthe process of particle selection

42 Population Initialization In the implementation ofCSPSO each cluster center is the position of a particle eachcluster center represents a cluster and different cluster centersrepresent different clustersThe set of cluster centers 119911 = 119911119896for 119896 = 1 2 119870 represents a partitioning result and alsorepresents the position of a particle 119909119894 = 1199111198941 1199111198942 119911119894119870In the initialization the initial position of each particle isgenerated randomly in the entire search space Dimension 119895of the position of particle 119894 represented by 119909119894119895 is generated

6 Mathematical Problems in Engineering

Input 119872 119905max 119873119890min 119873119890max 119908min 119908maxParticle Swarm Population Initializationfor 119894 = 1 to119872 (Population Size)

Particle(119894)BestPosition=Particle(119894)PositionParticle(119894)BestFitness=Particle(119894)Fitnessif Particle(119894)BestFitnessltBest Particle Fitness

Best ParticlePosition=Particle(119894)BestPositionBest ParticleFitness=Particle(119894)BestFitness

End ifEnd forfor 119905 = 1 to 119905max (Max Iteration)Algorithm 1 The Standard PSO Algorithm(See Algorithm 1)

Particle Selection119878119890 = Euclidean Distance Function (Best ParticlePosition)119878119891 = Fitness Function (Particle(119894)Fitness)119878 = 119878119890 + 119878119891Algorithm 2 Starling Birds Collective Responses

for 119894 = 1 to 119878Find the119873119890 neighbors in the neighborhood119867 of particle 119894 using FERThe new position of particle 119894 119909119894 = 119909119894 + 1199033 lowast ((1119873119890) lowast sum1198941isin119867 1199091198941)The new position of particle 119894 V119894 = V119894 + 1199034 lowast ((1119873119890) lowast sum1198941isin119867 V1198941 )if Particle()FitnessltParticle(119894)FitnessParticle(119894)Fitness=Particle()FitnessParticle(119894)Position=Particle()PositionParticle(119894)Velocity=Particle()Velocityif Particle()FitnessltBest ParticleFitness

Best ParticlePosition=Particle()BestPositionBest ParticleFitness=Particle()BestFitness

End ifEnd if

End for

119873119890 = 119873119890min + INT(119905 lowast ((119873119890max minus 119873119890min)119905max))

End forOutput Best ParticleFitness

Algorithm 3 A Chaotic Starling PSO Algorithm

randomly between 119909119897119895 and 119909119906119895 where 119909119897119895 and 119909119906119895 are theminimum and maximum values of dimension 119895 of the searchspace and also are the limits of the positions of the particlesTherefore 119909119894119895 isin [119909119897119895 119909119906119895 ] for 119894 = 1 2 119872 and 119895 =1 2 119863 The values of 119909119897119895 and 119909119906119895 are determined as (14)

119909119897119895 = min (119883119894119895 | 119894 = 1 2 119873) 119909119906119895 = max (119883119894119895 | 119894 = 1 2 119873)

for 119895 = 1 2 119889(14)

where 119883119894119895 is dimension 119895 of data point 119883119894 In particular thedimension 119863 of a particle is D = 119870 lowast 11988943 Boundary Handling If a particle flies outside or acrossthe boundary of the search space this particle no longerrepresents a solution The position and velocity of such aparticle are adjusted If the position of a particle in dimension

119895 is below the lower (above the upper) limit of that dimensionit is set to the lower (upper) limit of that dimensionAccordingly the velocity of this particle along dimension119895 is reversed These adjustments can limit the scope of thepositions and change the velocities as well as changing thedirections of the flying paths of the particles The positionand velocity of such a particle are adjusted as follows

119909119894119895 = 119909119897119895 if 119909119894119895 lt 119909119897119895119909119894119895 = 119909119906119895 if 119909119894119895 gt 119909119906119895 V119894119895 = minusV119894119895

(15)

where 119909119894119895 is the new value of dimension 119895 of the position andV119894119895 is the new value of dimension 119895 of the velocity of particle 119894after the adjustments

44The Pseudocode of the Proposed CSPSO Thepseudocodeof the proposed CSPSO is presented in Algorithm 3

Mathematical Problems in Engineering 7

Table 1 Benchmark Functions

Benchmark Functions Function Expression Domain 119883min 119865min

Sphere 119891min = 119863sum119894=1

1199092119894 [ndash100 100]D 0119863 0

Rastrigin 119891min = 119863sum119894=1

[1199092119894 minus 10 cos(2120587119909119894) + 10] [ndash512 512]D 0119863 0

Griewank 119891min = 119863sum119894=1

( 11990921198944000) minus 119863prod119894=1

cos( 119909119894radic119894) + 1 [ndash600 600]D 0119863 0

Rosenbrock 119891min = 119863sum119894=1

[(1199092119894+1 minus 1199092119894 )2 + (1 minus 119909119894)2] [ndash2048 2048]D 1119863 0

SphereRastrigin

times104

2

1

0100

0 0

minus100 minus100

100

60

40

20

0

minus20

minus40

5

50

0

minus5 minus5

Figure 3 The Sphere and Rastrigin Function for 119863 = 2

5 Simulation Experiments

Three experiments are conducted to evaluate the perfor-mance of CSPSO The first experiment validates the optimalparameter values in CSPSO using the Sphere and Rastrigin[33] benchmark functions The second experiment validatesthe performance of CSPSO using four classical numericalbenchmark functions [33] These functions all to be min-imized and their domains are presented in Table 1 Thethird experiment checks the effectiveness of CSPSO in dataclustering using some datasets from the Artificial Datasets[34] and the UCI Machine Learning Repository [35] CSPSOis implemented in MATLAB and all the experiments areconducted on a LENOVO desktop computer with an Intel320 GHz i5-4460P4 processor and 8GB of RAM in aWindows 8 environment

51 Experiment on Parameter Settings As described abovethe values of the parameters have important influences onthe performance of CSPSOThis section focuses on checkingthe influences of the three critical parameters in CSPSO theinertia weight 119908 the size of the neighborhood 119873119890 and thenumber of selected particles 119878 so as to find appropriate valuesfor them The Sphere and Rastrigin functions are used tostudy the influences of different values of the parameters in

CSPSO The shape and ranges of the Sphere and Rastriginfunctions for 119863 = 2 are depicted in Figure 3 The dimensionof benchmark functions is119863 = 10 in the experiment CSPSOran 30 times for each of the two benchmark functions

For the validity of the experiments and fair comparisonsparameters not to be tested in the experiments are kept atthe same values These parameters are the population size119872 = 50 the max number of iterations 119905max = 100 and thecognitive coefficients 1198881 = 1198882 = 2

The influences of the inertia weight 119908 is tested first Itsvalue is adjusted according to (8)Thevalues of119908min and119908maxdirectly determine the decline speed of 119908 and consequentlyaffect the convergence speed of CSPSO Their values arevaried in the experiments When the value of 119908 varies theminimum and maximum sizes of the neighborhood are keptat 119873119890min = 2 and 119873119890max = 5 and the number of selectedparticles is kept at 119878 = 10 The best and mean values of theSphere and Rastrigin functions are reported in Table 2

The differences in these results are obvious The optimalvalues of the Sphere and Rastrigin functions were obtained atdifferent values of 119908 However the mean values are differentfor the Rastrigin function because it is multimodal withmany local optima Generally it is difficult to find globaloptimal solutions for multimodal functions with traditionaloptimization algorithms Based on these results the optimal

8 Mathematical Problems in Engineering

Table 2 Results when varying the inertia weight 119908Inertia weight (119908min minus 119908max) Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin02minus06 0(00022) 0(09684) 33773 3132102minus09 0(0) 0(16269) 31235 3135804minus09 0(0) 0(17282) 31258 3130904minus12 0(0) 0(07567) 31229 31359

Table 3 Results when varying the size of the neighbourhood119873119890Neighbor size (119873119890min minus 119873119890max) Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin2-5 0(01382) 0(10071) 31503 314202-7 0(0) 0(07359) 31389 314162-9 0(0) 0(13716) 31403 31446

Table 4 Results when varying the number of selected particles 119878119878119890 119878119891 119878 Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin5 10 0(0) 0(06586) 31808 319667 14 0(07567) 0(15407) 42845 426459 18 0(25609) 0(24015) 53850 53840

values of the lower and upper limits of the inertia weight areset to 119908min = 04 and 119908max = 12 where the minimum meanvalues are achieved

The influences of the size of the neighborhood 119873119890 isexamined next In CSPSO the size of the neighborhood 119873119890increases gradually to enhance the local search ability Largevalues will lead to slow convergence and small values willrestrain the local search ability An appropriate value balancesexploration and exploitation The lower and upper limits onthe size of the neighborhood are set to different values and thebest and mean values of the Sphere and Rastrigin functionsare reported in Table 3

Table 3 shows that the best results are obtained when thesize of the neighborhood is between the lower and upperlimits 119873119890min = 2 and 119873119890max = 7 The influences of thenumber of selected particles 119878 are then tested It has influenceson both convergence and search ability Different values for 119878119890and 119878119891 are used and the results are reported in Table 4

The results in Table 4 show that the best number ofselected particles is 10 when the mean values of the testfunctions and computation time are both low It is easy tosee that more particles do not always lead to better resultsBecause of randomness built into the algorithm sometimesmost of the particles may just gather around local optima

Based on these results the lower and upper limits onthe inertia weight 119908min and 119908max are set to 04 and 12respectively the lower and upper limits on the size ofthe neighborhood 119873119890min and 119873119890max are set to 2 and 7respectively and the number of selected particles 119878 is set to10 These parameter values in CSPSO are kept the same andare used in the following experiments

52 Benchmark Functions In this section the 4 classicalnumerical benchmark functions presented in Table 1 are usedto validate the effectiveness of CSPSO The dimension of the

benchmark functions which determines the dimension ofthe search space is set to119863 = 10 20 and 30Theperformanceof CSPSO is compared with those of PSO fitness-Euclideandistance ratio particle swarm optimization (FER-PSO) [23]and starling particle swarm optimization (SPSO) [25] PSO isthe basic swarm intelligent algorithm that was developed byKennedy and Eberhart [26] FER-PSO utilizes the individualbest of a neighbor selected based on the Euclidean distanceand fitness function value to replace the individual best of itsown The SPSO uses the collective responses of the particlesto adjust the positions and velocities of the particles when thealgorithm stagnates

The values of the adjustable parameters in these compet-itive algorithms are the best ones reported in the respectivereferences as listed in Table 5 These competitive algorithmswere also run for 50 independent times each so as to getsome meaningful results The best fitness function values forthese functions obtained by these algorithms are reported inTable 6

Figure 4 shows the convergence of some functions ofthe three algorithms Compared with PSO and FER-PSOin Figures 4(b) and 4(c) the curves of the function valuesobtained by CSPSO decline slowly at the beginning of theevolutionary process because of the time taken by localsearch Hence it has a slow convergence speed Relying onDDT and chaotic mapping CSPSO can find lower fitnessvalues than other algorithms after about half of evolutionaryprocess rather than falling into local optima The local searchstrategy in the neighborhood helps in finding better solutionsin the nearby neighborhood of the current position of aparticle In particular the curve of the Sphere functionobtained by CSPSO disappeared after the best value 0 hasbeen found because this value cannot be marked on thechart As shown in Figure 4(d) for the Rosenbrock functiona multimodal function and traditional algorithms like PSO

Mathematical Problems in Engineering 9

Table 5 Parameter settings in the experiment

Parameters PSO ABC [22] FER-PSO [23] GABC [24] SPSO [25] CSPSOPopulation (119872) 50 50 50 50 50 50119905max 200 200 200 200 200 2001198881 1198882 22 minus 22 minus 22 2 21199031 1199032 (0 1) (minus1 1) (01 04) (minus1 1) (0 1) (0 1)(119908min 119908max) 1 minus 1 1 (02 04) (04 12)Neighbors (119873119890) minus minus 7 minus 7 2minus7Selected (119878) minus minus minus minus minus 10Sub-population minus minus minus minus 14 minusStagnant limit minus minus minus minus 2 minusTrial limit minus 10 minus 10 minus minus

Table 6 Best fitness function values obtained by different algorithms for the test functions

Function 119863 PSO FER-PSO SPSO CSPSO

Sphere10 0 0 0 020 05285 01613 82232e-12 030 24235 08891 51645e-12 0

Rastrigin10 00727 003647 0 020 03595 01895 0 030 10374 04847 0 0

Griewank10 05965 01135 0 020 14575 10994 0 030 32947 16874 0 0

Rosenbrock10 63220 68162 52486 0703720 198643 192219 187899 9843130 381130 337894 287863 186928

FER-PSO and SPSO are easily trapped into local optima butCSPSO can find the smallest function values among thosefound by these algorithms These results show that CSPSOhas strong global search ability through the use of DDT andchaotic mapping It can quickly escape local optima

As the results of the 4 classic functions show in Table 6CSPSO and SPSO perform better than PSO and FER-PSOon the Rastrigin and Griewank functions and CSPSO per-forms better than the other three algorithms on the Sphereand Rosenbrock functions In general the results showthat CSPSO has more powerful global search ability thantraditional algorithms like PSO FER-PSO and SPSO bothin single modal and multimodal functions Furthermore asindicated by the results in Table 6 the better performance ofCSPSO than others provides evidence that the improvementin CSPSO is effective in enhancing the performance of PSOand CSPSO can find the global optimum more often thanother algorithms in the experiments

53 Clustering Problems In this section experiments onclustering problemswith different datasets are performed andresults of different algorithms including CSPSO are reportedand discussed Eight benchmark datasets are used Theyare Data 3 2 Data 5 2 Data 10 2 and Data 4 3 from thework reported in [34] and Iris Glass Contraceptive Method

Choice (CMC) and Wine from the UCI Machine LearningRepository [35] More details about these datasets are givenin Table 7 Data 5 2 and Data 4 3 are also plotted in Figure 5

The optimized values of the parameters in the other algo-rithms shown in Table 5 are adopted in the experiments Inorder to perform fair comparisons all the algorithms run onall the clustering datasets for 50 times to eliminate the effectsrandom factors As simple statistics best values (Best) worstvalues (Worst) mean values (Mean) and standard deviations(SD) are used as the evaluation criteria to measure theeffectiveness of these clustering algorithms The environmentof experiment is the same for all clustering algorithms

Figure 6 shows the convergence of these algorithms onfour datasets for typical runs of the algorithms The fitnessfunctions of CSPSO decline slowly at the beginning ofthe evolutionary process and continue to decline ratherthan dropping into local optimum at the middle of theevolutionary process possibly because DDT gives the par-ticles high velocities Local search based on Euclidean andfitness neighborhood structure also helps the particles findbetter positions This phenomenon is more evident on high-dimensional datasets such as the Glass dataset

To be more clear Figure 7 gives more details aboutthe convergence of these algorithms near the end of thesearch process on three out of the four same datasets

10 Mathematical Problems in Engineering

Table 7 Description of the datasets used in the experiments

Datasets Clusters (119870) Features (119889) Total instances (119873) Instances in each clustersData 3 2 3 2 76 (252526)Data 5 2 5 2 250 (5050505050)Data 10 2 10 2 500 (50505050505050505050)Data 4 3 4 3 400 (100100100100)Iris 3 4 150 (505050)Glass 6 9 214 (70177613929)CMC 3 9 1473 (629334510)Wine 3 13 178 (597148)

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

CSPSOPSO

SPSOFER-PSO

104

103

102

101

100

10minus1

10minus2

(a) Sphere function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Bes

t Val

ues

103

102

101

100

(b) Rastrigin function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

105

100

10minus10

10minus5

10minus15

10minus20

(c) Griewank function

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

103

102

101

100

CSPSOPSO

SPSOFER-PSO

(d) Rosenbrock function

Figure 4 Convergence of the algorithms on some functions (D = 10)

CSPSO has the best performance among all these clusteringalgorithms ABC is the first converging to a local optimalpoint possibly because of the influence of randomly selectedneighbors GABC has a poor performance for large datavolumes such as the Data 5 2 and Glass datasets possibly

due to the local search behavior of onlooker bees FER-PSOhas apparently better convergence performance than otheralgorithms possibly due to the strategy that only one of theneighbors is selected to guide the search of a particle but it iseasily trapped in local optimal points in the later generations

Mathematical Problems in Engineering 11

Data_5_2 Data_4_34 6 8 10 12 14 16

4

6

8

10

12

14

16

2015

105

0minus5minus5

05

1015

0

5

10

15

20

minus520

Figure 5 Distributions of data points in some datasets

of the search process SPSO shows a better performance insearching for a global optimum but its local search strategy isnot working on high-dimensional datasets The above resultsand discussions reveal the different performances amongPSO ABC FER-PSO GABC SPSO and CSPSO and verifythe good performance of CSPSO as a clustering algorithm

It can be seen from Table 8 that CSPSO has a betterperformance on these classical clustering problems For theData 3 2 Data 5 2 and Data 10 2 datasets CSPSO obtainedlower mean fitness values than others but other algorithmsperform better than CSPSO on the best fitness values Thisis possibly due to the randomness of the swarm intelligencealgorithms For the Data 4 3 datasets because of their largenumbers of data points the ability of local search in CSPSOis affected by the selected neighbors ABC has better stabilityon this dataset but has a poor ability in searching for aglobal optimum than the others due to the random strategyCSPSO outperforms the other algorithms and obtains lowermean fitness values than others on the Iris Glass CMC andWine datasets Therefore it has better performance on high-dimensional data This proposed algorithm with embeddedDDT and chaotic mapping has better global search abilityand is more stable than traditional PSO through the use ofa local search method based on starling birds Furthermoreas the sizes of the datasets increase in both the dimensionand number of data points CSPSO has better performancethan PSO ABC GABC FER-PSO and SPSO It is morecapable of dealing with big data in the current informationage

6 Conclusions

Because the performance of standard PSO depends onits parameter values parameter adjustment is one of themost useful ways for PSO to obtain good exploration andexploitation abilities CSPSO proposed in this study enhancesthe overall convergence of the searching process Nonlinearadjustment of the inertia weight and the chaotic search

method based on logistic mapping enhance the global searchperformance of the traditional PSO and help particles escapefrom local optima DDT added to velocity update increasesthe velocities of the particles in later iterations and also helpsparticles escape from local optima The local search strategybased on behavior of starling birds utilizes the information ofthe nearest neighbors in the neighborhood and determinesa new collective position and a new collective velocity Thetwoparticle selectionmethods Euclideandistance and fitnessfunction value maintain population diversity of the particlesThese improvements help particles avoid stagnation and findbetter solutions in the search space

The results of simulation experiments in both benchmarkfunction optimization and classical clustering problems showthe effectiveness of CSPSO Compared with the traditionalPSO and other evolutionary algorithms CSPSO has betterconvergence properties and stronger global search abilityHowever as for other swarm intelligence algorithms CSPSOmay be trapped into and may stagnate around local optimalpoints and therefore may be unstable when applied toproblemswith multiple local optima In future works CSPSOwill be applied to other combinatorial optimization problemssuch as scheduling problems Other metaheuristic methodsand optimization techniques may also be used to improve theperformance of PSO

Data Availability

All the datasets used in the simulation experiments camefrom the Artificial Datasets available at httpwwwisicalacinsimsanghamidatahtml (accessed June 2017) and from theUCIMachine Learning Repository available at httparchiveicsuciedumldatasetshtml (accessed June 2017)

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

12 Mathematical Problems in Engineering

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

46

48

50

52

54

56

58

60

(a) Data 3 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

320

340

360

380

400

420

440

460

480

(b) Data 5 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

90

100

110

120

130

140

150

160

170

180

190

(c) Iris

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

250

300

350

400

450

500

550

600

650

(d) Glass

Figure 6 Convergence of the algorithms on some datasets

Iteration 19965 1997 19975 1998 19985 1999 19995 200

Best

Valu

es

47515

4752

47525

4753

47535

4754

Iteration 188 190 192 194 196 198 200

Best

Valu

es

322323324325326327328329330331

Iteration 188 190 192 194 196 198 200

Best

Valu

es

94

95

96

97

98

99

Figure 7 Convergence of the algorithms on the Data 3 2 Data 5 2 and Iris datasets

Mathematical Problems in Engineering 13

Table 8 Comparison of performance of CSPSO with other clustering algorithms

Datasets Parameters AlgorithmsPSO ABC FERminusPSO GABC SPSO CSPSO

Data 3 2

Best 475287 477630 475290 475282 475733 475282Worst 591935 601039 497856 591895 531721 475382Mean 484644 509702 478771 482279 478934 475300SD 31956 31345 04967 27975 08551 00022

Data 5 2

Best 3264386 3375538 3269987 3297155 3264666 3264432Worst 3922899 4110786 3326651 3648400 3270877 3269676Mean 3296249 3736977 3287897 3435928 3267493 3265300SD 129978 201046 11594 96214 01536 00942

Data 10 2

Best 10793e+03 9164320 8756264 13134e+03 8432274 8498712Worst 14459e+03 12718e+03 12068e+03 11657e+03 11827e+03 10141e+03Mean 12875e+03 10608e+03 10066e+03 638046 9418921 9182542SD 704484 761788 671964 13134e+03 855520 505344

Data 4 3

Best 7518897 10937e+03 8305456 7504056 7499506 7496738Worst 12866e+03 16347e+03 13753e+03 18485e+03 12726e+03 12725e+03Mean 8388670 13838e+03 10312e+03 11489e+03 7938641 7845028SD 1818515 1223579 1500554 2713938 1425136 1251508

Iris

Best 1005216 1131201 980975 966829 966605 966605Worst 1306931 1568960 1029780 1276869 986906 980888Mean 1137655 1331161 1003761 1050487 969465 969194SD 71771 110706 12211 112307 03921 02931

Glass

Best 3291487 3337214 2786565 2885152 2353554 2296544Worst 4365615 5031607 3265351 4025881 3090997 2789187Mean 3764035 4223809 2973875 3464078 2805624 2570244SD 215371 320309 111730 269515 145045 103633

CMC

Best 58449e+03 59895e+03 56153e+03 56704e+03 55615e+03 55404e+03Worst 69260e+03 72137e+03 59728e+03 64143e+03 57676e+03 55861e+03Mean 61772e+03 64821e+03 57297e+03 58864e+03 56426e+03 55562e+03SD 1867542 2432599 730180 1408947 446957 87806

Wine

Best 16308e+04 16399e+04 16312e+04 16309e+04 16305e+04 16298e+04Worst 16798e+04 17317e+04 16376e+04 16482e+04 16356e+04 16313e+04Mean 16438e+04 16705e+04 16336e+04 16355e+04 16322e+04 16304e+04SD 950097 2197057 159233 354218 120030 33799

Acknowledgments

This research project was partially supported by the NationalNatural Science Foundation of China (61472231 6150228361640201) Humanities and Social Science Research Fund ofthe Ministry of Education of China (12YJA630152) and theShandong Social Science Foundation of China (16BGLJ0611CGLJ22)

References

[1] X LiuH Liu andHDuan ldquoParticle swarmoptimizationbasedon dynamic niche technology with applications to conceptualdesignrdquo Advances in Engineering Software vol 38 no 10 pp668ndash676 2007

[2] Z Bao and TWatanabe ldquoMixed constrained image filter designusing particle swarm optimizationrdquo Artificial Life and Roboticsvol 15 no 3 pp 363ndash368 2010

[3] S Sahin andM A Cavuslu ldquoFPGA Implementation ofWaveletNeural Network Training with PSOiPSOrdquo Journal of CircuitsSystems and Computers vol 27 no 6 2018

[4] D Wu and H Gao ldquoA BP and Switching PSO Based Optimiza-tion Approach for Engine Optimizationrdquo National Academy ofScience Letters vol 40 no 1 pp 33ndash37 2017

[5] Q Jia and Y Guo ldquoHybridization of ABC and PSO algorithmsfor improved solutions of RCPSPrdquo Journal of the ChineseInstitute of Engineers Transactions of the Chinese Institute ofEngineersSeries AChung-kuo Kung Chrsquoeng Hsuch Krsquoan vol 39no 6 pp 727ndash734 2016

[6] C Yue B Qu and J Liang ldquoA Multi-objective Particle SwarmOptimizer Using Ring Topology for Solving Multimodal Multi-objective Problemsrdquo IEEE Transactions on Evolutionary Com-putation 2017

[7] X Liu and J Xue ldquoA Cluster Splitting Technique by HopfieldNetworks and P Systems on Simplicesrdquo Neural ProcessingLetters pp 1ndash24 2017

14 Mathematical Problems in Engineering

[8] Y Zhao X Liu and W Wang ldquoSpiking neural P systems withneuron division and dissolutionrdquo PLoS ONE vol 11 no 9Article ID e0162882 2016

[9] X Liu Y Zhao and M Sun ldquoAn Improved Apriori AlgorithmBased on an Evolution-Communication Tissue-Like P Systemwith Promoters and Inhibitorsrdquo Discrete Dynamics in Natureand Society vol 2017 2017

[10] N Netjinda T Achalakul and B Sirinaovakul ldquoParticle SwarmOptimization inspired by starling flock behaviorrdquo Applied SoftComputing vol 35 pp 411ndash422 2015

[11] A El Dor D Lemoine M Clerc P Siarry L Deroussi andMGourgand ldquoDynamic cluster in particle swarm optimizationalgorithmrdquo Natural Computing vol 14 no 4 pp 655ndash672 2015

[12] Y M B Ali ldquoUnsupervised Clustering Based an AdaptiveParticle Swarm Optimization Algorithmrdquo Neural ProcessingLetters vol 44 no 1 pp 221ndash244 2016

[13] G Armano and M R Farmani ldquoMultiobjective clusteringanalysis using particle swarm optimizationrdquo Expert Systemswith Applications vol 55 pp 184ndash193 2016

[14] B Niu Q Duan L Tan C Liu and P Liang ldquoA Population-Based Clustering Technique Using Particle Swarm Optimiza-tion and K-Meansrdquo in Advances in Swarm and ComputationalIntelligence vol 9140 of Lecture Notes in Computer Science pp145ndash152 Springer International Publishing Cham 2015

[15] X Zhao W Lin J Hao X Zuo and J Yuan ldquoClustering andpattern search for enhancing particle swarm optimization withEuclidean spatial neighborhood searchrdquo Neurocomputing vol171 pp 966ndash981 2016

[16] E Comak ldquoA modified particle swarm optimization algorithmusing Renyi entropy-based clusteringrdquo Neural Computing andApplications vol 27 no 5 pp 1381ndash1390 2016

[17] K K Bharti and P K Singh ldquoOpposition chaotic fitness muta-tion based adaptive inertia weight BPSO for feature selection intext clusteringrdquoApplied SoftComputing vol 43 pp 20ndash34 2016

[18] W Song W Ma and Y Qiao ldquoParticle swarm optimizationalgorithm with environmental factors for clustering analysisrdquoSoft Computing vol 21 no 2 pp 283ndash293 2017

[19] R Liu J Li J Fan C Mu and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimizationfor dynamic multi-objective optimizationrdquo European Journal ofOperational Research vol 261 no 3 pp 1028ndash1051 2017

[20] X Chu B Niu J J Liang and Q Lu ldquoAn orthogonal-designhybrid particle swarm optimiser with application to capacitatedfacility location problemrdquo International Journal of Bio-InspiredComputation vol 8 no 5 pp 268ndash285 2016

[21] DWang D Tan and L Liu ldquoParticle Swarm Optimization Anoverviewrdquo Soft Computing vol 22 no 2 p 22 2018

[22] D Karaboga ldquoAn Idea Based on Honey Bee Swarm for Numer-ical Optimizationrdquo Technical Report TR0 2005

[23] X Li ldquoA multimodal particle swarm optimizer based on fitnessEuclidean-distance ratiordquo in Proceedings of the the 9th annualconference pp 78ndash85 London England July 2007

[24] G Zhu and S Kwong ldquoGbest-guided artificial bee colonyalgorithm for numerical function optimizationrdquo Applied Math-ematics and Computation vol 217 no 7 pp 3166ndash3173 2010

[25] A Laudani F R Fulginei G M Lozito and A SalvinildquoSwarmflock optimization algorithms as continuous dynamicsystemsrdquo Applied Mathematics and Computation vol 243 pp670ndash683 2014

[26] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[27] K K Bharti and P K Singh ldquoChaotic gradient artificial beecolony for text clusteringrdquo Soft Computing vol 20 no 3 pp1113ndash1126 2016

[28] RMMay ldquoSimplemathematicalmodelswith very complicateddynamicsrdquo Nature vol 261 no 5560 pp 459ndash467 1976

[29] F S Hover and M S Triantafyllou ldquoApplication of polynomialchaos in stability and controlrdquo Automatica vol 42 no 5 pp789ndash795 2006

[30] X Liu L Xiang and X Wang ldquoSpatial cluster analysis bythe adleman-lipton DNA computing model and flexible gridsrdquoDiscrete Dynamics in Nature and Society vol 2012 Article ID894207 32 pages 2012

[31] H Peng X Luo ZGao JWang andZ Pei ldquoANovel ClusteringAlgorithm Inspired by Membrane Computingrdquo The ScientificWorld Journal vol 3 no 22 Article ID 929471 8 pages 2015

[32] B Ufnalski and L M Grzesiak ldquoPlug-in direct particle swarmrepetitive controller with a reduced dimensionality of a fitnesslandscape - a multi-swarm approachrdquo Bulletin of the PolishAcademy of SciencesmdashTechnical Sciences vol 63 no 4 pp 857ndash866 2015

[33] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[34] ldquoDatasetsrdquo httpwwwisicalacinsimsanghamidatahtml[35] H C Blake UCI Repository of Machine Learning Databases

1998

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 3: A New Chaotic Starling Particle Swarm Optimization ...downloads.hindawi.com/journals/mpe/2018/8250480.pdfMathematicalProblemsinEngineering Chaotic Parameter u 2.5 3 3.5 4 Chaotic Value

Mathematical Problems in Engineering 3

Chaotic Parameter u 25 3 35 4

Cha

otic

Val

ue C

h(t)

0

01

02

03

04

05

06

07

08

09

1

3 3 444

Figure 1 Bifurcation diagram for the logistic map

22 Chaotic Mapping Chaotic maps are mapping methodsused to generate random numbers with features like ergodic-ity and nonlinear and random similarity [27]There aremanykinds of chaoticmaps such as tentmap Tchebychevmap andlogistic map Specifically the logistic map developed by May[28] can explore in the vicinity of a solution by oscillating inthe region One chaotic variant based on logistic mapping isgiven by (3) in the following [29]

119862ℎ (119905 + 1) = 119906 lowast 119862ℎ (119905) lowast (1 minus 119862ℎ (119905)) 119862ℎ (119905) isin (0 1) 119906 isin (0 +infin) (3)

where 119862ℎ(119905) represents the value of the chaotic num-ber at time 119905 119906 is the control parameter and 119862ℎ(0) notin0 025 05 075 10 In (3) the parameter 119906 controls thebehavior of the chaotic variant 119862ℎ The values of 119862ℎ(119905) forvarying values of 119906 are shown in Figure 1

23 Data Clustering Problems Let 119883 = 119883119902 for 119902 =1 2 119873 be a dataset containing119873 data points or instancesData point 119902 is represented by119883119902 = 1198831199021 1198831199022 119883119902119889 with119889 representing the dimension of the data The purpose of adata clustering problem is to find a partition of the datasetby optimizing a fitness function A partition is representedby 119862 = 119862119896 for 119896 = 1 2 119870 where 119870 is the numberof clusters [30] A partition must satisfy the following con-ditions

(i) 1198621198961 cap 1198621198962 = forall1198961 = 1198962(ii) ⋃119870119896=1 119862119896 = 119883(iii) 119862119896 = forall119896 = 1 2 119870

Usually the Euclidean distance is used to measure thedifference or similarity between two data points1198831199021 and1198831199022 Let 119911 = z119896 for 119896 = 1 2 119870 represent the centers of the119870cluster 119862119896 The intracluster distance of cluster 119862119896 is the sum

of the distances of all data points in the cluster to 119911119896 given by(4) in the following

119863 (119883119862119896) = sum119883119902isin119862119896

10038171003817100381710038171003817119883119902 minus 11991111989610038171003817100381710038171003817 (4)

The quality of the clustering results for the dataset can bemeasured by the sum of the intracluster distances over allclusters given by (5) in the following [31]

119865 = 119865 (1198621 1198622 119862119896) = 119870sum119896=1

119863(119883119862119896)

= 119870sum119896=1

sum119883119902isin119862119896

10038171003817100381710038171003817119883119902 minus 11991111989610038171003817100381710038171003817 (5)

119865 defined in (6) is used as the fitness function in clustering inthe following

3 The Chaotic Starling Particle SwarmOptimization Algorithm

Some efforts have been made in order to enhance the searchperformance and convergence speed of PSO In this section achaotic mapping method and a DDT are introduced into thePSOalgorithm to improve the global search ability and a localsearch technique based on starling birds is added to improvethe convergence speed This improved PSO method is theCSPSO algorithm The major components and the details ofthe CSPSO algorithm are discussed in this section

31 Dynamic Update of CSPSO Parameters Three compo-nents velocity of the previous iteration individual factorand social factor are used to update the velocity of aparticle Three parameters 119908 1199031 and 1199032 control the relativecontributions of the three components The inertia weight 119908controls the influence of the velocity of the previous iterationThe cognitive coefficients 1198881 and 1198882 balance the contributionsof the individual and social factors Furthermore the accel-eration coefficients 1199031 and 1199032 control the proportion retainedin the individual and social factors Each parameter plays animportant role in the velocity update and also consequentlyaffects the particle position update Instead of using fixedvalues logistic mapping is used to generate the accelerationcoefficients 1199031 and 1199032 in each generation In addition theinertia weight is adjusted using an exponential function Themodifications of these parameters are shown in (6) and (7)

119903119901 (119905 + 1) = 4 lowast 119903119901 (119905) lowast (1 minus 119903119901 (119905)) 119903119901 (119905) isin (0 1) 119901 = 1 2 (6)

119908 (t) = wmin + (wmax minus wmin) lowast e(minust(tmaxminust)) (7)

where119908min and119908max are the minimum and maximum of theinertia weight and 119905max is the maximum number of iterations

32 The Dynamic Disturbance Term As an optimizationalgorithm based on swarm intelligence PSO uses the

4 Mathematical Problems in Engineering

trajectory of particles in the population by comparing thefitness function values to select the local and global bestpositions Because of local optimal solutions the velocity ofa particle is gradually decreasing during the later iterationsTherefore the majority of the particles fall into and cannoteasily jump out of local optimal points called prematureconvergence and evolutionary stagnation To overcome theseproblems a method based on DDT [32] is used to changethe velocity update of each particle DDT is embedded intovelocity updating when the velocity of a particle becomes toolow or when its position stays unchanged in the middle andfinal generations The modified velocity is updated using (8)and the DDT represented by 119879 is updated using (9) in thefollowing

V119894119895 (119905 + 1) = 119908 lowast V119894119895 (119905) + 1198881 lowast 1199031 lowast (119901119894119895 (119905) minus 119909119894119895 (119905))+ 1198882 lowast 1199032 lowast (119901119892119895 (119905) minus 119909119894119895 (119905)) + 119879 (8)

119879 = 119898 lowast ( ttmax

minus 05) (9)

where 119898 is an accommodation coefficient with a valuebetween 0 and 1 This improvement helps in preventing aparticle from flying out of the boundary of the search spacewhen its velocity is too high in early generations and inescaping from a local optimal point by increasing its velocityso as to improve the overall searching capacity Besides 119879increases linearly to avoid oscillation and to keep a relativelystable search direction in the optimization process

33 Local Search Based on Starling Birds In nature a star-ling bird spreads some kinds of information to its nearbyneighbors in order to defend its position and the informationcoming from one starling bird can spread to everywherein the swarm Inspired by this phenomenon a local searchmethod based on starling bird behavior is used in thedeveloped PSO procedure Each particle seeks a new positionand velocity in the neighborhood by a weighted method thatcan be seen as a kind of information communication betweenan individual and its nearby neighborsThis mechanism triesto seek a better solution in the local exploration around theparticle and reduces the time needed in local search [25]

Dynamic Neighborhood Selection For each particle 119894 theneighborhood 119867 is a subset of the particle population Theparticles in the population are listed in decreasing order ofthe fitness-Euclidean distance ratio (FER) fromparticle 119894 [23]and the119873119890 particles ranked on the top are taken as the ones inthe neighborhood of the particle FER is determined by (10)in the following

119865119864119877(119909119894 1199091198941) = 119886 lowast ((119865 (119909119894) minus 119865 (1199091198941))119864119863(119909119894 1199091198941) ) 1198941 = 1 2 119872

(10)

where ED(119909119894 1199091198941) is the Euclidean distance between 119909119894 and1199091198941 120572 = 119904(F(119909119908) minus F(119909119887)) 119909119887 and 119909119908 are the best and

worst positions of the particles in the current population 119904is the size of search space defined as 119904 = radicsum119863119895 (119909119906119895 minus 119909119897119895)2and 119909119906119895 and 119909119897119895 are the maximum and minimum values of the119895th dimension of the data points in the dataset The numberof neighbors 119873119890 is the size of the neighborhood In order toexplore different search scope a dynamic strategy is used toadjust the size of the neighborhood specified in (11) in thefollowing

119873119890 = 119873119890min + INT(119905 lowast ((119873119890max minus 119873119890min)119905max)) (11)

where 119873119890max and 119873119890min are the maximum and minimumsizes of the neighborhood and INT(119886) is the integral functiontaking only the integer part of 119886Position Update The position of particle 119894 is adjusted usingthe information of the particles in the neighborhood Usingthe weighted positions of the neighbors the new position ofparticle 119894 is determined by (12)

119909119894 = 119909119894 + 1199033 lowast (( 1119873119890) lowast sum1198941isin119867

1199091198941) (12)

where 119909119894 is the current position of particle 119894 119909119894 is the newposition after the adjustment and 1199033 is a randomly generatedreal number uniformly distributed between minus1 and 1

Velocity Update Similar to the update of the position thevelocity of particle 119894 is updated using the velocity informationof the particles in the neighborhood The new velocity ofparticle 119894 is determined by (13) in the following

V119894 = V119894 + 1199034 lowast (( 1119873119890) lowast sum1198941isin119867

V1198941) (13)

where V119894 and V119894 are the current and updated velocities ofparticle 119894 and 1199034 is a randomly generated real number uni-formly distributed between 0 and 1 The collective responsesof starling birds are described by the pseudocode shown inAlgorithm 2

4 The CSPSO for Clustering Problems

41 Particle Selection In the previous section a local searchmethod based on starling bird behavior is used in theneighborhood of a particle to search for better solutionsTime needed by this method will increase In order toaccelerate search speed and reduce time needed the localsearch method is not applied to all particles Two methodsare adopted to select particles from the population The localsearchmethod is then applied to all the particles selected withthese two methods

An Euclidean Distance Method In each generation all theparticles are sorted in ascending order of their Euclideandistances from the global best The particles are dividedinto 119878119890 groups of approximately equal size based on their

Mathematical Problems in Engineering 5

Input 119873119890for Particle 119894

Find the119873119890 neighbors in the neighborhood119867 of particle 119894 using FERThe new position of particle 119894 119909119894 = 119909119894 + 1199033 lowast ((1119873119890) lowast sum1198941isin119867 1199091198941)The new position of particle 119894 V119894 = V119894 + 1199034 lowast ((1119873119890) lowast sum1198941isin119867 V1198941 )if Particle()FitnessltParticle(119894)Fitness

Particle(119894)Fitness=Particle()FitnessParticle(119894)Position=Particle()PositionParticle(119894)Velocity=Particle()Velocity

End ifReturn Particle 119894

End forOutput Particle 119894

Algorithm 2 Starling Birds Collective Responses

Particle swarm initialization

Update velocity and position ofeach particle

Update individual best

and globalbest positions

Local search method in starling birds

Sort the particle population using fitness

values

Selected particles using

Euclidean distance

particles using Fitness function

Max iteration

Return global best position

Particles

Identify neighbors

Position and velocity

adjustment

Select particleusing fitness values

update positionand velocity

Update globalbest position

NO

YES

=+

Selected

Figure 2 A local search method in the CSPSO algorithm

Euclidean distances to the global best The particle rankedon the top of each group is selected As a result a total of 119878119890particles are selected The particles selected in this methoddiversify the search and enhance exploration of the CSPSOprocedure

A Fitness Function Method In each generation all theparticles are sorted in the ascending order of their fitnessfunction values A total of 119878119891 particles ranked on the topare selected The particles selected in this method focus onthe promising regions in the search space so as to enhanceexploitation of the CSPSO procedure

The two particle selection methods described above areused in CSPSO to improve its convergence and increasediversity As a result a total of 119878 = 119878119890+119878119891 particles are selectedIn the implementation 119878119890 = 119878119891 is used Particles selected

through the Euclidean distance method are diversified in thesearch space to effectively avoid premature convergence Theparticles selected through the fitness function method willlead the search direction Figure 2 gives more details aboutthe process of particle selection

42 Population Initialization In the implementation ofCSPSO each cluster center is the position of a particle eachcluster center represents a cluster and different cluster centersrepresent different clustersThe set of cluster centers 119911 = 119911119896for 119896 = 1 2 119870 represents a partitioning result and alsorepresents the position of a particle 119909119894 = 1199111198941 1199111198942 119911119894119870In the initialization the initial position of each particle isgenerated randomly in the entire search space Dimension 119895of the position of particle 119894 represented by 119909119894119895 is generated

6 Mathematical Problems in Engineering

Input 119872 119905max 119873119890min 119873119890max 119908min 119908maxParticle Swarm Population Initializationfor 119894 = 1 to119872 (Population Size)

Particle(119894)BestPosition=Particle(119894)PositionParticle(119894)BestFitness=Particle(119894)Fitnessif Particle(119894)BestFitnessltBest Particle Fitness

Best ParticlePosition=Particle(119894)BestPositionBest ParticleFitness=Particle(119894)BestFitness

End ifEnd forfor 119905 = 1 to 119905max (Max Iteration)Algorithm 1 The Standard PSO Algorithm(See Algorithm 1)

Particle Selection119878119890 = Euclidean Distance Function (Best ParticlePosition)119878119891 = Fitness Function (Particle(119894)Fitness)119878 = 119878119890 + 119878119891Algorithm 2 Starling Birds Collective Responses

for 119894 = 1 to 119878Find the119873119890 neighbors in the neighborhood119867 of particle 119894 using FERThe new position of particle 119894 119909119894 = 119909119894 + 1199033 lowast ((1119873119890) lowast sum1198941isin119867 1199091198941)The new position of particle 119894 V119894 = V119894 + 1199034 lowast ((1119873119890) lowast sum1198941isin119867 V1198941 )if Particle()FitnessltParticle(119894)FitnessParticle(119894)Fitness=Particle()FitnessParticle(119894)Position=Particle()PositionParticle(119894)Velocity=Particle()Velocityif Particle()FitnessltBest ParticleFitness

Best ParticlePosition=Particle()BestPositionBest ParticleFitness=Particle()BestFitness

End ifEnd if

End for

119873119890 = 119873119890min + INT(119905 lowast ((119873119890max minus 119873119890min)119905max))

End forOutput Best ParticleFitness

Algorithm 3 A Chaotic Starling PSO Algorithm

randomly between 119909119897119895 and 119909119906119895 where 119909119897119895 and 119909119906119895 are theminimum and maximum values of dimension 119895 of the searchspace and also are the limits of the positions of the particlesTherefore 119909119894119895 isin [119909119897119895 119909119906119895 ] for 119894 = 1 2 119872 and 119895 =1 2 119863 The values of 119909119897119895 and 119909119906119895 are determined as (14)

119909119897119895 = min (119883119894119895 | 119894 = 1 2 119873) 119909119906119895 = max (119883119894119895 | 119894 = 1 2 119873)

for 119895 = 1 2 119889(14)

where 119883119894119895 is dimension 119895 of data point 119883119894 In particular thedimension 119863 of a particle is D = 119870 lowast 11988943 Boundary Handling If a particle flies outside or acrossthe boundary of the search space this particle no longerrepresents a solution The position and velocity of such aparticle are adjusted If the position of a particle in dimension

119895 is below the lower (above the upper) limit of that dimensionit is set to the lower (upper) limit of that dimensionAccordingly the velocity of this particle along dimension119895 is reversed These adjustments can limit the scope of thepositions and change the velocities as well as changing thedirections of the flying paths of the particles The positionand velocity of such a particle are adjusted as follows

119909119894119895 = 119909119897119895 if 119909119894119895 lt 119909119897119895119909119894119895 = 119909119906119895 if 119909119894119895 gt 119909119906119895 V119894119895 = minusV119894119895

(15)

where 119909119894119895 is the new value of dimension 119895 of the position andV119894119895 is the new value of dimension 119895 of the velocity of particle 119894after the adjustments

44The Pseudocode of the Proposed CSPSO Thepseudocodeof the proposed CSPSO is presented in Algorithm 3

Mathematical Problems in Engineering 7

Table 1 Benchmark Functions

Benchmark Functions Function Expression Domain 119883min 119865min

Sphere 119891min = 119863sum119894=1

1199092119894 [ndash100 100]D 0119863 0

Rastrigin 119891min = 119863sum119894=1

[1199092119894 minus 10 cos(2120587119909119894) + 10] [ndash512 512]D 0119863 0

Griewank 119891min = 119863sum119894=1

( 11990921198944000) minus 119863prod119894=1

cos( 119909119894radic119894) + 1 [ndash600 600]D 0119863 0

Rosenbrock 119891min = 119863sum119894=1

[(1199092119894+1 minus 1199092119894 )2 + (1 minus 119909119894)2] [ndash2048 2048]D 1119863 0

SphereRastrigin

times104

2

1

0100

0 0

minus100 minus100

100

60

40

20

0

minus20

minus40

5

50

0

minus5 minus5

Figure 3 The Sphere and Rastrigin Function for 119863 = 2

5 Simulation Experiments

Three experiments are conducted to evaluate the perfor-mance of CSPSO The first experiment validates the optimalparameter values in CSPSO using the Sphere and Rastrigin[33] benchmark functions The second experiment validatesthe performance of CSPSO using four classical numericalbenchmark functions [33] These functions all to be min-imized and their domains are presented in Table 1 Thethird experiment checks the effectiveness of CSPSO in dataclustering using some datasets from the Artificial Datasets[34] and the UCI Machine Learning Repository [35] CSPSOis implemented in MATLAB and all the experiments areconducted on a LENOVO desktop computer with an Intel320 GHz i5-4460P4 processor and 8GB of RAM in aWindows 8 environment

51 Experiment on Parameter Settings As described abovethe values of the parameters have important influences onthe performance of CSPSOThis section focuses on checkingthe influences of the three critical parameters in CSPSO theinertia weight 119908 the size of the neighborhood 119873119890 and thenumber of selected particles 119878 so as to find appropriate valuesfor them The Sphere and Rastrigin functions are used tostudy the influences of different values of the parameters in

CSPSO The shape and ranges of the Sphere and Rastriginfunctions for 119863 = 2 are depicted in Figure 3 The dimensionof benchmark functions is119863 = 10 in the experiment CSPSOran 30 times for each of the two benchmark functions

For the validity of the experiments and fair comparisonsparameters not to be tested in the experiments are kept atthe same values These parameters are the population size119872 = 50 the max number of iterations 119905max = 100 and thecognitive coefficients 1198881 = 1198882 = 2

The influences of the inertia weight 119908 is tested first Itsvalue is adjusted according to (8)Thevalues of119908min and119908maxdirectly determine the decline speed of 119908 and consequentlyaffect the convergence speed of CSPSO Their values arevaried in the experiments When the value of 119908 varies theminimum and maximum sizes of the neighborhood are keptat 119873119890min = 2 and 119873119890max = 5 and the number of selectedparticles is kept at 119878 = 10 The best and mean values of theSphere and Rastrigin functions are reported in Table 2

The differences in these results are obvious The optimalvalues of the Sphere and Rastrigin functions were obtained atdifferent values of 119908 However the mean values are differentfor the Rastrigin function because it is multimodal withmany local optima Generally it is difficult to find globaloptimal solutions for multimodal functions with traditionaloptimization algorithms Based on these results the optimal

8 Mathematical Problems in Engineering

Table 2 Results when varying the inertia weight 119908Inertia weight (119908min minus 119908max) Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin02minus06 0(00022) 0(09684) 33773 3132102minus09 0(0) 0(16269) 31235 3135804minus09 0(0) 0(17282) 31258 3130904minus12 0(0) 0(07567) 31229 31359

Table 3 Results when varying the size of the neighbourhood119873119890Neighbor size (119873119890min minus 119873119890max) Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin2-5 0(01382) 0(10071) 31503 314202-7 0(0) 0(07359) 31389 314162-9 0(0) 0(13716) 31403 31446

Table 4 Results when varying the number of selected particles 119878119878119890 119878119891 119878 Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin5 10 0(0) 0(06586) 31808 319667 14 0(07567) 0(15407) 42845 426459 18 0(25609) 0(24015) 53850 53840

values of the lower and upper limits of the inertia weight areset to 119908min = 04 and 119908max = 12 where the minimum meanvalues are achieved

The influences of the size of the neighborhood 119873119890 isexamined next In CSPSO the size of the neighborhood 119873119890increases gradually to enhance the local search ability Largevalues will lead to slow convergence and small values willrestrain the local search ability An appropriate value balancesexploration and exploitation The lower and upper limits onthe size of the neighborhood are set to different values and thebest and mean values of the Sphere and Rastrigin functionsare reported in Table 3

Table 3 shows that the best results are obtained when thesize of the neighborhood is between the lower and upperlimits 119873119890min = 2 and 119873119890max = 7 The influences of thenumber of selected particles 119878 are then tested It has influenceson both convergence and search ability Different values for 119878119890and 119878119891 are used and the results are reported in Table 4

The results in Table 4 show that the best number ofselected particles is 10 when the mean values of the testfunctions and computation time are both low It is easy tosee that more particles do not always lead to better resultsBecause of randomness built into the algorithm sometimesmost of the particles may just gather around local optima

Based on these results the lower and upper limits onthe inertia weight 119908min and 119908max are set to 04 and 12respectively the lower and upper limits on the size ofthe neighborhood 119873119890min and 119873119890max are set to 2 and 7respectively and the number of selected particles 119878 is set to10 These parameter values in CSPSO are kept the same andare used in the following experiments

52 Benchmark Functions In this section the 4 classicalnumerical benchmark functions presented in Table 1 are usedto validate the effectiveness of CSPSO The dimension of the

benchmark functions which determines the dimension ofthe search space is set to119863 = 10 20 and 30Theperformanceof CSPSO is compared with those of PSO fitness-Euclideandistance ratio particle swarm optimization (FER-PSO) [23]and starling particle swarm optimization (SPSO) [25] PSO isthe basic swarm intelligent algorithm that was developed byKennedy and Eberhart [26] FER-PSO utilizes the individualbest of a neighbor selected based on the Euclidean distanceand fitness function value to replace the individual best of itsown The SPSO uses the collective responses of the particlesto adjust the positions and velocities of the particles when thealgorithm stagnates

The values of the adjustable parameters in these compet-itive algorithms are the best ones reported in the respectivereferences as listed in Table 5 These competitive algorithmswere also run for 50 independent times each so as to getsome meaningful results The best fitness function values forthese functions obtained by these algorithms are reported inTable 6

Figure 4 shows the convergence of some functions ofthe three algorithms Compared with PSO and FER-PSOin Figures 4(b) and 4(c) the curves of the function valuesobtained by CSPSO decline slowly at the beginning of theevolutionary process because of the time taken by localsearch Hence it has a slow convergence speed Relying onDDT and chaotic mapping CSPSO can find lower fitnessvalues than other algorithms after about half of evolutionaryprocess rather than falling into local optima The local searchstrategy in the neighborhood helps in finding better solutionsin the nearby neighborhood of the current position of aparticle In particular the curve of the Sphere functionobtained by CSPSO disappeared after the best value 0 hasbeen found because this value cannot be marked on thechart As shown in Figure 4(d) for the Rosenbrock functiona multimodal function and traditional algorithms like PSO

Mathematical Problems in Engineering 9

Table 5 Parameter settings in the experiment

Parameters PSO ABC [22] FER-PSO [23] GABC [24] SPSO [25] CSPSOPopulation (119872) 50 50 50 50 50 50119905max 200 200 200 200 200 2001198881 1198882 22 minus 22 minus 22 2 21199031 1199032 (0 1) (minus1 1) (01 04) (minus1 1) (0 1) (0 1)(119908min 119908max) 1 minus 1 1 (02 04) (04 12)Neighbors (119873119890) minus minus 7 minus 7 2minus7Selected (119878) minus minus minus minus minus 10Sub-population minus minus minus minus 14 minusStagnant limit minus minus minus minus 2 minusTrial limit minus 10 minus 10 minus minus

Table 6 Best fitness function values obtained by different algorithms for the test functions

Function 119863 PSO FER-PSO SPSO CSPSO

Sphere10 0 0 0 020 05285 01613 82232e-12 030 24235 08891 51645e-12 0

Rastrigin10 00727 003647 0 020 03595 01895 0 030 10374 04847 0 0

Griewank10 05965 01135 0 020 14575 10994 0 030 32947 16874 0 0

Rosenbrock10 63220 68162 52486 0703720 198643 192219 187899 9843130 381130 337894 287863 186928

FER-PSO and SPSO are easily trapped into local optima butCSPSO can find the smallest function values among thosefound by these algorithms These results show that CSPSOhas strong global search ability through the use of DDT andchaotic mapping It can quickly escape local optima

As the results of the 4 classic functions show in Table 6CSPSO and SPSO perform better than PSO and FER-PSOon the Rastrigin and Griewank functions and CSPSO per-forms better than the other three algorithms on the Sphereand Rosenbrock functions In general the results showthat CSPSO has more powerful global search ability thantraditional algorithms like PSO FER-PSO and SPSO bothin single modal and multimodal functions Furthermore asindicated by the results in Table 6 the better performance ofCSPSO than others provides evidence that the improvementin CSPSO is effective in enhancing the performance of PSOand CSPSO can find the global optimum more often thanother algorithms in the experiments

53 Clustering Problems In this section experiments onclustering problemswith different datasets are performed andresults of different algorithms including CSPSO are reportedand discussed Eight benchmark datasets are used Theyare Data 3 2 Data 5 2 Data 10 2 and Data 4 3 from thework reported in [34] and Iris Glass Contraceptive Method

Choice (CMC) and Wine from the UCI Machine LearningRepository [35] More details about these datasets are givenin Table 7 Data 5 2 and Data 4 3 are also plotted in Figure 5

The optimized values of the parameters in the other algo-rithms shown in Table 5 are adopted in the experiments Inorder to perform fair comparisons all the algorithms run onall the clustering datasets for 50 times to eliminate the effectsrandom factors As simple statistics best values (Best) worstvalues (Worst) mean values (Mean) and standard deviations(SD) are used as the evaluation criteria to measure theeffectiveness of these clustering algorithms The environmentof experiment is the same for all clustering algorithms

Figure 6 shows the convergence of these algorithms onfour datasets for typical runs of the algorithms The fitnessfunctions of CSPSO decline slowly at the beginning ofthe evolutionary process and continue to decline ratherthan dropping into local optimum at the middle of theevolutionary process possibly because DDT gives the par-ticles high velocities Local search based on Euclidean andfitness neighborhood structure also helps the particles findbetter positions This phenomenon is more evident on high-dimensional datasets such as the Glass dataset

To be more clear Figure 7 gives more details aboutthe convergence of these algorithms near the end of thesearch process on three out of the four same datasets

10 Mathematical Problems in Engineering

Table 7 Description of the datasets used in the experiments

Datasets Clusters (119870) Features (119889) Total instances (119873) Instances in each clustersData 3 2 3 2 76 (252526)Data 5 2 5 2 250 (5050505050)Data 10 2 10 2 500 (50505050505050505050)Data 4 3 4 3 400 (100100100100)Iris 3 4 150 (505050)Glass 6 9 214 (70177613929)CMC 3 9 1473 (629334510)Wine 3 13 178 (597148)

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

CSPSOPSO

SPSOFER-PSO

104

103

102

101

100

10minus1

10minus2

(a) Sphere function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Bes

t Val

ues

103

102

101

100

(b) Rastrigin function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

105

100

10minus10

10minus5

10minus15

10minus20

(c) Griewank function

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

103

102

101

100

CSPSOPSO

SPSOFER-PSO

(d) Rosenbrock function

Figure 4 Convergence of the algorithms on some functions (D = 10)

CSPSO has the best performance among all these clusteringalgorithms ABC is the first converging to a local optimalpoint possibly because of the influence of randomly selectedneighbors GABC has a poor performance for large datavolumes such as the Data 5 2 and Glass datasets possibly

due to the local search behavior of onlooker bees FER-PSOhas apparently better convergence performance than otheralgorithms possibly due to the strategy that only one of theneighbors is selected to guide the search of a particle but it iseasily trapped in local optimal points in the later generations

Mathematical Problems in Engineering 11

Data_5_2 Data_4_34 6 8 10 12 14 16

4

6

8

10

12

14

16

2015

105

0minus5minus5

05

1015

0

5

10

15

20

minus520

Figure 5 Distributions of data points in some datasets

of the search process SPSO shows a better performance insearching for a global optimum but its local search strategy isnot working on high-dimensional datasets The above resultsand discussions reveal the different performances amongPSO ABC FER-PSO GABC SPSO and CSPSO and verifythe good performance of CSPSO as a clustering algorithm

It can be seen from Table 8 that CSPSO has a betterperformance on these classical clustering problems For theData 3 2 Data 5 2 and Data 10 2 datasets CSPSO obtainedlower mean fitness values than others but other algorithmsperform better than CSPSO on the best fitness values Thisis possibly due to the randomness of the swarm intelligencealgorithms For the Data 4 3 datasets because of their largenumbers of data points the ability of local search in CSPSOis affected by the selected neighbors ABC has better stabilityon this dataset but has a poor ability in searching for aglobal optimum than the others due to the random strategyCSPSO outperforms the other algorithms and obtains lowermean fitness values than others on the Iris Glass CMC andWine datasets Therefore it has better performance on high-dimensional data This proposed algorithm with embeddedDDT and chaotic mapping has better global search abilityand is more stable than traditional PSO through the use ofa local search method based on starling birds Furthermoreas the sizes of the datasets increase in both the dimensionand number of data points CSPSO has better performancethan PSO ABC GABC FER-PSO and SPSO It is morecapable of dealing with big data in the current informationage

6 Conclusions

Because the performance of standard PSO depends onits parameter values parameter adjustment is one of themost useful ways for PSO to obtain good exploration andexploitation abilities CSPSO proposed in this study enhancesthe overall convergence of the searching process Nonlinearadjustment of the inertia weight and the chaotic search

method based on logistic mapping enhance the global searchperformance of the traditional PSO and help particles escapefrom local optima DDT added to velocity update increasesthe velocities of the particles in later iterations and also helpsparticles escape from local optima The local search strategybased on behavior of starling birds utilizes the information ofthe nearest neighbors in the neighborhood and determinesa new collective position and a new collective velocity Thetwoparticle selectionmethods Euclideandistance and fitnessfunction value maintain population diversity of the particlesThese improvements help particles avoid stagnation and findbetter solutions in the search space

The results of simulation experiments in both benchmarkfunction optimization and classical clustering problems showthe effectiveness of CSPSO Compared with the traditionalPSO and other evolutionary algorithms CSPSO has betterconvergence properties and stronger global search abilityHowever as for other swarm intelligence algorithms CSPSOmay be trapped into and may stagnate around local optimalpoints and therefore may be unstable when applied toproblemswith multiple local optima In future works CSPSOwill be applied to other combinatorial optimization problemssuch as scheduling problems Other metaheuristic methodsand optimization techniques may also be used to improve theperformance of PSO

Data Availability

All the datasets used in the simulation experiments camefrom the Artificial Datasets available at httpwwwisicalacinsimsanghamidatahtml (accessed June 2017) and from theUCIMachine Learning Repository available at httparchiveicsuciedumldatasetshtml (accessed June 2017)

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

12 Mathematical Problems in Engineering

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

46

48

50

52

54

56

58

60

(a) Data 3 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

320

340

360

380

400

420

440

460

480

(b) Data 5 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

90

100

110

120

130

140

150

160

170

180

190

(c) Iris

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

250

300

350

400

450

500

550

600

650

(d) Glass

Figure 6 Convergence of the algorithms on some datasets

Iteration 19965 1997 19975 1998 19985 1999 19995 200

Best

Valu

es

47515

4752

47525

4753

47535

4754

Iteration 188 190 192 194 196 198 200

Best

Valu

es

322323324325326327328329330331

Iteration 188 190 192 194 196 198 200

Best

Valu

es

94

95

96

97

98

99

Figure 7 Convergence of the algorithms on the Data 3 2 Data 5 2 and Iris datasets

Mathematical Problems in Engineering 13

Table 8 Comparison of performance of CSPSO with other clustering algorithms

Datasets Parameters AlgorithmsPSO ABC FERminusPSO GABC SPSO CSPSO

Data 3 2

Best 475287 477630 475290 475282 475733 475282Worst 591935 601039 497856 591895 531721 475382Mean 484644 509702 478771 482279 478934 475300SD 31956 31345 04967 27975 08551 00022

Data 5 2

Best 3264386 3375538 3269987 3297155 3264666 3264432Worst 3922899 4110786 3326651 3648400 3270877 3269676Mean 3296249 3736977 3287897 3435928 3267493 3265300SD 129978 201046 11594 96214 01536 00942

Data 10 2

Best 10793e+03 9164320 8756264 13134e+03 8432274 8498712Worst 14459e+03 12718e+03 12068e+03 11657e+03 11827e+03 10141e+03Mean 12875e+03 10608e+03 10066e+03 638046 9418921 9182542SD 704484 761788 671964 13134e+03 855520 505344

Data 4 3

Best 7518897 10937e+03 8305456 7504056 7499506 7496738Worst 12866e+03 16347e+03 13753e+03 18485e+03 12726e+03 12725e+03Mean 8388670 13838e+03 10312e+03 11489e+03 7938641 7845028SD 1818515 1223579 1500554 2713938 1425136 1251508

Iris

Best 1005216 1131201 980975 966829 966605 966605Worst 1306931 1568960 1029780 1276869 986906 980888Mean 1137655 1331161 1003761 1050487 969465 969194SD 71771 110706 12211 112307 03921 02931

Glass

Best 3291487 3337214 2786565 2885152 2353554 2296544Worst 4365615 5031607 3265351 4025881 3090997 2789187Mean 3764035 4223809 2973875 3464078 2805624 2570244SD 215371 320309 111730 269515 145045 103633

CMC

Best 58449e+03 59895e+03 56153e+03 56704e+03 55615e+03 55404e+03Worst 69260e+03 72137e+03 59728e+03 64143e+03 57676e+03 55861e+03Mean 61772e+03 64821e+03 57297e+03 58864e+03 56426e+03 55562e+03SD 1867542 2432599 730180 1408947 446957 87806

Wine

Best 16308e+04 16399e+04 16312e+04 16309e+04 16305e+04 16298e+04Worst 16798e+04 17317e+04 16376e+04 16482e+04 16356e+04 16313e+04Mean 16438e+04 16705e+04 16336e+04 16355e+04 16322e+04 16304e+04SD 950097 2197057 159233 354218 120030 33799

Acknowledgments

This research project was partially supported by the NationalNatural Science Foundation of China (61472231 6150228361640201) Humanities and Social Science Research Fund ofthe Ministry of Education of China (12YJA630152) and theShandong Social Science Foundation of China (16BGLJ0611CGLJ22)

References

[1] X LiuH Liu andHDuan ldquoParticle swarmoptimizationbasedon dynamic niche technology with applications to conceptualdesignrdquo Advances in Engineering Software vol 38 no 10 pp668ndash676 2007

[2] Z Bao and TWatanabe ldquoMixed constrained image filter designusing particle swarm optimizationrdquo Artificial Life and Roboticsvol 15 no 3 pp 363ndash368 2010

[3] S Sahin andM A Cavuslu ldquoFPGA Implementation ofWaveletNeural Network Training with PSOiPSOrdquo Journal of CircuitsSystems and Computers vol 27 no 6 2018

[4] D Wu and H Gao ldquoA BP and Switching PSO Based Optimiza-tion Approach for Engine Optimizationrdquo National Academy ofScience Letters vol 40 no 1 pp 33ndash37 2017

[5] Q Jia and Y Guo ldquoHybridization of ABC and PSO algorithmsfor improved solutions of RCPSPrdquo Journal of the ChineseInstitute of Engineers Transactions of the Chinese Institute ofEngineersSeries AChung-kuo Kung Chrsquoeng Hsuch Krsquoan vol 39no 6 pp 727ndash734 2016

[6] C Yue B Qu and J Liang ldquoA Multi-objective Particle SwarmOptimizer Using Ring Topology for Solving Multimodal Multi-objective Problemsrdquo IEEE Transactions on Evolutionary Com-putation 2017

[7] X Liu and J Xue ldquoA Cluster Splitting Technique by HopfieldNetworks and P Systems on Simplicesrdquo Neural ProcessingLetters pp 1ndash24 2017

14 Mathematical Problems in Engineering

[8] Y Zhao X Liu and W Wang ldquoSpiking neural P systems withneuron division and dissolutionrdquo PLoS ONE vol 11 no 9Article ID e0162882 2016

[9] X Liu Y Zhao and M Sun ldquoAn Improved Apriori AlgorithmBased on an Evolution-Communication Tissue-Like P Systemwith Promoters and Inhibitorsrdquo Discrete Dynamics in Natureand Society vol 2017 2017

[10] N Netjinda T Achalakul and B Sirinaovakul ldquoParticle SwarmOptimization inspired by starling flock behaviorrdquo Applied SoftComputing vol 35 pp 411ndash422 2015

[11] A El Dor D Lemoine M Clerc P Siarry L Deroussi andMGourgand ldquoDynamic cluster in particle swarm optimizationalgorithmrdquo Natural Computing vol 14 no 4 pp 655ndash672 2015

[12] Y M B Ali ldquoUnsupervised Clustering Based an AdaptiveParticle Swarm Optimization Algorithmrdquo Neural ProcessingLetters vol 44 no 1 pp 221ndash244 2016

[13] G Armano and M R Farmani ldquoMultiobjective clusteringanalysis using particle swarm optimizationrdquo Expert Systemswith Applications vol 55 pp 184ndash193 2016

[14] B Niu Q Duan L Tan C Liu and P Liang ldquoA Population-Based Clustering Technique Using Particle Swarm Optimiza-tion and K-Meansrdquo in Advances in Swarm and ComputationalIntelligence vol 9140 of Lecture Notes in Computer Science pp145ndash152 Springer International Publishing Cham 2015

[15] X Zhao W Lin J Hao X Zuo and J Yuan ldquoClustering andpattern search for enhancing particle swarm optimization withEuclidean spatial neighborhood searchrdquo Neurocomputing vol171 pp 966ndash981 2016

[16] E Comak ldquoA modified particle swarm optimization algorithmusing Renyi entropy-based clusteringrdquo Neural Computing andApplications vol 27 no 5 pp 1381ndash1390 2016

[17] K K Bharti and P K Singh ldquoOpposition chaotic fitness muta-tion based adaptive inertia weight BPSO for feature selection intext clusteringrdquoApplied SoftComputing vol 43 pp 20ndash34 2016

[18] W Song W Ma and Y Qiao ldquoParticle swarm optimizationalgorithm with environmental factors for clustering analysisrdquoSoft Computing vol 21 no 2 pp 283ndash293 2017

[19] R Liu J Li J Fan C Mu and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimizationfor dynamic multi-objective optimizationrdquo European Journal ofOperational Research vol 261 no 3 pp 1028ndash1051 2017

[20] X Chu B Niu J J Liang and Q Lu ldquoAn orthogonal-designhybrid particle swarm optimiser with application to capacitatedfacility location problemrdquo International Journal of Bio-InspiredComputation vol 8 no 5 pp 268ndash285 2016

[21] DWang D Tan and L Liu ldquoParticle Swarm Optimization Anoverviewrdquo Soft Computing vol 22 no 2 p 22 2018

[22] D Karaboga ldquoAn Idea Based on Honey Bee Swarm for Numer-ical Optimizationrdquo Technical Report TR0 2005

[23] X Li ldquoA multimodal particle swarm optimizer based on fitnessEuclidean-distance ratiordquo in Proceedings of the the 9th annualconference pp 78ndash85 London England July 2007

[24] G Zhu and S Kwong ldquoGbest-guided artificial bee colonyalgorithm for numerical function optimizationrdquo Applied Math-ematics and Computation vol 217 no 7 pp 3166ndash3173 2010

[25] A Laudani F R Fulginei G M Lozito and A SalvinildquoSwarmflock optimization algorithms as continuous dynamicsystemsrdquo Applied Mathematics and Computation vol 243 pp670ndash683 2014

[26] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[27] K K Bharti and P K Singh ldquoChaotic gradient artificial beecolony for text clusteringrdquo Soft Computing vol 20 no 3 pp1113ndash1126 2016

[28] RMMay ldquoSimplemathematicalmodelswith very complicateddynamicsrdquo Nature vol 261 no 5560 pp 459ndash467 1976

[29] F S Hover and M S Triantafyllou ldquoApplication of polynomialchaos in stability and controlrdquo Automatica vol 42 no 5 pp789ndash795 2006

[30] X Liu L Xiang and X Wang ldquoSpatial cluster analysis bythe adleman-lipton DNA computing model and flexible gridsrdquoDiscrete Dynamics in Nature and Society vol 2012 Article ID894207 32 pages 2012

[31] H Peng X Luo ZGao JWang andZ Pei ldquoANovel ClusteringAlgorithm Inspired by Membrane Computingrdquo The ScientificWorld Journal vol 3 no 22 Article ID 929471 8 pages 2015

[32] B Ufnalski and L M Grzesiak ldquoPlug-in direct particle swarmrepetitive controller with a reduced dimensionality of a fitnesslandscape - a multi-swarm approachrdquo Bulletin of the PolishAcademy of SciencesmdashTechnical Sciences vol 63 no 4 pp 857ndash866 2015

[33] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[34] ldquoDatasetsrdquo httpwwwisicalacinsimsanghamidatahtml[35] H C Blake UCI Repository of Machine Learning Databases

1998

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 4: A New Chaotic Starling Particle Swarm Optimization ...downloads.hindawi.com/journals/mpe/2018/8250480.pdfMathematicalProblemsinEngineering Chaotic Parameter u 2.5 3 3.5 4 Chaotic Value

4 Mathematical Problems in Engineering

trajectory of particles in the population by comparing thefitness function values to select the local and global bestpositions Because of local optimal solutions the velocity ofa particle is gradually decreasing during the later iterationsTherefore the majority of the particles fall into and cannoteasily jump out of local optimal points called prematureconvergence and evolutionary stagnation To overcome theseproblems a method based on DDT [32] is used to changethe velocity update of each particle DDT is embedded intovelocity updating when the velocity of a particle becomes toolow or when its position stays unchanged in the middle andfinal generations The modified velocity is updated using (8)and the DDT represented by 119879 is updated using (9) in thefollowing

V119894119895 (119905 + 1) = 119908 lowast V119894119895 (119905) + 1198881 lowast 1199031 lowast (119901119894119895 (119905) minus 119909119894119895 (119905))+ 1198882 lowast 1199032 lowast (119901119892119895 (119905) minus 119909119894119895 (119905)) + 119879 (8)

119879 = 119898 lowast ( ttmax

minus 05) (9)

where 119898 is an accommodation coefficient with a valuebetween 0 and 1 This improvement helps in preventing aparticle from flying out of the boundary of the search spacewhen its velocity is too high in early generations and inescaping from a local optimal point by increasing its velocityso as to improve the overall searching capacity Besides 119879increases linearly to avoid oscillation and to keep a relativelystable search direction in the optimization process

33 Local Search Based on Starling Birds In nature a star-ling bird spreads some kinds of information to its nearbyneighbors in order to defend its position and the informationcoming from one starling bird can spread to everywherein the swarm Inspired by this phenomenon a local searchmethod based on starling bird behavior is used in thedeveloped PSO procedure Each particle seeks a new positionand velocity in the neighborhood by a weighted method thatcan be seen as a kind of information communication betweenan individual and its nearby neighborsThis mechanism triesto seek a better solution in the local exploration around theparticle and reduces the time needed in local search [25]

Dynamic Neighborhood Selection For each particle 119894 theneighborhood 119867 is a subset of the particle population Theparticles in the population are listed in decreasing order ofthe fitness-Euclidean distance ratio (FER) fromparticle 119894 [23]and the119873119890 particles ranked on the top are taken as the ones inthe neighborhood of the particle FER is determined by (10)in the following

119865119864119877(119909119894 1199091198941) = 119886 lowast ((119865 (119909119894) minus 119865 (1199091198941))119864119863(119909119894 1199091198941) ) 1198941 = 1 2 119872

(10)

where ED(119909119894 1199091198941) is the Euclidean distance between 119909119894 and1199091198941 120572 = 119904(F(119909119908) minus F(119909119887)) 119909119887 and 119909119908 are the best and

worst positions of the particles in the current population 119904is the size of search space defined as 119904 = radicsum119863119895 (119909119906119895 minus 119909119897119895)2and 119909119906119895 and 119909119897119895 are the maximum and minimum values of the119895th dimension of the data points in the dataset The numberof neighbors 119873119890 is the size of the neighborhood In order toexplore different search scope a dynamic strategy is used toadjust the size of the neighborhood specified in (11) in thefollowing

119873119890 = 119873119890min + INT(119905 lowast ((119873119890max minus 119873119890min)119905max)) (11)

where 119873119890max and 119873119890min are the maximum and minimumsizes of the neighborhood and INT(119886) is the integral functiontaking only the integer part of 119886Position Update The position of particle 119894 is adjusted usingthe information of the particles in the neighborhood Usingthe weighted positions of the neighbors the new position ofparticle 119894 is determined by (12)

119909119894 = 119909119894 + 1199033 lowast (( 1119873119890) lowast sum1198941isin119867

1199091198941) (12)

where 119909119894 is the current position of particle 119894 119909119894 is the newposition after the adjustment and 1199033 is a randomly generatedreal number uniformly distributed between minus1 and 1

Velocity Update Similar to the update of the position thevelocity of particle 119894 is updated using the velocity informationof the particles in the neighborhood The new velocity ofparticle 119894 is determined by (13) in the following

V119894 = V119894 + 1199034 lowast (( 1119873119890) lowast sum1198941isin119867

V1198941) (13)

where V119894 and V119894 are the current and updated velocities ofparticle 119894 and 1199034 is a randomly generated real number uni-formly distributed between 0 and 1 The collective responsesof starling birds are described by the pseudocode shown inAlgorithm 2

4 The CSPSO for Clustering Problems

41 Particle Selection In the previous section a local searchmethod based on starling bird behavior is used in theneighborhood of a particle to search for better solutionsTime needed by this method will increase In order toaccelerate search speed and reduce time needed the localsearch method is not applied to all particles Two methodsare adopted to select particles from the population The localsearchmethod is then applied to all the particles selected withthese two methods

An Euclidean Distance Method In each generation all theparticles are sorted in ascending order of their Euclideandistances from the global best The particles are dividedinto 119878119890 groups of approximately equal size based on their

Mathematical Problems in Engineering 5

Input 119873119890for Particle 119894

Find the119873119890 neighbors in the neighborhood119867 of particle 119894 using FERThe new position of particle 119894 119909119894 = 119909119894 + 1199033 lowast ((1119873119890) lowast sum1198941isin119867 1199091198941)The new position of particle 119894 V119894 = V119894 + 1199034 lowast ((1119873119890) lowast sum1198941isin119867 V1198941 )if Particle()FitnessltParticle(119894)Fitness

Particle(119894)Fitness=Particle()FitnessParticle(119894)Position=Particle()PositionParticle(119894)Velocity=Particle()Velocity

End ifReturn Particle 119894

End forOutput Particle 119894

Algorithm 2 Starling Birds Collective Responses

Particle swarm initialization

Update velocity and position ofeach particle

Update individual best

and globalbest positions

Local search method in starling birds

Sort the particle population using fitness

values

Selected particles using

Euclidean distance

particles using Fitness function

Max iteration

Return global best position

Particles

Identify neighbors

Position and velocity

adjustment

Select particleusing fitness values

update positionand velocity

Update globalbest position

NO

YES

=+

Selected

Figure 2 A local search method in the CSPSO algorithm

Euclidean distances to the global best The particle rankedon the top of each group is selected As a result a total of 119878119890particles are selected The particles selected in this methoddiversify the search and enhance exploration of the CSPSOprocedure

A Fitness Function Method In each generation all theparticles are sorted in the ascending order of their fitnessfunction values A total of 119878119891 particles ranked on the topare selected The particles selected in this method focus onthe promising regions in the search space so as to enhanceexploitation of the CSPSO procedure

The two particle selection methods described above areused in CSPSO to improve its convergence and increasediversity As a result a total of 119878 = 119878119890+119878119891 particles are selectedIn the implementation 119878119890 = 119878119891 is used Particles selected

through the Euclidean distance method are diversified in thesearch space to effectively avoid premature convergence Theparticles selected through the fitness function method willlead the search direction Figure 2 gives more details aboutthe process of particle selection

42 Population Initialization In the implementation ofCSPSO each cluster center is the position of a particle eachcluster center represents a cluster and different cluster centersrepresent different clustersThe set of cluster centers 119911 = 119911119896for 119896 = 1 2 119870 represents a partitioning result and alsorepresents the position of a particle 119909119894 = 1199111198941 1199111198942 119911119894119870In the initialization the initial position of each particle isgenerated randomly in the entire search space Dimension 119895of the position of particle 119894 represented by 119909119894119895 is generated

6 Mathematical Problems in Engineering

Input 119872 119905max 119873119890min 119873119890max 119908min 119908maxParticle Swarm Population Initializationfor 119894 = 1 to119872 (Population Size)

Particle(119894)BestPosition=Particle(119894)PositionParticle(119894)BestFitness=Particle(119894)Fitnessif Particle(119894)BestFitnessltBest Particle Fitness

Best ParticlePosition=Particle(119894)BestPositionBest ParticleFitness=Particle(119894)BestFitness

End ifEnd forfor 119905 = 1 to 119905max (Max Iteration)Algorithm 1 The Standard PSO Algorithm(See Algorithm 1)

Particle Selection119878119890 = Euclidean Distance Function (Best ParticlePosition)119878119891 = Fitness Function (Particle(119894)Fitness)119878 = 119878119890 + 119878119891Algorithm 2 Starling Birds Collective Responses

for 119894 = 1 to 119878Find the119873119890 neighbors in the neighborhood119867 of particle 119894 using FERThe new position of particle 119894 119909119894 = 119909119894 + 1199033 lowast ((1119873119890) lowast sum1198941isin119867 1199091198941)The new position of particle 119894 V119894 = V119894 + 1199034 lowast ((1119873119890) lowast sum1198941isin119867 V1198941 )if Particle()FitnessltParticle(119894)FitnessParticle(119894)Fitness=Particle()FitnessParticle(119894)Position=Particle()PositionParticle(119894)Velocity=Particle()Velocityif Particle()FitnessltBest ParticleFitness

Best ParticlePosition=Particle()BestPositionBest ParticleFitness=Particle()BestFitness

End ifEnd if

End for

119873119890 = 119873119890min + INT(119905 lowast ((119873119890max minus 119873119890min)119905max))

End forOutput Best ParticleFitness

Algorithm 3 A Chaotic Starling PSO Algorithm

randomly between 119909119897119895 and 119909119906119895 where 119909119897119895 and 119909119906119895 are theminimum and maximum values of dimension 119895 of the searchspace and also are the limits of the positions of the particlesTherefore 119909119894119895 isin [119909119897119895 119909119906119895 ] for 119894 = 1 2 119872 and 119895 =1 2 119863 The values of 119909119897119895 and 119909119906119895 are determined as (14)

119909119897119895 = min (119883119894119895 | 119894 = 1 2 119873) 119909119906119895 = max (119883119894119895 | 119894 = 1 2 119873)

for 119895 = 1 2 119889(14)

where 119883119894119895 is dimension 119895 of data point 119883119894 In particular thedimension 119863 of a particle is D = 119870 lowast 11988943 Boundary Handling If a particle flies outside or acrossthe boundary of the search space this particle no longerrepresents a solution The position and velocity of such aparticle are adjusted If the position of a particle in dimension

119895 is below the lower (above the upper) limit of that dimensionit is set to the lower (upper) limit of that dimensionAccordingly the velocity of this particle along dimension119895 is reversed These adjustments can limit the scope of thepositions and change the velocities as well as changing thedirections of the flying paths of the particles The positionand velocity of such a particle are adjusted as follows

119909119894119895 = 119909119897119895 if 119909119894119895 lt 119909119897119895119909119894119895 = 119909119906119895 if 119909119894119895 gt 119909119906119895 V119894119895 = minusV119894119895

(15)

where 119909119894119895 is the new value of dimension 119895 of the position andV119894119895 is the new value of dimension 119895 of the velocity of particle 119894after the adjustments

44The Pseudocode of the Proposed CSPSO Thepseudocodeof the proposed CSPSO is presented in Algorithm 3

Mathematical Problems in Engineering 7

Table 1 Benchmark Functions

Benchmark Functions Function Expression Domain 119883min 119865min

Sphere 119891min = 119863sum119894=1

1199092119894 [ndash100 100]D 0119863 0

Rastrigin 119891min = 119863sum119894=1

[1199092119894 minus 10 cos(2120587119909119894) + 10] [ndash512 512]D 0119863 0

Griewank 119891min = 119863sum119894=1

( 11990921198944000) minus 119863prod119894=1

cos( 119909119894radic119894) + 1 [ndash600 600]D 0119863 0

Rosenbrock 119891min = 119863sum119894=1

[(1199092119894+1 minus 1199092119894 )2 + (1 minus 119909119894)2] [ndash2048 2048]D 1119863 0

SphereRastrigin

times104

2

1

0100

0 0

minus100 minus100

100

60

40

20

0

minus20

minus40

5

50

0

minus5 minus5

Figure 3 The Sphere and Rastrigin Function for 119863 = 2

5 Simulation Experiments

Three experiments are conducted to evaluate the perfor-mance of CSPSO The first experiment validates the optimalparameter values in CSPSO using the Sphere and Rastrigin[33] benchmark functions The second experiment validatesthe performance of CSPSO using four classical numericalbenchmark functions [33] These functions all to be min-imized and their domains are presented in Table 1 Thethird experiment checks the effectiveness of CSPSO in dataclustering using some datasets from the Artificial Datasets[34] and the UCI Machine Learning Repository [35] CSPSOis implemented in MATLAB and all the experiments areconducted on a LENOVO desktop computer with an Intel320 GHz i5-4460P4 processor and 8GB of RAM in aWindows 8 environment

51 Experiment on Parameter Settings As described abovethe values of the parameters have important influences onthe performance of CSPSOThis section focuses on checkingthe influences of the three critical parameters in CSPSO theinertia weight 119908 the size of the neighborhood 119873119890 and thenumber of selected particles 119878 so as to find appropriate valuesfor them The Sphere and Rastrigin functions are used tostudy the influences of different values of the parameters in

CSPSO The shape and ranges of the Sphere and Rastriginfunctions for 119863 = 2 are depicted in Figure 3 The dimensionof benchmark functions is119863 = 10 in the experiment CSPSOran 30 times for each of the two benchmark functions

For the validity of the experiments and fair comparisonsparameters not to be tested in the experiments are kept atthe same values These parameters are the population size119872 = 50 the max number of iterations 119905max = 100 and thecognitive coefficients 1198881 = 1198882 = 2

The influences of the inertia weight 119908 is tested first Itsvalue is adjusted according to (8)Thevalues of119908min and119908maxdirectly determine the decline speed of 119908 and consequentlyaffect the convergence speed of CSPSO Their values arevaried in the experiments When the value of 119908 varies theminimum and maximum sizes of the neighborhood are keptat 119873119890min = 2 and 119873119890max = 5 and the number of selectedparticles is kept at 119878 = 10 The best and mean values of theSphere and Rastrigin functions are reported in Table 2

The differences in these results are obvious The optimalvalues of the Sphere and Rastrigin functions were obtained atdifferent values of 119908 However the mean values are differentfor the Rastrigin function because it is multimodal withmany local optima Generally it is difficult to find globaloptimal solutions for multimodal functions with traditionaloptimization algorithms Based on these results the optimal

8 Mathematical Problems in Engineering

Table 2 Results when varying the inertia weight 119908Inertia weight (119908min minus 119908max) Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin02minus06 0(00022) 0(09684) 33773 3132102minus09 0(0) 0(16269) 31235 3135804minus09 0(0) 0(17282) 31258 3130904minus12 0(0) 0(07567) 31229 31359

Table 3 Results when varying the size of the neighbourhood119873119890Neighbor size (119873119890min minus 119873119890max) Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin2-5 0(01382) 0(10071) 31503 314202-7 0(0) 0(07359) 31389 314162-9 0(0) 0(13716) 31403 31446

Table 4 Results when varying the number of selected particles 119878119878119890 119878119891 119878 Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin5 10 0(0) 0(06586) 31808 319667 14 0(07567) 0(15407) 42845 426459 18 0(25609) 0(24015) 53850 53840

values of the lower and upper limits of the inertia weight areset to 119908min = 04 and 119908max = 12 where the minimum meanvalues are achieved

The influences of the size of the neighborhood 119873119890 isexamined next In CSPSO the size of the neighborhood 119873119890increases gradually to enhance the local search ability Largevalues will lead to slow convergence and small values willrestrain the local search ability An appropriate value balancesexploration and exploitation The lower and upper limits onthe size of the neighborhood are set to different values and thebest and mean values of the Sphere and Rastrigin functionsare reported in Table 3

Table 3 shows that the best results are obtained when thesize of the neighborhood is between the lower and upperlimits 119873119890min = 2 and 119873119890max = 7 The influences of thenumber of selected particles 119878 are then tested It has influenceson both convergence and search ability Different values for 119878119890and 119878119891 are used and the results are reported in Table 4

The results in Table 4 show that the best number ofselected particles is 10 when the mean values of the testfunctions and computation time are both low It is easy tosee that more particles do not always lead to better resultsBecause of randomness built into the algorithm sometimesmost of the particles may just gather around local optima

Based on these results the lower and upper limits onthe inertia weight 119908min and 119908max are set to 04 and 12respectively the lower and upper limits on the size ofthe neighborhood 119873119890min and 119873119890max are set to 2 and 7respectively and the number of selected particles 119878 is set to10 These parameter values in CSPSO are kept the same andare used in the following experiments

52 Benchmark Functions In this section the 4 classicalnumerical benchmark functions presented in Table 1 are usedto validate the effectiveness of CSPSO The dimension of the

benchmark functions which determines the dimension ofthe search space is set to119863 = 10 20 and 30Theperformanceof CSPSO is compared with those of PSO fitness-Euclideandistance ratio particle swarm optimization (FER-PSO) [23]and starling particle swarm optimization (SPSO) [25] PSO isthe basic swarm intelligent algorithm that was developed byKennedy and Eberhart [26] FER-PSO utilizes the individualbest of a neighbor selected based on the Euclidean distanceand fitness function value to replace the individual best of itsown The SPSO uses the collective responses of the particlesto adjust the positions and velocities of the particles when thealgorithm stagnates

The values of the adjustable parameters in these compet-itive algorithms are the best ones reported in the respectivereferences as listed in Table 5 These competitive algorithmswere also run for 50 independent times each so as to getsome meaningful results The best fitness function values forthese functions obtained by these algorithms are reported inTable 6

Figure 4 shows the convergence of some functions ofthe three algorithms Compared with PSO and FER-PSOin Figures 4(b) and 4(c) the curves of the function valuesobtained by CSPSO decline slowly at the beginning of theevolutionary process because of the time taken by localsearch Hence it has a slow convergence speed Relying onDDT and chaotic mapping CSPSO can find lower fitnessvalues than other algorithms after about half of evolutionaryprocess rather than falling into local optima The local searchstrategy in the neighborhood helps in finding better solutionsin the nearby neighborhood of the current position of aparticle In particular the curve of the Sphere functionobtained by CSPSO disappeared after the best value 0 hasbeen found because this value cannot be marked on thechart As shown in Figure 4(d) for the Rosenbrock functiona multimodal function and traditional algorithms like PSO

Mathematical Problems in Engineering 9

Table 5 Parameter settings in the experiment

Parameters PSO ABC [22] FER-PSO [23] GABC [24] SPSO [25] CSPSOPopulation (119872) 50 50 50 50 50 50119905max 200 200 200 200 200 2001198881 1198882 22 minus 22 minus 22 2 21199031 1199032 (0 1) (minus1 1) (01 04) (minus1 1) (0 1) (0 1)(119908min 119908max) 1 minus 1 1 (02 04) (04 12)Neighbors (119873119890) minus minus 7 minus 7 2minus7Selected (119878) minus minus minus minus minus 10Sub-population minus minus minus minus 14 minusStagnant limit minus minus minus minus 2 minusTrial limit minus 10 minus 10 minus minus

Table 6 Best fitness function values obtained by different algorithms for the test functions

Function 119863 PSO FER-PSO SPSO CSPSO

Sphere10 0 0 0 020 05285 01613 82232e-12 030 24235 08891 51645e-12 0

Rastrigin10 00727 003647 0 020 03595 01895 0 030 10374 04847 0 0

Griewank10 05965 01135 0 020 14575 10994 0 030 32947 16874 0 0

Rosenbrock10 63220 68162 52486 0703720 198643 192219 187899 9843130 381130 337894 287863 186928

FER-PSO and SPSO are easily trapped into local optima butCSPSO can find the smallest function values among thosefound by these algorithms These results show that CSPSOhas strong global search ability through the use of DDT andchaotic mapping It can quickly escape local optima

As the results of the 4 classic functions show in Table 6CSPSO and SPSO perform better than PSO and FER-PSOon the Rastrigin and Griewank functions and CSPSO per-forms better than the other three algorithms on the Sphereand Rosenbrock functions In general the results showthat CSPSO has more powerful global search ability thantraditional algorithms like PSO FER-PSO and SPSO bothin single modal and multimodal functions Furthermore asindicated by the results in Table 6 the better performance ofCSPSO than others provides evidence that the improvementin CSPSO is effective in enhancing the performance of PSOand CSPSO can find the global optimum more often thanother algorithms in the experiments

53 Clustering Problems In this section experiments onclustering problemswith different datasets are performed andresults of different algorithms including CSPSO are reportedand discussed Eight benchmark datasets are used Theyare Data 3 2 Data 5 2 Data 10 2 and Data 4 3 from thework reported in [34] and Iris Glass Contraceptive Method

Choice (CMC) and Wine from the UCI Machine LearningRepository [35] More details about these datasets are givenin Table 7 Data 5 2 and Data 4 3 are also plotted in Figure 5

The optimized values of the parameters in the other algo-rithms shown in Table 5 are adopted in the experiments Inorder to perform fair comparisons all the algorithms run onall the clustering datasets for 50 times to eliminate the effectsrandom factors As simple statistics best values (Best) worstvalues (Worst) mean values (Mean) and standard deviations(SD) are used as the evaluation criteria to measure theeffectiveness of these clustering algorithms The environmentof experiment is the same for all clustering algorithms

Figure 6 shows the convergence of these algorithms onfour datasets for typical runs of the algorithms The fitnessfunctions of CSPSO decline slowly at the beginning ofthe evolutionary process and continue to decline ratherthan dropping into local optimum at the middle of theevolutionary process possibly because DDT gives the par-ticles high velocities Local search based on Euclidean andfitness neighborhood structure also helps the particles findbetter positions This phenomenon is more evident on high-dimensional datasets such as the Glass dataset

To be more clear Figure 7 gives more details aboutthe convergence of these algorithms near the end of thesearch process on three out of the four same datasets

10 Mathematical Problems in Engineering

Table 7 Description of the datasets used in the experiments

Datasets Clusters (119870) Features (119889) Total instances (119873) Instances in each clustersData 3 2 3 2 76 (252526)Data 5 2 5 2 250 (5050505050)Data 10 2 10 2 500 (50505050505050505050)Data 4 3 4 3 400 (100100100100)Iris 3 4 150 (505050)Glass 6 9 214 (70177613929)CMC 3 9 1473 (629334510)Wine 3 13 178 (597148)

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

CSPSOPSO

SPSOFER-PSO

104

103

102

101

100

10minus1

10minus2

(a) Sphere function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Bes

t Val

ues

103

102

101

100

(b) Rastrigin function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

105

100

10minus10

10minus5

10minus15

10minus20

(c) Griewank function

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

103

102

101

100

CSPSOPSO

SPSOFER-PSO

(d) Rosenbrock function

Figure 4 Convergence of the algorithms on some functions (D = 10)

CSPSO has the best performance among all these clusteringalgorithms ABC is the first converging to a local optimalpoint possibly because of the influence of randomly selectedneighbors GABC has a poor performance for large datavolumes such as the Data 5 2 and Glass datasets possibly

due to the local search behavior of onlooker bees FER-PSOhas apparently better convergence performance than otheralgorithms possibly due to the strategy that only one of theneighbors is selected to guide the search of a particle but it iseasily trapped in local optimal points in the later generations

Mathematical Problems in Engineering 11

Data_5_2 Data_4_34 6 8 10 12 14 16

4

6

8

10

12

14

16

2015

105

0minus5minus5

05

1015

0

5

10

15

20

minus520

Figure 5 Distributions of data points in some datasets

of the search process SPSO shows a better performance insearching for a global optimum but its local search strategy isnot working on high-dimensional datasets The above resultsand discussions reveal the different performances amongPSO ABC FER-PSO GABC SPSO and CSPSO and verifythe good performance of CSPSO as a clustering algorithm

It can be seen from Table 8 that CSPSO has a betterperformance on these classical clustering problems For theData 3 2 Data 5 2 and Data 10 2 datasets CSPSO obtainedlower mean fitness values than others but other algorithmsperform better than CSPSO on the best fitness values Thisis possibly due to the randomness of the swarm intelligencealgorithms For the Data 4 3 datasets because of their largenumbers of data points the ability of local search in CSPSOis affected by the selected neighbors ABC has better stabilityon this dataset but has a poor ability in searching for aglobal optimum than the others due to the random strategyCSPSO outperforms the other algorithms and obtains lowermean fitness values than others on the Iris Glass CMC andWine datasets Therefore it has better performance on high-dimensional data This proposed algorithm with embeddedDDT and chaotic mapping has better global search abilityand is more stable than traditional PSO through the use ofa local search method based on starling birds Furthermoreas the sizes of the datasets increase in both the dimensionand number of data points CSPSO has better performancethan PSO ABC GABC FER-PSO and SPSO It is morecapable of dealing with big data in the current informationage

6 Conclusions

Because the performance of standard PSO depends onits parameter values parameter adjustment is one of themost useful ways for PSO to obtain good exploration andexploitation abilities CSPSO proposed in this study enhancesthe overall convergence of the searching process Nonlinearadjustment of the inertia weight and the chaotic search

method based on logistic mapping enhance the global searchperformance of the traditional PSO and help particles escapefrom local optima DDT added to velocity update increasesthe velocities of the particles in later iterations and also helpsparticles escape from local optima The local search strategybased on behavior of starling birds utilizes the information ofthe nearest neighbors in the neighborhood and determinesa new collective position and a new collective velocity Thetwoparticle selectionmethods Euclideandistance and fitnessfunction value maintain population diversity of the particlesThese improvements help particles avoid stagnation and findbetter solutions in the search space

The results of simulation experiments in both benchmarkfunction optimization and classical clustering problems showthe effectiveness of CSPSO Compared with the traditionalPSO and other evolutionary algorithms CSPSO has betterconvergence properties and stronger global search abilityHowever as for other swarm intelligence algorithms CSPSOmay be trapped into and may stagnate around local optimalpoints and therefore may be unstable when applied toproblemswith multiple local optima In future works CSPSOwill be applied to other combinatorial optimization problemssuch as scheduling problems Other metaheuristic methodsand optimization techniques may also be used to improve theperformance of PSO

Data Availability

All the datasets used in the simulation experiments camefrom the Artificial Datasets available at httpwwwisicalacinsimsanghamidatahtml (accessed June 2017) and from theUCIMachine Learning Repository available at httparchiveicsuciedumldatasetshtml (accessed June 2017)

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

12 Mathematical Problems in Engineering

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

46

48

50

52

54

56

58

60

(a) Data 3 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

320

340

360

380

400

420

440

460

480

(b) Data 5 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

90

100

110

120

130

140

150

160

170

180

190

(c) Iris

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

250

300

350

400

450

500

550

600

650

(d) Glass

Figure 6 Convergence of the algorithms on some datasets

Iteration 19965 1997 19975 1998 19985 1999 19995 200

Best

Valu

es

47515

4752

47525

4753

47535

4754

Iteration 188 190 192 194 196 198 200

Best

Valu

es

322323324325326327328329330331

Iteration 188 190 192 194 196 198 200

Best

Valu

es

94

95

96

97

98

99

Figure 7 Convergence of the algorithms on the Data 3 2 Data 5 2 and Iris datasets

Mathematical Problems in Engineering 13

Table 8 Comparison of performance of CSPSO with other clustering algorithms

Datasets Parameters AlgorithmsPSO ABC FERminusPSO GABC SPSO CSPSO

Data 3 2

Best 475287 477630 475290 475282 475733 475282Worst 591935 601039 497856 591895 531721 475382Mean 484644 509702 478771 482279 478934 475300SD 31956 31345 04967 27975 08551 00022

Data 5 2

Best 3264386 3375538 3269987 3297155 3264666 3264432Worst 3922899 4110786 3326651 3648400 3270877 3269676Mean 3296249 3736977 3287897 3435928 3267493 3265300SD 129978 201046 11594 96214 01536 00942

Data 10 2

Best 10793e+03 9164320 8756264 13134e+03 8432274 8498712Worst 14459e+03 12718e+03 12068e+03 11657e+03 11827e+03 10141e+03Mean 12875e+03 10608e+03 10066e+03 638046 9418921 9182542SD 704484 761788 671964 13134e+03 855520 505344

Data 4 3

Best 7518897 10937e+03 8305456 7504056 7499506 7496738Worst 12866e+03 16347e+03 13753e+03 18485e+03 12726e+03 12725e+03Mean 8388670 13838e+03 10312e+03 11489e+03 7938641 7845028SD 1818515 1223579 1500554 2713938 1425136 1251508

Iris

Best 1005216 1131201 980975 966829 966605 966605Worst 1306931 1568960 1029780 1276869 986906 980888Mean 1137655 1331161 1003761 1050487 969465 969194SD 71771 110706 12211 112307 03921 02931

Glass

Best 3291487 3337214 2786565 2885152 2353554 2296544Worst 4365615 5031607 3265351 4025881 3090997 2789187Mean 3764035 4223809 2973875 3464078 2805624 2570244SD 215371 320309 111730 269515 145045 103633

CMC

Best 58449e+03 59895e+03 56153e+03 56704e+03 55615e+03 55404e+03Worst 69260e+03 72137e+03 59728e+03 64143e+03 57676e+03 55861e+03Mean 61772e+03 64821e+03 57297e+03 58864e+03 56426e+03 55562e+03SD 1867542 2432599 730180 1408947 446957 87806

Wine

Best 16308e+04 16399e+04 16312e+04 16309e+04 16305e+04 16298e+04Worst 16798e+04 17317e+04 16376e+04 16482e+04 16356e+04 16313e+04Mean 16438e+04 16705e+04 16336e+04 16355e+04 16322e+04 16304e+04SD 950097 2197057 159233 354218 120030 33799

Acknowledgments

This research project was partially supported by the NationalNatural Science Foundation of China (61472231 6150228361640201) Humanities and Social Science Research Fund ofthe Ministry of Education of China (12YJA630152) and theShandong Social Science Foundation of China (16BGLJ0611CGLJ22)

References

[1] X LiuH Liu andHDuan ldquoParticle swarmoptimizationbasedon dynamic niche technology with applications to conceptualdesignrdquo Advances in Engineering Software vol 38 no 10 pp668ndash676 2007

[2] Z Bao and TWatanabe ldquoMixed constrained image filter designusing particle swarm optimizationrdquo Artificial Life and Roboticsvol 15 no 3 pp 363ndash368 2010

[3] S Sahin andM A Cavuslu ldquoFPGA Implementation ofWaveletNeural Network Training with PSOiPSOrdquo Journal of CircuitsSystems and Computers vol 27 no 6 2018

[4] D Wu and H Gao ldquoA BP and Switching PSO Based Optimiza-tion Approach for Engine Optimizationrdquo National Academy ofScience Letters vol 40 no 1 pp 33ndash37 2017

[5] Q Jia and Y Guo ldquoHybridization of ABC and PSO algorithmsfor improved solutions of RCPSPrdquo Journal of the ChineseInstitute of Engineers Transactions of the Chinese Institute ofEngineersSeries AChung-kuo Kung Chrsquoeng Hsuch Krsquoan vol 39no 6 pp 727ndash734 2016

[6] C Yue B Qu and J Liang ldquoA Multi-objective Particle SwarmOptimizer Using Ring Topology for Solving Multimodal Multi-objective Problemsrdquo IEEE Transactions on Evolutionary Com-putation 2017

[7] X Liu and J Xue ldquoA Cluster Splitting Technique by HopfieldNetworks and P Systems on Simplicesrdquo Neural ProcessingLetters pp 1ndash24 2017

14 Mathematical Problems in Engineering

[8] Y Zhao X Liu and W Wang ldquoSpiking neural P systems withneuron division and dissolutionrdquo PLoS ONE vol 11 no 9Article ID e0162882 2016

[9] X Liu Y Zhao and M Sun ldquoAn Improved Apriori AlgorithmBased on an Evolution-Communication Tissue-Like P Systemwith Promoters and Inhibitorsrdquo Discrete Dynamics in Natureand Society vol 2017 2017

[10] N Netjinda T Achalakul and B Sirinaovakul ldquoParticle SwarmOptimization inspired by starling flock behaviorrdquo Applied SoftComputing vol 35 pp 411ndash422 2015

[11] A El Dor D Lemoine M Clerc P Siarry L Deroussi andMGourgand ldquoDynamic cluster in particle swarm optimizationalgorithmrdquo Natural Computing vol 14 no 4 pp 655ndash672 2015

[12] Y M B Ali ldquoUnsupervised Clustering Based an AdaptiveParticle Swarm Optimization Algorithmrdquo Neural ProcessingLetters vol 44 no 1 pp 221ndash244 2016

[13] G Armano and M R Farmani ldquoMultiobjective clusteringanalysis using particle swarm optimizationrdquo Expert Systemswith Applications vol 55 pp 184ndash193 2016

[14] B Niu Q Duan L Tan C Liu and P Liang ldquoA Population-Based Clustering Technique Using Particle Swarm Optimiza-tion and K-Meansrdquo in Advances in Swarm and ComputationalIntelligence vol 9140 of Lecture Notes in Computer Science pp145ndash152 Springer International Publishing Cham 2015

[15] X Zhao W Lin J Hao X Zuo and J Yuan ldquoClustering andpattern search for enhancing particle swarm optimization withEuclidean spatial neighborhood searchrdquo Neurocomputing vol171 pp 966ndash981 2016

[16] E Comak ldquoA modified particle swarm optimization algorithmusing Renyi entropy-based clusteringrdquo Neural Computing andApplications vol 27 no 5 pp 1381ndash1390 2016

[17] K K Bharti and P K Singh ldquoOpposition chaotic fitness muta-tion based adaptive inertia weight BPSO for feature selection intext clusteringrdquoApplied SoftComputing vol 43 pp 20ndash34 2016

[18] W Song W Ma and Y Qiao ldquoParticle swarm optimizationalgorithm with environmental factors for clustering analysisrdquoSoft Computing vol 21 no 2 pp 283ndash293 2017

[19] R Liu J Li J Fan C Mu and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimizationfor dynamic multi-objective optimizationrdquo European Journal ofOperational Research vol 261 no 3 pp 1028ndash1051 2017

[20] X Chu B Niu J J Liang and Q Lu ldquoAn orthogonal-designhybrid particle swarm optimiser with application to capacitatedfacility location problemrdquo International Journal of Bio-InspiredComputation vol 8 no 5 pp 268ndash285 2016

[21] DWang D Tan and L Liu ldquoParticle Swarm Optimization Anoverviewrdquo Soft Computing vol 22 no 2 p 22 2018

[22] D Karaboga ldquoAn Idea Based on Honey Bee Swarm for Numer-ical Optimizationrdquo Technical Report TR0 2005

[23] X Li ldquoA multimodal particle swarm optimizer based on fitnessEuclidean-distance ratiordquo in Proceedings of the the 9th annualconference pp 78ndash85 London England July 2007

[24] G Zhu and S Kwong ldquoGbest-guided artificial bee colonyalgorithm for numerical function optimizationrdquo Applied Math-ematics and Computation vol 217 no 7 pp 3166ndash3173 2010

[25] A Laudani F R Fulginei G M Lozito and A SalvinildquoSwarmflock optimization algorithms as continuous dynamicsystemsrdquo Applied Mathematics and Computation vol 243 pp670ndash683 2014

[26] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[27] K K Bharti and P K Singh ldquoChaotic gradient artificial beecolony for text clusteringrdquo Soft Computing vol 20 no 3 pp1113ndash1126 2016

[28] RMMay ldquoSimplemathematicalmodelswith very complicateddynamicsrdquo Nature vol 261 no 5560 pp 459ndash467 1976

[29] F S Hover and M S Triantafyllou ldquoApplication of polynomialchaos in stability and controlrdquo Automatica vol 42 no 5 pp789ndash795 2006

[30] X Liu L Xiang and X Wang ldquoSpatial cluster analysis bythe adleman-lipton DNA computing model and flexible gridsrdquoDiscrete Dynamics in Nature and Society vol 2012 Article ID894207 32 pages 2012

[31] H Peng X Luo ZGao JWang andZ Pei ldquoANovel ClusteringAlgorithm Inspired by Membrane Computingrdquo The ScientificWorld Journal vol 3 no 22 Article ID 929471 8 pages 2015

[32] B Ufnalski and L M Grzesiak ldquoPlug-in direct particle swarmrepetitive controller with a reduced dimensionality of a fitnesslandscape - a multi-swarm approachrdquo Bulletin of the PolishAcademy of SciencesmdashTechnical Sciences vol 63 no 4 pp 857ndash866 2015

[33] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[34] ldquoDatasetsrdquo httpwwwisicalacinsimsanghamidatahtml[35] H C Blake UCI Repository of Machine Learning Databases

1998

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 5: A New Chaotic Starling Particle Swarm Optimization ...downloads.hindawi.com/journals/mpe/2018/8250480.pdfMathematicalProblemsinEngineering Chaotic Parameter u 2.5 3 3.5 4 Chaotic Value

Mathematical Problems in Engineering 5

Input 119873119890for Particle 119894

Find the119873119890 neighbors in the neighborhood119867 of particle 119894 using FERThe new position of particle 119894 119909119894 = 119909119894 + 1199033 lowast ((1119873119890) lowast sum1198941isin119867 1199091198941)The new position of particle 119894 V119894 = V119894 + 1199034 lowast ((1119873119890) lowast sum1198941isin119867 V1198941 )if Particle()FitnessltParticle(119894)Fitness

Particle(119894)Fitness=Particle()FitnessParticle(119894)Position=Particle()PositionParticle(119894)Velocity=Particle()Velocity

End ifReturn Particle 119894

End forOutput Particle 119894

Algorithm 2 Starling Birds Collective Responses

Particle swarm initialization

Update velocity and position ofeach particle

Update individual best

and globalbest positions

Local search method in starling birds

Sort the particle population using fitness

values

Selected particles using

Euclidean distance

particles using Fitness function

Max iteration

Return global best position

Particles

Identify neighbors

Position and velocity

adjustment

Select particleusing fitness values

update positionand velocity

Update globalbest position

NO

YES

=+

Selected

Figure 2 A local search method in the CSPSO algorithm

Euclidean distances to the global best The particle rankedon the top of each group is selected As a result a total of 119878119890particles are selected The particles selected in this methoddiversify the search and enhance exploration of the CSPSOprocedure

A Fitness Function Method In each generation all theparticles are sorted in the ascending order of their fitnessfunction values A total of 119878119891 particles ranked on the topare selected The particles selected in this method focus onthe promising regions in the search space so as to enhanceexploitation of the CSPSO procedure

The two particle selection methods described above areused in CSPSO to improve its convergence and increasediversity As a result a total of 119878 = 119878119890+119878119891 particles are selectedIn the implementation 119878119890 = 119878119891 is used Particles selected

through the Euclidean distance method are diversified in thesearch space to effectively avoid premature convergence Theparticles selected through the fitness function method willlead the search direction Figure 2 gives more details aboutthe process of particle selection

42 Population Initialization In the implementation ofCSPSO each cluster center is the position of a particle eachcluster center represents a cluster and different cluster centersrepresent different clustersThe set of cluster centers 119911 = 119911119896for 119896 = 1 2 119870 represents a partitioning result and alsorepresents the position of a particle 119909119894 = 1199111198941 1199111198942 119911119894119870In the initialization the initial position of each particle isgenerated randomly in the entire search space Dimension 119895of the position of particle 119894 represented by 119909119894119895 is generated

6 Mathematical Problems in Engineering

Input 119872 119905max 119873119890min 119873119890max 119908min 119908maxParticle Swarm Population Initializationfor 119894 = 1 to119872 (Population Size)

Particle(119894)BestPosition=Particle(119894)PositionParticle(119894)BestFitness=Particle(119894)Fitnessif Particle(119894)BestFitnessltBest Particle Fitness

Best ParticlePosition=Particle(119894)BestPositionBest ParticleFitness=Particle(119894)BestFitness

End ifEnd forfor 119905 = 1 to 119905max (Max Iteration)Algorithm 1 The Standard PSO Algorithm(See Algorithm 1)

Particle Selection119878119890 = Euclidean Distance Function (Best ParticlePosition)119878119891 = Fitness Function (Particle(119894)Fitness)119878 = 119878119890 + 119878119891Algorithm 2 Starling Birds Collective Responses

for 119894 = 1 to 119878Find the119873119890 neighbors in the neighborhood119867 of particle 119894 using FERThe new position of particle 119894 119909119894 = 119909119894 + 1199033 lowast ((1119873119890) lowast sum1198941isin119867 1199091198941)The new position of particle 119894 V119894 = V119894 + 1199034 lowast ((1119873119890) lowast sum1198941isin119867 V1198941 )if Particle()FitnessltParticle(119894)FitnessParticle(119894)Fitness=Particle()FitnessParticle(119894)Position=Particle()PositionParticle(119894)Velocity=Particle()Velocityif Particle()FitnessltBest ParticleFitness

Best ParticlePosition=Particle()BestPositionBest ParticleFitness=Particle()BestFitness

End ifEnd if

End for

119873119890 = 119873119890min + INT(119905 lowast ((119873119890max minus 119873119890min)119905max))

End forOutput Best ParticleFitness

Algorithm 3 A Chaotic Starling PSO Algorithm

randomly between 119909119897119895 and 119909119906119895 where 119909119897119895 and 119909119906119895 are theminimum and maximum values of dimension 119895 of the searchspace and also are the limits of the positions of the particlesTherefore 119909119894119895 isin [119909119897119895 119909119906119895 ] for 119894 = 1 2 119872 and 119895 =1 2 119863 The values of 119909119897119895 and 119909119906119895 are determined as (14)

119909119897119895 = min (119883119894119895 | 119894 = 1 2 119873) 119909119906119895 = max (119883119894119895 | 119894 = 1 2 119873)

for 119895 = 1 2 119889(14)

where 119883119894119895 is dimension 119895 of data point 119883119894 In particular thedimension 119863 of a particle is D = 119870 lowast 11988943 Boundary Handling If a particle flies outside or acrossthe boundary of the search space this particle no longerrepresents a solution The position and velocity of such aparticle are adjusted If the position of a particle in dimension

119895 is below the lower (above the upper) limit of that dimensionit is set to the lower (upper) limit of that dimensionAccordingly the velocity of this particle along dimension119895 is reversed These adjustments can limit the scope of thepositions and change the velocities as well as changing thedirections of the flying paths of the particles The positionand velocity of such a particle are adjusted as follows

119909119894119895 = 119909119897119895 if 119909119894119895 lt 119909119897119895119909119894119895 = 119909119906119895 if 119909119894119895 gt 119909119906119895 V119894119895 = minusV119894119895

(15)

where 119909119894119895 is the new value of dimension 119895 of the position andV119894119895 is the new value of dimension 119895 of the velocity of particle 119894after the adjustments

44The Pseudocode of the Proposed CSPSO Thepseudocodeof the proposed CSPSO is presented in Algorithm 3

Mathematical Problems in Engineering 7

Table 1 Benchmark Functions

Benchmark Functions Function Expression Domain 119883min 119865min

Sphere 119891min = 119863sum119894=1

1199092119894 [ndash100 100]D 0119863 0

Rastrigin 119891min = 119863sum119894=1

[1199092119894 minus 10 cos(2120587119909119894) + 10] [ndash512 512]D 0119863 0

Griewank 119891min = 119863sum119894=1

( 11990921198944000) minus 119863prod119894=1

cos( 119909119894radic119894) + 1 [ndash600 600]D 0119863 0

Rosenbrock 119891min = 119863sum119894=1

[(1199092119894+1 minus 1199092119894 )2 + (1 minus 119909119894)2] [ndash2048 2048]D 1119863 0

SphereRastrigin

times104

2

1

0100

0 0

minus100 minus100

100

60

40

20

0

minus20

minus40

5

50

0

minus5 minus5

Figure 3 The Sphere and Rastrigin Function for 119863 = 2

5 Simulation Experiments

Three experiments are conducted to evaluate the perfor-mance of CSPSO The first experiment validates the optimalparameter values in CSPSO using the Sphere and Rastrigin[33] benchmark functions The second experiment validatesthe performance of CSPSO using four classical numericalbenchmark functions [33] These functions all to be min-imized and their domains are presented in Table 1 Thethird experiment checks the effectiveness of CSPSO in dataclustering using some datasets from the Artificial Datasets[34] and the UCI Machine Learning Repository [35] CSPSOis implemented in MATLAB and all the experiments areconducted on a LENOVO desktop computer with an Intel320 GHz i5-4460P4 processor and 8GB of RAM in aWindows 8 environment

51 Experiment on Parameter Settings As described abovethe values of the parameters have important influences onthe performance of CSPSOThis section focuses on checkingthe influences of the three critical parameters in CSPSO theinertia weight 119908 the size of the neighborhood 119873119890 and thenumber of selected particles 119878 so as to find appropriate valuesfor them The Sphere and Rastrigin functions are used tostudy the influences of different values of the parameters in

CSPSO The shape and ranges of the Sphere and Rastriginfunctions for 119863 = 2 are depicted in Figure 3 The dimensionof benchmark functions is119863 = 10 in the experiment CSPSOran 30 times for each of the two benchmark functions

For the validity of the experiments and fair comparisonsparameters not to be tested in the experiments are kept atthe same values These parameters are the population size119872 = 50 the max number of iterations 119905max = 100 and thecognitive coefficients 1198881 = 1198882 = 2

The influences of the inertia weight 119908 is tested first Itsvalue is adjusted according to (8)Thevalues of119908min and119908maxdirectly determine the decline speed of 119908 and consequentlyaffect the convergence speed of CSPSO Their values arevaried in the experiments When the value of 119908 varies theminimum and maximum sizes of the neighborhood are keptat 119873119890min = 2 and 119873119890max = 5 and the number of selectedparticles is kept at 119878 = 10 The best and mean values of theSphere and Rastrigin functions are reported in Table 2

The differences in these results are obvious The optimalvalues of the Sphere and Rastrigin functions were obtained atdifferent values of 119908 However the mean values are differentfor the Rastrigin function because it is multimodal withmany local optima Generally it is difficult to find globaloptimal solutions for multimodal functions with traditionaloptimization algorithms Based on these results the optimal

8 Mathematical Problems in Engineering

Table 2 Results when varying the inertia weight 119908Inertia weight (119908min minus 119908max) Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin02minus06 0(00022) 0(09684) 33773 3132102minus09 0(0) 0(16269) 31235 3135804minus09 0(0) 0(17282) 31258 3130904minus12 0(0) 0(07567) 31229 31359

Table 3 Results when varying the size of the neighbourhood119873119890Neighbor size (119873119890min minus 119873119890max) Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin2-5 0(01382) 0(10071) 31503 314202-7 0(0) 0(07359) 31389 314162-9 0(0) 0(13716) 31403 31446

Table 4 Results when varying the number of selected particles 119878119878119890 119878119891 119878 Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin5 10 0(0) 0(06586) 31808 319667 14 0(07567) 0(15407) 42845 426459 18 0(25609) 0(24015) 53850 53840

values of the lower and upper limits of the inertia weight areset to 119908min = 04 and 119908max = 12 where the minimum meanvalues are achieved

The influences of the size of the neighborhood 119873119890 isexamined next In CSPSO the size of the neighborhood 119873119890increases gradually to enhance the local search ability Largevalues will lead to slow convergence and small values willrestrain the local search ability An appropriate value balancesexploration and exploitation The lower and upper limits onthe size of the neighborhood are set to different values and thebest and mean values of the Sphere and Rastrigin functionsare reported in Table 3

Table 3 shows that the best results are obtained when thesize of the neighborhood is between the lower and upperlimits 119873119890min = 2 and 119873119890max = 7 The influences of thenumber of selected particles 119878 are then tested It has influenceson both convergence and search ability Different values for 119878119890and 119878119891 are used and the results are reported in Table 4

The results in Table 4 show that the best number ofselected particles is 10 when the mean values of the testfunctions and computation time are both low It is easy tosee that more particles do not always lead to better resultsBecause of randomness built into the algorithm sometimesmost of the particles may just gather around local optima

Based on these results the lower and upper limits onthe inertia weight 119908min and 119908max are set to 04 and 12respectively the lower and upper limits on the size ofthe neighborhood 119873119890min and 119873119890max are set to 2 and 7respectively and the number of selected particles 119878 is set to10 These parameter values in CSPSO are kept the same andare used in the following experiments

52 Benchmark Functions In this section the 4 classicalnumerical benchmark functions presented in Table 1 are usedto validate the effectiveness of CSPSO The dimension of the

benchmark functions which determines the dimension ofthe search space is set to119863 = 10 20 and 30Theperformanceof CSPSO is compared with those of PSO fitness-Euclideandistance ratio particle swarm optimization (FER-PSO) [23]and starling particle swarm optimization (SPSO) [25] PSO isthe basic swarm intelligent algorithm that was developed byKennedy and Eberhart [26] FER-PSO utilizes the individualbest of a neighbor selected based on the Euclidean distanceand fitness function value to replace the individual best of itsown The SPSO uses the collective responses of the particlesto adjust the positions and velocities of the particles when thealgorithm stagnates

The values of the adjustable parameters in these compet-itive algorithms are the best ones reported in the respectivereferences as listed in Table 5 These competitive algorithmswere also run for 50 independent times each so as to getsome meaningful results The best fitness function values forthese functions obtained by these algorithms are reported inTable 6

Figure 4 shows the convergence of some functions ofthe three algorithms Compared with PSO and FER-PSOin Figures 4(b) and 4(c) the curves of the function valuesobtained by CSPSO decline slowly at the beginning of theevolutionary process because of the time taken by localsearch Hence it has a slow convergence speed Relying onDDT and chaotic mapping CSPSO can find lower fitnessvalues than other algorithms after about half of evolutionaryprocess rather than falling into local optima The local searchstrategy in the neighborhood helps in finding better solutionsin the nearby neighborhood of the current position of aparticle In particular the curve of the Sphere functionobtained by CSPSO disappeared after the best value 0 hasbeen found because this value cannot be marked on thechart As shown in Figure 4(d) for the Rosenbrock functiona multimodal function and traditional algorithms like PSO

Mathematical Problems in Engineering 9

Table 5 Parameter settings in the experiment

Parameters PSO ABC [22] FER-PSO [23] GABC [24] SPSO [25] CSPSOPopulation (119872) 50 50 50 50 50 50119905max 200 200 200 200 200 2001198881 1198882 22 minus 22 minus 22 2 21199031 1199032 (0 1) (minus1 1) (01 04) (minus1 1) (0 1) (0 1)(119908min 119908max) 1 minus 1 1 (02 04) (04 12)Neighbors (119873119890) minus minus 7 minus 7 2minus7Selected (119878) minus minus minus minus minus 10Sub-population minus minus minus minus 14 minusStagnant limit minus minus minus minus 2 minusTrial limit minus 10 minus 10 minus minus

Table 6 Best fitness function values obtained by different algorithms for the test functions

Function 119863 PSO FER-PSO SPSO CSPSO

Sphere10 0 0 0 020 05285 01613 82232e-12 030 24235 08891 51645e-12 0

Rastrigin10 00727 003647 0 020 03595 01895 0 030 10374 04847 0 0

Griewank10 05965 01135 0 020 14575 10994 0 030 32947 16874 0 0

Rosenbrock10 63220 68162 52486 0703720 198643 192219 187899 9843130 381130 337894 287863 186928

FER-PSO and SPSO are easily trapped into local optima butCSPSO can find the smallest function values among thosefound by these algorithms These results show that CSPSOhas strong global search ability through the use of DDT andchaotic mapping It can quickly escape local optima

As the results of the 4 classic functions show in Table 6CSPSO and SPSO perform better than PSO and FER-PSOon the Rastrigin and Griewank functions and CSPSO per-forms better than the other three algorithms on the Sphereand Rosenbrock functions In general the results showthat CSPSO has more powerful global search ability thantraditional algorithms like PSO FER-PSO and SPSO bothin single modal and multimodal functions Furthermore asindicated by the results in Table 6 the better performance ofCSPSO than others provides evidence that the improvementin CSPSO is effective in enhancing the performance of PSOand CSPSO can find the global optimum more often thanother algorithms in the experiments

53 Clustering Problems In this section experiments onclustering problemswith different datasets are performed andresults of different algorithms including CSPSO are reportedand discussed Eight benchmark datasets are used Theyare Data 3 2 Data 5 2 Data 10 2 and Data 4 3 from thework reported in [34] and Iris Glass Contraceptive Method

Choice (CMC) and Wine from the UCI Machine LearningRepository [35] More details about these datasets are givenin Table 7 Data 5 2 and Data 4 3 are also plotted in Figure 5

The optimized values of the parameters in the other algo-rithms shown in Table 5 are adopted in the experiments Inorder to perform fair comparisons all the algorithms run onall the clustering datasets for 50 times to eliminate the effectsrandom factors As simple statistics best values (Best) worstvalues (Worst) mean values (Mean) and standard deviations(SD) are used as the evaluation criteria to measure theeffectiveness of these clustering algorithms The environmentof experiment is the same for all clustering algorithms

Figure 6 shows the convergence of these algorithms onfour datasets for typical runs of the algorithms The fitnessfunctions of CSPSO decline slowly at the beginning ofthe evolutionary process and continue to decline ratherthan dropping into local optimum at the middle of theevolutionary process possibly because DDT gives the par-ticles high velocities Local search based on Euclidean andfitness neighborhood structure also helps the particles findbetter positions This phenomenon is more evident on high-dimensional datasets such as the Glass dataset

To be more clear Figure 7 gives more details aboutthe convergence of these algorithms near the end of thesearch process on three out of the four same datasets

10 Mathematical Problems in Engineering

Table 7 Description of the datasets used in the experiments

Datasets Clusters (119870) Features (119889) Total instances (119873) Instances in each clustersData 3 2 3 2 76 (252526)Data 5 2 5 2 250 (5050505050)Data 10 2 10 2 500 (50505050505050505050)Data 4 3 4 3 400 (100100100100)Iris 3 4 150 (505050)Glass 6 9 214 (70177613929)CMC 3 9 1473 (629334510)Wine 3 13 178 (597148)

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

CSPSOPSO

SPSOFER-PSO

104

103

102

101

100

10minus1

10minus2

(a) Sphere function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Bes

t Val

ues

103

102

101

100

(b) Rastrigin function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

105

100

10minus10

10minus5

10minus15

10minus20

(c) Griewank function

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

103

102

101

100

CSPSOPSO

SPSOFER-PSO

(d) Rosenbrock function

Figure 4 Convergence of the algorithms on some functions (D = 10)

CSPSO has the best performance among all these clusteringalgorithms ABC is the first converging to a local optimalpoint possibly because of the influence of randomly selectedneighbors GABC has a poor performance for large datavolumes such as the Data 5 2 and Glass datasets possibly

due to the local search behavior of onlooker bees FER-PSOhas apparently better convergence performance than otheralgorithms possibly due to the strategy that only one of theneighbors is selected to guide the search of a particle but it iseasily trapped in local optimal points in the later generations

Mathematical Problems in Engineering 11

Data_5_2 Data_4_34 6 8 10 12 14 16

4

6

8

10

12

14

16

2015

105

0minus5minus5

05

1015

0

5

10

15

20

minus520

Figure 5 Distributions of data points in some datasets

of the search process SPSO shows a better performance insearching for a global optimum but its local search strategy isnot working on high-dimensional datasets The above resultsand discussions reveal the different performances amongPSO ABC FER-PSO GABC SPSO and CSPSO and verifythe good performance of CSPSO as a clustering algorithm

It can be seen from Table 8 that CSPSO has a betterperformance on these classical clustering problems For theData 3 2 Data 5 2 and Data 10 2 datasets CSPSO obtainedlower mean fitness values than others but other algorithmsperform better than CSPSO on the best fitness values Thisis possibly due to the randomness of the swarm intelligencealgorithms For the Data 4 3 datasets because of their largenumbers of data points the ability of local search in CSPSOis affected by the selected neighbors ABC has better stabilityon this dataset but has a poor ability in searching for aglobal optimum than the others due to the random strategyCSPSO outperforms the other algorithms and obtains lowermean fitness values than others on the Iris Glass CMC andWine datasets Therefore it has better performance on high-dimensional data This proposed algorithm with embeddedDDT and chaotic mapping has better global search abilityand is more stable than traditional PSO through the use ofa local search method based on starling birds Furthermoreas the sizes of the datasets increase in both the dimensionand number of data points CSPSO has better performancethan PSO ABC GABC FER-PSO and SPSO It is morecapable of dealing with big data in the current informationage

6 Conclusions

Because the performance of standard PSO depends onits parameter values parameter adjustment is one of themost useful ways for PSO to obtain good exploration andexploitation abilities CSPSO proposed in this study enhancesthe overall convergence of the searching process Nonlinearadjustment of the inertia weight and the chaotic search

method based on logistic mapping enhance the global searchperformance of the traditional PSO and help particles escapefrom local optima DDT added to velocity update increasesthe velocities of the particles in later iterations and also helpsparticles escape from local optima The local search strategybased on behavior of starling birds utilizes the information ofthe nearest neighbors in the neighborhood and determinesa new collective position and a new collective velocity Thetwoparticle selectionmethods Euclideandistance and fitnessfunction value maintain population diversity of the particlesThese improvements help particles avoid stagnation and findbetter solutions in the search space

The results of simulation experiments in both benchmarkfunction optimization and classical clustering problems showthe effectiveness of CSPSO Compared with the traditionalPSO and other evolutionary algorithms CSPSO has betterconvergence properties and stronger global search abilityHowever as for other swarm intelligence algorithms CSPSOmay be trapped into and may stagnate around local optimalpoints and therefore may be unstable when applied toproblemswith multiple local optima In future works CSPSOwill be applied to other combinatorial optimization problemssuch as scheduling problems Other metaheuristic methodsand optimization techniques may also be used to improve theperformance of PSO

Data Availability

All the datasets used in the simulation experiments camefrom the Artificial Datasets available at httpwwwisicalacinsimsanghamidatahtml (accessed June 2017) and from theUCIMachine Learning Repository available at httparchiveicsuciedumldatasetshtml (accessed June 2017)

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

12 Mathematical Problems in Engineering

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

46

48

50

52

54

56

58

60

(a) Data 3 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

320

340

360

380

400

420

440

460

480

(b) Data 5 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

90

100

110

120

130

140

150

160

170

180

190

(c) Iris

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

250

300

350

400

450

500

550

600

650

(d) Glass

Figure 6 Convergence of the algorithms on some datasets

Iteration 19965 1997 19975 1998 19985 1999 19995 200

Best

Valu

es

47515

4752

47525

4753

47535

4754

Iteration 188 190 192 194 196 198 200

Best

Valu

es

322323324325326327328329330331

Iteration 188 190 192 194 196 198 200

Best

Valu

es

94

95

96

97

98

99

Figure 7 Convergence of the algorithms on the Data 3 2 Data 5 2 and Iris datasets

Mathematical Problems in Engineering 13

Table 8 Comparison of performance of CSPSO with other clustering algorithms

Datasets Parameters AlgorithmsPSO ABC FERminusPSO GABC SPSO CSPSO

Data 3 2

Best 475287 477630 475290 475282 475733 475282Worst 591935 601039 497856 591895 531721 475382Mean 484644 509702 478771 482279 478934 475300SD 31956 31345 04967 27975 08551 00022

Data 5 2

Best 3264386 3375538 3269987 3297155 3264666 3264432Worst 3922899 4110786 3326651 3648400 3270877 3269676Mean 3296249 3736977 3287897 3435928 3267493 3265300SD 129978 201046 11594 96214 01536 00942

Data 10 2

Best 10793e+03 9164320 8756264 13134e+03 8432274 8498712Worst 14459e+03 12718e+03 12068e+03 11657e+03 11827e+03 10141e+03Mean 12875e+03 10608e+03 10066e+03 638046 9418921 9182542SD 704484 761788 671964 13134e+03 855520 505344

Data 4 3

Best 7518897 10937e+03 8305456 7504056 7499506 7496738Worst 12866e+03 16347e+03 13753e+03 18485e+03 12726e+03 12725e+03Mean 8388670 13838e+03 10312e+03 11489e+03 7938641 7845028SD 1818515 1223579 1500554 2713938 1425136 1251508

Iris

Best 1005216 1131201 980975 966829 966605 966605Worst 1306931 1568960 1029780 1276869 986906 980888Mean 1137655 1331161 1003761 1050487 969465 969194SD 71771 110706 12211 112307 03921 02931

Glass

Best 3291487 3337214 2786565 2885152 2353554 2296544Worst 4365615 5031607 3265351 4025881 3090997 2789187Mean 3764035 4223809 2973875 3464078 2805624 2570244SD 215371 320309 111730 269515 145045 103633

CMC

Best 58449e+03 59895e+03 56153e+03 56704e+03 55615e+03 55404e+03Worst 69260e+03 72137e+03 59728e+03 64143e+03 57676e+03 55861e+03Mean 61772e+03 64821e+03 57297e+03 58864e+03 56426e+03 55562e+03SD 1867542 2432599 730180 1408947 446957 87806

Wine

Best 16308e+04 16399e+04 16312e+04 16309e+04 16305e+04 16298e+04Worst 16798e+04 17317e+04 16376e+04 16482e+04 16356e+04 16313e+04Mean 16438e+04 16705e+04 16336e+04 16355e+04 16322e+04 16304e+04SD 950097 2197057 159233 354218 120030 33799

Acknowledgments

This research project was partially supported by the NationalNatural Science Foundation of China (61472231 6150228361640201) Humanities and Social Science Research Fund ofthe Ministry of Education of China (12YJA630152) and theShandong Social Science Foundation of China (16BGLJ0611CGLJ22)

References

[1] X LiuH Liu andHDuan ldquoParticle swarmoptimizationbasedon dynamic niche technology with applications to conceptualdesignrdquo Advances in Engineering Software vol 38 no 10 pp668ndash676 2007

[2] Z Bao and TWatanabe ldquoMixed constrained image filter designusing particle swarm optimizationrdquo Artificial Life and Roboticsvol 15 no 3 pp 363ndash368 2010

[3] S Sahin andM A Cavuslu ldquoFPGA Implementation ofWaveletNeural Network Training with PSOiPSOrdquo Journal of CircuitsSystems and Computers vol 27 no 6 2018

[4] D Wu and H Gao ldquoA BP and Switching PSO Based Optimiza-tion Approach for Engine Optimizationrdquo National Academy ofScience Letters vol 40 no 1 pp 33ndash37 2017

[5] Q Jia and Y Guo ldquoHybridization of ABC and PSO algorithmsfor improved solutions of RCPSPrdquo Journal of the ChineseInstitute of Engineers Transactions of the Chinese Institute ofEngineersSeries AChung-kuo Kung Chrsquoeng Hsuch Krsquoan vol 39no 6 pp 727ndash734 2016

[6] C Yue B Qu and J Liang ldquoA Multi-objective Particle SwarmOptimizer Using Ring Topology for Solving Multimodal Multi-objective Problemsrdquo IEEE Transactions on Evolutionary Com-putation 2017

[7] X Liu and J Xue ldquoA Cluster Splitting Technique by HopfieldNetworks and P Systems on Simplicesrdquo Neural ProcessingLetters pp 1ndash24 2017

14 Mathematical Problems in Engineering

[8] Y Zhao X Liu and W Wang ldquoSpiking neural P systems withneuron division and dissolutionrdquo PLoS ONE vol 11 no 9Article ID e0162882 2016

[9] X Liu Y Zhao and M Sun ldquoAn Improved Apriori AlgorithmBased on an Evolution-Communication Tissue-Like P Systemwith Promoters and Inhibitorsrdquo Discrete Dynamics in Natureand Society vol 2017 2017

[10] N Netjinda T Achalakul and B Sirinaovakul ldquoParticle SwarmOptimization inspired by starling flock behaviorrdquo Applied SoftComputing vol 35 pp 411ndash422 2015

[11] A El Dor D Lemoine M Clerc P Siarry L Deroussi andMGourgand ldquoDynamic cluster in particle swarm optimizationalgorithmrdquo Natural Computing vol 14 no 4 pp 655ndash672 2015

[12] Y M B Ali ldquoUnsupervised Clustering Based an AdaptiveParticle Swarm Optimization Algorithmrdquo Neural ProcessingLetters vol 44 no 1 pp 221ndash244 2016

[13] G Armano and M R Farmani ldquoMultiobjective clusteringanalysis using particle swarm optimizationrdquo Expert Systemswith Applications vol 55 pp 184ndash193 2016

[14] B Niu Q Duan L Tan C Liu and P Liang ldquoA Population-Based Clustering Technique Using Particle Swarm Optimiza-tion and K-Meansrdquo in Advances in Swarm and ComputationalIntelligence vol 9140 of Lecture Notes in Computer Science pp145ndash152 Springer International Publishing Cham 2015

[15] X Zhao W Lin J Hao X Zuo and J Yuan ldquoClustering andpattern search for enhancing particle swarm optimization withEuclidean spatial neighborhood searchrdquo Neurocomputing vol171 pp 966ndash981 2016

[16] E Comak ldquoA modified particle swarm optimization algorithmusing Renyi entropy-based clusteringrdquo Neural Computing andApplications vol 27 no 5 pp 1381ndash1390 2016

[17] K K Bharti and P K Singh ldquoOpposition chaotic fitness muta-tion based adaptive inertia weight BPSO for feature selection intext clusteringrdquoApplied SoftComputing vol 43 pp 20ndash34 2016

[18] W Song W Ma and Y Qiao ldquoParticle swarm optimizationalgorithm with environmental factors for clustering analysisrdquoSoft Computing vol 21 no 2 pp 283ndash293 2017

[19] R Liu J Li J Fan C Mu and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimizationfor dynamic multi-objective optimizationrdquo European Journal ofOperational Research vol 261 no 3 pp 1028ndash1051 2017

[20] X Chu B Niu J J Liang and Q Lu ldquoAn orthogonal-designhybrid particle swarm optimiser with application to capacitatedfacility location problemrdquo International Journal of Bio-InspiredComputation vol 8 no 5 pp 268ndash285 2016

[21] DWang D Tan and L Liu ldquoParticle Swarm Optimization Anoverviewrdquo Soft Computing vol 22 no 2 p 22 2018

[22] D Karaboga ldquoAn Idea Based on Honey Bee Swarm for Numer-ical Optimizationrdquo Technical Report TR0 2005

[23] X Li ldquoA multimodal particle swarm optimizer based on fitnessEuclidean-distance ratiordquo in Proceedings of the the 9th annualconference pp 78ndash85 London England July 2007

[24] G Zhu and S Kwong ldquoGbest-guided artificial bee colonyalgorithm for numerical function optimizationrdquo Applied Math-ematics and Computation vol 217 no 7 pp 3166ndash3173 2010

[25] A Laudani F R Fulginei G M Lozito and A SalvinildquoSwarmflock optimization algorithms as continuous dynamicsystemsrdquo Applied Mathematics and Computation vol 243 pp670ndash683 2014

[26] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[27] K K Bharti and P K Singh ldquoChaotic gradient artificial beecolony for text clusteringrdquo Soft Computing vol 20 no 3 pp1113ndash1126 2016

[28] RMMay ldquoSimplemathematicalmodelswith very complicateddynamicsrdquo Nature vol 261 no 5560 pp 459ndash467 1976

[29] F S Hover and M S Triantafyllou ldquoApplication of polynomialchaos in stability and controlrdquo Automatica vol 42 no 5 pp789ndash795 2006

[30] X Liu L Xiang and X Wang ldquoSpatial cluster analysis bythe adleman-lipton DNA computing model and flexible gridsrdquoDiscrete Dynamics in Nature and Society vol 2012 Article ID894207 32 pages 2012

[31] H Peng X Luo ZGao JWang andZ Pei ldquoANovel ClusteringAlgorithm Inspired by Membrane Computingrdquo The ScientificWorld Journal vol 3 no 22 Article ID 929471 8 pages 2015

[32] B Ufnalski and L M Grzesiak ldquoPlug-in direct particle swarmrepetitive controller with a reduced dimensionality of a fitnesslandscape - a multi-swarm approachrdquo Bulletin of the PolishAcademy of SciencesmdashTechnical Sciences vol 63 no 4 pp 857ndash866 2015

[33] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[34] ldquoDatasetsrdquo httpwwwisicalacinsimsanghamidatahtml[35] H C Blake UCI Repository of Machine Learning Databases

1998

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 6: A New Chaotic Starling Particle Swarm Optimization ...downloads.hindawi.com/journals/mpe/2018/8250480.pdfMathematicalProblemsinEngineering Chaotic Parameter u 2.5 3 3.5 4 Chaotic Value

6 Mathematical Problems in Engineering

Input 119872 119905max 119873119890min 119873119890max 119908min 119908maxParticle Swarm Population Initializationfor 119894 = 1 to119872 (Population Size)

Particle(119894)BestPosition=Particle(119894)PositionParticle(119894)BestFitness=Particle(119894)Fitnessif Particle(119894)BestFitnessltBest Particle Fitness

Best ParticlePosition=Particle(119894)BestPositionBest ParticleFitness=Particle(119894)BestFitness

End ifEnd forfor 119905 = 1 to 119905max (Max Iteration)Algorithm 1 The Standard PSO Algorithm(See Algorithm 1)

Particle Selection119878119890 = Euclidean Distance Function (Best ParticlePosition)119878119891 = Fitness Function (Particle(119894)Fitness)119878 = 119878119890 + 119878119891Algorithm 2 Starling Birds Collective Responses

for 119894 = 1 to 119878Find the119873119890 neighbors in the neighborhood119867 of particle 119894 using FERThe new position of particle 119894 119909119894 = 119909119894 + 1199033 lowast ((1119873119890) lowast sum1198941isin119867 1199091198941)The new position of particle 119894 V119894 = V119894 + 1199034 lowast ((1119873119890) lowast sum1198941isin119867 V1198941 )if Particle()FitnessltParticle(119894)FitnessParticle(119894)Fitness=Particle()FitnessParticle(119894)Position=Particle()PositionParticle(119894)Velocity=Particle()Velocityif Particle()FitnessltBest ParticleFitness

Best ParticlePosition=Particle()BestPositionBest ParticleFitness=Particle()BestFitness

End ifEnd if

End for

119873119890 = 119873119890min + INT(119905 lowast ((119873119890max minus 119873119890min)119905max))

End forOutput Best ParticleFitness

Algorithm 3 A Chaotic Starling PSO Algorithm

randomly between 119909119897119895 and 119909119906119895 where 119909119897119895 and 119909119906119895 are theminimum and maximum values of dimension 119895 of the searchspace and also are the limits of the positions of the particlesTherefore 119909119894119895 isin [119909119897119895 119909119906119895 ] for 119894 = 1 2 119872 and 119895 =1 2 119863 The values of 119909119897119895 and 119909119906119895 are determined as (14)

119909119897119895 = min (119883119894119895 | 119894 = 1 2 119873) 119909119906119895 = max (119883119894119895 | 119894 = 1 2 119873)

for 119895 = 1 2 119889(14)

where 119883119894119895 is dimension 119895 of data point 119883119894 In particular thedimension 119863 of a particle is D = 119870 lowast 11988943 Boundary Handling If a particle flies outside or acrossthe boundary of the search space this particle no longerrepresents a solution The position and velocity of such aparticle are adjusted If the position of a particle in dimension

119895 is below the lower (above the upper) limit of that dimensionit is set to the lower (upper) limit of that dimensionAccordingly the velocity of this particle along dimension119895 is reversed These adjustments can limit the scope of thepositions and change the velocities as well as changing thedirections of the flying paths of the particles The positionand velocity of such a particle are adjusted as follows

119909119894119895 = 119909119897119895 if 119909119894119895 lt 119909119897119895119909119894119895 = 119909119906119895 if 119909119894119895 gt 119909119906119895 V119894119895 = minusV119894119895

(15)

where 119909119894119895 is the new value of dimension 119895 of the position andV119894119895 is the new value of dimension 119895 of the velocity of particle 119894after the adjustments

44The Pseudocode of the Proposed CSPSO Thepseudocodeof the proposed CSPSO is presented in Algorithm 3

Mathematical Problems in Engineering 7

Table 1 Benchmark Functions

Benchmark Functions Function Expression Domain 119883min 119865min

Sphere 119891min = 119863sum119894=1

1199092119894 [ndash100 100]D 0119863 0

Rastrigin 119891min = 119863sum119894=1

[1199092119894 minus 10 cos(2120587119909119894) + 10] [ndash512 512]D 0119863 0

Griewank 119891min = 119863sum119894=1

( 11990921198944000) minus 119863prod119894=1

cos( 119909119894radic119894) + 1 [ndash600 600]D 0119863 0

Rosenbrock 119891min = 119863sum119894=1

[(1199092119894+1 minus 1199092119894 )2 + (1 minus 119909119894)2] [ndash2048 2048]D 1119863 0

SphereRastrigin

times104

2

1

0100

0 0

minus100 minus100

100

60

40

20

0

minus20

minus40

5

50

0

minus5 minus5

Figure 3 The Sphere and Rastrigin Function for 119863 = 2

5 Simulation Experiments

Three experiments are conducted to evaluate the perfor-mance of CSPSO The first experiment validates the optimalparameter values in CSPSO using the Sphere and Rastrigin[33] benchmark functions The second experiment validatesthe performance of CSPSO using four classical numericalbenchmark functions [33] These functions all to be min-imized and their domains are presented in Table 1 Thethird experiment checks the effectiveness of CSPSO in dataclustering using some datasets from the Artificial Datasets[34] and the UCI Machine Learning Repository [35] CSPSOis implemented in MATLAB and all the experiments areconducted on a LENOVO desktop computer with an Intel320 GHz i5-4460P4 processor and 8GB of RAM in aWindows 8 environment

51 Experiment on Parameter Settings As described abovethe values of the parameters have important influences onthe performance of CSPSOThis section focuses on checkingthe influences of the three critical parameters in CSPSO theinertia weight 119908 the size of the neighborhood 119873119890 and thenumber of selected particles 119878 so as to find appropriate valuesfor them The Sphere and Rastrigin functions are used tostudy the influences of different values of the parameters in

CSPSO The shape and ranges of the Sphere and Rastriginfunctions for 119863 = 2 are depicted in Figure 3 The dimensionof benchmark functions is119863 = 10 in the experiment CSPSOran 30 times for each of the two benchmark functions

For the validity of the experiments and fair comparisonsparameters not to be tested in the experiments are kept atthe same values These parameters are the population size119872 = 50 the max number of iterations 119905max = 100 and thecognitive coefficients 1198881 = 1198882 = 2

The influences of the inertia weight 119908 is tested first Itsvalue is adjusted according to (8)Thevalues of119908min and119908maxdirectly determine the decline speed of 119908 and consequentlyaffect the convergence speed of CSPSO Their values arevaried in the experiments When the value of 119908 varies theminimum and maximum sizes of the neighborhood are keptat 119873119890min = 2 and 119873119890max = 5 and the number of selectedparticles is kept at 119878 = 10 The best and mean values of theSphere and Rastrigin functions are reported in Table 2

The differences in these results are obvious The optimalvalues of the Sphere and Rastrigin functions were obtained atdifferent values of 119908 However the mean values are differentfor the Rastrigin function because it is multimodal withmany local optima Generally it is difficult to find globaloptimal solutions for multimodal functions with traditionaloptimization algorithms Based on these results the optimal

8 Mathematical Problems in Engineering

Table 2 Results when varying the inertia weight 119908Inertia weight (119908min minus 119908max) Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin02minus06 0(00022) 0(09684) 33773 3132102minus09 0(0) 0(16269) 31235 3135804minus09 0(0) 0(17282) 31258 3130904minus12 0(0) 0(07567) 31229 31359

Table 3 Results when varying the size of the neighbourhood119873119890Neighbor size (119873119890min minus 119873119890max) Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin2-5 0(01382) 0(10071) 31503 314202-7 0(0) 0(07359) 31389 314162-9 0(0) 0(13716) 31403 31446

Table 4 Results when varying the number of selected particles 119878119878119890 119878119891 119878 Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin5 10 0(0) 0(06586) 31808 319667 14 0(07567) 0(15407) 42845 426459 18 0(25609) 0(24015) 53850 53840

values of the lower and upper limits of the inertia weight areset to 119908min = 04 and 119908max = 12 where the minimum meanvalues are achieved

The influences of the size of the neighborhood 119873119890 isexamined next In CSPSO the size of the neighborhood 119873119890increases gradually to enhance the local search ability Largevalues will lead to slow convergence and small values willrestrain the local search ability An appropriate value balancesexploration and exploitation The lower and upper limits onthe size of the neighborhood are set to different values and thebest and mean values of the Sphere and Rastrigin functionsare reported in Table 3

Table 3 shows that the best results are obtained when thesize of the neighborhood is between the lower and upperlimits 119873119890min = 2 and 119873119890max = 7 The influences of thenumber of selected particles 119878 are then tested It has influenceson both convergence and search ability Different values for 119878119890and 119878119891 are used and the results are reported in Table 4

The results in Table 4 show that the best number ofselected particles is 10 when the mean values of the testfunctions and computation time are both low It is easy tosee that more particles do not always lead to better resultsBecause of randomness built into the algorithm sometimesmost of the particles may just gather around local optima

Based on these results the lower and upper limits onthe inertia weight 119908min and 119908max are set to 04 and 12respectively the lower and upper limits on the size ofthe neighborhood 119873119890min and 119873119890max are set to 2 and 7respectively and the number of selected particles 119878 is set to10 These parameter values in CSPSO are kept the same andare used in the following experiments

52 Benchmark Functions In this section the 4 classicalnumerical benchmark functions presented in Table 1 are usedto validate the effectiveness of CSPSO The dimension of the

benchmark functions which determines the dimension ofthe search space is set to119863 = 10 20 and 30Theperformanceof CSPSO is compared with those of PSO fitness-Euclideandistance ratio particle swarm optimization (FER-PSO) [23]and starling particle swarm optimization (SPSO) [25] PSO isthe basic swarm intelligent algorithm that was developed byKennedy and Eberhart [26] FER-PSO utilizes the individualbest of a neighbor selected based on the Euclidean distanceand fitness function value to replace the individual best of itsown The SPSO uses the collective responses of the particlesto adjust the positions and velocities of the particles when thealgorithm stagnates

The values of the adjustable parameters in these compet-itive algorithms are the best ones reported in the respectivereferences as listed in Table 5 These competitive algorithmswere also run for 50 independent times each so as to getsome meaningful results The best fitness function values forthese functions obtained by these algorithms are reported inTable 6

Figure 4 shows the convergence of some functions ofthe three algorithms Compared with PSO and FER-PSOin Figures 4(b) and 4(c) the curves of the function valuesobtained by CSPSO decline slowly at the beginning of theevolutionary process because of the time taken by localsearch Hence it has a slow convergence speed Relying onDDT and chaotic mapping CSPSO can find lower fitnessvalues than other algorithms after about half of evolutionaryprocess rather than falling into local optima The local searchstrategy in the neighborhood helps in finding better solutionsin the nearby neighborhood of the current position of aparticle In particular the curve of the Sphere functionobtained by CSPSO disappeared after the best value 0 hasbeen found because this value cannot be marked on thechart As shown in Figure 4(d) for the Rosenbrock functiona multimodal function and traditional algorithms like PSO

Mathematical Problems in Engineering 9

Table 5 Parameter settings in the experiment

Parameters PSO ABC [22] FER-PSO [23] GABC [24] SPSO [25] CSPSOPopulation (119872) 50 50 50 50 50 50119905max 200 200 200 200 200 2001198881 1198882 22 minus 22 minus 22 2 21199031 1199032 (0 1) (minus1 1) (01 04) (minus1 1) (0 1) (0 1)(119908min 119908max) 1 minus 1 1 (02 04) (04 12)Neighbors (119873119890) minus minus 7 minus 7 2minus7Selected (119878) minus minus minus minus minus 10Sub-population minus minus minus minus 14 minusStagnant limit minus minus minus minus 2 minusTrial limit minus 10 minus 10 minus minus

Table 6 Best fitness function values obtained by different algorithms for the test functions

Function 119863 PSO FER-PSO SPSO CSPSO

Sphere10 0 0 0 020 05285 01613 82232e-12 030 24235 08891 51645e-12 0

Rastrigin10 00727 003647 0 020 03595 01895 0 030 10374 04847 0 0

Griewank10 05965 01135 0 020 14575 10994 0 030 32947 16874 0 0

Rosenbrock10 63220 68162 52486 0703720 198643 192219 187899 9843130 381130 337894 287863 186928

FER-PSO and SPSO are easily trapped into local optima butCSPSO can find the smallest function values among thosefound by these algorithms These results show that CSPSOhas strong global search ability through the use of DDT andchaotic mapping It can quickly escape local optima

As the results of the 4 classic functions show in Table 6CSPSO and SPSO perform better than PSO and FER-PSOon the Rastrigin and Griewank functions and CSPSO per-forms better than the other three algorithms on the Sphereand Rosenbrock functions In general the results showthat CSPSO has more powerful global search ability thantraditional algorithms like PSO FER-PSO and SPSO bothin single modal and multimodal functions Furthermore asindicated by the results in Table 6 the better performance ofCSPSO than others provides evidence that the improvementin CSPSO is effective in enhancing the performance of PSOand CSPSO can find the global optimum more often thanother algorithms in the experiments

53 Clustering Problems In this section experiments onclustering problemswith different datasets are performed andresults of different algorithms including CSPSO are reportedand discussed Eight benchmark datasets are used Theyare Data 3 2 Data 5 2 Data 10 2 and Data 4 3 from thework reported in [34] and Iris Glass Contraceptive Method

Choice (CMC) and Wine from the UCI Machine LearningRepository [35] More details about these datasets are givenin Table 7 Data 5 2 and Data 4 3 are also plotted in Figure 5

The optimized values of the parameters in the other algo-rithms shown in Table 5 are adopted in the experiments Inorder to perform fair comparisons all the algorithms run onall the clustering datasets for 50 times to eliminate the effectsrandom factors As simple statistics best values (Best) worstvalues (Worst) mean values (Mean) and standard deviations(SD) are used as the evaluation criteria to measure theeffectiveness of these clustering algorithms The environmentof experiment is the same for all clustering algorithms

Figure 6 shows the convergence of these algorithms onfour datasets for typical runs of the algorithms The fitnessfunctions of CSPSO decline slowly at the beginning ofthe evolutionary process and continue to decline ratherthan dropping into local optimum at the middle of theevolutionary process possibly because DDT gives the par-ticles high velocities Local search based on Euclidean andfitness neighborhood structure also helps the particles findbetter positions This phenomenon is more evident on high-dimensional datasets such as the Glass dataset

To be more clear Figure 7 gives more details aboutthe convergence of these algorithms near the end of thesearch process on three out of the four same datasets

10 Mathematical Problems in Engineering

Table 7 Description of the datasets used in the experiments

Datasets Clusters (119870) Features (119889) Total instances (119873) Instances in each clustersData 3 2 3 2 76 (252526)Data 5 2 5 2 250 (5050505050)Data 10 2 10 2 500 (50505050505050505050)Data 4 3 4 3 400 (100100100100)Iris 3 4 150 (505050)Glass 6 9 214 (70177613929)CMC 3 9 1473 (629334510)Wine 3 13 178 (597148)

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

CSPSOPSO

SPSOFER-PSO

104

103

102

101

100

10minus1

10minus2

(a) Sphere function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Bes

t Val

ues

103

102

101

100

(b) Rastrigin function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

105

100

10minus10

10minus5

10minus15

10minus20

(c) Griewank function

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

103

102

101

100

CSPSOPSO

SPSOFER-PSO

(d) Rosenbrock function

Figure 4 Convergence of the algorithms on some functions (D = 10)

CSPSO has the best performance among all these clusteringalgorithms ABC is the first converging to a local optimalpoint possibly because of the influence of randomly selectedneighbors GABC has a poor performance for large datavolumes such as the Data 5 2 and Glass datasets possibly

due to the local search behavior of onlooker bees FER-PSOhas apparently better convergence performance than otheralgorithms possibly due to the strategy that only one of theneighbors is selected to guide the search of a particle but it iseasily trapped in local optimal points in the later generations

Mathematical Problems in Engineering 11

Data_5_2 Data_4_34 6 8 10 12 14 16

4

6

8

10

12

14

16

2015

105

0minus5minus5

05

1015

0

5

10

15

20

minus520

Figure 5 Distributions of data points in some datasets

of the search process SPSO shows a better performance insearching for a global optimum but its local search strategy isnot working on high-dimensional datasets The above resultsand discussions reveal the different performances amongPSO ABC FER-PSO GABC SPSO and CSPSO and verifythe good performance of CSPSO as a clustering algorithm

It can be seen from Table 8 that CSPSO has a betterperformance on these classical clustering problems For theData 3 2 Data 5 2 and Data 10 2 datasets CSPSO obtainedlower mean fitness values than others but other algorithmsperform better than CSPSO on the best fitness values Thisis possibly due to the randomness of the swarm intelligencealgorithms For the Data 4 3 datasets because of their largenumbers of data points the ability of local search in CSPSOis affected by the selected neighbors ABC has better stabilityon this dataset but has a poor ability in searching for aglobal optimum than the others due to the random strategyCSPSO outperforms the other algorithms and obtains lowermean fitness values than others on the Iris Glass CMC andWine datasets Therefore it has better performance on high-dimensional data This proposed algorithm with embeddedDDT and chaotic mapping has better global search abilityand is more stable than traditional PSO through the use ofa local search method based on starling birds Furthermoreas the sizes of the datasets increase in both the dimensionand number of data points CSPSO has better performancethan PSO ABC GABC FER-PSO and SPSO It is morecapable of dealing with big data in the current informationage

6 Conclusions

Because the performance of standard PSO depends onits parameter values parameter adjustment is one of themost useful ways for PSO to obtain good exploration andexploitation abilities CSPSO proposed in this study enhancesthe overall convergence of the searching process Nonlinearadjustment of the inertia weight and the chaotic search

method based on logistic mapping enhance the global searchperformance of the traditional PSO and help particles escapefrom local optima DDT added to velocity update increasesthe velocities of the particles in later iterations and also helpsparticles escape from local optima The local search strategybased on behavior of starling birds utilizes the information ofthe nearest neighbors in the neighborhood and determinesa new collective position and a new collective velocity Thetwoparticle selectionmethods Euclideandistance and fitnessfunction value maintain population diversity of the particlesThese improvements help particles avoid stagnation and findbetter solutions in the search space

The results of simulation experiments in both benchmarkfunction optimization and classical clustering problems showthe effectiveness of CSPSO Compared with the traditionalPSO and other evolutionary algorithms CSPSO has betterconvergence properties and stronger global search abilityHowever as for other swarm intelligence algorithms CSPSOmay be trapped into and may stagnate around local optimalpoints and therefore may be unstable when applied toproblemswith multiple local optima In future works CSPSOwill be applied to other combinatorial optimization problemssuch as scheduling problems Other metaheuristic methodsand optimization techniques may also be used to improve theperformance of PSO

Data Availability

All the datasets used in the simulation experiments camefrom the Artificial Datasets available at httpwwwisicalacinsimsanghamidatahtml (accessed June 2017) and from theUCIMachine Learning Repository available at httparchiveicsuciedumldatasetshtml (accessed June 2017)

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

12 Mathematical Problems in Engineering

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

46

48

50

52

54

56

58

60

(a) Data 3 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

320

340

360

380

400

420

440

460

480

(b) Data 5 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

90

100

110

120

130

140

150

160

170

180

190

(c) Iris

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

250

300

350

400

450

500

550

600

650

(d) Glass

Figure 6 Convergence of the algorithms on some datasets

Iteration 19965 1997 19975 1998 19985 1999 19995 200

Best

Valu

es

47515

4752

47525

4753

47535

4754

Iteration 188 190 192 194 196 198 200

Best

Valu

es

322323324325326327328329330331

Iteration 188 190 192 194 196 198 200

Best

Valu

es

94

95

96

97

98

99

Figure 7 Convergence of the algorithms on the Data 3 2 Data 5 2 and Iris datasets

Mathematical Problems in Engineering 13

Table 8 Comparison of performance of CSPSO with other clustering algorithms

Datasets Parameters AlgorithmsPSO ABC FERminusPSO GABC SPSO CSPSO

Data 3 2

Best 475287 477630 475290 475282 475733 475282Worst 591935 601039 497856 591895 531721 475382Mean 484644 509702 478771 482279 478934 475300SD 31956 31345 04967 27975 08551 00022

Data 5 2

Best 3264386 3375538 3269987 3297155 3264666 3264432Worst 3922899 4110786 3326651 3648400 3270877 3269676Mean 3296249 3736977 3287897 3435928 3267493 3265300SD 129978 201046 11594 96214 01536 00942

Data 10 2

Best 10793e+03 9164320 8756264 13134e+03 8432274 8498712Worst 14459e+03 12718e+03 12068e+03 11657e+03 11827e+03 10141e+03Mean 12875e+03 10608e+03 10066e+03 638046 9418921 9182542SD 704484 761788 671964 13134e+03 855520 505344

Data 4 3

Best 7518897 10937e+03 8305456 7504056 7499506 7496738Worst 12866e+03 16347e+03 13753e+03 18485e+03 12726e+03 12725e+03Mean 8388670 13838e+03 10312e+03 11489e+03 7938641 7845028SD 1818515 1223579 1500554 2713938 1425136 1251508

Iris

Best 1005216 1131201 980975 966829 966605 966605Worst 1306931 1568960 1029780 1276869 986906 980888Mean 1137655 1331161 1003761 1050487 969465 969194SD 71771 110706 12211 112307 03921 02931

Glass

Best 3291487 3337214 2786565 2885152 2353554 2296544Worst 4365615 5031607 3265351 4025881 3090997 2789187Mean 3764035 4223809 2973875 3464078 2805624 2570244SD 215371 320309 111730 269515 145045 103633

CMC

Best 58449e+03 59895e+03 56153e+03 56704e+03 55615e+03 55404e+03Worst 69260e+03 72137e+03 59728e+03 64143e+03 57676e+03 55861e+03Mean 61772e+03 64821e+03 57297e+03 58864e+03 56426e+03 55562e+03SD 1867542 2432599 730180 1408947 446957 87806

Wine

Best 16308e+04 16399e+04 16312e+04 16309e+04 16305e+04 16298e+04Worst 16798e+04 17317e+04 16376e+04 16482e+04 16356e+04 16313e+04Mean 16438e+04 16705e+04 16336e+04 16355e+04 16322e+04 16304e+04SD 950097 2197057 159233 354218 120030 33799

Acknowledgments

This research project was partially supported by the NationalNatural Science Foundation of China (61472231 6150228361640201) Humanities and Social Science Research Fund ofthe Ministry of Education of China (12YJA630152) and theShandong Social Science Foundation of China (16BGLJ0611CGLJ22)

References

[1] X LiuH Liu andHDuan ldquoParticle swarmoptimizationbasedon dynamic niche technology with applications to conceptualdesignrdquo Advances in Engineering Software vol 38 no 10 pp668ndash676 2007

[2] Z Bao and TWatanabe ldquoMixed constrained image filter designusing particle swarm optimizationrdquo Artificial Life and Roboticsvol 15 no 3 pp 363ndash368 2010

[3] S Sahin andM A Cavuslu ldquoFPGA Implementation ofWaveletNeural Network Training with PSOiPSOrdquo Journal of CircuitsSystems and Computers vol 27 no 6 2018

[4] D Wu and H Gao ldquoA BP and Switching PSO Based Optimiza-tion Approach for Engine Optimizationrdquo National Academy ofScience Letters vol 40 no 1 pp 33ndash37 2017

[5] Q Jia and Y Guo ldquoHybridization of ABC and PSO algorithmsfor improved solutions of RCPSPrdquo Journal of the ChineseInstitute of Engineers Transactions of the Chinese Institute ofEngineersSeries AChung-kuo Kung Chrsquoeng Hsuch Krsquoan vol 39no 6 pp 727ndash734 2016

[6] C Yue B Qu and J Liang ldquoA Multi-objective Particle SwarmOptimizer Using Ring Topology for Solving Multimodal Multi-objective Problemsrdquo IEEE Transactions on Evolutionary Com-putation 2017

[7] X Liu and J Xue ldquoA Cluster Splitting Technique by HopfieldNetworks and P Systems on Simplicesrdquo Neural ProcessingLetters pp 1ndash24 2017

14 Mathematical Problems in Engineering

[8] Y Zhao X Liu and W Wang ldquoSpiking neural P systems withneuron division and dissolutionrdquo PLoS ONE vol 11 no 9Article ID e0162882 2016

[9] X Liu Y Zhao and M Sun ldquoAn Improved Apriori AlgorithmBased on an Evolution-Communication Tissue-Like P Systemwith Promoters and Inhibitorsrdquo Discrete Dynamics in Natureand Society vol 2017 2017

[10] N Netjinda T Achalakul and B Sirinaovakul ldquoParticle SwarmOptimization inspired by starling flock behaviorrdquo Applied SoftComputing vol 35 pp 411ndash422 2015

[11] A El Dor D Lemoine M Clerc P Siarry L Deroussi andMGourgand ldquoDynamic cluster in particle swarm optimizationalgorithmrdquo Natural Computing vol 14 no 4 pp 655ndash672 2015

[12] Y M B Ali ldquoUnsupervised Clustering Based an AdaptiveParticle Swarm Optimization Algorithmrdquo Neural ProcessingLetters vol 44 no 1 pp 221ndash244 2016

[13] G Armano and M R Farmani ldquoMultiobjective clusteringanalysis using particle swarm optimizationrdquo Expert Systemswith Applications vol 55 pp 184ndash193 2016

[14] B Niu Q Duan L Tan C Liu and P Liang ldquoA Population-Based Clustering Technique Using Particle Swarm Optimiza-tion and K-Meansrdquo in Advances in Swarm and ComputationalIntelligence vol 9140 of Lecture Notes in Computer Science pp145ndash152 Springer International Publishing Cham 2015

[15] X Zhao W Lin J Hao X Zuo and J Yuan ldquoClustering andpattern search for enhancing particle swarm optimization withEuclidean spatial neighborhood searchrdquo Neurocomputing vol171 pp 966ndash981 2016

[16] E Comak ldquoA modified particle swarm optimization algorithmusing Renyi entropy-based clusteringrdquo Neural Computing andApplications vol 27 no 5 pp 1381ndash1390 2016

[17] K K Bharti and P K Singh ldquoOpposition chaotic fitness muta-tion based adaptive inertia weight BPSO for feature selection intext clusteringrdquoApplied SoftComputing vol 43 pp 20ndash34 2016

[18] W Song W Ma and Y Qiao ldquoParticle swarm optimizationalgorithm with environmental factors for clustering analysisrdquoSoft Computing vol 21 no 2 pp 283ndash293 2017

[19] R Liu J Li J Fan C Mu and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimizationfor dynamic multi-objective optimizationrdquo European Journal ofOperational Research vol 261 no 3 pp 1028ndash1051 2017

[20] X Chu B Niu J J Liang and Q Lu ldquoAn orthogonal-designhybrid particle swarm optimiser with application to capacitatedfacility location problemrdquo International Journal of Bio-InspiredComputation vol 8 no 5 pp 268ndash285 2016

[21] DWang D Tan and L Liu ldquoParticle Swarm Optimization Anoverviewrdquo Soft Computing vol 22 no 2 p 22 2018

[22] D Karaboga ldquoAn Idea Based on Honey Bee Swarm for Numer-ical Optimizationrdquo Technical Report TR0 2005

[23] X Li ldquoA multimodal particle swarm optimizer based on fitnessEuclidean-distance ratiordquo in Proceedings of the the 9th annualconference pp 78ndash85 London England July 2007

[24] G Zhu and S Kwong ldquoGbest-guided artificial bee colonyalgorithm for numerical function optimizationrdquo Applied Math-ematics and Computation vol 217 no 7 pp 3166ndash3173 2010

[25] A Laudani F R Fulginei G M Lozito and A SalvinildquoSwarmflock optimization algorithms as continuous dynamicsystemsrdquo Applied Mathematics and Computation vol 243 pp670ndash683 2014

[26] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[27] K K Bharti and P K Singh ldquoChaotic gradient artificial beecolony for text clusteringrdquo Soft Computing vol 20 no 3 pp1113ndash1126 2016

[28] RMMay ldquoSimplemathematicalmodelswith very complicateddynamicsrdquo Nature vol 261 no 5560 pp 459ndash467 1976

[29] F S Hover and M S Triantafyllou ldquoApplication of polynomialchaos in stability and controlrdquo Automatica vol 42 no 5 pp789ndash795 2006

[30] X Liu L Xiang and X Wang ldquoSpatial cluster analysis bythe adleman-lipton DNA computing model and flexible gridsrdquoDiscrete Dynamics in Nature and Society vol 2012 Article ID894207 32 pages 2012

[31] H Peng X Luo ZGao JWang andZ Pei ldquoANovel ClusteringAlgorithm Inspired by Membrane Computingrdquo The ScientificWorld Journal vol 3 no 22 Article ID 929471 8 pages 2015

[32] B Ufnalski and L M Grzesiak ldquoPlug-in direct particle swarmrepetitive controller with a reduced dimensionality of a fitnesslandscape - a multi-swarm approachrdquo Bulletin of the PolishAcademy of SciencesmdashTechnical Sciences vol 63 no 4 pp 857ndash866 2015

[33] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[34] ldquoDatasetsrdquo httpwwwisicalacinsimsanghamidatahtml[35] H C Blake UCI Repository of Machine Learning Databases

1998

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 7: A New Chaotic Starling Particle Swarm Optimization ...downloads.hindawi.com/journals/mpe/2018/8250480.pdfMathematicalProblemsinEngineering Chaotic Parameter u 2.5 3 3.5 4 Chaotic Value

Mathematical Problems in Engineering 7

Table 1 Benchmark Functions

Benchmark Functions Function Expression Domain 119883min 119865min

Sphere 119891min = 119863sum119894=1

1199092119894 [ndash100 100]D 0119863 0

Rastrigin 119891min = 119863sum119894=1

[1199092119894 minus 10 cos(2120587119909119894) + 10] [ndash512 512]D 0119863 0

Griewank 119891min = 119863sum119894=1

( 11990921198944000) minus 119863prod119894=1

cos( 119909119894radic119894) + 1 [ndash600 600]D 0119863 0

Rosenbrock 119891min = 119863sum119894=1

[(1199092119894+1 minus 1199092119894 )2 + (1 minus 119909119894)2] [ndash2048 2048]D 1119863 0

SphereRastrigin

times104

2

1

0100

0 0

minus100 minus100

100

60

40

20

0

minus20

minus40

5

50

0

minus5 minus5

Figure 3 The Sphere and Rastrigin Function for 119863 = 2

5 Simulation Experiments

Three experiments are conducted to evaluate the perfor-mance of CSPSO The first experiment validates the optimalparameter values in CSPSO using the Sphere and Rastrigin[33] benchmark functions The second experiment validatesthe performance of CSPSO using four classical numericalbenchmark functions [33] These functions all to be min-imized and their domains are presented in Table 1 Thethird experiment checks the effectiveness of CSPSO in dataclustering using some datasets from the Artificial Datasets[34] and the UCI Machine Learning Repository [35] CSPSOis implemented in MATLAB and all the experiments areconducted on a LENOVO desktop computer with an Intel320 GHz i5-4460P4 processor and 8GB of RAM in aWindows 8 environment

51 Experiment on Parameter Settings As described abovethe values of the parameters have important influences onthe performance of CSPSOThis section focuses on checkingthe influences of the three critical parameters in CSPSO theinertia weight 119908 the size of the neighborhood 119873119890 and thenumber of selected particles 119878 so as to find appropriate valuesfor them The Sphere and Rastrigin functions are used tostudy the influences of different values of the parameters in

CSPSO The shape and ranges of the Sphere and Rastriginfunctions for 119863 = 2 are depicted in Figure 3 The dimensionof benchmark functions is119863 = 10 in the experiment CSPSOran 30 times for each of the two benchmark functions

For the validity of the experiments and fair comparisonsparameters not to be tested in the experiments are kept atthe same values These parameters are the population size119872 = 50 the max number of iterations 119905max = 100 and thecognitive coefficients 1198881 = 1198882 = 2

The influences of the inertia weight 119908 is tested first Itsvalue is adjusted according to (8)Thevalues of119908min and119908maxdirectly determine the decline speed of 119908 and consequentlyaffect the convergence speed of CSPSO Their values arevaried in the experiments When the value of 119908 varies theminimum and maximum sizes of the neighborhood are keptat 119873119890min = 2 and 119873119890max = 5 and the number of selectedparticles is kept at 119878 = 10 The best and mean values of theSphere and Rastrigin functions are reported in Table 2

The differences in these results are obvious The optimalvalues of the Sphere and Rastrigin functions were obtained atdifferent values of 119908 However the mean values are differentfor the Rastrigin function because it is multimodal withmany local optima Generally it is difficult to find globaloptimal solutions for multimodal functions with traditionaloptimization algorithms Based on these results the optimal

8 Mathematical Problems in Engineering

Table 2 Results when varying the inertia weight 119908Inertia weight (119908min minus 119908max) Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin02minus06 0(00022) 0(09684) 33773 3132102minus09 0(0) 0(16269) 31235 3135804minus09 0(0) 0(17282) 31258 3130904minus12 0(0) 0(07567) 31229 31359

Table 3 Results when varying the size of the neighbourhood119873119890Neighbor size (119873119890min minus 119873119890max) Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin2-5 0(01382) 0(10071) 31503 314202-7 0(0) 0(07359) 31389 314162-9 0(0) 0(13716) 31403 31446

Table 4 Results when varying the number of selected particles 119878119878119890 119878119891 119878 Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin5 10 0(0) 0(06586) 31808 319667 14 0(07567) 0(15407) 42845 426459 18 0(25609) 0(24015) 53850 53840

values of the lower and upper limits of the inertia weight areset to 119908min = 04 and 119908max = 12 where the minimum meanvalues are achieved

The influences of the size of the neighborhood 119873119890 isexamined next In CSPSO the size of the neighborhood 119873119890increases gradually to enhance the local search ability Largevalues will lead to slow convergence and small values willrestrain the local search ability An appropriate value balancesexploration and exploitation The lower and upper limits onthe size of the neighborhood are set to different values and thebest and mean values of the Sphere and Rastrigin functionsare reported in Table 3

Table 3 shows that the best results are obtained when thesize of the neighborhood is between the lower and upperlimits 119873119890min = 2 and 119873119890max = 7 The influences of thenumber of selected particles 119878 are then tested It has influenceson both convergence and search ability Different values for 119878119890and 119878119891 are used and the results are reported in Table 4

The results in Table 4 show that the best number ofselected particles is 10 when the mean values of the testfunctions and computation time are both low It is easy tosee that more particles do not always lead to better resultsBecause of randomness built into the algorithm sometimesmost of the particles may just gather around local optima

Based on these results the lower and upper limits onthe inertia weight 119908min and 119908max are set to 04 and 12respectively the lower and upper limits on the size ofthe neighborhood 119873119890min and 119873119890max are set to 2 and 7respectively and the number of selected particles 119878 is set to10 These parameter values in CSPSO are kept the same andare used in the following experiments

52 Benchmark Functions In this section the 4 classicalnumerical benchmark functions presented in Table 1 are usedto validate the effectiveness of CSPSO The dimension of the

benchmark functions which determines the dimension ofthe search space is set to119863 = 10 20 and 30Theperformanceof CSPSO is compared with those of PSO fitness-Euclideandistance ratio particle swarm optimization (FER-PSO) [23]and starling particle swarm optimization (SPSO) [25] PSO isthe basic swarm intelligent algorithm that was developed byKennedy and Eberhart [26] FER-PSO utilizes the individualbest of a neighbor selected based on the Euclidean distanceand fitness function value to replace the individual best of itsown The SPSO uses the collective responses of the particlesto adjust the positions and velocities of the particles when thealgorithm stagnates

The values of the adjustable parameters in these compet-itive algorithms are the best ones reported in the respectivereferences as listed in Table 5 These competitive algorithmswere also run for 50 independent times each so as to getsome meaningful results The best fitness function values forthese functions obtained by these algorithms are reported inTable 6

Figure 4 shows the convergence of some functions ofthe three algorithms Compared with PSO and FER-PSOin Figures 4(b) and 4(c) the curves of the function valuesobtained by CSPSO decline slowly at the beginning of theevolutionary process because of the time taken by localsearch Hence it has a slow convergence speed Relying onDDT and chaotic mapping CSPSO can find lower fitnessvalues than other algorithms after about half of evolutionaryprocess rather than falling into local optima The local searchstrategy in the neighborhood helps in finding better solutionsin the nearby neighborhood of the current position of aparticle In particular the curve of the Sphere functionobtained by CSPSO disappeared after the best value 0 hasbeen found because this value cannot be marked on thechart As shown in Figure 4(d) for the Rosenbrock functiona multimodal function and traditional algorithms like PSO

Mathematical Problems in Engineering 9

Table 5 Parameter settings in the experiment

Parameters PSO ABC [22] FER-PSO [23] GABC [24] SPSO [25] CSPSOPopulation (119872) 50 50 50 50 50 50119905max 200 200 200 200 200 2001198881 1198882 22 minus 22 minus 22 2 21199031 1199032 (0 1) (minus1 1) (01 04) (minus1 1) (0 1) (0 1)(119908min 119908max) 1 minus 1 1 (02 04) (04 12)Neighbors (119873119890) minus minus 7 minus 7 2minus7Selected (119878) minus minus minus minus minus 10Sub-population minus minus minus minus 14 minusStagnant limit minus minus minus minus 2 minusTrial limit minus 10 minus 10 minus minus

Table 6 Best fitness function values obtained by different algorithms for the test functions

Function 119863 PSO FER-PSO SPSO CSPSO

Sphere10 0 0 0 020 05285 01613 82232e-12 030 24235 08891 51645e-12 0

Rastrigin10 00727 003647 0 020 03595 01895 0 030 10374 04847 0 0

Griewank10 05965 01135 0 020 14575 10994 0 030 32947 16874 0 0

Rosenbrock10 63220 68162 52486 0703720 198643 192219 187899 9843130 381130 337894 287863 186928

FER-PSO and SPSO are easily trapped into local optima butCSPSO can find the smallest function values among thosefound by these algorithms These results show that CSPSOhas strong global search ability through the use of DDT andchaotic mapping It can quickly escape local optima

As the results of the 4 classic functions show in Table 6CSPSO and SPSO perform better than PSO and FER-PSOon the Rastrigin and Griewank functions and CSPSO per-forms better than the other three algorithms on the Sphereand Rosenbrock functions In general the results showthat CSPSO has more powerful global search ability thantraditional algorithms like PSO FER-PSO and SPSO bothin single modal and multimodal functions Furthermore asindicated by the results in Table 6 the better performance ofCSPSO than others provides evidence that the improvementin CSPSO is effective in enhancing the performance of PSOand CSPSO can find the global optimum more often thanother algorithms in the experiments

53 Clustering Problems In this section experiments onclustering problemswith different datasets are performed andresults of different algorithms including CSPSO are reportedand discussed Eight benchmark datasets are used Theyare Data 3 2 Data 5 2 Data 10 2 and Data 4 3 from thework reported in [34] and Iris Glass Contraceptive Method

Choice (CMC) and Wine from the UCI Machine LearningRepository [35] More details about these datasets are givenin Table 7 Data 5 2 and Data 4 3 are also plotted in Figure 5

The optimized values of the parameters in the other algo-rithms shown in Table 5 are adopted in the experiments Inorder to perform fair comparisons all the algorithms run onall the clustering datasets for 50 times to eliminate the effectsrandom factors As simple statistics best values (Best) worstvalues (Worst) mean values (Mean) and standard deviations(SD) are used as the evaluation criteria to measure theeffectiveness of these clustering algorithms The environmentof experiment is the same for all clustering algorithms

Figure 6 shows the convergence of these algorithms onfour datasets for typical runs of the algorithms The fitnessfunctions of CSPSO decline slowly at the beginning ofthe evolutionary process and continue to decline ratherthan dropping into local optimum at the middle of theevolutionary process possibly because DDT gives the par-ticles high velocities Local search based on Euclidean andfitness neighborhood structure also helps the particles findbetter positions This phenomenon is more evident on high-dimensional datasets such as the Glass dataset

To be more clear Figure 7 gives more details aboutthe convergence of these algorithms near the end of thesearch process on three out of the four same datasets

10 Mathematical Problems in Engineering

Table 7 Description of the datasets used in the experiments

Datasets Clusters (119870) Features (119889) Total instances (119873) Instances in each clustersData 3 2 3 2 76 (252526)Data 5 2 5 2 250 (5050505050)Data 10 2 10 2 500 (50505050505050505050)Data 4 3 4 3 400 (100100100100)Iris 3 4 150 (505050)Glass 6 9 214 (70177613929)CMC 3 9 1473 (629334510)Wine 3 13 178 (597148)

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

CSPSOPSO

SPSOFER-PSO

104

103

102

101

100

10minus1

10minus2

(a) Sphere function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Bes

t Val

ues

103

102

101

100

(b) Rastrigin function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

105

100

10minus10

10minus5

10minus15

10minus20

(c) Griewank function

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

103

102

101

100

CSPSOPSO

SPSOFER-PSO

(d) Rosenbrock function

Figure 4 Convergence of the algorithms on some functions (D = 10)

CSPSO has the best performance among all these clusteringalgorithms ABC is the first converging to a local optimalpoint possibly because of the influence of randomly selectedneighbors GABC has a poor performance for large datavolumes such as the Data 5 2 and Glass datasets possibly

due to the local search behavior of onlooker bees FER-PSOhas apparently better convergence performance than otheralgorithms possibly due to the strategy that only one of theneighbors is selected to guide the search of a particle but it iseasily trapped in local optimal points in the later generations

Mathematical Problems in Engineering 11

Data_5_2 Data_4_34 6 8 10 12 14 16

4

6

8

10

12

14

16

2015

105

0minus5minus5

05

1015

0

5

10

15

20

minus520

Figure 5 Distributions of data points in some datasets

of the search process SPSO shows a better performance insearching for a global optimum but its local search strategy isnot working on high-dimensional datasets The above resultsand discussions reveal the different performances amongPSO ABC FER-PSO GABC SPSO and CSPSO and verifythe good performance of CSPSO as a clustering algorithm

It can be seen from Table 8 that CSPSO has a betterperformance on these classical clustering problems For theData 3 2 Data 5 2 and Data 10 2 datasets CSPSO obtainedlower mean fitness values than others but other algorithmsperform better than CSPSO on the best fitness values Thisis possibly due to the randomness of the swarm intelligencealgorithms For the Data 4 3 datasets because of their largenumbers of data points the ability of local search in CSPSOis affected by the selected neighbors ABC has better stabilityon this dataset but has a poor ability in searching for aglobal optimum than the others due to the random strategyCSPSO outperforms the other algorithms and obtains lowermean fitness values than others on the Iris Glass CMC andWine datasets Therefore it has better performance on high-dimensional data This proposed algorithm with embeddedDDT and chaotic mapping has better global search abilityand is more stable than traditional PSO through the use ofa local search method based on starling birds Furthermoreas the sizes of the datasets increase in both the dimensionand number of data points CSPSO has better performancethan PSO ABC GABC FER-PSO and SPSO It is morecapable of dealing with big data in the current informationage

6 Conclusions

Because the performance of standard PSO depends onits parameter values parameter adjustment is one of themost useful ways for PSO to obtain good exploration andexploitation abilities CSPSO proposed in this study enhancesthe overall convergence of the searching process Nonlinearadjustment of the inertia weight and the chaotic search

method based on logistic mapping enhance the global searchperformance of the traditional PSO and help particles escapefrom local optima DDT added to velocity update increasesthe velocities of the particles in later iterations and also helpsparticles escape from local optima The local search strategybased on behavior of starling birds utilizes the information ofthe nearest neighbors in the neighborhood and determinesa new collective position and a new collective velocity Thetwoparticle selectionmethods Euclideandistance and fitnessfunction value maintain population diversity of the particlesThese improvements help particles avoid stagnation and findbetter solutions in the search space

The results of simulation experiments in both benchmarkfunction optimization and classical clustering problems showthe effectiveness of CSPSO Compared with the traditionalPSO and other evolutionary algorithms CSPSO has betterconvergence properties and stronger global search abilityHowever as for other swarm intelligence algorithms CSPSOmay be trapped into and may stagnate around local optimalpoints and therefore may be unstable when applied toproblemswith multiple local optima In future works CSPSOwill be applied to other combinatorial optimization problemssuch as scheduling problems Other metaheuristic methodsand optimization techniques may also be used to improve theperformance of PSO

Data Availability

All the datasets used in the simulation experiments camefrom the Artificial Datasets available at httpwwwisicalacinsimsanghamidatahtml (accessed June 2017) and from theUCIMachine Learning Repository available at httparchiveicsuciedumldatasetshtml (accessed June 2017)

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

12 Mathematical Problems in Engineering

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

46

48

50

52

54

56

58

60

(a) Data 3 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

320

340

360

380

400

420

440

460

480

(b) Data 5 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

90

100

110

120

130

140

150

160

170

180

190

(c) Iris

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

250

300

350

400

450

500

550

600

650

(d) Glass

Figure 6 Convergence of the algorithms on some datasets

Iteration 19965 1997 19975 1998 19985 1999 19995 200

Best

Valu

es

47515

4752

47525

4753

47535

4754

Iteration 188 190 192 194 196 198 200

Best

Valu

es

322323324325326327328329330331

Iteration 188 190 192 194 196 198 200

Best

Valu

es

94

95

96

97

98

99

Figure 7 Convergence of the algorithms on the Data 3 2 Data 5 2 and Iris datasets

Mathematical Problems in Engineering 13

Table 8 Comparison of performance of CSPSO with other clustering algorithms

Datasets Parameters AlgorithmsPSO ABC FERminusPSO GABC SPSO CSPSO

Data 3 2

Best 475287 477630 475290 475282 475733 475282Worst 591935 601039 497856 591895 531721 475382Mean 484644 509702 478771 482279 478934 475300SD 31956 31345 04967 27975 08551 00022

Data 5 2

Best 3264386 3375538 3269987 3297155 3264666 3264432Worst 3922899 4110786 3326651 3648400 3270877 3269676Mean 3296249 3736977 3287897 3435928 3267493 3265300SD 129978 201046 11594 96214 01536 00942

Data 10 2

Best 10793e+03 9164320 8756264 13134e+03 8432274 8498712Worst 14459e+03 12718e+03 12068e+03 11657e+03 11827e+03 10141e+03Mean 12875e+03 10608e+03 10066e+03 638046 9418921 9182542SD 704484 761788 671964 13134e+03 855520 505344

Data 4 3

Best 7518897 10937e+03 8305456 7504056 7499506 7496738Worst 12866e+03 16347e+03 13753e+03 18485e+03 12726e+03 12725e+03Mean 8388670 13838e+03 10312e+03 11489e+03 7938641 7845028SD 1818515 1223579 1500554 2713938 1425136 1251508

Iris

Best 1005216 1131201 980975 966829 966605 966605Worst 1306931 1568960 1029780 1276869 986906 980888Mean 1137655 1331161 1003761 1050487 969465 969194SD 71771 110706 12211 112307 03921 02931

Glass

Best 3291487 3337214 2786565 2885152 2353554 2296544Worst 4365615 5031607 3265351 4025881 3090997 2789187Mean 3764035 4223809 2973875 3464078 2805624 2570244SD 215371 320309 111730 269515 145045 103633

CMC

Best 58449e+03 59895e+03 56153e+03 56704e+03 55615e+03 55404e+03Worst 69260e+03 72137e+03 59728e+03 64143e+03 57676e+03 55861e+03Mean 61772e+03 64821e+03 57297e+03 58864e+03 56426e+03 55562e+03SD 1867542 2432599 730180 1408947 446957 87806

Wine

Best 16308e+04 16399e+04 16312e+04 16309e+04 16305e+04 16298e+04Worst 16798e+04 17317e+04 16376e+04 16482e+04 16356e+04 16313e+04Mean 16438e+04 16705e+04 16336e+04 16355e+04 16322e+04 16304e+04SD 950097 2197057 159233 354218 120030 33799

Acknowledgments

This research project was partially supported by the NationalNatural Science Foundation of China (61472231 6150228361640201) Humanities and Social Science Research Fund ofthe Ministry of Education of China (12YJA630152) and theShandong Social Science Foundation of China (16BGLJ0611CGLJ22)

References

[1] X LiuH Liu andHDuan ldquoParticle swarmoptimizationbasedon dynamic niche technology with applications to conceptualdesignrdquo Advances in Engineering Software vol 38 no 10 pp668ndash676 2007

[2] Z Bao and TWatanabe ldquoMixed constrained image filter designusing particle swarm optimizationrdquo Artificial Life and Roboticsvol 15 no 3 pp 363ndash368 2010

[3] S Sahin andM A Cavuslu ldquoFPGA Implementation ofWaveletNeural Network Training with PSOiPSOrdquo Journal of CircuitsSystems and Computers vol 27 no 6 2018

[4] D Wu and H Gao ldquoA BP and Switching PSO Based Optimiza-tion Approach for Engine Optimizationrdquo National Academy ofScience Letters vol 40 no 1 pp 33ndash37 2017

[5] Q Jia and Y Guo ldquoHybridization of ABC and PSO algorithmsfor improved solutions of RCPSPrdquo Journal of the ChineseInstitute of Engineers Transactions of the Chinese Institute ofEngineersSeries AChung-kuo Kung Chrsquoeng Hsuch Krsquoan vol 39no 6 pp 727ndash734 2016

[6] C Yue B Qu and J Liang ldquoA Multi-objective Particle SwarmOptimizer Using Ring Topology for Solving Multimodal Multi-objective Problemsrdquo IEEE Transactions on Evolutionary Com-putation 2017

[7] X Liu and J Xue ldquoA Cluster Splitting Technique by HopfieldNetworks and P Systems on Simplicesrdquo Neural ProcessingLetters pp 1ndash24 2017

14 Mathematical Problems in Engineering

[8] Y Zhao X Liu and W Wang ldquoSpiking neural P systems withneuron division and dissolutionrdquo PLoS ONE vol 11 no 9Article ID e0162882 2016

[9] X Liu Y Zhao and M Sun ldquoAn Improved Apriori AlgorithmBased on an Evolution-Communication Tissue-Like P Systemwith Promoters and Inhibitorsrdquo Discrete Dynamics in Natureand Society vol 2017 2017

[10] N Netjinda T Achalakul and B Sirinaovakul ldquoParticle SwarmOptimization inspired by starling flock behaviorrdquo Applied SoftComputing vol 35 pp 411ndash422 2015

[11] A El Dor D Lemoine M Clerc P Siarry L Deroussi andMGourgand ldquoDynamic cluster in particle swarm optimizationalgorithmrdquo Natural Computing vol 14 no 4 pp 655ndash672 2015

[12] Y M B Ali ldquoUnsupervised Clustering Based an AdaptiveParticle Swarm Optimization Algorithmrdquo Neural ProcessingLetters vol 44 no 1 pp 221ndash244 2016

[13] G Armano and M R Farmani ldquoMultiobjective clusteringanalysis using particle swarm optimizationrdquo Expert Systemswith Applications vol 55 pp 184ndash193 2016

[14] B Niu Q Duan L Tan C Liu and P Liang ldquoA Population-Based Clustering Technique Using Particle Swarm Optimiza-tion and K-Meansrdquo in Advances in Swarm and ComputationalIntelligence vol 9140 of Lecture Notes in Computer Science pp145ndash152 Springer International Publishing Cham 2015

[15] X Zhao W Lin J Hao X Zuo and J Yuan ldquoClustering andpattern search for enhancing particle swarm optimization withEuclidean spatial neighborhood searchrdquo Neurocomputing vol171 pp 966ndash981 2016

[16] E Comak ldquoA modified particle swarm optimization algorithmusing Renyi entropy-based clusteringrdquo Neural Computing andApplications vol 27 no 5 pp 1381ndash1390 2016

[17] K K Bharti and P K Singh ldquoOpposition chaotic fitness muta-tion based adaptive inertia weight BPSO for feature selection intext clusteringrdquoApplied SoftComputing vol 43 pp 20ndash34 2016

[18] W Song W Ma and Y Qiao ldquoParticle swarm optimizationalgorithm with environmental factors for clustering analysisrdquoSoft Computing vol 21 no 2 pp 283ndash293 2017

[19] R Liu J Li J Fan C Mu and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimizationfor dynamic multi-objective optimizationrdquo European Journal ofOperational Research vol 261 no 3 pp 1028ndash1051 2017

[20] X Chu B Niu J J Liang and Q Lu ldquoAn orthogonal-designhybrid particle swarm optimiser with application to capacitatedfacility location problemrdquo International Journal of Bio-InspiredComputation vol 8 no 5 pp 268ndash285 2016

[21] DWang D Tan and L Liu ldquoParticle Swarm Optimization Anoverviewrdquo Soft Computing vol 22 no 2 p 22 2018

[22] D Karaboga ldquoAn Idea Based on Honey Bee Swarm for Numer-ical Optimizationrdquo Technical Report TR0 2005

[23] X Li ldquoA multimodal particle swarm optimizer based on fitnessEuclidean-distance ratiordquo in Proceedings of the the 9th annualconference pp 78ndash85 London England July 2007

[24] G Zhu and S Kwong ldquoGbest-guided artificial bee colonyalgorithm for numerical function optimizationrdquo Applied Math-ematics and Computation vol 217 no 7 pp 3166ndash3173 2010

[25] A Laudani F R Fulginei G M Lozito and A SalvinildquoSwarmflock optimization algorithms as continuous dynamicsystemsrdquo Applied Mathematics and Computation vol 243 pp670ndash683 2014

[26] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[27] K K Bharti and P K Singh ldquoChaotic gradient artificial beecolony for text clusteringrdquo Soft Computing vol 20 no 3 pp1113ndash1126 2016

[28] RMMay ldquoSimplemathematicalmodelswith very complicateddynamicsrdquo Nature vol 261 no 5560 pp 459ndash467 1976

[29] F S Hover and M S Triantafyllou ldquoApplication of polynomialchaos in stability and controlrdquo Automatica vol 42 no 5 pp789ndash795 2006

[30] X Liu L Xiang and X Wang ldquoSpatial cluster analysis bythe adleman-lipton DNA computing model and flexible gridsrdquoDiscrete Dynamics in Nature and Society vol 2012 Article ID894207 32 pages 2012

[31] H Peng X Luo ZGao JWang andZ Pei ldquoANovel ClusteringAlgorithm Inspired by Membrane Computingrdquo The ScientificWorld Journal vol 3 no 22 Article ID 929471 8 pages 2015

[32] B Ufnalski and L M Grzesiak ldquoPlug-in direct particle swarmrepetitive controller with a reduced dimensionality of a fitnesslandscape - a multi-swarm approachrdquo Bulletin of the PolishAcademy of SciencesmdashTechnical Sciences vol 63 no 4 pp 857ndash866 2015

[33] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[34] ldquoDatasetsrdquo httpwwwisicalacinsimsanghamidatahtml[35] H C Blake UCI Repository of Machine Learning Databases

1998

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 8: A New Chaotic Starling Particle Swarm Optimization ...downloads.hindawi.com/journals/mpe/2018/8250480.pdfMathematicalProblemsinEngineering Chaotic Parameter u 2.5 3 3.5 4 Chaotic Value

8 Mathematical Problems in Engineering

Table 2 Results when varying the inertia weight 119908Inertia weight (119908min minus 119908max) Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin02minus06 0(00022) 0(09684) 33773 3132102minus09 0(0) 0(16269) 31235 3135804minus09 0(0) 0(17282) 31258 3130904minus12 0(0) 0(07567) 31229 31359

Table 3 Results when varying the size of the neighbourhood119873119890Neighbor size (119873119890min minus 119873119890max) Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin2-5 0(01382) 0(10071) 31503 314202-7 0(0) 0(07359) 31389 314162-9 0(0) 0(13716) 31403 31446

Table 4 Results when varying the number of selected particles 119878119878119890 119878119891 119878 Sphere Rastrigin Mean Time taken by Sphere Mean Time taken by Rastrigin5 10 0(0) 0(06586) 31808 319667 14 0(07567) 0(15407) 42845 426459 18 0(25609) 0(24015) 53850 53840

values of the lower and upper limits of the inertia weight areset to 119908min = 04 and 119908max = 12 where the minimum meanvalues are achieved

The influences of the size of the neighborhood 119873119890 isexamined next In CSPSO the size of the neighborhood 119873119890increases gradually to enhance the local search ability Largevalues will lead to slow convergence and small values willrestrain the local search ability An appropriate value balancesexploration and exploitation The lower and upper limits onthe size of the neighborhood are set to different values and thebest and mean values of the Sphere and Rastrigin functionsare reported in Table 3

Table 3 shows that the best results are obtained when thesize of the neighborhood is between the lower and upperlimits 119873119890min = 2 and 119873119890max = 7 The influences of thenumber of selected particles 119878 are then tested It has influenceson both convergence and search ability Different values for 119878119890and 119878119891 are used and the results are reported in Table 4

The results in Table 4 show that the best number ofselected particles is 10 when the mean values of the testfunctions and computation time are both low It is easy tosee that more particles do not always lead to better resultsBecause of randomness built into the algorithm sometimesmost of the particles may just gather around local optima

Based on these results the lower and upper limits onthe inertia weight 119908min and 119908max are set to 04 and 12respectively the lower and upper limits on the size ofthe neighborhood 119873119890min and 119873119890max are set to 2 and 7respectively and the number of selected particles 119878 is set to10 These parameter values in CSPSO are kept the same andare used in the following experiments

52 Benchmark Functions In this section the 4 classicalnumerical benchmark functions presented in Table 1 are usedto validate the effectiveness of CSPSO The dimension of the

benchmark functions which determines the dimension ofthe search space is set to119863 = 10 20 and 30Theperformanceof CSPSO is compared with those of PSO fitness-Euclideandistance ratio particle swarm optimization (FER-PSO) [23]and starling particle swarm optimization (SPSO) [25] PSO isthe basic swarm intelligent algorithm that was developed byKennedy and Eberhart [26] FER-PSO utilizes the individualbest of a neighbor selected based on the Euclidean distanceand fitness function value to replace the individual best of itsown The SPSO uses the collective responses of the particlesto adjust the positions and velocities of the particles when thealgorithm stagnates

The values of the adjustable parameters in these compet-itive algorithms are the best ones reported in the respectivereferences as listed in Table 5 These competitive algorithmswere also run for 50 independent times each so as to getsome meaningful results The best fitness function values forthese functions obtained by these algorithms are reported inTable 6

Figure 4 shows the convergence of some functions ofthe three algorithms Compared with PSO and FER-PSOin Figures 4(b) and 4(c) the curves of the function valuesobtained by CSPSO decline slowly at the beginning of theevolutionary process because of the time taken by localsearch Hence it has a slow convergence speed Relying onDDT and chaotic mapping CSPSO can find lower fitnessvalues than other algorithms after about half of evolutionaryprocess rather than falling into local optima The local searchstrategy in the neighborhood helps in finding better solutionsin the nearby neighborhood of the current position of aparticle In particular the curve of the Sphere functionobtained by CSPSO disappeared after the best value 0 hasbeen found because this value cannot be marked on thechart As shown in Figure 4(d) for the Rosenbrock functiona multimodal function and traditional algorithms like PSO

Mathematical Problems in Engineering 9

Table 5 Parameter settings in the experiment

Parameters PSO ABC [22] FER-PSO [23] GABC [24] SPSO [25] CSPSOPopulation (119872) 50 50 50 50 50 50119905max 200 200 200 200 200 2001198881 1198882 22 minus 22 minus 22 2 21199031 1199032 (0 1) (minus1 1) (01 04) (minus1 1) (0 1) (0 1)(119908min 119908max) 1 minus 1 1 (02 04) (04 12)Neighbors (119873119890) minus minus 7 minus 7 2minus7Selected (119878) minus minus minus minus minus 10Sub-population minus minus minus minus 14 minusStagnant limit minus minus minus minus 2 minusTrial limit minus 10 minus 10 minus minus

Table 6 Best fitness function values obtained by different algorithms for the test functions

Function 119863 PSO FER-PSO SPSO CSPSO

Sphere10 0 0 0 020 05285 01613 82232e-12 030 24235 08891 51645e-12 0

Rastrigin10 00727 003647 0 020 03595 01895 0 030 10374 04847 0 0

Griewank10 05965 01135 0 020 14575 10994 0 030 32947 16874 0 0

Rosenbrock10 63220 68162 52486 0703720 198643 192219 187899 9843130 381130 337894 287863 186928

FER-PSO and SPSO are easily trapped into local optima butCSPSO can find the smallest function values among thosefound by these algorithms These results show that CSPSOhas strong global search ability through the use of DDT andchaotic mapping It can quickly escape local optima

As the results of the 4 classic functions show in Table 6CSPSO and SPSO perform better than PSO and FER-PSOon the Rastrigin and Griewank functions and CSPSO per-forms better than the other three algorithms on the Sphereand Rosenbrock functions In general the results showthat CSPSO has more powerful global search ability thantraditional algorithms like PSO FER-PSO and SPSO bothin single modal and multimodal functions Furthermore asindicated by the results in Table 6 the better performance ofCSPSO than others provides evidence that the improvementin CSPSO is effective in enhancing the performance of PSOand CSPSO can find the global optimum more often thanother algorithms in the experiments

53 Clustering Problems In this section experiments onclustering problemswith different datasets are performed andresults of different algorithms including CSPSO are reportedand discussed Eight benchmark datasets are used Theyare Data 3 2 Data 5 2 Data 10 2 and Data 4 3 from thework reported in [34] and Iris Glass Contraceptive Method

Choice (CMC) and Wine from the UCI Machine LearningRepository [35] More details about these datasets are givenin Table 7 Data 5 2 and Data 4 3 are also plotted in Figure 5

The optimized values of the parameters in the other algo-rithms shown in Table 5 are adopted in the experiments Inorder to perform fair comparisons all the algorithms run onall the clustering datasets for 50 times to eliminate the effectsrandom factors As simple statistics best values (Best) worstvalues (Worst) mean values (Mean) and standard deviations(SD) are used as the evaluation criteria to measure theeffectiveness of these clustering algorithms The environmentof experiment is the same for all clustering algorithms

Figure 6 shows the convergence of these algorithms onfour datasets for typical runs of the algorithms The fitnessfunctions of CSPSO decline slowly at the beginning ofthe evolutionary process and continue to decline ratherthan dropping into local optimum at the middle of theevolutionary process possibly because DDT gives the par-ticles high velocities Local search based on Euclidean andfitness neighborhood structure also helps the particles findbetter positions This phenomenon is more evident on high-dimensional datasets such as the Glass dataset

To be more clear Figure 7 gives more details aboutthe convergence of these algorithms near the end of thesearch process on three out of the four same datasets

10 Mathematical Problems in Engineering

Table 7 Description of the datasets used in the experiments

Datasets Clusters (119870) Features (119889) Total instances (119873) Instances in each clustersData 3 2 3 2 76 (252526)Data 5 2 5 2 250 (5050505050)Data 10 2 10 2 500 (50505050505050505050)Data 4 3 4 3 400 (100100100100)Iris 3 4 150 (505050)Glass 6 9 214 (70177613929)CMC 3 9 1473 (629334510)Wine 3 13 178 (597148)

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

CSPSOPSO

SPSOFER-PSO

104

103

102

101

100

10minus1

10minus2

(a) Sphere function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Bes

t Val

ues

103

102

101

100

(b) Rastrigin function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

105

100

10minus10

10minus5

10minus15

10minus20

(c) Griewank function

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

103

102

101

100

CSPSOPSO

SPSOFER-PSO

(d) Rosenbrock function

Figure 4 Convergence of the algorithms on some functions (D = 10)

CSPSO has the best performance among all these clusteringalgorithms ABC is the first converging to a local optimalpoint possibly because of the influence of randomly selectedneighbors GABC has a poor performance for large datavolumes such as the Data 5 2 and Glass datasets possibly

due to the local search behavior of onlooker bees FER-PSOhas apparently better convergence performance than otheralgorithms possibly due to the strategy that only one of theneighbors is selected to guide the search of a particle but it iseasily trapped in local optimal points in the later generations

Mathematical Problems in Engineering 11

Data_5_2 Data_4_34 6 8 10 12 14 16

4

6

8

10

12

14

16

2015

105

0minus5minus5

05

1015

0

5

10

15

20

minus520

Figure 5 Distributions of data points in some datasets

of the search process SPSO shows a better performance insearching for a global optimum but its local search strategy isnot working on high-dimensional datasets The above resultsand discussions reveal the different performances amongPSO ABC FER-PSO GABC SPSO and CSPSO and verifythe good performance of CSPSO as a clustering algorithm

It can be seen from Table 8 that CSPSO has a betterperformance on these classical clustering problems For theData 3 2 Data 5 2 and Data 10 2 datasets CSPSO obtainedlower mean fitness values than others but other algorithmsperform better than CSPSO on the best fitness values Thisis possibly due to the randomness of the swarm intelligencealgorithms For the Data 4 3 datasets because of their largenumbers of data points the ability of local search in CSPSOis affected by the selected neighbors ABC has better stabilityon this dataset but has a poor ability in searching for aglobal optimum than the others due to the random strategyCSPSO outperforms the other algorithms and obtains lowermean fitness values than others on the Iris Glass CMC andWine datasets Therefore it has better performance on high-dimensional data This proposed algorithm with embeddedDDT and chaotic mapping has better global search abilityand is more stable than traditional PSO through the use ofa local search method based on starling birds Furthermoreas the sizes of the datasets increase in both the dimensionand number of data points CSPSO has better performancethan PSO ABC GABC FER-PSO and SPSO It is morecapable of dealing with big data in the current informationage

6 Conclusions

Because the performance of standard PSO depends onits parameter values parameter adjustment is one of themost useful ways for PSO to obtain good exploration andexploitation abilities CSPSO proposed in this study enhancesthe overall convergence of the searching process Nonlinearadjustment of the inertia weight and the chaotic search

method based on logistic mapping enhance the global searchperformance of the traditional PSO and help particles escapefrom local optima DDT added to velocity update increasesthe velocities of the particles in later iterations and also helpsparticles escape from local optima The local search strategybased on behavior of starling birds utilizes the information ofthe nearest neighbors in the neighborhood and determinesa new collective position and a new collective velocity Thetwoparticle selectionmethods Euclideandistance and fitnessfunction value maintain population diversity of the particlesThese improvements help particles avoid stagnation and findbetter solutions in the search space

The results of simulation experiments in both benchmarkfunction optimization and classical clustering problems showthe effectiveness of CSPSO Compared with the traditionalPSO and other evolutionary algorithms CSPSO has betterconvergence properties and stronger global search abilityHowever as for other swarm intelligence algorithms CSPSOmay be trapped into and may stagnate around local optimalpoints and therefore may be unstable when applied toproblemswith multiple local optima In future works CSPSOwill be applied to other combinatorial optimization problemssuch as scheduling problems Other metaheuristic methodsand optimization techniques may also be used to improve theperformance of PSO

Data Availability

All the datasets used in the simulation experiments camefrom the Artificial Datasets available at httpwwwisicalacinsimsanghamidatahtml (accessed June 2017) and from theUCIMachine Learning Repository available at httparchiveicsuciedumldatasetshtml (accessed June 2017)

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

12 Mathematical Problems in Engineering

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

46

48

50

52

54

56

58

60

(a) Data 3 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

320

340

360

380

400

420

440

460

480

(b) Data 5 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

90

100

110

120

130

140

150

160

170

180

190

(c) Iris

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

250

300

350

400

450

500

550

600

650

(d) Glass

Figure 6 Convergence of the algorithms on some datasets

Iteration 19965 1997 19975 1998 19985 1999 19995 200

Best

Valu

es

47515

4752

47525

4753

47535

4754

Iteration 188 190 192 194 196 198 200

Best

Valu

es

322323324325326327328329330331

Iteration 188 190 192 194 196 198 200

Best

Valu

es

94

95

96

97

98

99

Figure 7 Convergence of the algorithms on the Data 3 2 Data 5 2 and Iris datasets

Mathematical Problems in Engineering 13

Table 8 Comparison of performance of CSPSO with other clustering algorithms

Datasets Parameters AlgorithmsPSO ABC FERminusPSO GABC SPSO CSPSO

Data 3 2

Best 475287 477630 475290 475282 475733 475282Worst 591935 601039 497856 591895 531721 475382Mean 484644 509702 478771 482279 478934 475300SD 31956 31345 04967 27975 08551 00022

Data 5 2

Best 3264386 3375538 3269987 3297155 3264666 3264432Worst 3922899 4110786 3326651 3648400 3270877 3269676Mean 3296249 3736977 3287897 3435928 3267493 3265300SD 129978 201046 11594 96214 01536 00942

Data 10 2

Best 10793e+03 9164320 8756264 13134e+03 8432274 8498712Worst 14459e+03 12718e+03 12068e+03 11657e+03 11827e+03 10141e+03Mean 12875e+03 10608e+03 10066e+03 638046 9418921 9182542SD 704484 761788 671964 13134e+03 855520 505344

Data 4 3

Best 7518897 10937e+03 8305456 7504056 7499506 7496738Worst 12866e+03 16347e+03 13753e+03 18485e+03 12726e+03 12725e+03Mean 8388670 13838e+03 10312e+03 11489e+03 7938641 7845028SD 1818515 1223579 1500554 2713938 1425136 1251508

Iris

Best 1005216 1131201 980975 966829 966605 966605Worst 1306931 1568960 1029780 1276869 986906 980888Mean 1137655 1331161 1003761 1050487 969465 969194SD 71771 110706 12211 112307 03921 02931

Glass

Best 3291487 3337214 2786565 2885152 2353554 2296544Worst 4365615 5031607 3265351 4025881 3090997 2789187Mean 3764035 4223809 2973875 3464078 2805624 2570244SD 215371 320309 111730 269515 145045 103633

CMC

Best 58449e+03 59895e+03 56153e+03 56704e+03 55615e+03 55404e+03Worst 69260e+03 72137e+03 59728e+03 64143e+03 57676e+03 55861e+03Mean 61772e+03 64821e+03 57297e+03 58864e+03 56426e+03 55562e+03SD 1867542 2432599 730180 1408947 446957 87806

Wine

Best 16308e+04 16399e+04 16312e+04 16309e+04 16305e+04 16298e+04Worst 16798e+04 17317e+04 16376e+04 16482e+04 16356e+04 16313e+04Mean 16438e+04 16705e+04 16336e+04 16355e+04 16322e+04 16304e+04SD 950097 2197057 159233 354218 120030 33799

Acknowledgments

This research project was partially supported by the NationalNatural Science Foundation of China (61472231 6150228361640201) Humanities and Social Science Research Fund ofthe Ministry of Education of China (12YJA630152) and theShandong Social Science Foundation of China (16BGLJ0611CGLJ22)

References

[1] X LiuH Liu andHDuan ldquoParticle swarmoptimizationbasedon dynamic niche technology with applications to conceptualdesignrdquo Advances in Engineering Software vol 38 no 10 pp668ndash676 2007

[2] Z Bao and TWatanabe ldquoMixed constrained image filter designusing particle swarm optimizationrdquo Artificial Life and Roboticsvol 15 no 3 pp 363ndash368 2010

[3] S Sahin andM A Cavuslu ldquoFPGA Implementation ofWaveletNeural Network Training with PSOiPSOrdquo Journal of CircuitsSystems and Computers vol 27 no 6 2018

[4] D Wu and H Gao ldquoA BP and Switching PSO Based Optimiza-tion Approach for Engine Optimizationrdquo National Academy ofScience Letters vol 40 no 1 pp 33ndash37 2017

[5] Q Jia and Y Guo ldquoHybridization of ABC and PSO algorithmsfor improved solutions of RCPSPrdquo Journal of the ChineseInstitute of Engineers Transactions of the Chinese Institute ofEngineersSeries AChung-kuo Kung Chrsquoeng Hsuch Krsquoan vol 39no 6 pp 727ndash734 2016

[6] C Yue B Qu and J Liang ldquoA Multi-objective Particle SwarmOptimizer Using Ring Topology for Solving Multimodal Multi-objective Problemsrdquo IEEE Transactions on Evolutionary Com-putation 2017

[7] X Liu and J Xue ldquoA Cluster Splitting Technique by HopfieldNetworks and P Systems on Simplicesrdquo Neural ProcessingLetters pp 1ndash24 2017

14 Mathematical Problems in Engineering

[8] Y Zhao X Liu and W Wang ldquoSpiking neural P systems withneuron division and dissolutionrdquo PLoS ONE vol 11 no 9Article ID e0162882 2016

[9] X Liu Y Zhao and M Sun ldquoAn Improved Apriori AlgorithmBased on an Evolution-Communication Tissue-Like P Systemwith Promoters and Inhibitorsrdquo Discrete Dynamics in Natureand Society vol 2017 2017

[10] N Netjinda T Achalakul and B Sirinaovakul ldquoParticle SwarmOptimization inspired by starling flock behaviorrdquo Applied SoftComputing vol 35 pp 411ndash422 2015

[11] A El Dor D Lemoine M Clerc P Siarry L Deroussi andMGourgand ldquoDynamic cluster in particle swarm optimizationalgorithmrdquo Natural Computing vol 14 no 4 pp 655ndash672 2015

[12] Y M B Ali ldquoUnsupervised Clustering Based an AdaptiveParticle Swarm Optimization Algorithmrdquo Neural ProcessingLetters vol 44 no 1 pp 221ndash244 2016

[13] G Armano and M R Farmani ldquoMultiobjective clusteringanalysis using particle swarm optimizationrdquo Expert Systemswith Applications vol 55 pp 184ndash193 2016

[14] B Niu Q Duan L Tan C Liu and P Liang ldquoA Population-Based Clustering Technique Using Particle Swarm Optimiza-tion and K-Meansrdquo in Advances in Swarm and ComputationalIntelligence vol 9140 of Lecture Notes in Computer Science pp145ndash152 Springer International Publishing Cham 2015

[15] X Zhao W Lin J Hao X Zuo and J Yuan ldquoClustering andpattern search for enhancing particle swarm optimization withEuclidean spatial neighborhood searchrdquo Neurocomputing vol171 pp 966ndash981 2016

[16] E Comak ldquoA modified particle swarm optimization algorithmusing Renyi entropy-based clusteringrdquo Neural Computing andApplications vol 27 no 5 pp 1381ndash1390 2016

[17] K K Bharti and P K Singh ldquoOpposition chaotic fitness muta-tion based adaptive inertia weight BPSO for feature selection intext clusteringrdquoApplied SoftComputing vol 43 pp 20ndash34 2016

[18] W Song W Ma and Y Qiao ldquoParticle swarm optimizationalgorithm with environmental factors for clustering analysisrdquoSoft Computing vol 21 no 2 pp 283ndash293 2017

[19] R Liu J Li J Fan C Mu and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimizationfor dynamic multi-objective optimizationrdquo European Journal ofOperational Research vol 261 no 3 pp 1028ndash1051 2017

[20] X Chu B Niu J J Liang and Q Lu ldquoAn orthogonal-designhybrid particle swarm optimiser with application to capacitatedfacility location problemrdquo International Journal of Bio-InspiredComputation vol 8 no 5 pp 268ndash285 2016

[21] DWang D Tan and L Liu ldquoParticle Swarm Optimization Anoverviewrdquo Soft Computing vol 22 no 2 p 22 2018

[22] D Karaboga ldquoAn Idea Based on Honey Bee Swarm for Numer-ical Optimizationrdquo Technical Report TR0 2005

[23] X Li ldquoA multimodal particle swarm optimizer based on fitnessEuclidean-distance ratiordquo in Proceedings of the the 9th annualconference pp 78ndash85 London England July 2007

[24] G Zhu and S Kwong ldquoGbest-guided artificial bee colonyalgorithm for numerical function optimizationrdquo Applied Math-ematics and Computation vol 217 no 7 pp 3166ndash3173 2010

[25] A Laudani F R Fulginei G M Lozito and A SalvinildquoSwarmflock optimization algorithms as continuous dynamicsystemsrdquo Applied Mathematics and Computation vol 243 pp670ndash683 2014

[26] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[27] K K Bharti and P K Singh ldquoChaotic gradient artificial beecolony for text clusteringrdquo Soft Computing vol 20 no 3 pp1113ndash1126 2016

[28] RMMay ldquoSimplemathematicalmodelswith very complicateddynamicsrdquo Nature vol 261 no 5560 pp 459ndash467 1976

[29] F S Hover and M S Triantafyllou ldquoApplication of polynomialchaos in stability and controlrdquo Automatica vol 42 no 5 pp789ndash795 2006

[30] X Liu L Xiang and X Wang ldquoSpatial cluster analysis bythe adleman-lipton DNA computing model and flexible gridsrdquoDiscrete Dynamics in Nature and Society vol 2012 Article ID894207 32 pages 2012

[31] H Peng X Luo ZGao JWang andZ Pei ldquoANovel ClusteringAlgorithm Inspired by Membrane Computingrdquo The ScientificWorld Journal vol 3 no 22 Article ID 929471 8 pages 2015

[32] B Ufnalski and L M Grzesiak ldquoPlug-in direct particle swarmrepetitive controller with a reduced dimensionality of a fitnesslandscape - a multi-swarm approachrdquo Bulletin of the PolishAcademy of SciencesmdashTechnical Sciences vol 63 no 4 pp 857ndash866 2015

[33] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[34] ldquoDatasetsrdquo httpwwwisicalacinsimsanghamidatahtml[35] H C Blake UCI Repository of Machine Learning Databases

1998

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 9: A New Chaotic Starling Particle Swarm Optimization ...downloads.hindawi.com/journals/mpe/2018/8250480.pdfMathematicalProblemsinEngineering Chaotic Parameter u 2.5 3 3.5 4 Chaotic Value

Mathematical Problems in Engineering 9

Table 5 Parameter settings in the experiment

Parameters PSO ABC [22] FER-PSO [23] GABC [24] SPSO [25] CSPSOPopulation (119872) 50 50 50 50 50 50119905max 200 200 200 200 200 2001198881 1198882 22 minus 22 minus 22 2 21199031 1199032 (0 1) (minus1 1) (01 04) (minus1 1) (0 1) (0 1)(119908min 119908max) 1 minus 1 1 (02 04) (04 12)Neighbors (119873119890) minus minus 7 minus 7 2minus7Selected (119878) minus minus minus minus minus 10Sub-population minus minus minus minus 14 minusStagnant limit minus minus minus minus 2 minusTrial limit minus 10 minus 10 minus minus

Table 6 Best fitness function values obtained by different algorithms for the test functions

Function 119863 PSO FER-PSO SPSO CSPSO

Sphere10 0 0 0 020 05285 01613 82232e-12 030 24235 08891 51645e-12 0

Rastrigin10 00727 003647 0 020 03595 01895 0 030 10374 04847 0 0

Griewank10 05965 01135 0 020 14575 10994 0 030 32947 16874 0 0

Rosenbrock10 63220 68162 52486 0703720 198643 192219 187899 9843130 381130 337894 287863 186928

FER-PSO and SPSO are easily trapped into local optima butCSPSO can find the smallest function values among thosefound by these algorithms These results show that CSPSOhas strong global search ability through the use of DDT andchaotic mapping It can quickly escape local optima

As the results of the 4 classic functions show in Table 6CSPSO and SPSO perform better than PSO and FER-PSOon the Rastrigin and Griewank functions and CSPSO per-forms better than the other three algorithms on the Sphereand Rosenbrock functions In general the results showthat CSPSO has more powerful global search ability thantraditional algorithms like PSO FER-PSO and SPSO bothin single modal and multimodal functions Furthermore asindicated by the results in Table 6 the better performance ofCSPSO than others provides evidence that the improvementin CSPSO is effective in enhancing the performance of PSOand CSPSO can find the global optimum more often thanother algorithms in the experiments

53 Clustering Problems In this section experiments onclustering problemswith different datasets are performed andresults of different algorithms including CSPSO are reportedand discussed Eight benchmark datasets are used Theyare Data 3 2 Data 5 2 Data 10 2 and Data 4 3 from thework reported in [34] and Iris Glass Contraceptive Method

Choice (CMC) and Wine from the UCI Machine LearningRepository [35] More details about these datasets are givenin Table 7 Data 5 2 and Data 4 3 are also plotted in Figure 5

The optimized values of the parameters in the other algo-rithms shown in Table 5 are adopted in the experiments Inorder to perform fair comparisons all the algorithms run onall the clustering datasets for 50 times to eliminate the effectsrandom factors As simple statistics best values (Best) worstvalues (Worst) mean values (Mean) and standard deviations(SD) are used as the evaluation criteria to measure theeffectiveness of these clustering algorithms The environmentof experiment is the same for all clustering algorithms

Figure 6 shows the convergence of these algorithms onfour datasets for typical runs of the algorithms The fitnessfunctions of CSPSO decline slowly at the beginning ofthe evolutionary process and continue to decline ratherthan dropping into local optimum at the middle of theevolutionary process possibly because DDT gives the par-ticles high velocities Local search based on Euclidean andfitness neighborhood structure also helps the particles findbetter positions This phenomenon is more evident on high-dimensional datasets such as the Glass dataset

To be more clear Figure 7 gives more details aboutthe convergence of these algorithms near the end of thesearch process on three out of the four same datasets

10 Mathematical Problems in Engineering

Table 7 Description of the datasets used in the experiments

Datasets Clusters (119870) Features (119889) Total instances (119873) Instances in each clustersData 3 2 3 2 76 (252526)Data 5 2 5 2 250 (5050505050)Data 10 2 10 2 500 (50505050505050505050)Data 4 3 4 3 400 (100100100100)Iris 3 4 150 (505050)Glass 6 9 214 (70177613929)CMC 3 9 1473 (629334510)Wine 3 13 178 (597148)

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

CSPSOPSO

SPSOFER-PSO

104

103

102

101

100

10minus1

10minus2

(a) Sphere function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Bes

t Val

ues

103

102

101

100

(b) Rastrigin function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

105

100

10minus10

10minus5

10minus15

10minus20

(c) Griewank function

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

103

102

101

100

CSPSOPSO

SPSOFER-PSO

(d) Rosenbrock function

Figure 4 Convergence of the algorithms on some functions (D = 10)

CSPSO has the best performance among all these clusteringalgorithms ABC is the first converging to a local optimalpoint possibly because of the influence of randomly selectedneighbors GABC has a poor performance for large datavolumes such as the Data 5 2 and Glass datasets possibly

due to the local search behavior of onlooker bees FER-PSOhas apparently better convergence performance than otheralgorithms possibly due to the strategy that only one of theneighbors is selected to guide the search of a particle but it iseasily trapped in local optimal points in the later generations

Mathematical Problems in Engineering 11

Data_5_2 Data_4_34 6 8 10 12 14 16

4

6

8

10

12

14

16

2015

105

0minus5minus5

05

1015

0

5

10

15

20

minus520

Figure 5 Distributions of data points in some datasets

of the search process SPSO shows a better performance insearching for a global optimum but its local search strategy isnot working on high-dimensional datasets The above resultsand discussions reveal the different performances amongPSO ABC FER-PSO GABC SPSO and CSPSO and verifythe good performance of CSPSO as a clustering algorithm

It can be seen from Table 8 that CSPSO has a betterperformance on these classical clustering problems For theData 3 2 Data 5 2 and Data 10 2 datasets CSPSO obtainedlower mean fitness values than others but other algorithmsperform better than CSPSO on the best fitness values Thisis possibly due to the randomness of the swarm intelligencealgorithms For the Data 4 3 datasets because of their largenumbers of data points the ability of local search in CSPSOis affected by the selected neighbors ABC has better stabilityon this dataset but has a poor ability in searching for aglobal optimum than the others due to the random strategyCSPSO outperforms the other algorithms and obtains lowermean fitness values than others on the Iris Glass CMC andWine datasets Therefore it has better performance on high-dimensional data This proposed algorithm with embeddedDDT and chaotic mapping has better global search abilityand is more stable than traditional PSO through the use ofa local search method based on starling birds Furthermoreas the sizes of the datasets increase in both the dimensionand number of data points CSPSO has better performancethan PSO ABC GABC FER-PSO and SPSO It is morecapable of dealing with big data in the current informationage

6 Conclusions

Because the performance of standard PSO depends onits parameter values parameter adjustment is one of themost useful ways for PSO to obtain good exploration andexploitation abilities CSPSO proposed in this study enhancesthe overall convergence of the searching process Nonlinearadjustment of the inertia weight and the chaotic search

method based on logistic mapping enhance the global searchperformance of the traditional PSO and help particles escapefrom local optima DDT added to velocity update increasesthe velocities of the particles in later iterations and also helpsparticles escape from local optima The local search strategybased on behavior of starling birds utilizes the information ofthe nearest neighbors in the neighborhood and determinesa new collective position and a new collective velocity Thetwoparticle selectionmethods Euclideandistance and fitnessfunction value maintain population diversity of the particlesThese improvements help particles avoid stagnation and findbetter solutions in the search space

The results of simulation experiments in both benchmarkfunction optimization and classical clustering problems showthe effectiveness of CSPSO Compared with the traditionalPSO and other evolutionary algorithms CSPSO has betterconvergence properties and stronger global search abilityHowever as for other swarm intelligence algorithms CSPSOmay be trapped into and may stagnate around local optimalpoints and therefore may be unstable when applied toproblemswith multiple local optima In future works CSPSOwill be applied to other combinatorial optimization problemssuch as scheduling problems Other metaheuristic methodsand optimization techniques may also be used to improve theperformance of PSO

Data Availability

All the datasets used in the simulation experiments camefrom the Artificial Datasets available at httpwwwisicalacinsimsanghamidatahtml (accessed June 2017) and from theUCIMachine Learning Repository available at httparchiveicsuciedumldatasetshtml (accessed June 2017)

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

12 Mathematical Problems in Engineering

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

46

48

50

52

54

56

58

60

(a) Data 3 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

320

340

360

380

400

420

440

460

480

(b) Data 5 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

90

100

110

120

130

140

150

160

170

180

190

(c) Iris

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

250

300

350

400

450

500

550

600

650

(d) Glass

Figure 6 Convergence of the algorithms on some datasets

Iteration 19965 1997 19975 1998 19985 1999 19995 200

Best

Valu

es

47515

4752

47525

4753

47535

4754

Iteration 188 190 192 194 196 198 200

Best

Valu

es

322323324325326327328329330331

Iteration 188 190 192 194 196 198 200

Best

Valu

es

94

95

96

97

98

99

Figure 7 Convergence of the algorithms on the Data 3 2 Data 5 2 and Iris datasets

Mathematical Problems in Engineering 13

Table 8 Comparison of performance of CSPSO with other clustering algorithms

Datasets Parameters AlgorithmsPSO ABC FERminusPSO GABC SPSO CSPSO

Data 3 2

Best 475287 477630 475290 475282 475733 475282Worst 591935 601039 497856 591895 531721 475382Mean 484644 509702 478771 482279 478934 475300SD 31956 31345 04967 27975 08551 00022

Data 5 2

Best 3264386 3375538 3269987 3297155 3264666 3264432Worst 3922899 4110786 3326651 3648400 3270877 3269676Mean 3296249 3736977 3287897 3435928 3267493 3265300SD 129978 201046 11594 96214 01536 00942

Data 10 2

Best 10793e+03 9164320 8756264 13134e+03 8432274 8498712Worst 14459e+03 12718e+03 12068e+03 11657e+03 11827e+03 10141e+03Mean 12875e+03 10608e+03 10066e+03 638046 9418921 9182542SD 704484 761788 671964 13134e+03 855520 505344

Data 4 3

Best 7518897 10937e+03 8305456 7504056 7499506 7496738Worst 12866e+03 16347e+03 13753e+03 18485e+03 12726e+03 12725e+03Mean 8388670 13838e+03 10312e+03 11489e+03 7938641 7845028SD 1818515 1223579 1500554 2713938 1425136 1251508

Iris

Best 1005216 1131201 980975 966829 966605 966605Worst 1306931 1568960 1029780 1276869 986906 980888Mean 1137655 1331161 1003761 1050487 969465 969194SD 71771 110706 12211 112307 03921 02931

Glass

Best 3291487 3337214 2786565 2885152 2353554 2296544Worst 4365615 5031607 3265351 4025881 3090997 2789187Mean 3764035 4223809 2973875 3464078 2805624 2570244SD 215371 320309 111730 269515 145045 103633

CMC

Best 58449e+03 59895e+03 56153e+03 56704e+03 55615e+03 55404e+03Worst 69260e+03 72137e+03 59728e+03 64143e+03 57676e+03 55861e+03Mean 61772e+03 64821e+03 57297e+03 58864e+03 56426e+03 55562e+03SD 1867542 2432599 730180 1408947 446957 87806

Wine

Best 16308e+04 16399e+04 16312e+04 16309e+04 16305e+04 16298e+04Worst 16798e+04 17317e+04 16376e+04 16482e+04 16356e+04 16313e+04Mean 16438e+04 16705e+04 16336e+04 16355e+04 16322e+04 16304e+04SD 950097 2197057 159233 354218 120030 33799

Acknowledgments

This research project was partially supported by the NationalNatural Science Foundation of China (61472231 6150228361640201) Humanities and Social Science Research Fund ofthe Ministry of Education of China (12YJA630152) and theShandong Social Science Foundation of China (16BGLJ0611CGLJ22)

References

[1] X LiuH Liu andHDuan ldquoParticle swarmoptimizationbasedon dynamic niche technology with applications to conceptualdesignrdquo Advances in Engineering Software vol 38 no 10 pp668ndash676 2007

[2] Z Bao and TWatanabe ldquoMixed constrained image filter designusing particle swarm optimizationrdquo Artificial Life and Roboticsvol 15 no 3 pp 363ndash368 2010

[3] S Sahin andM A Cavuslu ldquoFPGA Implementation ofWaveletNeural Network Training with PSOiPSOrdquo Journal of CircuitsSystems and Computers vol 27 no 6 2018

[4] D Wu and H Gao ldquoA BP and Switching PSO Based Optimiza-tion Approach for Engine Optimizationrdquo National Academy ofScience Letters vol 40 no 1 pp 33ndash37 2017

[5] Q Jia and Y Guo ldquoHybridization of ABC and PSO algorithmsfor improved solutions of RCPSPrdquo Journal of the ChineseInstitute of Engineers Transactions of the Chinese Institute ofEngineersSeries AChung-kuo Kung Chrsquoeng Hsuch Krsquoan vol 39no 6 pp 727ndash734 2016

[6] C Yue B Qu and J Liang ldquoA Multi-objective Particle SwarmOptimizer Using Ring Topology for Solving Multimodal Multi-objective Problemsrdquo IEEE Transactions on Evolutionary Com-putation 2017

[7] X Liu and J Xue ldquoA Cluster Splitting Technique by HopfieldNetworks and P Systems on Simplicesrdquo Neural ProcessingLetters pp 1ndash24 2017

14 Mathematical Problems in Engineering

[8] Y Zhao X Liu and W Wang ldquoSpiking neural P systems withneuron division and dissolutionrdquo PLoS ONE vol 11 no 9Article ID e0162882 2016

[9] X Liu Y Zhao and M Sun ldquoAn Improved Apriori AlgorithmBased on an Evolution-Communication Tissue-Like P Systemwith Promoters and Inhibitorsrdquo Discrete Dynamics in Natureand Society vol 2017 2017

[10] N Netjinda T Achalakul and B Sirinaovakul ldquoParticle SwarmOptimization inspired by starling flock behaviorrdquo Applied SoftComputing vol 35 pp 411ndash422 2015

[11] A El Dor D Lemoine M Clerc P Siarry L Deroussi andMGourgand ldquoDynamic cluster in particle swarm optimizationalgorithmrdquo Natural Computing vol 14 no 4 pp 655ndash672 2015

[12] Y M B Ali ldquoUnsupervised Clustering Based an AdaptiveParticle Swarm Optimization Algorithmrdquo Neural ProcessingLetters vol 44 no 1 pp 221ndash244 2016

[13] G Armano and M R Farmani ldquoMultiobjective clusteringanalysis using particle swarm optimizationrdquo Expert Systemswith Applications vol 55 pp 184ndash193 2016

[14] B Niu Q Duan L Tan C Liu and P Liang ldquoA Population-Based Clustering Technique Using Particle Swarm Optimiza-tion and K-Meansrdquo in Advances in Swarm and ComputationalIntelligence vol 9140 of Lecture Notes in Computer Science pp145ndash152 Springer International Publishing Cham 2015

[15] X Zhao W Lin J Hao X Zuo and J Yuan ldquoClustering andpattern search for enhancing particle swarm optimization withEuclidean spatial neighborhood searchrdquo Neurocomputing vol171 pp 966ndash981 2016

[16] E Comak ldquoA modified particle swarm optimization algorithmusing Renyi entropy-based clusteringrdquo Neural Computing andApplications vol 27 no 5 pp 1381ndash1390 2016

[17] K K Bharti and P K Singh ldquoOpposition chaotic fitness muta-tion based adaptive inertia weight BPSO for feature selection intext clusteringrdquoApplied SoftComputing vol 43 pp 20ndash34 2016

[18] W Song W Ma and Y Qiao ldquoParticle swarm optimizationalgorithm with environmental factors for clustering analysisrdquoSoft Computing vol 21 no 2 pp 283ndash293 2017

[19] R Liu J Li J Fan C Mu and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimizationfor dynamic multi-objective optimizationrdquo European Journal ofOperational Research vol 261 no 3 pp 1028ndash1051 2017

[20] X Chu B Niu J J Liang and Q Lu ldquoAn orthogonal-designhybrid particle swarm optimiser with application to capacitatedfacility location problemrdquo International Journal of Bio-InspiredComputation vol 8 no 5 pp 268ndash285 2016

[21] DWang D Tan and L Liu ldquoParticle Swarm Optimization Anoverviewrdquo Soft Computing vol 22 no 2 p 22 2018

[22] D Karaboga ldquoAn Idea Based on Honey Bee Swarm for Numer-ical Optimizationrdquo Technical Report TR0 2005

[23] X Li ldquoA multimodal particle swarm optimizer based on fitnessEuclidean-distance ratiordquo in Proceedings of the the 9th annualconference pp 78ndash85 London England July 2007

[24] G Zhu and S Kwong ldquoGbest-guided artificial bee colonyalgorithm for numerical function optimizationrdquo Applied Math-ematics and Computation vol 217 no 7 pp 3166ndash3173 2010

[25] A Laudani F R Fulginei G M Lozito and A SalvinildquoSwarmflock optimization algorithms as continuous dynamicsystemsrdquo Applied Mathematics and Computation vol 243 pp670ndash683 2014

[26] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[27] K K Bharti and P K Singh ldquoChaotic gradient artificial beecolony for text clusteringrdquo Soft Computing vol 20 no 3 pp1113ndash1126 2016

[28] RMMay ldquoSimplemathematicalmodelswith very complicateddynamicsrdquo Nature vol 261 no 5560 pp 459ndash467 1976

[29] F S Hover and M S Triantafyllou ldquoApplication of polynomialchaos in stability and controlrdquo Automatica vol 42 no 5 pp789ndash795 2006

[30] X Liu L Xiang and X Wang ldquoSpatial cluster analysis bythe adleman-lipton DNA computing model and flexible gridsrdquoDiscrete Dynamics in Nature and Society vol 2012 Article ID894207 32 pages 2012

[31] H Peng X Luo ZGao JWang andZ Pei ldquoANovel ClusteringAlgorithm Inspired by Membrane Computingrdquo The ScientificWorld Journal vol 3 no 22 Article ID 929471 8 pages 2015

[32] B Ufnalski and L M Grzesiak ldquoPlug-in direct particle swarmrepetitive controller with a reduced dimensionality of a fitnesslandscape - a multi-swarm approachrdquo Bulletin of the PolishAcademy of SciencesmdashTechnical Sciences vol 63 no 4 pp 857ndash866 2015

[33] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[34] ldquoDatasetsrdquo httpwwwisicalacinsimsanghamidatahtml[35] H C Blake UCI Repository of Machine Learning Databases

1998

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 10: A New Chaotic Starling Particle Swarm Optimization ...downloads.hindawi.com/journals/mpe/2018/8250480.pdfMathematicalProblemsinEngineering Chaotic Parameter u 2.5 3 3.5 4 Chaotic Value

10 Mathematical Problems in Engineering

Table 7 Description of the datasets used in the experiments

Datasets Clusters (119870) Features (119889) Total instances (119873) Instances in each clustersData 3 2 3 2 76 (252526)Data 5 2 5 2 250 (5050505050)Data 10 2 10 2 500 (50505050505050505050)Data 4 3 4 3 400 (100100100100)Iris 3 4 150 (505050)Glass 6 9 214 (70177613929)CMC 3 9 1473 (629334510)Wine 3 13 178 (597148)

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

CSPSOPSO

SPSOFER-PSO

104

103

102

101

100

10minus1

10minus2

(a) Sphere function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Bes

t Val

ues

103

102

101

100

(b) Rastrigin function

CSPSOPSO

SPSOFER-PSO

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

105

100

10minus10

10minus5

10minus15

10minus20

(c) Griewank function

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

103

102

101

100

CSPSOPSO

SPSOFER-PSO

(d) Rosenbrock function

Figure 4 Convergence of the algorithms on some functions (D = 10)

CSPSO has the best performance among all these clusteringalgorithms ABC is the first converging to a local optimalpoint possibly because of the influence of randomly selectedneighbors GABC has a poor performance for large datavolumes such as the Data 5 2 and Glass datasets possibly

due to the local search behavior of onlooker bees FER-PSOhas apparently better convergence performance than otheralgorithms possibly due to the strategy that only one of theneighbors is selected to guide the search of a particle but it iseasily trapped in local optimal points in the later generations

Mathematical Problems in Engineering 11

Data_5_2 Data_4_34 6 8 10 12 14 16

4

6

8

10

12

14

16

2015

105

0minus5minus5

05

1015

0

5

10

15

20

minus520

Figure 5 Distributions of data points in some datasets

of the search process SPSO shows a better performance insearching for a global optimum but its local search strategy isnot working on high-dimensional datasets The above resultsand discussions reveal the different performances amongPSO ABC FER-PSO GABC SPSO and CSPSO and verifythe good performance of CSPSO as a clustering algorithm

It can be seen from Table 8 that CSPSO has a betterperformance on these classical clustering problems For theData 3 2 Data 5 2 and Data 10 2 datasets CSPSO obtainedlower mean fitness values than others but other algorithmsperform better than CSPSO on the best fitness values Thisis possibly due to the randomness of the swarm intelligencealgorithms For the Data 4 3 datasets because of their largenumbers of data points the ability of local search in CSPSOis affected by the selected neighbors ABC has better stabilityon this dataset but has a poor ability in searching for aglobal optimum than the others due to the random strategyCSPSO outperforms the other algorithms and obtains lowermean fitness values than others on the Iris Glass CMC andWine datasets Therefore it has better performance on high-dimensional data This proposed algorithm with embeddedDDT and chaotic mapping has better global search abilityand is more stable than traditional PSO through the use ofa local search method based on starling birds Furthermoreas the sizes of the datasets increase in both the dimensionand number of data points CSPSO has better performancethan PSO ABC GABC FER-PSO and SPSO It is morecapable of dealing with big data in the current informationage

6 Conclusions

Because the performance of standard PSO depends onits parameter values parameter adjustment is one of themost useful ways for PSO to obtain good exploration andexploitation abilities CSPSO proposed in this study enhancesthe overall convergence of the searching process Nonlinearadjustment of the inertia weight and the chaotic search

method based on logistic mapping enhance the global searchperformance of the traditional PSO and help particles escapefrom local optima DDT added to velocity update increasesthe velocities of the particles in later iterations and also helpsparticles escape from local optima The local search strategybased on behavior of starling birds utilizes the information ofthe nearest neighbors in the neighborhood and determinesa new collective position and a new collective velocity Thetwoparticle selectionmethods Euclideandistance and fitnessfunction value maintain population diversity of the particlesThese improvements help particles avoid stagnation and findbetter solutions in the search space

The results of simulation experiments in both benchmarkfunction optimization and classical clustering problems showthe effectiveness of CSPSO Compared with the traditionalPSO and other evolutionary algorithms CSPSO has betterconvergence properties and stronger global search abilityHowever as for other swarm intelligence algorithms CSPSOmay be trapped into and may stagnate around local optimalpoints and therefore may be unstable when applied toproblemswith multiple local optima In future works CSPSOwill be applied to other combinatorial optimization problemssuch as scheduling problems Other metaheuristic methodsand optimization techniques may also be used to improve theperformance of PSO

Data Availability

All the datasets used in the simulation experiments camefrom the Artificial Datasets available at httpwwwisicalacinsimsanghamidatahtml (accessed June 2017) and from theUCIMachine Learning Repository available at httparchiveicsuciedumldatasetshtml (accessed June 2017)

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

12 Mathematical Problems in Engineering

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

46

48

50

52

54

56

58

60

(a) Data 3 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

320

340

360

380

400

420

440

460

480

(b) Data 5 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

90

100

110

120

130

140

150

160

170

180

190

(c) Iris

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

250

300

350

400

450

500

550

600

650

(d) Glass

Figure 6 Convergence of the algorithms on some datasets

Iteration 19965 1997 19975 1998 19985 1999 19995 200

Best

Valu

es

47515

4752

47525

4753

47535

4754

Iteration 188 190 192 194 196 198 200

Best

Valu

es

322323324325326327328329330331

Iteration 188 190 192 194 196 198 200

Best

Valu

es

94

95

96

97

98

99

Figure 7 Convergence of the algorithms on the Data 3 2 Data 5 2 and Iris datasets

Mathematical Problems in Engineering 13

Table 8 Comparison of performance of CSPSO with other clustering algorithms

Datasets Parameters AlgorithmsPSO ABC FERminusPSO GABC SPSO CSPSO

Data 3 2

Best 475287 477630 475290 475282 475733 475282Worst 591935 601039 497856 591895 531721 475382Mean 484644 509702 478771 482279 478934 475300SD 31956 31345 04967 27975 08551 00022

Data 5 2

Best 3264386 3375538 3269987 3297155 3264666 3264432Worst 3922899 4110786 3326651 3648400 3270877 3269676Mean 3296249 3736977 3287897 3435928 3267493 3265300SD 129978 201046 11594 96214 01536 00942

Data 10 2

Best 10793e+03 9164320 8756264 13134e+03 8432274 8498712Worst 14459e+03 12718e+03 12068e+03 11657e+03 11827e+03 10141e+03Mean 12875e+03 10608e+03 10066e+03 638046 9418921 9182542SD 704484 761788 671964 13134e+03 855520 505344

Data 4 3

Best 7518897 10937e+03 8305456 7504056 7499506 7496738Worst 12866e+03 16347e+03 13753e+03 18485e+03 12726e+03 12725e+03Mean 8388670 13838e+03 10312e+03 11489e+03 7938641 7845028SD 1818515 1223579 1500554 2713938 1425136 1251508

Iris

Best 1005216 1131201 980975 966829 966605 966605Worst 1306931 1568960 1029780 1276869 986906 980888Mean 1137655 1331161 1003761 1050487 969465 969194SD 71771 110706 12211 112307 03921 02931

Glass

Best 3291487 3337214 2786565 2885152 2353554 2296544Worst 4365615 5031607 3265351 4025881 3090997 2789187Mean 3764035 4223809 2973875 3464078 2805624 2570244SD 215371 320309 111730 269515 145045 103633

CMC

Best 58449e+03 59895e+03 56153e+03 56704e+03 55615e+03 55404e+03Worst 69260e+03 72137e+03 59728e+03 64143e+03 57676e+03 55861e+03Mean 61772e+03 64821e+03 57297e+03 58864e+03 56426e+03 55562e+03SD 1867542 2432599 730180 1408947 446957 87806

Wine

Best 16308e+04 16399e+04 16312e+04 16309e+04 16305e+04 16298e+04Worst 16798e+04 17317e+04 16376e+04 16482e+04 16356e+04 16313e+04Mean 16438e+04 16705e+04 16336e+04 16355e+04 16322e+04 16304e+04SD 950097 2197057 159233 354218 120030 33799

Acknowledgments

This research project was partially supported by the NationalNatural Science Foundation of China (61472231 6150228361640201) Humanities and Social Science Research Fund ofthe Ministry of Education of China (12YJA630152) and theShandong Social Science Foundation of China (16BGLJ0611CGLJ22)

References

[1] X LiuH Liu andHDuan ldquoParticle swarmoptimizationbasedon dynamic niche technology with applications to conceptualdesignrdquo Advances in Engineering Software vol 38 no 10 pp668ndash676 2007

[2] Z Bao and TWatanabe ldquoMixed constrained image filter designusing particle swarm optimizationrdquo Artificial Life and Roboticsvol 15 no 3 pp 363ndash368 2010

[3] S Sahin andM A Cavuslu ldquoFPGA Implementation ofWaveletNeural Network Training with PSOiPSOrdquo Journal of CircuitsSystems and Computers vol 27 no 6 2018

[4] D Wu and H Gao ldquoA BP and Switching PSO Based Optimiza-tion Approach for Engine Optimizationrdquo National Academy ofScience Letters vol 40 no 1 pp 33ndash37 2017

[5] Q Jia and Y Guo ldquoHybridization of ABC and PSO algorithmsfor improved solutions of RCPSPrdquo Journal of the ChineseInstitute of Engineers Transactions of the Chinese Institute ofEngineersSeries AChung-kuo Kung Chrsquoeng Hsuch Krsquoan vol 39no 6 pp 727ndash734 2016

[6] C Yue B Qu and J Liang ldquoA Multi-objective Particle SwarmOptimizer Using Ring Topology for Solving Multimodal Multi-objective Problemsrdquo IEEE Transactions on Evolutionary Com-putation 2017

[7] X Liu and J Xue ldquoA Cluster Splitting Technique by HopfieldNetworks and P Systems on Simplicesrdquo Neural ProcessingLetters pp 1ndash24 2017

14 Mathematical Problems in Engineering

[8] Y Zhao X Liu and W Wang ldquoSpiking neural P systems withneuron division and dissolutionrdquo PLoS ONE vol 11 no 9Article ID e0162882 2016

[9] X Liu Y Zhao and M Sun ldquoAn Improved Apriori AlgorithmBased on an Evolution-Communication Tissue-Like P Systemwith Promoters and Inhibitorsrdquo Discrete Dynamics in Natureand Society vol 2017 2017

[10] N Netjinda T Achalakul and B Sirinaovakul ldquoParticle SwarmOptimization inspired by starling flock behaviorrdquo Applied SoftComputing vol 35 pp 411ndash422 2015

[11] A El Dor D Lemoine M Clerc P Siarry L Deroussi andMGourgand ldquoDynamic cluster in particle swarm optimizationalgorithmrdquo Natural Computing vol 14 no 4 pp 655ndash672 2015

[12] Y M B Ali ldquoUnsupervised Clustering Based an AdaptiveParticle Swarm Optimization Algorithmrdquo Neural ProcessingLetters vol 44 no 1 pp 221ndash244 2016

[13] G Armano and M R Farmani ldquoMultiobjective clusteringanalysis using particle swarm optimizationrdquo Expert Systemswith Applications vol 55 pp 184ndash193 2016

[14] B Niu Q Duan L Tan C Liu and P Liang ldquoA Population-Based Clustering Technique Using Particle Swarm Optimiza-tion and K-Meansrdquo in Advances in Swarm and ComputationalIntelligence vol 9140 of Lecture Notes in Computer Science pp145ndash152 Springer International Publishing Cham 2015

[15] X Zhao W Lin J Hao X Zuo and J Yuan ldquoClustering andpattern search for enhancing particle swarm optimization withEuclidean spatial neighborhood searchrdquo Neurocomputing vol171 pp 966ndash981 2016

[16] E Comak ldquoA modified particle swarm optimization algorithmusing Renyi entropy-based clusteringrdquo Neural Computing andApplications vol 27 no 5 pp 1381ndash1390 2016

[17] K K Bharti and P K Singh ldquoOpposition chaotic fitness muta-tion based adaptive inertia weight BPSO for feature selection intext clusteringrdquoApplied SoftComputing vol 43 pp 20ndash34 2016

[18] W Song W Ma and Y Qiao ldquoParticle swarm optimizationalgorithm with environmental factors for clustering analysisrdquoSoft Computing vol 21 no 2 pp 283ndash293 2017

[19] R Liu J Li J Fan C Mu and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimizationfor dynamic multi-objective optimizationrdquo European Journal ofOperational Research vol 261 no 3 pp 1028ndash1051 2017

[20] X Chu B Niu J J Liang and Q Lu ldquoAn orthogonal-designhybrid particle swarm optimiser with application to capacitatedfacility location problemrdquo International Journal of Bio-InspiredComputation vol 8 no 5 pp 268ndash285 2016

[21] DWang D Tan and L Liu ldquoParticle Swarm Optimization Anoverviewrdquo Soft Computing vol 22 no 2 p 22 2018

[22] D Karaboga ldquoAn Idea Based on Honey Bee Swarm for Numer-ical Optimizationrdquo Technical Report TR0 2005

[23] X Li ldquoA multimodal particle swarm optimizer based on fitnessEuclidean-distance ratiordquo in Proceedings of the the 9th annualconference pp 78ndash85 London England July 2007

[24] G Zhu and S Kwong ldquoGbest-guided artificial bee colonyalgorithm for numerical function optimizationrdquo Applied Math-ematics and Computation vol 217 no 7 pp 3166ndash3173 2010

[25] A Laudani F R Fulginei G M Lozito and A SalvinildquoSwarmflock optimization algorithms as continuous dynamicsystemsrdquo Applied Mathematics and Computation vol 243 pp670ndash683 2014

[26] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[27] K K Bharti and P K Singh ldquoChaotic gradient artificial beecolony for text clusteringrdquo Soft Computing vol 20 no 3 pp1113ndash1126 2016

[28] RMMay ldquoSimplemathematicalmodelswith very complicateddynamicsrdquo Nature vol 261 no 5560 pp 459ndash467 1976

[29] F S Hover and M S Triantafyllou ldquoApplication of polynomialchaos in stability and controlrdquo Automatica vol 42 no 5 pp789ndash795 2006

[30] X Liu L Xiang and X Wang ldquoSpatial cluster analysis bythe adleman-lipton DNA computing model and flexible gridsrdquoDiscrete Dynamics in Nature and Society vol 2012 Article ID894207 32 pages 2012

[31] H Peng X Luo ZGao JWang andZ Pei ldquoANovel ClusteringAlgorithm Inspired by Membrane Computingrdquo The ScientificWorld Journal vol 3 no 22 Article ID 929471 8 pages 2015

[32] B Ufnalski and L M Grzesiak ldquoPlug-in direct particle swarmrepetitive controller with a reduced dimensionality of a fitnesslandscape - a multi-swarm approachrdquo Bulletin of the PolishAcademy of SciencesmdashTechnical Sciences vol 63 no 4 pp 857ndash866 2015

[33] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[34] ldquoDatasetsrdquo httpwwwisicalacinsimsanghamidatahtml[35] H C Blake UCI Repository of Machine Learning Databases

1998

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 11: A New Chaotic Starling Particle Swarm Optimization ...downloads.hindawi.com/journals/mpe/2018/8250480.pdfMathematicalProblemsinEngineering Chaotic Parameter u 2.5 3 3.5 4 Chaotic Value

Mathematical Problems in Engineering 11

Data_5_2 Data_4_34 6 8 10 12 14 16

4

6

8

10

12

14

16

2015

105

0minus5minus5

05

1015

0

5

10

15

20

minus520

Figure 5 Distributions of data points in some datasets

of the search process SPSO shows a better performance insearching for a global optimum but its local search strategy isnot working on high-dimensional datasets The above resultsand discussions reveal the different performances amongPSO ABC FER-PSO GABC SPSO and CSPSO and verifythe good performance of CSPSO as a clustering algorithm

It can be seen from Table 8 that CSPSO has a betterperformance on these classical clustering problems For theData 3 2 Data 5 2 and Data 10 2 datasets CSPSO obtainedlower mean fitness values than others but other algorithmsperform better than CSPSO on the best fitness values Thisis possibly due to the randomness of the swarm intelligencealgorithms For the Data 4 3 datasets because of their largenumbers of data points the ability of local search in CSPSOis affected by the selected neighbors ABC has better stabilityon this dataset but has a poor ability in searching for aglobal optimum than the others due to the random strategyCSPSO outperforms the other algorithms and obtains lowermean fitness values than others on the Iris Glass CMC andWine datasets Therefore it has better performance on high-dimensional data This proposed algorithm with embeddedDDT and chaotic mapping has better global search abilityand is more stable than traditional PSO through the use ofa local search method based on starling birds Furthermoreas the sizes of the datasets increase in both the dimensionand number of data points CSPSO has better performancethan PSO ABC GABC FER-PSO and SPSO It is morecapable of dealing with big data in the current informationage

6 Conclusions

Because the performance of standard PSO depends onits parameter values parameter adjustment is one of themost useful ways for PSO to obtain good exploration andexploitation abilities CSPSO proposed in this study enhancesthe overall convergence of the searching process Nonlinearadjustment of the inertia weight and the chaotic search

method based on logistic mapping enhance the global searchperformance of the traditional PSO and help particles escapefrom local optima DDT added to velocity update increasesthe velocities of the particles in later iterations and also helpsparticles escape from local optima The local search strategybased on behavior of starling birds utilizes the information ofthe nearest neighbors in the neighborhood and determinesa new collective position and a new collective velocity Thetwoparticle selectionmethods Euclideandistance and fitnessfunction value maintain population diversity of the particlesThese improvements help particles avoid stagnation and findbetter solutions in the search space

The results of simulation experiments in both benchmarkfunction optimization and classical clustering problems showthe effectiveness of CSPSO Compared with the traditionalPSO and other evolutionary algorithms CSPSO has betterconvergence properties and stronger global search abilityHowever as for other swarm intelligence algorithms CSPSOmay be trapped into and may stagnate around local optimalpoints and therefore may be unstable when applied toproblemswith multiple local optima In future works CSPSOwill be applied to other combinatorial optimization problemssuch as scheduling problems Other metaheuristic methodsand optimization techniques may also be used to improve theperformance of PSO

Data Availability

All the datasets used in the simulation experiments camefrom the Artificial Datasets available at httpwwwisicalacinsimsanghamidatahtml (accessed June 2017) and from theUCIMachine Learning Repository available at httparchiveicsuciedumldatasetshtml (accessed June 2017)

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

12 Mathematical Problems in Engineering

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

46

48

50

52

54

56

58

60

(a) Data 3 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

320

340

360

380

400

420

440

460

480

(b) Data 5 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

90

100

110

120

130

140

150

160

170

180

190

(c) Iris

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

250

300

350

400

450

500

550

600

650

(d) Glass

Figure 6 Convergence of the algorithms on some datasets

Iteration 19965 1997 19975 1998 19985 1999 19995 200

Best

Valu

es

47515

4752

47525

4753

47535

4754

Iteration 188 190 192 194 196 198 200

Best

Valu

es

322323324325326327328329330331

Iteration 188 190 192 194 196 198 200

Best

Valu

es

94

95

96

97

98

99

Figure 7 Convergence of the algorithms on the Data 3 2 Data 5 2 and Iris datasets

Mathematical Problems in Engineering 13

Table 8 Comparison of performance of CSPSO with other clustering algorithms

Datasets Parameters AlgorithmsPSO ABC FERminusPSO GABC SPSO CSPSO

Data 3 2

Best 475287 477630 475290 475282 475733 475282Worst 591935 601039 497856 591895 531721 475382Mean 484644 509702 478771 482279 478934 475300SD 31956 31345 04967 27975 08551 00022

Data 5 2

Best 3264386 3375538 3269987 3297155 3264666 3264432Worst 3922899 4110786 3326651 3648400 3270877 3269676Mean 3296249 3736977 3287897 3435928 3267493 3265300SD 129978 201046 11594 96214 01536 00942

Data 10 2

Best 10793e+03 9164320 8756264 13134e+03 8432274 8498712Worst 14459e+03 12718e+03 12068e+03 11657e+03 11827e+03 10141e+03Mean 12875e+03 10608e+03 10066e+03 638046 9418921 9182542SD 704484 761788 671964 13134e+03 855520 505344

Data 4 3

Best 7518897 10937e+03 8305456 7504056 7499506 7496738Worst 12866e+03 16347e+03 13753e+03 18485e+03 12726e+03 12725e+03Mean 8388670 13838e+03 10312e+03 11489e+03 7938641 7845028SD 1818515 1223579 1500554 2713938 1425136 1251508

Iris

Best 1005216 1131201 980975 966829 966605 966605Worst 1306931 1568960 1029780 1276869 986906 980888Mean 1137655 1331161 1003761 1050487 969465 969194SD 71771 110706 12211 112307 03921 02931

Glass

Best 3291487 3337214 2786565 2885152 2353554 2296544Worst 4365615 5031607 3265351 4025881 3090997 2789187Mean 3764035 4223809 2973875 3464078 2805624 2570244SD 215371 320309 111730 269515 145045 103633

CMC

Best 58449e+03 59895e+03 56153e+03 56704e+03 55615e+03 55404e+03Worst 69260e+03 72137e+03 59728e+03 64143e+03 57676e+03 55861e+03Mean 61772e+03 64821e+03 57297e+03 58864e+03 56426e+03 55562e+03SD 1867542 2432599 730180 1408947 446957 87806

Wine

Best 16308e+04 16399e+04 16312e+04 16309e+04 16305e+04 16298e+04Worst 16798e+04 17317e+04 16376e+04 16482e+04 16356e+04 16313e+04Mean 16438e+04 16705e+04 16336e+04 16355e+04 16322e+04 16304e+04SD 950097 2197057 159233 354218 120030 33799

Acknowledgments

This research project was partially supported by the NationalNatural Science Foundation of China (61472231 6150228361640201) Humanities and Social Science Research Fund ofthe Ministry of Education of China (12YJA630152) and theShandong Social Science Foundation of China (16BGLJ0611CGLJ22)

References

[1] X LiuH Liu andHDuan ldquoParticle swarmoptimizationbasedon dynamic niche technology with applications to conceptualdesignrdquo Advances in Engineering Software vol 38 no 10 pp668ndash676 2007

[2] Z Bao and TWatanabe ldquoMixed constrained image filter designusing particle swarm optimizationrdquo Artificial Life and Roboticsvol 15 no 3 pp 363ndash368 2010

[3] S Sahin andM A Cavuslu ldquoFPGA Implementation ofWaveletNeural Network Training with PSOiPSOrdquo Journal of CircuitsSystems and Computers vol 27 no 6 2018

[4] D Wu and H Gao ldquoA BP and Switching PSO Based Optimiza-tion Approach for Engine Optimizationrdquo National Academy ofScience Letters vol 40 no 1 pp 33ndash37 2017

[5] Q Jia and Y Guo ldquoHybridization of ABC and PSO algorithmsfor improved solutions of RCPSPrdquo Journal of the ChineseInstitute of Engineers Transactions of the Chinese Institute ofEngineersSeries AChung-kuo Kung Chrsquoeng Hsuch Krsquoan vol 39no 6 pp 727ndash734 2016

[6] C Yue B Qu and J Liang ldquoA Multi-objective Particle SwarmOptimizer Using Ring Topology for Solving Multimodal Multi-objective Problemsrdquo IEEE Transactions on Evolutionary Com-putation 2017

[7] X Liu and J Xue ldquoA Cluster Splitting Technique by HopfieldNetworks and P Systems on Simplicesrdquo Neural ProcessingLetters pp 1ndash24 2017

14 Mathematical Problems in Engineering

[8] Y Zhao X Liu and W Wang ldquoSpiking neural P systems withneuron division and dissolutionrdquo PLoS ONE vol 11 no 9Article ID e0162882 2016

[9] X Liu Y Zhao and M Sun ldquoAn Improved Apriori AlgorithmBased on an Evolution-Communication Tissue-Like P Systemwith Promoters and Inhibitorsrdquo Discrete Dynamics in Natureand Society vol 2017 2017

[10] N Netjinda T Achalakul and B Sirinaovakul ldquoParticle SwarmOptimization inspired by starling flock behaviorrdquo Applied SoftComputing vol 35 pp 411ndash422 2015

[11] A El Dor D Lemoine M Clerc P Siarry L Deroussi andMGourgand ldquoDynamic cluster in particle swarm optimizationalgorithmrdquo Natural Computing vol 14 no 4 pp 655ndash672 2015

[12] Y M B Ali ldquoUnsupervised Clustering Based an AdaptiveParticle Swarm Optimization Algorithmrdquo Neural ProcessingLetters vol 44 no 1 pp 221ndash244 2016

[13] G Armano and M R Farmani ldquoMultiobjective clusteringanalysis using particle swarm optimizationrdquo Expert Systemswith Applications vol 55 pp 184ndash193 2016

[14] B Niu Q Duan L Tan C Liu and P Liang ldquoA Population-Based Clustering Technique Using Particle Swarm Optimiza-tion and K-Meansrdquo in Advances in Swarm and ComputationalIntelligence vol 9140 of Lecture Notes in Computer Science pp145ndash152 Springer International Publishing Cham 2015

[15] X Zhao W Lin J Hao X Zuo and J Yuan ldquoClustering andpattern search for enhancing particle swarm optimization withEuclidean spatial neighborhood searchrdquo Neurocomputing vol171 pp 966ndash981 2016

[16] E Comak ldquoA modified particle swarm optimization algorithmusing Renyi entropy-based clusteringrdquo Neural Computing andApplications vol 27 no 5 pp 1381ndash1390 2016

[17] K K Bharti and P K Singh ldquoOpposition chaotic fitness muta-tion based adaptive inertia weight BPSO for feature selection intext clusteringrdquoApplied SoftComputing vol 43 pp 20ndash34 2016

[18] W Song W Ma and Y Qiao ldquoParticle swarm optimizationalgorithm with environmental factors for clustering analysisrdquoSoft Computing vol 21 no 2 pp 283ndash293 2017

[19] R Liu J Li J Fan C Mu and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimizationfor dynamic multi-objective optimizationrdquo European Journal ofOperational Research vol 261 no 3 pp 1028ndash1051 2017

[20] X Chu B Niu J J Liang and Q Lu ldquoAn orthogonal-designhybrid particle swarm optimiser with application to capacitatedfacility location problemrdquo International Journal of Bio-InspiredComputation vol 8 no 5 pp 268ndash285 2016

[21] DWang D Tan and L Liu ldquoParticle Swarm Optimization Anoverviewrdquo Soft Computing vol 22 no 2 p 22 2018

[22] D Karaboga ldquoAn Idea Based on Honey Bee Swarm for Numer-ical Optimizationrdquo Technical Report TR0 2005

[23] X Li ldquoA multimodal particle swarm optimizer based on fitnessEuclidean-distance ratiordquo in Proceedings of the the 9th annualconference pp 78ndash85 London England July 2007

[24] G Zhu and S Kwong ldquoGbest-guided artificial bee colonyalgorithm for numerical function optimizationrdquo Applied Math-ematics and Computation vol 217 no 7 pp 3166ndash3173 2010

[25] A Laudani F R Fulginei G M Lozito and A SalvinildquoSwarmflock optimization algorithms as continuous dynamicsystemsrdquo Applied Mathematics and Computation vol 243 pp670ndash683 2014

[26] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[27] K K Bharti and P K Singh ldquoChaotic gradient artificial beecolony for text clusteringrdquo Soft Computing vol 20 no 3 pp1113ndash1126 2016

[28] RMMay ldquoSimplemathematicalmodelswith very complicateddynamicsrdquo Nature vol 261 no 5560 pp 459ndash467 1976

[29] F S Hover and M S Triantafyllou ldquoApplication of polynomialchaos in stability and controlrdquo Automatica vol 42 no 5 pp789ndash795 2006

[30] X Liu L Xiang and X Wang ldquoSpatial cluster analysis bythe adleman-lipton DNA computing model and flexible gridsrdquoDiscrete Dynamics in Nature and Society vol 2012 Article ID894207 32 pages 2012

[31] H Peng X Luo ZGao JWang andZ Pei ldquoANovel ClusteringAlgorithm Inspired by Membrane Computingrdquo The ScientificWorld Journal vol 3 no 22 Article ID 929471 8 pages 2015

[32] B Ufnalski and L M Grzesiak ldquoPlug-in direct particle swarmrepetitive controller with a reduced dimensionality of a fitnesslandscape - a multi-swarm approachrdquo Bulletin of the PolishAcademy of SciencesmdashTechnical Sciences vol 63 no 4 pp 857ndash866 2015

[33] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[34] ldquoDatasetsrdquo httpwwwisicalacinsimsanghamidatahtml[35] H C Blake UCI Repository of Machine Learning Databases

1998

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 12: A New Chaotic Starling Particle Swarm Optimization ...downloads.hindawi.com/journals/mpe/2018/8250480.pdfMathematicalProblemsinEngineering Chaotic Parameter u 2.5 3 3.5 4 Chaotic Value

12 Mathematical Problems in Engineering

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

46

48

50

52

54

56

58

60

(a) Data 3 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

320

340

360

380

400

420

440

460

480

(b) Data 5 2

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

90

100

110

120

130

140

150

160

170

180

190

(c) Iris

CSPSOPSOSPSO

FER-PSOGABCABC

Iteration 0 20 40 60 80 100 120 140 160 180 200

Best

Valu

es

250

300

350

400

450

500

550

600

650

(d) Glass

Figure 6 Convergence of the algorithms on some datasets

Iteration 19965 1997 19975 1998 19985 1999 19995 200

Best

Valu

es

47515

4752

47525

4753

47535

4754

Iteration 188 190 192 194 196 198 200

Best

Valu

es

322323324325326327328329330331

Iteration 188 190 192 194 196 198 200

Best

Valu

es

94

95

96

97

98

99

Figure 7 Convergence of the algorithms on the Data 3 2 Data 5 2 and Iris datasets

Mathematical Problems in Engineering 13

Table 8 Comparison of performance of CSPSO with other clustering algorithms

Datasets Parameters AlgorithmsPSO ABC FERminusPSO GABC SPSO CSPSO

Data 3 2

Best 475287 477630 475290 475282 475733 475282Worst 591935 601039 497856 591895 531721 475382Mean 484644 509702 478771 482279 478934 475300SD 31956 31345 04967 27975 08551 00022

Data 5 2

Best 3264386 3375538 3269987 3297155 3264666 3264432Worst 3922899 4110786 3326651 3648400 3270877 3269676Mean 3296249 3736977 3287897 3435928 3267493 3265300SD 129978 201046 11594 96214 01536 00942

Data 10 2

Best 10793e+03 9164320 8756264 13134e+03 8432274 8498712Worst 14459e+03 12718e+03 12068e+03 11657e+03 11827e+03 10141e+03Mean 12875e+03 10608e+03 10066e+03 638046 9418921 9182542SD 704484 761788 671964 13134e+03 855520 505344

Data 4 3

Best 7518897 10937e+03 8305456 7504056 7499506 7496738Worst 12866e+03 16347e+03 13753e+03 18485e+03 12726e+03 12725e+03Mean 8388670 13838e+03 10312e+03 11489e+03 7938641 7845028SD 1818515 1223579 1500554 2713938 1425136 1251508

Iris

Best 1005216 1131201 980975 966829 966605 966605Worst 1306931 1568960 1029780 1276869 986906 980888Mean 1137655 1331161 1003761 1050487 969465 969194SD 71771 110706 12211 112307 03921 02931

Glass

Best 3291487 3337214 2786565 2885152 2353554 2296544Worst 4365615 5031607 3265351 4025881 3090997 2789187Mean 3764035 4223809 2973875 3464078 2805624 2570244SD 215371 320309 111730 269515 145045 103633

CMC

Best 58449e+03 59895e+03 56153e+03 56704e+03 55615e+03 55404e+03Worst 69260e+03 72137e+03 59728e+03 64143e+03 57676e+03 55861e+03Mean 61772e+03 64821e+03 57297e+03 58864e+03 56426e+03 55562e+03SD 1867542 2432599 730180 1408947 446957 87806

Wine

Best 16308e+04 16399e+04 16312e+04 16309e+04 16305e+04 16298e+04Worst 16798e+04 17317e+04 16376e+04 16482e+04 16356e+04 16313e+04Mean 16438e+04 16705e+04 16336e+04 16355e+04 16322e+04 16304e+04SD 950097 2197057 159233 354218 120030 33799

Acknowledgments

This research project was partially supported by the NationalNatural Science Foundation of China (61472231 6150228361640201) Humanities and Social Science Research Fund ofthe Ministry of Education of China (12YJA630152) and theShandong Social Science Foundation of China (16BGLJ0611CGLJ22)

References

[1] X LiuH Liu andHDuan ldquoParticle swarmoptimizationbasedon dynamic niche technology with applications to conceptualdesignrdquo Advances in Engineering Software vol 38 no 10 pp668ndash676 2007

[2] Z Bao and TWatanabe ldquoMixed constrained image filter designusing particle swarm optimizationrdquo Artificial Life and Roboticsvol 15 no 3 pp 363ndash368 2010

[3] S Sahin andM A Cavuslu ldquoFPGA Implementation ofWaveletNeural Network Training with PSOiPSOrdquo Journal of CircuitsSystems and Computers vol 27 no 6 2018

[4] D Wu and H Gao ldquoA BP and Switching PSO Based Optimiza-tion Approach for Engine Optimizationrdquo National Academy ofScience Letters vol 40 no 1 pp 33ndash37 2017

[5] Q Jia and Y Guo ldquoHybridization of ABC and PSO algorithmsfor improved solutions of RCPSPrdquo Journal of the ChineseInstitute of Engineers Transactions of the Chinese Institute ofEngineersSeries AChung-kuo Kung Chrsquoeng Hsuch Krsquoan vol 39no 6 pp 727ndash734 2016

[6] C Yue B Qu and J Liang ldquoA Multi-objective Particle SwarmOptimizer Using Ring Topology for Solving Multimodal Multi-objective Problemsrdquo IEEE Transactions on Evolutionary Com-putation 2017

[7] X Liu and J Xue ldquoA Cluster Splitting Technique by HopfieldNetworks and P Systems on Simplicesrdquo Neural ProcessingLetters pp 1ndash24 2017

14 Mathematical Problems in Engineering

[8] Y Zhao X Liu and W Wang ldquoSpiking neural P systems withneuron division and dissolutionrdquo PLoS ONE vol 11 no 9Article ID e0162882 2016

[9] X Liu Y Zhao and M Sun ldquoAn Improved Apriori AlgorithmBased on an Evolution-Communication Tissue-Like P Systemwith Promoters and Inhibitorsrdquo Discrete Dynamics in Natureand Society vol 2017 2017

[10] N Netjinda T Achalakul and B Sirinaovakul ldquoParticle SwarmOptimization inspired by starling flock behaviorrdquo Applied SoftComputing vol 35 pp 411ndash422 2015

[11] A El Dor D Lemoine M Clerc P Siarry L Deroussi andMGourgand ldquoDynamic cluster in particle swarm optimizationalgorithmrdquo Natural Computing vol 14 no 4 pp 655ndash672 2015

[12] Y M B Ali ldquoUnsupervised Clustering Based an AdaptiveParticle Swarm Optimization Algorithmrdquo Neural ProcessingLetters vol 44 no 1 pp 221ndash244 2016

[13] G Armano and M R Farmani ldquoMultiobjective clusteringanalysis using particle swarm optimizationrdquo Expert Systemswith Applications vol 55 pp 184ndash193 2016

[14] B Niu Q Duan L Tan C Liu and P Liang ldquoA Population-Based Clustering Technique Using Particle Swarm Optimiza-tion and K-Meansrdquo in Advances in Swarm and ComputationalIntelligence vol 9140 of Lecture Notes in Computer Science pp145ndash152 Springer International Publishing Cham 2015

[15] X Zhao W Lin J Hao X Zuo and J Yuan ldquoClustering andpattern search for enhancing particle swarm optimization withEuclidean spatial neighborhood searchrdquo Neurocomputing vol171 pp 966ndash981 2016

[16] E Comak ldquoA modified particle swarm optimization algorithmusing Renyi entropy-based clusteringrdquo Neural Computing andApplications vol 27 no 5 pp 1381ndash1390 2016

[17] K K Bharti and P K Singh ldquoOpposition chaotic fitness muta-tion based adaptive inertia weight BPSO for feature selection intext clusteringrdquoApplied SoftComputing vol 43 pp 20ndash34 2016

[18] W Song W Ma and Y Qiao ldquoParticle swarm optimizationalgorithm with environmental factors for clustering analysisrdquoSoft Computing vol 21 no 2 pp 283ndash293 2017

[19] R Liu J Li J Fan C Mu and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimizationfor dynamic multi-objective optimizationrdquo European Journal ofOperational Research vol 261 no 3 pp 1028ndash1051 2017

[20] X Chu B Niu J J Liang and Q Lu ldquoAn orthogonal-designhybrid particle swarm optimiser with application to capacitatedfacility location problemrdquo International Journal of Bio-InspiredComputation vol 8 no 5 pp 268ndash285 2016

[21] DWang D Tan and L Liu ldquoParticle Swarm Optimization Anoverviewrdquo Soft Computing vol 22 no 2 p 22 2018

[22] D Karaboga ldquoAn Idea Based on Honey Bee Swarm for Numer-ical Optimizationrdquo Technical Report TR0 2005

[23] X Li ldquoA multimodal particle swarm optimizer based on fitnessEuclidean-distance ratiordquo in Proceedings of the the 9th annualconference pp 78ndash85 London England July 2007

[24] G Zhu and S Kwong ldquoGbest-guided artificial bee colonyalgorithm for numerical function optimizationrdquo Applied Math-ematics and Computation vol 217 no 7 pp 3166ndash3173 2010

[25] A Laudani F R Fulginei G M Lozito and A SalvinildquoSwarmflock optimization algorithms as continuous dynamicsystemsrdquo Applied Mathematics and Computation vol 243 pp670ndash683 2014

[26] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[27] K K Bharti and P K Singh ldquoChaotic gradient artificial beecolony for text clusteringrdquo Soft Computing vol 20 no 3 pp1113ndash1126 2016

[28] RMMay ldquoSimplemathematicalmodelswith very complicateddynamicsrdquo Nature vol 261 no 5560 pp 459ndash467 1976

[29] F S Hover and M S Triantafyllou ldquoApplication of polynomialchaos in stability and controlrdquo Automatica vol 42 no 5 pp789ndash795 2006

[30] X Liu L Xiang and X Wang ldquoSpatial cluster analysis bythe adleman-lipton DNA computing model and flexible gridsrdquoDiscrete Dynamics in Nature and Society vol 2012 Article ID894207 32 pages 2012

[31] H Peng X Luo ZGao JWang andZ Pei ldquoANovel ClusteringAlgorithm Inspired by Membrane Computingrdquo The ScientificWorld Journal vol 3 no 22 Article ID 929471 8 pages 2015

[32] B Ufnalski and L M Grzesiak ldquoPlug-in direct particle swarmrepetitive controller with a reduced dimensionality of a fitnesslandscape - a multi-swarm approachrdquo Bulletin of the PolishAcademy of SciencesmdashTechnical Sciences vol 63 no 4 pp 857ndash866 2015

[33] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[34] ldquoDatasetsrdquo httpwwwisicalacinsimsanghamidatahtml[35] H C Blake UCI Repository of Machine Learning Databases

1998

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 13: A New Chaotic Starling Particle Swarm Optimization ...downloads.hindawi.com/journals/mpe/2018/8250480.pdfMathematicalProblemsinEngineering Chaotic Parameter u 2.5 3 3.5 4 Chaotic Value

Mathematical Problems in Engineering 13

Table 8 Comparison of performance of CSPSO with other clustering algorithms

Datasets Parameters AlgorithmsPSO ABC FERminusPSO GABC SPSO CSPSO

Data 3 2

Best 475287 477630 475290 475282 475733 475282Worst 591935 601039 497856 591895 531721 475382Mean 484644 509702 478771 482279 478934 475300SD 31956 31345 04967 27975 08551 00022

Data 5 2

Best 3264386 3375538 3269987 3297155 3264666 3264432Worst 3922899 4110786 3326651 3648400 3270877 3269676Mean 3296249 3736977 3287897 3435928 3267493 3265300SD 129978 201046 11594 96214 01536 00942

Data 10 2

Best 10793e+03 9164320 8756264 13134e+03 8432274 8498712Worst 14459e+03 12718e+03 12068e+03 11657e+03 11827e+03 10141e+03Mean 12875e+03 10608e+03 10066e+03 638046 9418921 9182542SD 704484 761788 671964 13134e+03 855520 505344

Data 4 3

Best 7518897 10937e+03 8305456 7504056 7499506 7496738Worst 12866e+03 16347e+03 13753e+03 18485e+03 12726e+03 12725e+03Mean 8388670 13838e+03 10312e+03 11489e+03 7938641 7845028SD 1818515 1223579 1500554 2713938 1425136 1251508

Iris

Best 1005216 1131201 980975 966829 966605 966605Worst 1306931 1568960 1029780 1276869 986906 980888Mean 1137655 1331161 1003761 1050487 969465 969194SD 71771 110706 12211 112307 03921 02931

Glass

Best 3291487 3337214 2786565 2885152 2353554 2296544Worst 4365615 5031607 3265351 4025881 3090997 2789187Mean 3764035 4223809 2973875 3464078 2805624 2570244SD 215371 320309 111730 269515 145045 103633

CMC

Best 58449e+03 59895e+03 56153e+03 56704e+03 55615e+03 55404e+03Worst 69260e+03 72137e+03 59728e+03 64143e+03 57676e+03 55861e+03Mean 61772e+03 64821e+03 57297e+03 58864e+03 56426e+03 55562e+03SD 1867542 2432599 730180 1408947 446957 87806

Wine

Best 16308e+04 16399e+04 16312e+04 16309e+04 16305e+04 16298e+04Worst 16798e+04 17317e+04 16376e+04 16482e+04 16356e+04 16313e+04Mean 16438e+04 16705e+04 16336e+04 16355e+04 16322e+04 16304e+04SD 950097 2197057 159233 354218 120030 33799

Acknowledgments

This research project was partially supported by the NationalNatural Science Foundation of China (61472231 6150228361640201) Humanities and Social Science Research Fund ofthe Ministry of Education of China (12YJA630152) and theShandong Social Science Foundation of China (16BGLJ0611CGLJ22)

References

[1] X LiuH Liu andHDuan ldquoParticle swarmoptimizationbasedon dynamic niche technology with applications to conceptualdesignrdquo Advances in Engineering Software vol 38 no 10 pp668ndash676 2007

[2] Z Bao and TWatanabe ldquoMixed constrained image filter designusing particle swarm optimizationrdquo Artificial Life and Roboticsvol 15 no 3 pp 363ndash368 2010

[3] S Sahin andM A Cavuslu ldquoFPGA Implementation ofWaveletNeural Network Training with PSOiPSOrdquo Journal of CircuitsSystems and Computers vol 27 no 6 2018

[4] D Wu and H Gao ldquoA BP and Switching PSO Based Optimiza-tion Approach for Engine Optimizationrdquo National Academy ofScience Letters vol 40 no 1 pp 33ndash37 2017

[5] Q Jia and Y Guo ldquoHybridization of ABC and PSO algorithmsfor improved solutions of RCPSPrdquo Journal of the ChineseInstitute of Engineers Transactions of the Chinese Institute ofEngineersSeries AChung-kuo Kung Chrsquoeng Hsuch Krsquoan vol 39no 6 pp 727ndash734 2016

[6] C Yue B Qu and J Liang ldquoA Multi-objective Particle SwarmOptimizer Using Ring Topology for Solving Multimodal Multi-objective Problemsrdquo IEEE Transactions on Evolutionary Com-putation 2017

[7] X Liu and J Xue ldquoA Cluster Splitting Technique by HopfieldNetworks and P Systems on Simplicesrdquo Neural ProcessingLetters pp 1ndash24 2017

14 Mathematical Problems in Engineering

[8] Y Zhao X Liu and W Wang ldquoSpiking neural P systems withneuron division and dissolutionrdquo PLoS ONE vol 11 no 9Article ID e0162882 2016

[9] X Liu Y Zhao and M Sun ldquoAn Improved Apriori AlgorithmBased on an Evolution-Communication Tissue-Like P Systemwith Promoters and Inhibitorsrdquo Discrete Dynamics in Natureand Society vol 2017 2017

[10] N Netjinda T Achalakul and B Sirinaovakul ldquoParticle SwarmOptimization inspired by starling flock behaviorrdquo Applied SoftComputing vol 35 pp 411ndash422 2015

[11] A El Dor D Lemoine M Clerc P Siarry L Deroussi andMGourgand ldquoDynamic cluster in particle swarm optimizationalgorithmrdquo Natural Computing vol 14 no 4 pp 655ndash672 2015

[12] Y M B Ali ldquoUnsupervised Clustering Based an AdaptiveParticle Swarm Optimization Algorithmrdquo Neural ProcessingLetters vol 44 no 1 pp 221ndash244 2016

[13] G Armano and M R Farmani ldquoMultiobjective clusteringanalysis using particle swarm optimizationrdquo Expert Systemswith Applications vol 55 pp 184ndash193 2016

[14] B Niu Q Duan L Tan C Liu and P Liang ldquoA Population-Based Clustering Technique Using Particle Swarm Optimiza-tion and K-Meansrdquo in Advances in Swarm and ComputationalIntelligence vol 9140 of Lecture Notes in Computer Science pp145ndash152 Springer International Publishing Cham 2015

[15] X Zhao W Lin J Hao X Zuo and J Yuan ldquoClustering andpattern search for enhancing particle swarm optimization withEuclidean spatial neighborhood searchrdquo Neurocomputing vol171 pp 966ndash981 2016

[16] E Comak ldquoA modified particle swarm optimization algorithmusing Renyi entropy-based clusteringrdquo Neural Computing andApplications vol 27 no 5 pp 1381ndash1390 2016

[17] K K Bharti and P K Singh ldquoOpposition chaotic fitness muta-tion based adaptive inertia weight BPSO for feature selection intext clusteringrdquoApplied SoftComputing vol 43 pp 20ndash34 2016

[18] W Song W Ma and Y Qiao ldquoParticle swarm optimizationalgorithm with environmental factors for clustering analysisrdquoSoft Computing vol 21 no 2 pp 283ndash293 2017

[19] R Liu J Li J Fan C Mu and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimizationfor dynamic multi-objective optimizationrdquo European Journal ofOperational Research vol 261 no 3 pp 1028ndash1051 2017

[20] X Chu B Niu J J Liang and Q Lu ldquoAn orthogonal-designhybrid particle swarm optimiser with application to capacitatedfacility location problemrdquo International Journal of Bio-InspiredComputation vol 8 no 5 pp 268ndash285 2016

[21] DWang D Tan and L Liu ldquoParticle Swarm Optimization Anoverviewrdquo Soft Computing vol 22 no 2 p 22 2018

[22] D Karaboga ldquoAn Idea Based on Honey Bee Swarm for Numer-ical Optimizationrdquo Technical Report TR0 2005

[23] X Li ldquoA multimodal particle swarm optimizer based on fitnessEuclidean-distance ratiordquo in Proceedings of the the 9th annualconference pp 78ndash85 London England July 2007

[24] G Zhu and S Kwong ldquoGbest-guided artificial bee colonyalgorithm for numerical function optimizationrdquo Applied Math-ematics and Computation vol 217 no 7 pp 3166ndash3173 2010

[25] A Laudani F R Fulginei G M Lozito and A SalvinildquoSwarmflock optimization algorithms as continuous dynamicsystemsrdquo Applied Mathematics and Computation vol 243 pp670ndash683 2014

[26] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[27] K K Bharti and P K Singh ldquoChaotic gradient artificial beecolony for text clusteringrdquo Soft Computing vol 20 no 3 pp1113ndash1126 2016

[28] RMMay ldquoSimplemathematicalmodelswith very complicateddynamicsrdquo Nature vol 261 no 5560 pp 459ndash467 1976

[29] F S Hover and M S Triantafyllou ldquoApplication of polynomialchaos in stability and controlrdquo Automatica vol 42 no 5 pp789ndash795 2006

[30] X Liu L Xiang and X Wang ldquoSpatial cluster analysis bythe adleman-lipton DNA computing model and flexible gridsrdquoDiscrete Dynamics in Nature and Society vol 2012 Article ID894207 32 pages 2012

[31] H Peng X Luo ZGao JWang andZ Pei ldquoANovel ClusteringAlgorithm Inspired by Membrane Computingrdquo The ScientificWorld Journal vol 3 no 22 Article ID 929471 8 pages 2015

[32] B Ufnalski and L M Grzesiak ldquoPlug-in direct particle swarmrepetitive controller with a reduced dimensionality of a fitnesslandscape - a multi-swarm approachrdquo Bulletin of the PolishAcademy of SciencesmdashTechnical Sciences vol 63 no 4 pp 857ndash866 2015

[33] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[34] ldquoDatasetsrdquo httpwwwisicalacinsimsanghamidatahtml[35] H C Blake UCI Repository of Machine Learning Databases

1998

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 14: A New Chaotic Starling Particle Swarm Optimization ...downloads.hindawi.com/journals/mpe/2018/8250480.pdfMathematicalProblemsinEngineering Chaotic Parameter u 2.5 3 3.5 4 Chaotic Value

14 Mathematical Problems in Engineering

[8] Y Zhao X Liu and W Wang ldquoSpiking neural P systems withneuron division and dissolutionrdquo PLoS ONE vol 11 no 9Article ID e0162882 2016

[9] X Liu Y Zhao and M Sun ldquoAn Improved Apriori AlgorithmBased on an Evolution-Communication Tissue-Like P Systemwith Promoters and Inhibitorsrdquo Discrete Dynamics in Natureand Society vol 2017 2017

[10] N Netjinda T Achalakul and B Sirinaovakul ldquoParticle SwarmOptimization inspired by starling flock behaviorrdquo Applied SoftComputing vol 35 pp 411ndash422 2015

[11] A El Dor D Lemoine M Clerc P Siarry L Deroussi andMGourgand ldquoDynamic cluster in particle swarm optimizationalgorithmrdquo Natural Computing vol 14 no 4 pp 655ndash672 2015

[12] Y M B Ali ldquoUnsupervised Clustering Based an AdaptiveParticle Swarm Optimization Algorithmrdquo Neural ProcessingLetters vol 44 no 1 pp 221ndash244 2016

[13] G Armano and M R Farmani ldquoMultiobjective clusteringanalysis using particle swarm optimizationrdquo Expert Systemswith Applications vol 55 pp 184ndash193 2016

[14] B Niu Q Duan L Tan C Liu and P Liang ldquoA Population-Based Clustering Technique Using Particle Swarm Optimiza-tion and K-Meansrdquo in Advances in Swarm and ComputationalIntelligence vol 9140 of Lecture Notes in Computer Science pp145ndash152 Springer International Publishing Cham 2015

[15] X Zhao W Lin J Hao X Zuo and J Yuan ldquoClustering andpattern search for enhancing particle swarm optimization withEuclidean spatial neighborhood searchrdquo Neurocomputing vol171 pp 966ndash981 2016

[16] E Comak ldquoA modified particle swarm optimization algorithmusing Renyi entropy-based clusteringrdquo Neural Computing andApplications vol 27 no 5 pp 1381ndash1390 2016

[17] K K Bharti and P K Singh ldquoOpposition chaotic fitness muta-tion based adaptive inertia weight BPSO for feature selection intext clusteringrdquoApplied SoftComputing vol 43 pp 20ndash34 2016

[18] W Song W Ma and Y Qiao ldquoParticle swarm optimizationalgorithm with environmental factors for clustering analysisrdquoSoft Computing vol 21 no 2 pp 283ndash293 2017

[19] R Liu J Li J Fan C Mu and L Jiao ldquoA coevolutionarytechnique based on multi-swarm particle swarm optimizationfor dynamic multi-objective optimizationrdquo European Journal ofOperational Research vol 261 no 3 pp 1028ndash1051 2017

[20] X Chu B Niu J J Liang and Q Lu ldquoAn orthogonal-designhybrid particle swarm optimiser with application to capacitatedfacility location problemrdquo International Journal of Bio-InspiredComputation vol 8 no 5 pp 268ndash285 2016

[21] DWang D Tan and L Liu ldquoParticle Swarm Optimization Anoverviewrdquo Soft Computing vol 22 no 2 p 22 2018

[22] D Karaboga ldquoAn Idea Based on Honey Bee Swarm for Numer-ical Optimizationrdquo Technical Report TR0 2005

[23] X Li ldquoA multimodal particle swarm optimizer based on fitnessEuclidean-distance ratiordquo in Proceedings of the the 9th annualconference pp 78ndash85 London England July 2007

[24] G Zhu and S Kwong ldquoGbest-guided artificial bee colonyalgorithm for numerical function optimizationrdquo Applied Math-ematics and Computation vol 217 no 7 pp 3166ndash3173 2010

[25] A Laudani F R Fulginei G M Lozito and A SalvinildquoSwarmflock optimization algorithms as continuous dynamicsystemsrdquo Applied Mathematics and Computation vol 243 pp670ndash683 2014

[26] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[27] K K Bharti and P K Singh ldquoChaotic gradient artificial beecolony for text clusteringrdquo Soft Computing vol 20 no 3 pp1113ndash1126 2016

[28] RMMay ldquoSimplemathematicalmodelswith very complicateddynamicsrdquo Nature vol 261 no 5560 pp 459ndash467 1976

[29] F S Hover and M S Triantafyllou ldquoApplication of polynomialchaos in stability and controlrdquo Automatica vol 42 no 5 pp789ndash795 2006

[30] X Liu L Xiang and X Wang ldquoSpatial cluster analysis bythe adleman-lipton DNA computing model and flexible gridsrdquoDiscrete Dynamics in Nature and Society vol 2012 Article ID894207 32 pages 2012

[31] H Peng X Luo ZGao JWang andZ Pei ldquoANovel ClusteringAlgorithm Inspired by Membrane Computingrdquo The ScientificWorld Journal vol 3 no 22 Article ID 929471 8 pages 2015

[32] B Ufnalski and L M Grzesiak ldquoPlug-in direct particle swarmrepetitive controller with a reduced dimensionality of a fitnesslandscape - a multi-swarm approachrdquo Bulletin of the PolishAcademy of SciencesmdashTechnical Sciences vol 63 no 4 pp 857ndash866 2015

[33] A O Griewank ldquoGeneralized descent for global optimizationrdquoJournal of Optimization Theory and Applications vol 34 no 1pp 11ndash39 1981

[34] ldquoDatasetsrdquo httpwwwisicalacinsimsanghamidatahtml[35] H C Blake UCI Repository of Machine Learning Databases

1998

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 15: A New Chaotic Starling Particle Swarm Optimization ...downloads.hindawi.com/journals/mpe/2018/8250480.pdfMathematicalProblemsinEngineering Chaotic Parameter u 2.5 3 3.5 4 Chaotic Value

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom