PARTICLE SWARM OPTIMIZATION (PSO) [1] is an

Similar documents
Modified Particle Swarm Optimization

Small World Network Based Dynamic Topology for Particle Swarm Optimization

Small World Particle Swarm Optimizer for Global Optimization Problems

Particle Swarm Optimization

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques

Optimization of Benchmark Functions Using Artificial Bee Colony (ABC) Algorithm

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-"&"3 -"(' ( +-" " " % '.+ % ' -0(+$,

PARTICLE SWARM OPTIMIZATION (PSO)

EE 553 Term Project Report Particle Swarm Optimization (PSO) and PSO with Cross-over

A Hybrid Fireworks Optimization Method with Differential Evolution Operators

HPSOM: A HYBRID PARTICLE SWARM OPTIMIZATION ALGORITHM WITH GENETIC MUTATION. Received February 2012; revised June 2012

DS-PSO: Particle Swarm Optimization with Dynamic and Static Topologies

Speculative Evaluation in Particle Swarm Optimization

Inertia Weight. v i = ωv i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) The new velocity update equation:

Binary Differential Evolution Strategies

Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization

A Polar Coordinate Particle Swarm Optimiser

Kent Academic Repository

Optimized Algorithm for Particle Swarm Optimization

Particle Swarm Optimization Artificial Bee Colony Chain (PSOABCC): A Hybrid Meteahuristic Algorithm

Effect of the PSO Topologies on the Performance of the PSO-ELM

IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE

LECTURE 16: SWARM INTELLIGENCE 2 / PARTICLE SWARM OPTIMIZATION 2

GREEN-PSO: Conserving Function Evaluations in Particle Swarm Optimization

1 Lab + Hwk 5: Particle Swarm Optimization

Unidimensional Search for solving continuous high-dimensional optimization problems

Reconfiguration Optimization for Loss Reduction in Distribution Networks using Hybrid PSO algorithm and Fuzzy logic

An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm

Cooperative Coevolution using The Brain Storm Optimization Algorithm

What Makes A Successful Society?

150 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 14, NO. 1, FEBRUARY X/$26.00 c 2010 IEEE

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization

Adaptative Clustering Particle Swarm Optimization

Tracking Changing Extrema with Particle Swarm Optimizer

Population Structure and Particle Swarm Performance

Constrained Single-Objective Optimization Using Particle Swarm Optimization

An Island Based Hybrid Evolutionary Algorithm for Optimization

Particle Swarm Optimizer for Finding Robust Optima. J.K. Vis

FDR PSO-Based Optimization for Non-smooth Functions

A Gaussian Firefly Algorithm

Modified Particle Swarm Optimization with Novel Modulated Inertia for Velocity Update

Optimization Using Particle Swarms with Near Neighbor Interactions

Mobile Robot Path Planning in Static Environments using Particle Swarm Optimization

Three-Dimensional Off-Line Path Planning for Unmanned Aerial Vehicle Using Modified Particle Swarm Optimization

Defining a Standard for Particle Swarm Optimization

1 Lab 5: Particle Swarm Optimization

Discrete Particle Swarm Optimization for TSP based on Neighborhood

Comparing lbest PSO Niching algorithms Using Different Position Update Rules

Research Article A Dynamic Multistage Hybrid Swarm Intelligence Optimization Algorithm for Function Optimization

Individual Parameter Selection Strategy for Particle Swarm Optimization

1 Lab + Hwk 5: Particle Swarm Optimization

Particle swarm optimization for mobile network design

Automatic differentiation based for particle swarm optimization steepest descent direction

A MULTI-SWARM PARTICLE SWARM OPTIMIZATION WITH LOCAL SEARCH ON MULTI-ROBOT SEARCH SYSTEM

A Comparative Analysis on the Performance of Particle Swarm Optimization and Artificial Immune Systems for Mathematical Test Functions.

SINCE PARTICLE swarm optimization (PSO) was introduced

The Pennsylvania State University. The Graduate School. Department of Electrical Engineering COMPARISON OF CAT SWARM OPTIMIZATION WITH PARTICLE SWARM

The movement of the dimmer firefly i towards the brighter firefly j in terms of the dimmer one s updated location is determined by the following equat

Discrete Multi-Valued Particle Swarm Optimization

QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION

A Multiobjective Memetic Algorithm Based on Particle Swarm Optimization

A Robust Method of Facial Feature Tracking for Moving Images

The Prediction of Real estate Price Index based on Improved Neural Network Algorithm

Benchmark Functions for the CEC 2008 Special Session and Competition on Large Scale Global Optimization

THREE PHASE FAULT DIAGNOSIS BASED ON RBF NEURAL NETWORK OPTIMIZED BY PSO ALGORITHM

An Algorithm for the Welding Torch Weaving Control of Arc Welding Robot

A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION

Université Libre de Bruxelles

IN RECENT YEARS, particle swarm optimization (PSO)

KEYWORDS: Mobile Ad hoc Networks (MANETs), Swarm Intelligence, Particle Swarm Optimization (PSO), Multi Point Relay (MPR), Throughput.

Particle Swarm Optimization in Scilab ver 0.1-7

Application of Improved Discrete Particle Swarm Optimization in Logistics Distribution Routing Problem

Design optimization method for Francis turbine

Cell-to-switch assignment in. cellular networks. barebones particle swarm optimization

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

Automatic Deployment and Formation Control of Decentralized Multi-Agent Networks

Adaptive Spiral Optimization Algorithm for Benchmark Problems

Assessing Particle Swarm Optimizers Using Network Science Metrics

Adaptive Particle Swarm Optimization (APSO) for multimodal function optimization

Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization

OPTIMUM CAPACITY ALLOCATION OF DISTRIBUTED GENERATION UNITS USING PARALLEL PSO USING MESSAGE PASSING INTERFACE

Simulating Particle Swarm Optimization Algorithm to Estimate Likelihood Function of ARMA(1, 1) Model

An Intelligent Mesh Based Multicast Routing Algorithm for MANETs using Particle Swarm Optimization

A RANDOM SYNCHRONOUS-ASYNCHRONOUS PARTICLE SWARM OPTIMIZATION ALGORITHM WITH A NEW ITERATION STRATEGY

Particle Swarm Optimization for ILP Model Based Scheduling

Development of Redundant Robot Simulator for Avoiding Arbitrary Obstacles Based on Semi-Analytical Method of Solving Inverse Kinematics

Improving Tree-Based Classification Rules Using a Particle Swarm Optimization

Adaptively Choosing Neighbourhood Bests Using Species in a Particle Swarm Optimizer for Multimodal Function Optimization

A Native Approach to Cell to Switch Assignment Using Firefly Algorithm

International Conference on Modeling and SimulationCoimbatore, August 2007

Multi swarm optimization algorithm with adaptive connectivity degree

FPGA Implementation of Parallel Particle Swarm Optimization Algorithm and Compared with Genetic Algorithm

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Thesis Submitted for the degree of. Doctor of Philosophy. at the University of Leicester. Changhe Li. Department of Computer Science

Video Seamless Splicing Method Based on SURF Algorithm and Harris Corner Points Detection

Feeder Reconfiguration Using Binary Coding Particle Swarm Optimization

Part II. Computational Intelligence Algorithms

Particle Swarm Optimization

Effectual Multiprocessor Scheduling Based on Stochastic Optimization Technique

A REAL-TIME REGISTRATION METHOD OF AUGMENTED REALITY BASED ON SURF AND OPTICAL FLOW

Transcription:

Proceedings of International Joint Conference on Neural Netorks, Atlanta, Georgia, USA, June -9, 9 Netork-Structured Particle Sarm Optimizer Considering Neighborhood Relationships Haruna Matsushita and Yoshifumi Nishio Abstract This study proposes a ne Netork-Structured Particle Sarm Optimizer considering neighborhood relationships (NS-). All particles of NS- are connected to adjacent particles by a neighborhood relation of the -dimensional netork. The directly connected particles share the information of their on best position. Each particle is updated depending on the neighborhood distance on the netork beteen it and a inner, hose function value is best among all particles. Simulation results sho the searching efficiency of NS-. I. INTRODUCTION PARTICLE SWARM OPTIMIZATION () [] is an evolutionary algorithm to simulate the movement of flocks of birds. Due to the simple concept, easy implementation, and quick convergence, has attracted much attention and is used to ide applications in different fields in recent years. Hoever, greatly depends on its parameters and converges prematurely in case of solving complex problems hich have local optima. Furthermore, in the standard algorithm, there are no special relationships beteen particles. Each particle position is updated according to its personal best position and the best particle position among the all particles. Then, hich position is more important for each particle is determined at random in every generation. On the other side, in the real orld, various personal relationships exist, such as the hierarchical relationship, the trust relationships, the parentschild relationship and so on. Various topological neighborhoods have been considered by researches [] []. They have applied ring neighborhood, the von Neumann neighborhood, or some other topological neighborhoods. Hoever, the parameters are increased by using fourth term for considering the neighboring best position hen updating the velocity. Moreover, if there are no connections in respective particles, the computation costs are increased because e have to search each particle s neighborhood by using particle position or its cost ith a neighborhood radius in every generation. In this study, e propose a ne algorithm ith topological neighborhoods; Netork-Structured Particle Sarm Optimizer considering neighborhood relationships (NS- ). All particles of NS- are connected to adjacent particles by a neighborhood relation, hich dictates the topology of the -dimensional netork. The connected particles, namely neighboring particles on the netork, share the Haruna Matsushita and Yoshifumi Nishio are ith the Department of Electrical and Electronic Engineering, Tokushima University, Tokushima, 77, Japan (phone: + 77; fax: + 77; email: {haruna, nishio}@ee.tokushima-u.ac.jp). information of their on best position. In every generation, e find a inner particle, hose function value is the best among all particles, and each particle is updated depending on the neighborhood distance beteen it and the inner on the netork. Simulation results and comparisons ith the standard sho that the proposed NS- can effectively enhance the searching efficiency by measuring in terms of accuracy, robustness and parameter-dependence. II. STANDARD PARTICLE SWARM OPTIMIZATION () is an evolutionary algorithm to simulate the movement of flocks of birds. In the algorithm of, multiple solutions called particles coexist. At each time step, the particle flies toards its on past best position and the best position among all particles. Each particle has to information; position and velocity. The position vector of each particle i and its velocity vector are represented by X i = (x i,,x id,,x id ) and V i =(v i,,v id,,v id ), respectively, here (d =,,,D), (i =,,,M)and x id [x min,x max ]. () (Initialization) Let a generation step t =. Randomly initialize the particle position X i and its velocity V i for each particle i, and initialize P i =(p i,p i,,p id ) ith a copy of X i. Evaluate the objective function f(x i ) for each particle i and find P g ith the best function value among the all particles. () Evaluate the fitness f(x i ). For each particle i, if f(x i ) <f(p i ), the personal best position (called pbest) P i = X i.letp g represents the best position ith the best fitness among all particles so far (called gbest). Update P g, if needed. (3) Update V i and X i of each particle i depending on its pbest and gbest according to v id (t +)=v id (t)+c rand( )(p id x id (t)) + c Rand( )(p g x id (t)), x id (t +)= x id (t)+v id (t +), here is the inertia eight determining ho much of the previous velocity of the particle is preserved. c and c are to positive acceleration coefficients, generally c = c. rand( ) and Rand( ) are to uniform random numbers samples from U(, ). () Let t = t + and go back to (). () 97---33-/9/$. 9 IEEE 3

Neighborhood A 7 7 3 3 3 9 9 Neighborhood B Fig.. Netork-structure and neighborhood relationships. -neighborhood of the particle is Neighborhood A containing the particle 7,,, 3 and 7. -neighborhood of the particle is Neighborhood B containing the particle 9, 3,, and 9. If f(p 3 ) is the smallest value among Neighborhood A, then lbest of the particle is L = P 3.Iff(P ) is the smallest value among Neighborhood B, then lbest of the particle is L = P. III. NETWORK-STRUCTURED PARTICLE SWARM OPTIMIZER CONSIDERING NEIGHBORHOOD RELATIONSHIPS (NS-) The algorithm of NS- is almost same as the standard. The most important feature of NS- is that all particles are organized on a rectangular -dimensional grid as Fig.. In other ords, the particles are connected to adjacent particles by the neighborhood relation, hich dictates the topology of the netork. Furthermore, the particles share the local best position beteen the neighborhood particles directly connected. We should note the mean of neighborhood. In some research, the neighboring particles ere defined as the particle ith similar cost values or as the particles hose position are close to each other. Hoever, in this study, the particles, hich are directly connected on the netork, are defined as neighborhood even if their positions or their costs are far. (NS-) (Initialization) Let a generation step t =and initialize particle informationton according to (). Define g as the inner c. FindL i = (l i,l i,,l id ) ith the best function value among the directly connected particles, namely on neighbors. (NS-) Evaluate the fitness f(x i ) and find a inner particle c ith the best fitness among the all particles at current time t; c =argmin i {f (X i (t))}. () For each particle i, iff(x i ) <f(p i ), the personal best position pbest P i = X i. If f(x c ) <f(p g ), update gbest P g = X c,herex c = (x c,x c,,x cd ) is the position of the inner c. (NS-3) Find each local best position (called lbest) L i among the particle i and its neighborhoods hich are directly connected ith i on the netork as Fig.. For each particle i, iff(p i ) <f(l i ), update lbest L i = P i. (NS-) Update V i and X i of each particle i depending on its lbest, position of the inner X c and the distance on the netork beteen i and the inner c, according to v id (t +)=v id (t)+c rand( )(l id x id (t)) + c h c,i (x cd x id (t)), x id (t +)= x id (t)+v id (t +), here h c,i is the fixed neighborhood function defined by h c,i =exp ( r i r c ) σ, () here r i r c is the distance beteen netork nodes c and i on the netork, the fixed parameter σ corresponds to the idth of the neighborhood function. Therefore, the large σ strengthens particles spreading force to the hole space, and the small σ strengthens their convergent force toard the inner. (NS-) Let t = t + and go back to (NS-). IV. EXPERIMENTAL RESULTS In order to evaluate the performance of NS-, e use eight benchmark optimization problems summarized in Table I. The optimum solution x of Rosenbrock s function f is [,,...,], and x of the other functions are all [,,...,]. The optimum function values f(x ) of all functions are. f, f, f 3 and f are unimodal functions shon in Figs., and f, f, f 7 and f are multimodal functions shon in Figs. ith numerous local minima. Grieank s function ith full definition range as Fig. looks similar to Sphere function as Fig. hich is the unimodal. Hoever, in the inner area as Fig. (i), there are a lot of small peaks and valleys. All the functions have D variables. In this study, D is set to and to investigate the performances in various dimensions. The population size M is set to 3 in, and the netork size is in NS-. For and NS-, the parameters are set as =.7 and c = c =.. The neighborhood radius σ of NS- for f ith D = are 3, hile they are all. for f ith D =and the other functions ith both dimensions. Because Ackley s function f ith D = is difficult to optimize, the particles of NS- need much information of the neighbors. We carry out the simulations repeated 3 times for all the optimization functions ith 3 generations. The performances ith the minimum and mean function values and the standard deviation over 3 independent runs on eight functions ith and dimensions are listed in Tables II and III, respectively. We can see that NS- can obtain the better mean values for all the test functions. In particular, NS- greatly improves the performance from on f, f 3, f and f ith D =and on f, f 3, f and f ith D =. Furthermore, NS- can achieve the better minimum values 7 times on both dimensions. From these results, e can say that NS- algorithm can obtain the better results for the test functions ith both D =and. Figure 3 shos the mean gbest value of every generation over 3 runs for eight test functions ith dimensions. (3) 39

TABLE I EIGHT TEST FUNCTIONS. Function name Test Function Initialization Space Sphere function; f (x) = D D Rosenbrock s function; f (x) = 3 rd De Jong s function; f 3 (x) = th De Jong s function; f (x) = Rastrigin s function; f (x) = x d, x [.,.7]D ( ( x d x d+ ) +( xd ) ), x [.,.7] D D x d, x [.,.7] D D dx d, x [.,.7] D D ( x d cos (πx d )+ ), x [.,.] D D Ackley s function; f (x) = ( + e e..(x d +x d+ ) e.(cos(πx d)+cos(πx d+ )) ), x [ 3, 3] D D Stretched V sine ave function; f 7 (x) = (x d + x d+). ( +sin ((x d + x d+). ) ), x [, ] D Grieank s function; f (x) = D x d D ( ) xd cos +, x [, ] D d TABLE II COMPARISON RESULTS OF AND NS- ON TEST FUNCTIONS WITH D =. f Method Mean Minimum Std f.3e-.e-.e-7 NS-.3e-.e-3.7e- f..33 7. NS-. 37. 9.3 f 3.7e-.e-9.e- NS-.9e-7 3.e- 7.e-7 f 9.3e-3.3e-.e-3 NS- 3.7e- 9.9e-.3e- f 9.7 99. 3. NS- 7.9.7.9 f 3.9 3. 9.3 NS-.79.3 39.37 f 7 9. 9.7 9. NS- 37. 7.7 7.9 f 9.e-.7 NS-.e-.e- TABLE III COMPARISON RESULTS OF AND NS- ON TEST FUNCTIONS WITH D =. f Method Mean Minimum Std f.e- 9.e-7.e-3 NS-.e- 3.e- 7.9e- f 79..9. NS- 3. 9.. f 3.93 3.9e-3.3 NS-. 3.e-3.7 f.e-7 7.e- 7.3e-7 NS- 3.7e-.e-.e-9 f 39.7 33.. NS-.3. 3.7 f 73.73. 9.7 NS- 3.9. 9. f 7 9.7 7..9 NS-.39.. f..e-.3 NS-..e-.79 The convergence rate of NS- is almost same or sloer than the standard. In the standard, the particles move toard gbest or toard pbest, hoever, the direction, hich more particles move toard, is decided at random on every generation. On the other hand, the neighborhood gaussian function is used in NS-, then, the particles

x 3... (i) Fig.. Eight test functions ith to variables. First and second variables are on the x-axis and y-axis, respectively, and z-axis shos its function value. Sphere function. Rosenbrock s function. 3rd De Jong s function. th De Jong s function. Rastrigin s function. Ackley s function. Stretched V sine ave function. Grieank s function ith full definition range. (i) Inner area of Grieank s function from - to. move according to the neighborhood distance beteen the inner and them. The inner s neighborhood particles move toard the inner, so that they spread to hole space. For the particles hich are not -neighbors of the inner but are connected near the inner, the gravitation toard the inner is strong. The other particles fly toard their lbest. In other ords, the roles of the NS- particles are determined by the connection relationship, and they produce the diversity of the particles. These effects avert the premature convergence, and the particles of NS- can easily escape from the local optima. Next, e consider the robustness of both algorithms by measuring by the standard deviation of all simulations. From the standard deviations in Tables II and III, the standard and NS- obtain the smaller values and times on both dimensions, respectively. Therefore, NS- appears to be more robust than the standard. The particle s velocity and its position of NS- is updated by according to the connection relationship although the standard is updated at completely random. Therefore, NS- can achieve the stable better performance unaffected by the initial states of the particles. V. PARAMETER D EPENDENCE Furthermore, e investigate the effect of the parameters; the inertia eight and the acceleration coefficients c and c, on performance quality and their sensitivity. Figure shos the mean fitness value of gbest ith different over 3 runs for all the test functions ith dimensions. The fixed parameters are same as above simulations. We can see that NS- achieve better performances than the standard even if the inertia eight is varied. If e decrease or increase by just., the performance of the standard becomes drastically orse as especially

Function Value Function Value Function Value Function Value 3 3 3 3...3. 3 Function Value Function Value 3 3 3 Function Value Function Value 3 3 3 Fig. 3. Mean gbest value of every generation for -dimensional eight functions. Sphere function. Rosenbrock s function. 3 rd De Jong s function. th De Jong s function. Rastrigin s function. Ackley s function. Stretched V sine ave function. Grieank s function. Figs., and. Oh the other hand, NS- is more effective and robust than the standard if the inertia eight is set as a small value ( [.,.]) []as especially multimodal functions shon in Figs.. Figure shos the mean fitness value of gbest ith different c (= c ) over 3 runs for all the test functions ith dimensions. We should note that for f 3 and f, the minimum gbest value of the standard among all the c (= c ) is smaller than NS- as Figs. and. Hoever, the standard is extremely sensitive to small changes in c (= c ) hereas the performances of NS- are stable. For other functions, NS- can obtain better results than even if the acceleration coefficients c (= c ) are varied. From these results, e can say that NS- is more effective and its parametrical dependence is not stronger than the standard. VI. CONCLUSIONS This study has proposed a ne Particle Sarm Optimization () ith topological neighborhoods; Netork- Structured Particle Sarm Optimizer considering neighborhood relationships. All particles of NS- are connected

...3....7. 3...3....7....3....7....3....7. 3 3...3....7....3....7...3.....3....7....3....7. Fig.. Mean gbest value ith different for -dimensional eight functions. Fixed parameters; c = c =.. σ =3for Ackley s function f and. for other all functions. Sphere function. Rosenbrock s function. 3 rd De Jong s function. th De Jong s function. Rastrigin s function. Ackley s function. Stretched V sine ave function. Grieank s function. to adjacent particles by neighborhood relation of the - dimensional netork. The directly connected particles share the information of their on best position. Each particle is updated depending on the neighborhood distance on the netork beteen it and a inner, hose function value is the best among all particles. In the simulation results, e have confirmed the accuracy, robustness and parameterdependence of and NS-, and it is clear that the overall performance of NS- is much efficient and robust. In the feature orks, e should investigate the behaviors of NS- ith various netork topologies and its relevance beteen netork structure and performance in detail. REFERENCES [] J. Kennedy and R. C. Eberhart, Particle sarm optimization, in Proc. IEEE. Int. Conf. Neural Net., pp. 9 9, 99. [] J. Kennedy, Small orlds and mega-minds: effects of neighborhood topology on particle sarm performance, in Proc. of Cong. on Evolut. Comput., pp. 93 93, 999. [3] J. Kennedy and R. Mendes, Population structure and particle sarm performance, in Proc. of Cong. on Evolut. Comput., pp. 7 7,. 3

3...7..9 c=c...7..9 c=c...7..9 c=c...7..9 c=c 3 3...7..9 c=c...3....7..9 c=c...7..9 c=c...7..9 c=c Fig.. Mean gbest value ith different c = c for -dimensional eight functions. Fixed parameters; =.7. σ =3for Ackley s function f and. for other all functions. Sphere function. Rosenbrock s function. 3 rd De Jong s function. th De Jong s function. Rastrigin s function. Ackley s function. Stretched V sine ave function. Grieank s function. [] R. Mendes, J. Kennedy and J. Neves, The Fully Informed Particle Sarm: Simpler, Maybe Better, in IEEE Trans. Evolut. Comput., vol., no.3, pp., June. [] S. B. Akat and V. Gazi, Particle Sarm Optimization ith Dynamic Neighborhood Topology: Three Neighborhood Strategies and Preliminary Results, in Proc. of IEEE Sarm Intelligence Symposium,. [] X. Xie, W. Zang and Z. Yang, A dissipative sarm optimization, in Proc. of Cong. on Evolut. Comput., pp.,.