IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE

Similar documents
Tracking Changing Extrema with Particle Swarm Optimizer

PARTICLE SWARM OPTIMIZATION (PSO)

Initializing the Particle Swarm Optimizer Using the Nonlinear Simplex Method

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-"&"3 -"(' ( +-" " " % '.+ % ' -0(+$,

Particle Swarm Optimization

Modified Particle Swarm Optimization

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques

Adaptative Clustering Particle Swarm Optimization

Particle Swarm Optimization

Small World Network Based Dynamic Topology for Particle Swarm Optimization

Optimized Algorithm for Particle Swarm Optimization

Université Libre de Bruxelles

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

Artificial bee colony algorithm with multiple onlookers for constrained optimization problems

Optimal Power Flow Using Particle Swarm Optimization

A Hybrid Fireworks Optimization Method with Differential Evolution Operators

Mobile Robot Path Planning in Static Environments using Particle Swarm Optimization

An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm

Constrained Single-Objective Optimization Using Particle Swarm Optimization

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Particle Swarm Optimization Artificial Bee Colony Chain (PSOABCC): A Hybrid Meteahuristic Algorithm

Discrete Particle Swarm Optimization for TSP based on Neighborhood

LECTURE 16: SWARM INTELLIGENCE 2 / PARTICLE SWARM OPTIMIZATION 2

QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION

Optimization of Benchmark Functions Using Artificial Bee Colony (ABC) Algorithm

A RANDOM SYNCHRONOUS-ASYNCHRONOUS PARTICLE SWARM OPTIMIZATION ALGORITHM WITH A NEW ITERATION STRATEGY

Improving Tree-Based Classification Rules Using a Particle Swarm Optimization

A hybrid constrained optimization approach coupling PSO and adaptive constraint-handling technique

Adaptively Choosing Neighbourhood Bests Using Species in a Particle Swarm Optimizer for Multimodal Function Optimization

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems

Inertia Weight. v i = ωv i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) The new velocity update equation:

A HYBRID ALGORITHM BASED ON PARTICLE SWARM OPTIMIZATION

Speculative Evaluation in Particle Swarm Optimization

A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION

The movement of the dimmer firefly i towards the brighter firefly j in terms of the dimmer one s updated location is determined by the following equat

Feature weighting using particle swarm optimization for learning vector quantization classifier

Application of Improved Discrete Particle Swarm Optimization in Logistics Distribution Routing Problem

Three-Dimensional Off-Line Path Planning for Unmanned Aerial Vehicle Using Modified Particle Swarm Optimization

A Comparative Analysis on the Performance of Particle Swarm Optimization and Artificial Immune Systems for Mathematical Test Functions.

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization

Automatic differentiation based for particle swarm optimization steepest descent direction

ACONM: A hybrid of Ant Colony Optimization and Nelder-Mead Simplex Search

Small World Particle Swarm Optimizer for Global Optimization Problems

Population Structure and Particle Swarm Performance

Particle swarm-based optimal partitioning algorithm for combinational CMOS circuits

Simplifying Handwritten Characters Recognition Using a Particle Swarm Optimization Approach

Binary Differential Evolution Strategies

Low-Complexity Particle Swarm Optimization for Time-Critical Applications

Parameter Estimation of PI Controller using PSO Algorithm for Level Control

Particle Swarm Optimizer for Finding Robust Optima. J.K. Vis

Variable Neighborhood Particle Swarm Optimization for Multi-objective Flexible Job-Shop Scheduling Problems

What Makes A Successful Society?

Parameter Selection of a Support Vector Machine, Based on a Chaotic Particle Swarm Optimization Algorithm

A Particle Swarm Approach to Quadratic Assignment Problems

Modified Particle Swarm Optimization with Novel Modulated Inertia for Velocity Update

Optimization Using Particle Swarms with Near Neighbor Interactions

EE 553 Term Project Report Particle Swarm Optimization (PSO) and PSO with Cross-over

PARTICLE SWARM OPTIMIZATION (PSO) [1] is an

Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization

Experiments with Firefly Algorithm

A Novel Probabilistic-PSO Based Learning Algorithm for Optimization of Neural Networks for Benchmark Problems

ARMA MODEL SELECTION USING PARTICLE SWARM OPTIMIZATION AND AIC CRITERIA. Mark S. Voss a b. and Xin Feng.

A MULTI-SWARM PARTICLE SWARM OPTIMIZATION WITH LOCAL SEARCH ON MULTI-ROBOT SEARCH SYSTEM

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

A Kind of Wireless Sensor Network Coverage Optimization Algorithm Based on Genetic PSO

Benchmark Functions for the CEC 2008 Special Session and Competition on Large Scale Global Optimization

Model Parameter Estimation

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding

Particle Identification by Light Scattering through Evolutionary Algorithms

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

FDR PSO-Based Optimization for Non-smooth Functions

PARALLEL PARTICLE SWARM OPTIMIZATION IN DATA CLUSTERING

Integrating fuzzy case-based reasoning and particle swarm optimization to support decision making

A New Discrete Binary Particle Swarm Optimization based on Learning Automata

Adaptive Radiation Pattern Optimization for Antenna Arrays by Phase Perturbations using Particle Swarm Optimization

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

A Combinatorial Algorithm for The Cardinality Constrained Portfolio Optimization Problem

A Study on Optimization Algorithms for Clustering Gene Expression Data

Individual Parameter Selection Strategy for Particle Swarm Optimization

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India.

Defining a Standard for Particle Swarm Optimization

A Novel Hybrid Self Organizing Migrating Algorithm with Mutation for Global Optimization

Assessing Particle Swarm Optimizers Using Network Science Metrics

Particle Swarm Optimization Approach for Scheduling of Flexible Job Shops

Hybridization of particle swarm optimization with quadratic approximation

An Improved Tree Seed Algorithm for Optimization Problems

Reconfiguration Optimization for Loss Reduction in Distribution Networks using Hybrid PSO algorithm and Fuzzy logic

Feeding the Fish Weight Update Strategies for the Fish School Search Algorithm

SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION

Cell-to-switch assignment in. cellular networks. barebones particle swarm optimization

Particle swarm algorithms for multi-local optimization A. Ismael F. Vaz 1, Edite M.G.P. Fernandes 1

The Modified IWO Algorithm for Optimization of Numerical Functions

Ying TAN. Peking University, China. Fireworks Algorithm (FWA) for Optimization. Ying TAN. Introduction. Conventional FWA.

Cooperative Coevolution using The Brain Storm Optimization Algorithm

A Multiobjective Memetic Algorithm Based on Particle Swarm Optimization

ANANT COLONY SYSTEMFOR ROUTING IN PCB HOLES DRILLING PROCESS

Orthogonal Particle Swarm Optimization Algorithm and Its Application in Circuit Design

A Particle Swarm Optimization Algorithm for Solving Flexible Job-Shop Scheduling Problem

Designing of Optimized Combinational Circuits Using Particle Swarm Optimization Algorithm

Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems

Transcription:

IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE Fang Wang, and Yuhui Qiu Intelligent Software and Software Engineering Laboratory, Southwest-China Normal University, Chongqing, 40075, China Abstract: Key words: This article proposes a hybrid Particle Swarm Optimization (PSO) based on the Nonlinear Simplex Method (NSM). At late stage of PSO running, when the promising regions of solutions have been located, the algorithm isolates particles which are very close to the extrema and applies the NSM to them to enhance the local exploitation. Experimental results on several benchmark functions demonstrate that this approach is very effective and efficient, especially for multimodal function optimizations. It yields better solution qualities and success rates compared to other methods taken from the literature. particle swarm optimization, simplex method, multimodal function. INTRODUCTION From the beginning of 90's, a new research field called Swarm Intelligence (SI) arose [, 2]. These new optimization techniques focus on analogies of swarm behavior of natural creatures, which suggest that the main ideas of intelligent individual's socio-cognition can be effectively applied to develop efficient algorithms for optimization tasks. Ant Colony Optimization (ACO) is the most well known SI algorithm and is mainly used for combinatorial optimization tasks [3]. The Particle Swarm Optimization algorithm is another SI technique, which is mainly used for continuous optimization tasks and has been originally proposed by R.C. Eberhart and J. Kennedy [4] based on the analogy of swarm of bird and fish school. PSO

356 Fang Wang, and Yuhui Qiu exhibits good performance in solving hard optimization problems and engineering applications, and compares favorably to other optimization algorithms [5, 6]. Since its introduction, numerous variations of the basic PSO algorithm have been developed in the literature [7] to improve its overall performance. Hybrid PSO algorithms with determinate methods, such as the nonlinear simplex method, are proved to be superior to the original techniques and have many advantages over other techniques, because these hybrid procedures can perform the exploration with PSO and the exploitation with determinate methods [8]. Generating initial swarm by the NSM might improve, but is not satisfying in multimodal function optimization [9]. Applying the NSM as an operator to the swarm during the optimization may increase the computational complex considerably. In this paper, the Nonlinear Simplex Method is adopted at late stage of PSO algorithm when particles fly quite near to the extrema. Experimental results on several famous test functions demonstrate that this is a very promising way to increase the convergence speed and the success rate significantly. We briefly introduce some background knowledge of the PSO algorithm in section 2. In section 3 the proposed algorithm is described, and experimental design and correlative results are given. The paper closes with some conclusions and ideas for further work in Section 4 2. THE PARTICLE SWARM OPTIMIZATION ALGORITHM In the original PSO formulae, particle i is denoted as Xi==(Xii,Xi2,...,XiD), which represents a potential solution to a problem in D-dimensional space. Each particle maintains a memory of its previous best position, Pi=(Pii,Pi2,...,PiD), and a velocity along each dimension, represented as Vi=(Vii,Vi2,...,ViD). At each iteration, the P vector of the particle with the best fitness in the local neighborhood, designated g, and the P vector of the current particle are combined to adjust the velocity along each dimension, and that velocity is then used to compute a new position for the particle. The evolutionary equations of the swarm are [6]: Vid ^ w*vid+cl*rand()*(pid-xid)+c2*rand()*(pgd-xid) () x,d = Xid+Vjd (2)

Improving the Particle Swarm Optimization Algorithm Using the 357 Simplex Method at Late Stage Constants cl and c2 determine the relative influence of the social and cognition components (learning rates), which often both are set to the same value to give each component equal weight. A constant, V^ax, was used to limit the velocities of the particles. The parameter w, which was introduced as an inertia factor, can dynamically adjust the velocity over time, gradually focusing the PSO into a local search. Maurice Clerc has derived a constriction coefficient K, a modification of the PSO that runs without Vmax, reducing some undesirable explosive feedback effects [0]. The constriction factor is computed as: 2 K=-i I, (p = c\+c2, (p>4 -(p-^(p -4(p (3) With the constriction factor K, the PSO formula for computing the new velocity is: VM = K*(Vid+cl *rand()*(pid-xid)+c2*rand()*(pgd-x,d)) (4) Carlisle and Doziert investigated the influence of different parameters in PSO, selected cl=2.8, c2=.3, population size as 30, and proposed the Canonical PSO [, 2]. 3. THE PROPOSED ALGORITHM AND EXPERIMENTAL RESULTS The basic simplex method was presented by Spendley et al in 962, which is an efficient sequential optimization method for function minimization tasks, and then improved by Neider and Mead, to what is called the Nonlinear Simplex Method (NSM) [3]. It needs only values and not derivatives of the objective function. In general, the NSM is considered as the best method if the figure of merit is "get something to work quickly". At late stage of PSO running, promising regions of solutions have been located. Applying the NSM operator for many steps to enhance exploitation search at this stage is capable of improving the solution quality and convergence rate. We propose a hybrid Nonlinear Simplex Method PSO (NSMPSO), which isolates a particle and apply the NSM to it when it reaches quite close to the

358 Fang Wang, and Yuhui Qiu extrema (within the diversion radius). If the particle "lands" within a specified precision of a goal solution (error goal) during the NSM running, a PSO process is considered to be successful, otherwise it may be laid back to the swarm and start the next PSO iteration. The diversion radius is computed as: DKadius = t.rrorcjqc 2l+d GO * ErrorGoal if ErrorGoal<=\0' -4 S--i ] [ Qm"" ErrorGoal otherwise Table. The rate of success, mean function evaluations for each test function (5) (6) Test Functions Rate of success Mean function evaluations NSMPSO NS-PSO CPSO NSMPSO NS-PSO CPSO Rastrigin2 Levy No.32 Schaffer2 Rosenbrock2 Griewank2 Levy No.83 Freudenstein2 GoIdstern2 Sphere 0 Rosenbrockio Rastriginjo Griewankio Sphercso Rosenbrockso Rastrigin3o Griewankso 0.74 0.805 0.98 0.855 0.965 0.885 0.805 0.99 0.84 0.83 0.57 0.845 0.685 0.995 0.52 0.94 0.945 0.83 0.8 0.94 0.645 0.97 0.735 0.975 0.84 0.96 0.845 0.795 0.995 285. 2532 7872.6 7372.5 8353.7 2644 5342.5 274. 838.3 5480.9 5379.8 8969.5 7867 6783 900.4 9388 5573.3 4827 9906.8 942.8 9397.6 2045 932. 360.2 5723.4 3552 5929.5 7084. 3789 0697 3475.5 8999. 38.8 2853.3 99.7 8450.3 8985.9 2327.6 4349.6 2955.4 6306.8 546.6 5094.3 7332.3 5448 6576 85.3 080 *Note: The subscript of each test function denotes its dimension. NSMPSO: the proposed algorithm. NS-PSO: another NSM hybrid PSO proposed by Parsopoulos and Vrahatis[9]. CPSO: the Canonical PSO, Carlisle A [2].

Improving the Particle Swarm Optimization Algorithm Using the Simplex Method at Late Stage 359 In a NSM process, an initial simplex is consists of the isolated particle i and other D vertices randomly generated with the mean of Xj and standard deviation of 0.02 times X^ax- The benchmark functions [4] on which the proposed algorithm has been tested and compared to other methods in the literature, as well as the equation of each one and the corresponding parameters are listed in Table. For all test functions 200 experiments have been done and the maximum number of PSO iterations is set to be 500, swarm size is 60 for 30-dimension functions and 30 for others. Parameters used in the NSM are: a =.0, ;K=2.0, ß^ = ß~=0.5. An each time when the NSM operator is applied, it searches 50 steps. Programming environment is: Matlab 7.0, PentiumlV 2.8GHz CPU, 52M RAM, Windows2000 Professional OS. Table 2. The average optima and total CPU time for each test function Test Functions Rastrigin2 Levy No.32 Schaffer2 Rosenbrock2 Griewank2 Levy No.83 Freudenstein2 Goldstern2 Sphere] 0 Rosenbrockjo Rastriginjo Griewankio Sphere3o Rosenbrockso Rastriginso Griewankso Average optima NSMPSO NS-PSO CPSO 4.854e-9-76.54-0.99755 4.995e-9 L002 6.4267e-9 0.97969 3 8.8273e-9 3L46 9.632 9.0987 9.2039e-5 2009.3 97.48 29.098 0.42783-63.7-0.99334 0.7662.003 5.8974e-9 80.237 2240. 8.246e-9 46.95 0.4 9.038 9.3474e-5 00.63 95.88 29.094 4.9856e-9-76.54-0.99664 0.002373.006 6.0348e-9.2246 3 8.2073e-9 20.688 9.758 9.003 9.2407e-5 2856.7 97.752 29.094 Total CPU time NSMPSO 4.266 2.578 29.88 30.578 44.06 4.656 6.078 9.8906 38.828 22.922 29.547 55.56 494. 8.484 65.09 56.08 NS-PSO 27.56 22.688 35.484 36.422 49.4 0.89 3.875 2.859 23.25 5.266 32.625 42.25 67.594 56.28 27.574 7.609 CPSO 4.906 2.83 3.047 3.78 44.83.88 4.56 9.825 24.656 22.547 27.64 43.56 68.609 80.047 60.75 8.547

360 Fang Wang, and Yuhui Qiu In table and 2, the rate of success, mean function evaluations, average optima and total CPU time for each test function are given. From the table we can see that the overall performance of NSMPSO algorithm is superior to other published algorithms in terms of success rate, solution quality and convergence speed, especially on multimodal functions such as Levy No.3, Schaffer, Rosenbrock and Griewank. As to high dimension function optimizing, NSMPSO operates appreciably inferior to NS-PSO due to its computational expense, but is still equal to the Canonical PSO algorithm. 4. CONCLUSIONS AND FUTURE WORK A new hybrid Particle Swarm Optimization algorithm is proposed in this paper, which applies the Nonlinear Simplex Method at late stage of PSO running when the most promising regions of solutions are fixed. We implement wide variety of experiments on well-known benchmark functions to test the proposed algorithm. The results compared to other 3 published methods indicate that this method is reliable and efficient, especially for continuous multimodal function optimizations. Future work may focus on accelerating the convergence for high dimension problems, developing practical applications of this hybrid approach in neuro-fuzzy network optimization, and extending the approach to constrained multi-objective optimization. REFERENCE [] Bonabeau E, Dorigo M, andtheraulaz G. Swarm Intelligence: From Natural to Artificial Systems. Oxford Press, 999 [2] Kennedy J, and Eberhart R C. Swarm Intelligence. Kaufmann Publishers, Morgan, 200 [3] Deneubourg J L, Goss S, Franks N, Sendova-Franks A, Detrain C, and Chr'etien L. The dynamics of collective sorting: Robot-like ants and ant-like robots. Proceedings of the First International Conference on Simulation of Adaptive Behaviour: From Animals to Animats, 99, MIT Press, Cambridge, MA: 356-365 [4] Kennedy J, and Eberhart R C. Particle swarm optimization. Proceedings of IEEE International Conference on Neural Networks, 995, Piscataway, NJ: 942-948 [5] Shi Y, and Eberhart R C. Parameter selection in particle swarm optimization. Evolutionary Programming VII: Proceedings of the 7^^ Annual Conference on Evolutionary Programming, 998, New York: 59-600 [6] Shi Y, and Eberhart R C. Empirical study of particle swarm optimization. Proceedings of the IEEE Congress on Evolutionary Computation, 999, Piscataway, NJ: 945-950 [7] Parsopoulos K E, and Vrahatis M N. Recent approaches to global optimization problems through particle swarm optimization. Natural Computing, 2002, Vol. : 235-306

Improving the Particle Swarm Optimization Algorithm Using the 36 Simplex Method at Late Stage [8] Shu-Kai S. Fan, Yun-Chia Liang, and Erwie Zahara. Hybrid Simplex Search and Particle Swarm Optimization for the Global Optimization of Multimodal Functions. Engineering Optimization, 2004, 36(4): 40-48 [9] Parsopoulos K E, and Vrahatis M N. Initializing the particle swarm optimizer using the nonlinear simplex Method. Advances in Intelligent Systems, Fuzzy Systems, Evolutionary Computation, 2002, WSEAS Press: 26-22 [0] Clerc M. The swarm and the queen: towards a deterministic and adaptive particle swarm optimization. Proceedings of the IEEE Congress on Evolutionary Computation, 999: 95-957 [] Carlisle A, and Dozier G. An off-the-shelf PSO. Proceedings of the Workshop on Particle Swarm Optimization, 200, Indianapolis [2] Clerc M, and Kennedy J. The particle swarm-explosion, stability, and convergence in a multidimensional complex space. IEEE Transactions on Evolutionary Computation, 2002, 6(): 58-73 [3] Neider J, and Mead R. A simplex method for function minimization. Computer Journal, 965, Vol. 7:308-33 [4] Levy A, Montalvo A, Gomez S, et al. Topics in Global Optimization. Springer-Verlag, New York: 98