A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Similar documents
Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India.

DERIVATIVE-FREE OPTIMIZATION

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

Particle Swarm Optimization

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-"&"3 -"(' ( +-" " " % '.+ % ' -0(+$,

Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization

ISSN: [Keswani* et al., 7(1): January, 2018] Impact Factor: 4.116

Particle Swarm Optimization

Optimization of Benchmark Functions Using Artificial Bee Colony (ABC) Algorithm

Genetic-PSO Fuzzy Data Mining With Divide and Conquer Strategy

A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION

A study of hybridizing Population based Meta heuristics

Mobile Robot Path Planning in Static Environments using Particle Swarm Optimization

The Genetic Algorithm for finding the maxima of single-variable functions

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems

Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization

The movement of the dimmer firefly i towards the brighter firefly j in terms of the dimmer one s updated location is determined by the following equat

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

CHAPTER 1 INTRODUCTION

Particle Swarm Optimization applied to Pattern Recognition

Particle Swarm Optimization Approach for Scheduling of Flexible Job Shops

Modified Particle Swarm Optimization

IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE

ATI Material Do Not Duplicate ATI Material. www. ATIcourses.com. www. ATIcourses.com

PARTICLE SWARM OPTIMIZATION (PSO)

A Hybrid Fireworks Optimization Method with Differential Evolution Operators

QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION

Three-Dimensional Off-Line Path Planning for Unmanned Aerial Vehicle Using Modified Particle Swarm Optimization

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

Grid Scheduling Strategy using GA (GSSGA)

Witold Pedrycz. University of Alberta Edmonton, Alberta, Canada

International Conference on Modeling and SimulationCoimbatore, August 2007

Artificial bee colony algorithm with multiple onlookers for constrained optimization problems

Constrained Functions of N Variables: Non-Gradient Based Methods

An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm

Generation of Ultra Side lobe levels in Circular Array Antennas using Evolutionary Algorithms

The Design of Pole Placement With Integral Controllers for Gryphon Robot Using Three Evolutionary Algorithms

Kyrre Glette INF3490 Evolvable Hardware Cartesian Genetic Programming

An Improved Genetic Algorithm based Fault tolerance Method for distributed wireless sensor networks.

DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques

Cell-to-switch assignment in. cellular networks. barebones particle swarm optimization

Model Parameter Estimation

Time Complexity Analysis of the Genetic Algorithm Clustering Method

Using Genetic Algorithms in Integer Programming for Decision Support

SwarmOps for Matlab. Numeric & Heuristic Optimization Source-Code Library for Matlab The Manual Revision 1.0

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function

Heuristic Optimisation

Revision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems

INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & MANAGEMENT

Application of Improved Discrete Particle Swarm Optimization in Logistics Distribution Routing Problem

BI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR FLEXIBLE JOB-SHOP SCHEDULING PROBLEM. Minimizing Make Span and the Total Workload of Machines

International Journal of Current Research and Modern Education (IJCRME) ISSN (Online): & Impact Factor: Special Issue, NCFTCCPS -

A Novel Hybrid Self Organizing Migrating Algorithm with Mutation for Global Optimization

A Modified PSO Technique for the Coordination Problem in Presence of DG

Automatic differentiation based for particle swarm optimization steepest descent direction

GENETIC ALGORITHM with Hands-On exercise

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

Hybrid of Genetic Algorithm and Continuous Ant Colony Optimization for Optimum Solution

Constraints in Particle Swarm Optimization of Hidden Markov Models

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini

Introduction to Optimization

KEYWORDS: Mobile Ad hoc Networks (MANETs), Swarm Intelligence, Particle Swarm Optimization (PSO), Multi Point Relay (MPR), Throughput.

Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems

SURVEY OF TEN NON-TRADITIONAL OPTIMIZATION TECHNIQUES

MIC 2009: The VIII Metaheuristics International Conference. A Comparative Study of Adaptive Mutation Operators for Genetic Algorithms

Network Routing Protocol using Genetic Algorithms

Dynamic Economic Dispatch for Power Generation Using Hybrid optimization Algorithm

Introduction to Optimization

MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS

CHAPTER 5 ENERGY MANAGEMENT USING FUZZY GENETIC APPROACH IN WSN

Initializing the Particle Swarm Optimizer Using the Nonlinear Simplex Method

An Introduction to Evolutionary Algorithms

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms

Binary Differential Evolution Strategies

Discrete Particle Swarm Optimization for TSP based on Neighborhood

Pre-requisite Material for Course Heuristics and Approximation Algorithms

Step Size Optimization of LMS Algorithm Using Particle Swarm Optimization Algorithm in System Identification

Dr. Ramesh Kumar, Nayan Kumar (Department of Electrical Engineering,NIT Patna, India, (Department of Electrical Engineering,NIT Uttarakhand, India,

Evolutionary Methods for State-based Testing

Evolutionary Computation Algorithms for Cryptanalysis: A Study

Algorithm for Classification

LECTURE 16: SWARM INTELLIGENCE 2 / PARTICLE SWARM OPTIMIZATION 2

MATH 209, Lab 5. Richard M. Slevinsky

Simplifying Handwritten Characters Recognition Using a Particle Swarm Optimization Approach

Multi-Objective Memetic Algorithm using Pattern Search Filter Methods

An Intelligent Energy Efficient Clustering in Wireless Sensor Networks

Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms

Parameter Estimation of PI Controller using PSO Algorithm for Level Control

Transcription:

IOSR Journal of Computer Engineering (IOSR-JCE) e-issn: 2278-0661,p-ISSN: 2278-8727 PP 18-22 www.iosrjournals.org A Comparative Study of Genetic Algorithm and Particle Swarm Optimization Mrs.D.Shona 1, Dr.M.Senthilkumar 2, S.Valarmathi 3 & G.Malathy 4 1 Research Scholar, Bharathiar University, Assistant Professor, Department of Computer Science, Sri Krishna Arts and Science College, Coimbatore. 2 Assistant Professor, PG & Research Department of Computer Science, Government Arts College, Udumalpet. 3,4 M.Sc CS, Department of Computer Science, Sri Krishna Arts and Science College, Coimbatore. Abstract: Among different search and optimization techniques, the development of Evolutionary Algorithms (EA) has gained large attention in the field of research and EA acts as an effective method for solving various optimization problems. Genetic Algorithm (GA) is an adaptive metaheuristic optimization algorithm which is mainly based on the theory of Natural Selection and Genetics. GA can be mainly used in various areas such as Neural Networks, Image Processing, Vehicle routing problems and so on. The drawback of the GA is, it is of high cost. Particle Swarm Optimization (PSO) which is also a kind of metaheuristic optimization algorithm based on the social behavior of organisms which possess swarming characteristics. PSO and the GA are very closely associated population based search methods where both advances from a set of points to another set of points in an iteration providing perceptible improvements from the antecedent values using some probabilistic and deterministic conditions. The main objective of this paper is to study and compare the efficiency and effectiveness of the two evolutionary algorithms. Keywords: Evolutionary Algorithm (EA), Genetic Algorithm (GA), Metaheuristics, Particle Swarm Optimization (PSO). I. Introduction The Genetic Algorithm (GA) was introduced in the mid-1970s by John Holland and his colleagues and students at the University of Michigan [1]. Mostly GA provides very approximate solutions to various problems. GA uses various biological techniques such as inheritance, selection, crossover or recombination, mutation and reproduction. The GA uses the principle of survival of the fittest in its search process to choose and generate individuals which are adapted to the environment. The GA is applied to solve complex design optimization problems as it can handle both discrete and continuous variables and non-linear objective and constrain functions without requiring gradient information. Mainly, the optimization problems can be solved easily through the concept called Swarm Intelligence. The concept of Swarm Intelligence was introduced by Gerardo Beni and Jing Wang in 1989, who got inspired from bird flocking, ant colonies, fish schooling and animal herding. Particle Swarm Optimization (PSO) was developed by Kennedy and Eberhart in the mid-1990s [2]. The basic idea in PSO is that each particle represents a potential solution which it updates accordingly using decision process. There are no evolution operators such as crossover and mutation in PSO, rather it is provided with particles which fly through the problem space by following the current optimum particles. Due to its various advantages like efficiency, robustness, effectiveness and simplicity, PSO has been used large in numbers. When compared with other stochastic algorithms it has been found that PSO requires less computational effort [3][4]. Although PSO has shown its potential on many aspects for solving different optimization problems, it still requires considerable execution time to find solutions for large-scale engineering problems [5][6]. In section II, survey has been made on related works and in section III and IV, PSO versus GA and Comparison Measure are shown respectively. The Concluding remarks are given in section V. II. Literature Survey J. Eberhart et al [2] introduced the concept for the optimization of non-linear functions using particle swarm methodology. The evolution of several paradigms is outlined, and an implementation of one of the paradigms is discussed. Benchmark testing of the paradigm is described, and applications, including non-linear function optimization and neural network training, are proposed. The relationships between particle swarm optimization and both artificial life and genetic algorithms are described. A. Engelbrecht [3] provided a comprehensive introduction to the new computational paradigm of Swarm Intelligence (SI), a field that emerged from biological research and introduces the various mathematical models of social insects collective behavior, and shows how they can be used in solving optimization problems. 18 Page

III. PSO versus GA 3.1 Particle Swarm Optimization (PSO) In this study, the basic PSO algorithm which is described in reference 7 is implemented. The basic algorithm is described first, followed by functional constraint handling, and finally, a discrete version of algorithm is presented. Figure 1: Flow chart of Particle Swarm Optimization The GA encodes the design variables into bits of 0 s and 1 s, when it is inherently discrete. So, it is noted that it easily handles discrete design variables. Whereas, on the other hand, PSO is inherently continuous and should be modified to handle discrete design variables. The simple PSO algorithm consists of only three steps, namely, obtaining particle s positions and velocities, calculating velocity vectors and finally, position update.consider a particle refers to a point in the design space which continuously changes its position for each iteration according to the velocity updates. Initially, the positions x k i and the velocities v k i of the initial swarm of particles are randomly generated in accordance with the upper and lower bounds on the design variable values, x min and x max as given in equations 1 and 2. The positions and velocities are expressed in vector format with the superscript and subscript denoting the i th particle at time k. In the below equations 1 and 2, rn1 and rn2 are uniformly distributed random numbers between 0 and 1. x 0 i = x min + rn1(x max x min ) ( 1) v 0 i = x min + rn2(x max x min ) ( 2) The next step is to calculate the velocities of all particles at time k+1 using the objective of the particles or the fitness values in the design space at time k. The fitness function value of.the particle states that which particle has the best global value in the current swarm, p k g and also determines best position of the particle, p i.the velocity calculation formula uses this information along with the effect of current motion, v k i, to provide search direction and the next iteration will be v i k+1. The uniformly distributed variables rn1 and rn2 are used to prevent entrapment in local optima and to acquire a good coverage. Totally, there are three parameters to be used here. The first one is the inertia of the particle which is w and the other two parameters are c1 and c2. The c1 is the parameter which indicates that how much confidence it has in itself and c2 indicates that how much confidence it has in the swarm. i = wv i k + c1 rn1 (pi x i k ) v k+1 + c2 rn2 (p g k x i k ) ( 3) 19 Page

In usual PSO algorithm, the values used for w, c1 and c2 are 1, 2 and 2 respectively. But, in this paper, it is found that by providing values such as 0.3, 1.3 and 1.2 to w, c1 and c2 respectively gives the best convergence rate for all kind of problems. Other combinations of values may produce slower convergence or no convergence at all. Actual Pso Values Modified Pso Values Weight Factors Valu Weight Factors Values es W 1 W 0.3 C1 2 C1 1.3 C2 2 C2 1.2 Table 1: Comparison of actual and modified PSO values The last step is the position update. The position can be updated in each iteration in accordance with the velocity vector. i x k+1 = x i i k + v k+1 ( 4) The velocity update, position update and fitness calculation are repeated until a desired convergence is obtained. In the PSO implemented in this study, the maximum changes in best fitness are the stopping criteria that should be smaller than the defined tolerance for a defined number of movess, as in Equation 5. In this study, S is defined as ten moves and εis defined as for 10 5 all test problems. f(p g g k ) f(p k q ) ε q = 1,2,. S ( 5) In PSO, the design variable can take any value depending on their current position design space and the calculated velocity vector. When velocity vector increases rapidly the value of design variable can go outside their upper and lower limits,x min or X max. Figure 2: Illustration of velocity and position updates in Particle Swarm Optimization This leads to a divergence. To avoid this problem in this paper, whenever the design variable violates their upper or lower design bounds, they are artificially brought back to their nearest side constraints. The functional constraints are handled in same way they are handled in GA. But in this comparison linear exterior penalty function is used to handle the function constraints as shown in Equation 6. f x = (x) + N con i=1 r i max [0, g i(x) ] ( 6) 3.2 Genetic Algorithm (GA) In this study, the benchmark problems can be solved by implementing basic binary encoded GA with tournament selection, uniform crossover and with lower probability mutation rate. The binary strings 0 s and 1 s referred to as chromosomes in GA is to represents the design variables of each individual design.ga works with coding for design parameters that permits both discrete and continuous parameters in one problem statement. To accomplish optimization-like process, the GA applies three operators to promulgate from one population to other population. The first operator is Selection operator that ridicules the principal of Survival of the fittest. The second operator is crossover that ridicules mating in biological populations. The features good surviving design is promulgated from current population to the future population using crossover operator which will have better fitness value on average. The last operator is mutation that elevates diversity population characteristics. 20 Page

The mutation operator permits a global search of design space and protects algorithm from getting trammeled in local minima. 3.2.1 Basic implementation of algorithm Figure 3: Flow chart of Genetic Algorithm IV. Comparison Measure The objective of this paper is to compare the performance of two empirical search methods GA and PSO using a set of test problems. The t-test (hypothesis test) is used to compare the effectiveness and efficiency of both the search algorithms.in this paper, two hypothesis are tested. The first test is the Effectiveness (Finding true global optimum) and the second test is the Efficiency (Computational cost) of the algorithms. The ability of the algorithm to find global solutions when the algorithm is started at the different random points in the design space. The solution s quality is measured by how familiar the solution is to the known global solution as shown in equation (7). Q sol solution known solution known solution % ( 7) The solution quality measure described in equation (7) is used to composition a meaningful hypothesis to test the effectiveness test of search algorithms, PSO and GA. 4.1 Benchmark Test Problems: In this section a set of optimization benchmark testproblems are presented. These problems are solved by both GA and PSO. This set of problems include two, unconstrained and continuous functions like the banana(rosenbrock) function and the Eggcrate function each has two design variables. 4.1.1 The Banana (Rosenbrock) Function: This function is known as the Banana Function because of its shape. The function is described mathematically in Equation (8). In this we have design variables with upper and lower limits of [-6 6] and it has a known global minimum of at [1 1] with optimal function value of zero. Minimize f(x) = 100(x 2 x 1 2 ) 2 + (1 x 1 ) 2 ( 8) 4.1.2 TheEggcrate Function: This function is described mathematically in equation(9). It has two design variables with upper and lower limits [-3п, 3п]. The eggcrate function has two known global minimum at[0 0] with an optimal function value of zero. Minimize f(x)25 + (sin 2 x 1 + sin 2 x 2 ) + x 1 2 + x 2 2 ( 9) 21 Page

V. Conclusion Particle Swarm Optimization (PSO) is a relatively recent heuristic search method which is based on swarming and collaborative behavior in biological populations.the main objective of this paper is to test the hypothesis that states that although PSO and GA on average, yields the same effectiveness, but PSO is more computationally efficient than the GA. To examine this claim, two statistical tests were set to test the two elements of this claim, equal effectiveness but superior efficiency for PSO over the GA.The benchmark test problems included here are: the banana (Rosenbrock function) and the Eggcrate Function. For both PSO and GA, the effectiveness test can be done; it uses a quality of solution metric which measures the difference between the solutions obtained by the known solutions and the heuristic approaches of the test problems. The efficiency test uses the number of evaluation functions required by the search method to attain convergence. The same convergence criterion is enforced on both PSO and the GA. The results of the test that while both PSO and GA obtain high quality solutions, with quality indices of 99% for most test problems, to arrive at a high quality solutions, the computational effort required for the PSO is less than the effort required by the GA. So, the results again show the computational efficiency superiority of PSO over the GA is statistically proven. References [1]. J.H.Holland, -Genetic algorithms and the optimal allocation of trials, SIAM Journal of Computing, Vol.2, no.2, pp. 88-105, 1973. [2]. Kennedy, J. and Eberhart, R., Particle Swarm Optimization, Neural Networks, 1995.Proceedings of the IEEE International Conference on Neural Networks, Perth, WA, Australia, Vol. 4, pp. 1942-1948, 1995. [3]. A.Engelbrecht, -Fundamentals of computational swarm intelligence, 2006, Hoboken: John Wiley & Sons, Ltd. [4]. N.Nedjah, L. dos Santos Coelho, and L.de MacedoMourelle, - Multiobjective swarm intelligent systems: theory & experiences. Springer Science & Business Media, 2009, vol.261. [5]. A.Kumar, A.Khosla, J.S.Saini and S.Singh,- Meta-heuristic range based node localization algorithm for wireless sensor networks, in localization and GNSS (ICLGNSS), 2012 International Conference on IEEE, 2012, pp.1-7. [6]. S.Singh, Shivangna and E. Mittal,-Range based wireless sensor nodes localization using PSO and BBO and its variants, in Communication Systems and Network Technologies (CSNT), 2013 International Conference on IEEE, 2013, pp. 309-315. [7]. Venter, G. and Sobieski, J., Particle Swarm Optimization, AIAA 2002-1235,43 rd IAAA /ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference, Denver, CO., April 2002. [8]. Goldberg,D., Genetic Algorithms in Search, Optimization and Machine Learning,Addison Wesley, Reading, MA,1989, pp. 1-25. [9]. Williams, E.A., Crossley, W.A., Emprically-derived population size and Mutation rate guidelines for a Genetic Algorithm with uniform crossover, Soft Computing in Engineering Design and Manfacturing, P.K. Chawdhry, R.Roy and R.K. Pant (Editors), Springer-Verlag, 1998,pp.163-172. [10]. Harnett, D.,Statistical Methods, 3 rd ed., Addison-Wesley, 1982,pp. 247-297. 22 Page