A Comparative Analysis on the Performance of Particle Swarm Optimization and Artificial Immune Systems for Mathematical Test Functions.

Similar documents
Small World Particle Swarm Optimizer for Global Optimization Problems

IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE

A Hybrid Fireworks Optimization Method with Differential Evolution Operators

Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization

Particle Swarm Optimization

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India.

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

Modified Particle Swarm Optimization

Particle Swarm Optimization Artificial Bee Colony Chain (PSOABCC): A Hybrid Meteahuristic Algorithm

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-"&"3 -"(' ( +-" " " % '.+ % ' -0(+$,

Optimization of Benchmark Functions Using Artificial Bee Colony (ABC) Algorithm

A MULTI-SWARM PARTICLE SWARM OPTIMIZATION WITH LOCAL SEARCH ON MULTI-ROBOT SEARCH SYSTEM

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

A NEW APPROACH TO SOLVE ECONOMIC LOAD DISPATCH USING PARTICLE SWARM OPTIMIZATION

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

Particle Swarm Optimization

Benchmark Functions for the CEC 2008 Special Session and Competition on Large Scale Global Optimization

Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems

Université Libre de Bruxelles

Small World Network Based Dynamic Topology for Particle Swarm Optimization

Feature weighting using particle swarm optimization for learning vector quantization classifier

ARMA MODEL SELECTION USING PARTICLE SWARM OPTIMIZATION AND AIC CRITERIA. Mark S. Voss a b. and Xin Feng.

PARALLEL PARTICLE SWARM OPTIMIZATION IN DATA CLUSTERING

An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm

Reconfiguration Optimization for Loss Reduction in Distribution Networks using Hybrid PSO algorithm and Fuzzy logic

Automatic differentiation based for particle swarm optimization steepest descent direction

Adaptative Clustering Particle Swarm Optimization

A RANDOM SYNCHRONOUS-ASYNCHRONOUS PARTICLE SWARM OPTIMIZATION ALGORITHM WITH A NEW ITERATION STRATEGY

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization

A Novel Content Based Image Retrieval Implemented By NSA Of AIS

Variable Neighborhood Particle Swarm Optimization for Multi-objective Flexible Job-Shop Scheduling Problems

Mobile Robot Path Planning in Static Environments using Particle Swarm Optimization

Simplifying Handwritten Characters Recognition Using a Particle Swarm Optimization Approach

SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION

FDR PSO-Based Optimization for Non-smooth Functions

Modified Particle Swarm Optimization with Novel Modulated Inertia for Velocity Update

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques

GA is the most popular population based heuristic algorithm since it was developed by Holland in 1975 [1]. This algorithm runs faster and requires les

Optimization Using Particle Swarms with Near Neighbor Interactions

Distributed Password Manager- Using Secure String Management technique

PARTICLE SWARM OPTIMIZATION (PSO) [1] is an

Experiments with Firefly Algorithm

Index Terms PSO, parallel computing, clustering, multiprocessor.

QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION

Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization

A Computation Time Comparison of Self-Organising Migrating Algorithm in Java and C#

The Modified IWO Algorithm for Optimization of Numerical Functions

PARTICLE SWARM OPTIMIZATION (PSO)

Grid Scheduling using PSO with Naive Crossover

Particle Swarm Optimization to Solve Optimization Problems

Cooperative Coevolution using The Brain Storm Optimization Algorithm

Application of Improved Discrete Particle Swarm Optimization in Logistics Distribution Routing Problem


Tracking Changing Extrema with Particle Swarm Optimizer

Three-Dimensional Off-Line Path Planning for Unmanned Aerial Vehicle Using Modified Particle Swarm Optimization

OPTIMUM CAPACITY ALLOCATION OF DISTRIBUTED GENERATION UNITS USING PARALLEL PSO USING MESSAGE PASSING INTERFACE

The movement of the dimmer firefly i towards the brighter firefly j in terms of the dimmer one s updated location is determined by the following equat

A Novel Hybrid Self Organizing Migrating Algorithm with Mutation for Global Optimization

Kent Academic Repository

Discrete Particle Swarm Optimization for TSP based on Neighborhood

Optimized Algorithm for Particle Swarm Optimization

Initializing the Particle Swarm Optimizer Using the Nonlinear Simplex Method

An Improved Tree Seed Algorithm for Optimization Problems

Exploratory Toolkit for Evolutionary and Swarm-Based Optimization

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding

Cell-to-switch assignment in. cellular networks. barebones particle swarm optimization

Constraints in Particle Swarm Optimization of Hidden Markov Models

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES

A Particle Swarm Optimization Algorithm for Solving Flexible Job-Shop Scheduling Problem

A Comparative Study of the Application of Swarm Intelligence in Kruppa-Based Camera Auto- Calibration

Particle swarm optimization for mobile network design

Comparative Study of Meta-heuristics Optimization Algorithm using Benchmark Function

Improving Tree-Based Classification Rules Using a Particle Swarm Optimization

Exploration vs. Exploitation in Differential Evolution

Modified Shuffled Frog-leaping Algorithm with Dimension by Dimension Improvement

Performance Assessment of DMOEA-DD with CEC 2009 MOEA Competition Test Instances

A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

APPLICATION OF ARTIFICIAL IMMUNE SYSTEM ALGORITHM TO ELECTROMAGNETICS PROBLEMS

An Island Based Hybrid Evolutionary Algorithm for Optimization

PARTICLE Swarm Optimization (PSO), an algorithm by

Immune Optimization Design of Diesel Engine Valve Spring Based on the Artificial Fish Swarm

HPSOM: A HYBRID PARTICLE SWARM OPTIMIZATION ALGORITHM WITH GENETIC MUTATION. Received February 2012; revised June 2012

Adaptively Choosing Neighbourhood Bests Using Species in a Particle Swarm Optimizer for Multimodal Function Optimization

International Journal of Advance Research in Computer Science and Management Studies

Adaptive Particle Swarm Optimization (APSO) for multimodal function optimization

Comparing Classification Performances between Neural Networks and Particle Swarm Optimization for Traffic Sign Recognition

A Gaussian Firefly Algorithm

Feeding the Fish Weight Update Strategies for the Fish School Search Algorithm

Study on the Development of Complex Network for Evolutionary and Swarm based Algorithms

A NEW METHODOLOGY FOR EMERGENT SYSTEM IDENTIFICATION USING PARTICLE SWARM OPTIMIZATION (PSO) AND THE GROUP METHOD OF DATA HANDLING (GMDH)

Global Optimal Analysis of Variant Genetic Operations in Solar Tracking

A Modified PSO Technique for the Coordination Problem in Presence of DG

Particle Swarm Optimization Approach for Scheduling of Flexible Job Shops

AN IMPROVED MULTIMODAL PSO METHOD BASED ON ELECTROSTATIC INTERACTION USING N- NEAREST-NEIGHBOR LOCAL SEARCH

A novel adaptive sequential niche technique for multimodal function optimization

THREE PHASE FAULT DIAGNOSIS BASED ON RBF NEURAL NETWORK OPTIMIZED BY PSO ALGORITHM

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

Advances in Military Technology Vol. 11, No. 1, June Influence of State Space Topology on Parameter Identification Based on PSO Method

EE 553 Term Project Report Particle Swarm Optimization (PSO) and PSO with Cross-over

Time Complexity Analysis of the Genetic Algorithm Clustering Method

Transcription:

Australian Journal of Basic and Applied Sciences 3(4): 4344-4350 2009 ISSN 1991-8178 A Comparative Analysis on the Performance of Particle Swarm Optimization and Artificial Immune Systems for Mathematical Test Functions. 1 2 2 David F.W. Yap S.P. Koh S.K. Tiong 1 Faculty of Electronics and Computer Engineering Universiti Teknikal Malaysia Melaka (UTeM) Karung Berkunci No. 1752 Pejabat Pos Durian Tunggal 76109 Durian Tunggal Melaka Malaysia. 2 College of Engineering Universiti Tenaga Nasional (UNITEN) km 7 Kajang-Puchong Road 43009 Kajang Selangor Darul Ehsan Malaysia. Abstract: Over the years the area of Artificial Immune Systems () has drawn wide attention among researchers as the algorithm is able to enhance local searching ability and efficiency. Alternatively Particle Swarm Optimization () has been used effectively in solving optimization problems. This paper compares the optimization results of the mathematical functions using and. The numerical results show that both and give comparable fitness solutions with the former performing about 56 percent faster than the latter. Conversely for simpler mathematical functions performs marginally faster than at about 14 percent while maintaining good accuracy of the objective value. Key words: Artificial Immune Systems Particle Swarm Optimization Clonal Selection Affinity and Maturation. INTRODUCTION Particle Swarm Optimization () make use of social behaviour (Pant M. 2008; Kennedy J. and R.C. Eberhart 2001) of individuals living together in groups. Each individual tries to emulate other better group members in order to improve itself. That way optimization procedure is being performed by the group members as described in (Kennedy J. and R.C. Eberhart 2001). The performance of depends on the way the particles move in the search space with a velocity that is repeatedly updated. Conversely Artificial Immune System () is an intelligent problem-solving technique which finds applications in optimization computer security data clustering pattern recognition or even fault tolerance (Dasgupta D. 2006). The natural immune system uses a diversity of evolutionary and adaptive techniques to protect organisms from foreign pathogens and misbehaving cells in the body (De Castro L.N. and J.V. Zuben 2005). seek to capture some aspects of the natural immune system in a computational framework for solving engineering problems (Glickman M. 2005). Clonal selection algorithm that is used in our simulation is a special kind of that uses the principle of clonal expansion and the affinity maturation as the main forces of the evolutionary process (De Castro L.N. and J. Timmis 2002). The application of a and for the optimization of mathematical functions is described here. The functions that are to be used to compare the optimization results include Rastrigin s function Griewangk s function Rosenbrock s function Ackley s function and De Jong s function. While De Jong s function is unimodal the rest of the mathematical functions are multimodal in nature. In this paper and are explained briefly in Sections 2 and 3 respectively. Section 4 describes the multimodal and unimodal test functions while Section 5 summarizes the experimental results. Conclusion and further research aspects are given in Section 6. 2. Particle Swarm Optimization: Particle Swarm Optimization () was originally proposed by Eberhert and Kennedy as a simulation of social behaviour of organisms that live and interact within large groups (Kennedy J. and R.C. Eberhard 1995). Corresponding Author: David F.W. Yap Faculty of Electronics and Computer Engineering Universiti Teknikal Malaysia Melaka (UTeM) Karung Berkunci No. 1752 Pejabat Pos Durian Tunggal 76109 Durian Tunggal Melaka Malaysia. E-mail: david.yap@utem.edu.my 4344

Aust. J. Basic & Appl. Sci. 3(4): 4344-4350 2009 The essence of is that particles move through the search space with velocities which are dynamically adjusted according to their historical behaviours. This mimics the social behaviour of animals such as bird flocking and fish schooling. With several alterations on the original model the algorithm could optimize complex functions based on the concept of Swarm Intelligence methods (Kennedy J. 2001). algorithm starts with a group of random particles that searches for optima by updating each generation. Each particle is represented by a volume-less particle in the n-dimensional search space. The i th particle is denoted as X i=(x i1x i2.x in). At each generation each particle is updated by ensuing two best values. They are the best solution that has been achieved (mbest) and the global best value (gbest) that is obtained so far by any particle in the population. With the inclusion of the inertia factor ù the equations are (Shi Y. and R.C. Eberhard 1998) where rnd() is a random number independently generated within the range [01] and á 1 and á 2 are two learning factors which control the influence of the individual s knowledge and that of the neighbourhood respectively. The pseudo-code format of a standard algorithm can be characterized in Figure 1. The optimization algorithm can be written as follows: 1. Generate a random initial swarm of particles assigning each one a random position and velocity. 2. Compute the fitness values of the N particles. 3. Update the values of the best position of each particle and the best position found by the swarm. 4. Update the position and the velocity of every particle according to Eq. 1 and 2. 5. Steps 2 to 4 are repeated until a pre-defined stopping condition is reached. begin c:=0 { counter } Initialize particle Do: {for each particle} Calculate fitness value If the fitness value is better than the best fitness value (mbest) in history set current value as the new mbest Choose the particle with the best fitness value of all the particles as the gbest Calculate particle velocity using Eq. 1 Update particle position based on Eq. 2 c:=c+1 end end Fig. 1: Pseudo-code of the Particle Swarm Optimization 3. Artificial Immune System Clonal Selection for Optimization: Artificial Immune Systems () are inspired by theoretical immunology and observed immune functions principles and models which are applied to solving engineering problems (De Castro L.N. and J. Timmis 2002). The clonal selection algorithm is a part of based on clonal expansion and affinity maturation (De Castro L.N. and J.V. Zuben 2002). The clonal selection theory describes that when an antigen (Ag) is detected antibodies (Ab) that best recognize this Ag will proliferate by cloning. This immune response is specific to each Ag. The immune cells will reproduce in response to a replicating Ag until it is successful in recognizing and fighting against this Ag. Some of the new cloned cells will be differentiated into plasma cells and memory cells. The plasma cells produce Ab and promotes genetic variation when it undergo mutation. The memory cells are responsible for the immunologic response for future Ag invasion. Subsequently the selection mechanism will keep the cells with the best affinity to the Ag in the next population. The process of a standard clonal selection algorithm can be characterized in pseudo-code format in Figure 2. The summary of the clonal selection optimization is described as (De Castro L.N. and J. Timmis 2002): 1. Generate a random initial population of Ab. (1) (2) 4345

Aust. J. Basic & Appl. Sci. 3(4): 4344-4350 2009 2. Compute the fitness of each Ab. 3. Generate clones by cloning all cells in the Ab population. 4. Mutate the clone population to produce a mature clone population. 5. Evaluate affinity values of the clones population. 6. Select the best Ab to compose the new Ab population. 7. Steps 3 to 6 are repeated until a pre-defined stopping condition is reached. begin c:=0 { counter } Initialize population Do: Compute affinity Generate clones Mutate clones Replace lowest affinity Ab with a new randomly generated Ab c:=c+1 end end Fig. 2: Pseudo-code of the Clonal Selection Algorithm 4. Mathematical Test Functions: In order to compare the performance of and five mathematical test functions would be used (Suganthan P.N. 2005). These test functions provide a firm benchmarking method in testing the efficiency of the global optimization algorithms. For the next subsections the details of each of the test functions used in this study would be discussed. 4.1 Rastrigin s Function: Rastrigin s function is given as The global minimum is located at the origin and its function value is zero. The two-dimensional Rastrigin s function is shown in Figure 3. Fig. 3: A two-dimensional plot of Rastrigin s function 4346

4.2 Griewangk s Function: Griewangk s function is defined as Aust. J. Basic & Appl. Sci. 3(4): 4344-4350 2009 The global minimum is located at the origin and its function value is zero. The two-dimensional Griewangk s function is shown in Figure 4. Fig. 4: A two-dimensional plot of Griewangk s function 4.3 Rosenbrock s Function: Rosenbrock s function is given as The global minimum is located at (1 1) and its function value is zero. The two-dimensional Rosenbrock s function is shown in Figure 5. Fig. 5: A two-dimensional plot of Rosenbrock s function 4347

4.4 Ackley s Function: Ackley s function is defined as Aust. J. Basic & Appl. Sci. 3(4): 4344-4350 2009 The global minimum is located at the origin and its function value is zero. The two-dimensional Ackley s function is shown in Figure 6. Fig. 6: A two-dimensional plot of Ackley s function 4.5 De Jong s Function: De Jong s function is given as The global minimum is located at the origin and its function value is zero. The two-dimensional De Jong s function is shown in Figure 7. Fig. 7: A two-dimensional plot of De Jong s function 4348

Aust. J. Basic & Appl. Sci. 3(4): 4344-4350 2009 RESULTS AND DISCUSSIONS The comparison in simulation between the performance of GA and was carried out using a computer with AMD Phenom 9600B Quad-Core CPU running at 2.30 GHz 2GB of RAM and Windows Vista Enterprise operating system. For the which were based on clonal selection algorithm (De Castro L.N. and J.V. Zuben 2002) mutation probability is 0.01 while the clone size factor is 2. Both the and would have an initial population size and number of iterations fixed at 50. The simulation results and comparison between and are shown in Tables 1 through 5. Table 1: Results obtained from Rastrigin s function. Fitness function value 0.000302 0.001597 Best particle/ab -0.00123 0.002837 Time taken (s) 0.097136 0.22774 % Speed faster than by 57.35% Table 2: Results obtained from Griewangk s function. Fitness function value 0.000441 0.0017484 Best particle/ab 0.029682 0.022177 Time taken (s) 0.093984 0.072699 % Speed faster than by 22.65% Table 3: Results obtained from Rosenbrock s function. Fitness function value 0.000212 0.007289 Best particle/ab [0.98578;0.97206] [0.97852;0.96576] Time taken (s) 0.076162 0.274619 % Speed faster than by 72.27% Table 4: Results obtained from Ackley s function. Fitness function value 2.9805 2.999 Best particle/ab 1.43E-12 0.027708 Time taken (s) 0.197996 0.321026 % Speed faster than by 38.32% Table 5: Results obtained from De Jong s function. Fitness function value 0.000126 2.64E-07 Best particle/ab 0.011207 0.00051345 Time taken (s) 0.092289 0.08755 % Speed faster than by 5.13% It is evident that and gives comparable result while fitness value of the former is closest to the global minimum. Tables 1 3 and 4 show that is significantly faster than for more complex mathematical functions. Conversely is marginally faster than for Griewangk s function and De Jong s function as can be seen in Tables 2 and 5. Both of the test function are less complicated and require minimal computational effort. At a glance the speed difference may not be significant as the time taken for both and simulation to run is in the region of milliseconds on a very fast PC. The performs faster probably due to the smaller number of parameters used in the calculation. This in turn uses up less memory space and thus enhance the speed of computational time. 6. Conclusion: The simulation results of and was presented and compared. The usage of the multimodal and unimodal test function shows that both algorithms have comparable fitness value with differences in the order of tens or less from the global minimum. However both and were trapped in the local minima during simulation of the Ackley s function. An increase in the number of iterations and better mutation factor would most probably overcome this problem although longer simulation time is required. From the results on average performs about 56 percent faster than in calculating the best fitness value of the mathematical test functions. On the contrary performs faster than (about 14 percent) when less taxing 4349

Aust. J. Basic & Appl. Sci. 3(4): 4344-4350 2009 mathematical functions are used. Future work may include improved or modified such as a hybrid algorithm that will be used to optimize other complicated test function. REFERENCES Dasgupta D. 2006. Advances in Artifical Immune Systems IEEE Computational Intelligence Magazine 1(4): 40-49. De Castro L.N. and J.V. Zuben 2005. Recent Developments in Biologically Inspired Computing Idea Group Inc. (IGI) Publishing. De Castro L.N. and J. Timmis 2002. Artificial Immune Systems: A New Computational Approach Springer-Verlag New York. De Castro L.N. and J. Timmis 2002. Artificial Immune Systems: A Novel Paradigm to Pattern Recognition J. M. Corchado L. Alonso and C. Fyfe (eds) Artificial Neural Networks in Pattern Recognition SOCO-2002. University of Paisley UK pp: 67-84. De Castro L.N. and J.V. Zuben 2002. Learning and Optimization using Clonal Selection Principle IEEE Trans on Evolutionary Computation Special Issue on Artificial Immune Systems 6(3): 239-251. Glickman M. J. Balthrop and S. Forrest 2005. A Machine Learning Evaluation of an Artificial Immune System Evolutionary Computation Journal 13(2): 179-212. Kennedy J. and R.C. Eberhart 2001. Swarm Intelligence Morgan Kaufmann Academic Press. Kennedy J. and R.C. Eberhard 1995. Particle Swarm Optimization In Proceedings of the IEEE International Conference on Neural Networks pp: 1942-1948. Kennedy J. R.C. Eberhard and Y. Shi 2001. Swarm Intelligence. The Morgan Kaufmann Series in Evolutionary Computation Academic Press San Francisco. Pant M. R. Thangaraj and A. Abraham 2008. Particle Swarm Based Meta-Heuristics for Function th Optimization and Engineering Applications 7 Computer Information Systems and Industrial Management Applications 7: 84-90. Shi Y. and R.C. Eberhard 1998. A Modified Particle Swarm Optimizer Procedings of the IEEE Congress on Evolutionary Computation (CEC 1998) Piscataway N.J. pp: 69-73. Suganthan P.N. N. Hansen J.J. Liang K. Deb Y.P. Chen A. Auger and S. Tiwari 2005. Problem Definations and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization Technical Report 2005 Nanyang Technological University Singapore and IIT Kanpur India. 4350