Binary Differential Evolution Strategies

Size: px
Start display at page:

Download "Binary Differential Evolution Strategies"

Transcription

1 Binary Differential Evolution Strategies A.P. Engelbrecht, Member, IEEE G. Pampará Abstract Differential evolution has shown to be a very powerful, yet simple, population-based optimization approach. The nature of its reproduction operator limits its application to continuous-valued search spaces. However, a simple discretization procedure can be used to convert floating-point solution vectors into discrete-valued vectors. This paper considers three approaches in which differential evolution can be used to solve problems with binary-valued parameters. The first approach is based on a homomorphous mapping [], while the second approach interprets the floating-point solution vector as a vector of probabilities, used to decide on the appropriate binary value. The third approach normalizes solution vectors and then discretize these normalized vectors to form a bitstring. Empirical results are provided to illustrate the efficiency of both methods in comparison with particle swarm optimizers. I. INTRODUCTION Differential evolution (DE) is a stochastic, populationbased search strategy developed by Storn and Price [2], [3]. Its reproduction operator consists of a mutation step to create a trial vector, which is then used by the crossover operator to produce one offspring. Mutation step sizes are calculated as weighted differences between randomly selected individuals. It is this reliance on difference vectors that makes DE applicable to optimization problems with continuous-valued parameters. The standard DE algorithms can not be applied as is to solve problems with binary-valued parameters. Although DE was developed for optimizing continuousvalued parameters, discretization methods have been applied to the floating-point solution vectors to transform these vectors into discrete-valued vectors. Such a procedure has been used for solving integer and mixed-integer programming problems [4] [9]. The discretization process is quite simple: each floating-point value of a solution vector is simply rounded to the nearest integer. For problems where an ordering exists among the values of a parameter, the index number in the ordered sequence is used as the discretized value [8]. This paper presents and evaluate three approaches to use DE to optimize binary-valued parameters: The angle modulated DE () [] uses the standard DE to evolve a bitstring generating function. The (binde) treats each floating-point component of a solution vector as a probability of producing either bit or bit. Lastly, the normalization DE (normde) first normalizes each solution vector such that all components are in the range [, ], and then produces a bitstring by using bit zero if the normalized component is less than.5; otherwise, bit is used. A.P. Engelbrecht and G. Pampará are both with the Department of Computer Science, University of Pretoria, South Africa ( {engel,gpampara}@cs.up.ac.za The rest of this paper is organized as follows: A short overview of DE is given in Section II. The is described in Section III, while the binde is presented in Section IV. The normalization approach is described in short in Section V. Results are presented and discussed in Section VI. II. DIFFERENTIAL EVOLUTION Contrary to more common EAs, the reproduction operator used in DE does not depend on some probability density function where elements in the individual are perturbed. The DE uses an arithmetic operator which alters the internal representation of individuals to generate deviations. The generated deviated vector, also known as a trial vector, is evaluated, and if the resulting fitness is better than the main parent, then the newly generated individual replaces its main parent. For each individual, x i (t), of the popolation at generation t, generate a trial vector, x i (t), asfollows:letx i(t) be the main parent. Then, select randomly from the population three other individuals, x i (t), x i2 (t) and x i3 (t), with i i 2 i 3 i, and i,i 2,i 3 U(,...,s), where s is the population size. Select a random number, r U(,...,n x ), where n x is the number of genes (or parameters to be optimized) of a single chromosome. Then for all parameters, j =,...,n x, if U(, ) <P r,orifj = r, let Otherwise, let x ij(t) =x i3j(t)+f (x ij(t) x i2j(t)) () x ij(t) =x ij (t) (2) If f(x i (t)) is better than f(x i(t)), then the latter is replaced with the offspring. In the above, P r is the probability of reproduction (with P r [, ]), F is a scaling factor (with F (, )), and x ij (t) and x ij(t) respectively indicate the j-th parameter of the offspring and the main parent. It is important to note that the three individuals used in equation () are randomly selected with no bias towards more fit individuals. Each individual has an equal chance of being selected. Price and Storn proposed a number of different DE strategies [], [], based on the individual being perturbed, and the number of weighted difference vectors used in equation (). The strategy described above is denoted as DE/rand/, meaning that the vector to be perturbed is randomly selected, and that only one difference vector is included. Other strategies include: DE/best/, where the individual to be perturbed is selected as the best performing individual, ˆx of the /7/$25. c 27IEEE

2 current population, in which case x ij(t) =ˆx j (t)+f (x ij(t) x i2j(t)) (3) DE/best/2, which is the same as the DE/best/ strategy, except that two difference vectors are used. That is, x ij(t) = ˆx j (t)+f ((x ij(t) x i2j(t)) + (x i3j(t) x i4j(t))) (4) DE/rand/2, which is the same as the DE/rand/ strategy, except that two difference vectors are used as in equation (4). DE/rand-to-best/, which adds to the DE/rand/ strategy a difference vector using the best individual and a randomly selected individual. That is, g(x) x Fig.. Angle Modulation Illustrated x ij(t) = x ij (t)+f (x ij(t) x i2j(t)) +λ(x best,j (t) x ij (t)) (5) where λ controls the greediness of the strategy. Usually, λ = F. This paper follows the DE/rand//exp strategy. III. ANGLE MODULATED DIFFERENTIAL EVOLUTION Pampará et al. [] proposed a DE algorithm to evolve solutions to binary-valued optimization problems, without having to change the operation of the original DE. This is achieved by using a homomorphous mapping [2] to abstract a problem (defined in binary-valued space) into a simpler problem (defined in continuous-valued space), and then to solve the problem in the abstracted space. The solution obtained in the abstracted space is then transformed back into the original space in order to solve the problem. The angle modulated DE () makes use of angle modulation (AM), a technique derived from the telecommunications industry [3], to implement such a homomorphous mapping between binary-valued and continuous-valued space. The objective is to evolve, in the abstracted space, a bitstring generating function, which will be used in the original space to produce bit-vector solutions. The generating functionasusedinamis g(x) =sin(2π(x a) b cos(2π(x a) c)) + d (6) where x is a single element from a set of evenly separated intervals determined by the required number of bits that need to be generated (i.e. the dimension of the original, binaryvalued space). The coefficients in equation (6) determine the shape of the generating function: a represents the horizontal shift of the generating function, b represents the maximum frequency of the sin function, c represents the frequency of the cos function, and d represents the vertical shift of the generating function. Figure illustrates the function for a =,b =,c =, and d =, with x [ 2, 2]. The evolves values for the four coefficients, a, b, c, and d. Solving a binary-valued problem thus reverts to solving a 4-dimensional problem in a continuous-valued space. After each iteration of the, the fitness of each individual in the population is determined by substituting the evolved values for the coefficients (as represented by the individual) into equation (6). The resulting function is sampled at evenly spaced intervals and a bit value is recorded for each interval. If the output of the function in equation (6) is positive, a bit-value of is recorded; otherwise, a bit-value of is recorded. The resulting bit string is then evaluated by the fitness function defined in the original binary-valued space in order to determine the quality of the solution. The is summarized in Algorithm. Algorithm Angle Modulated Differential Evolution Generate a population of 4-dimensional individuals and set control parameter values; repeat Apply any DE strategy for one iteration; for each individual do Substitute evolved values for coefficients a, b, c and d into equation (6); Produce n x bit-values to form a bit-vector solution; Calculate the fitness of the bit-vector solution in the original bit-valued space; end until a convergence criterion is satisfied; To emphasize, the does not directly evolve a binary string solution. Instead, it evolves a function which is used to generate the bits that form the solution. The task of solving a binary-valued problem is reduced to a 4- dimensional problem, where 4 floating-point parameters need to be optimized. IV. BINARY DIFFERENTIAL EVOLUTION This section proposes a new approach to solve binaryvalued problems using DE. The resulting algorithm is referred to as the (binde) in this text. The binde borrows concepts from the binary Particle Swarm Optimizer (binpso), developed by Kennedy and 27 IEEE Congress on EvolutionaryComputation (CEC 27) 943

3 Eberhart [4]. As with DE, particle swarm optimization (PSO) [5], [6] uses vector algebra to calculate new search positions, and was therefore developed for continuous-valued problems. In PSO, a velocity vector represents the mutation step sizes as stochastically weighted difference vectors (i.e. the social and cognitive components). The binpso does not interpret the velocity as a step size vector. Rather, each component of the velocity vector is then used to compute the probability that the corresponding component of the solution vector is bit or bit. In a similar way, the binde uses the floating-point DE individuals to determine a probability for each component. These probabilities are then used to generate a bitstring solution from the floating-point vector. This bitstring is used by the fitness function to determine its quality. The resulting fitness is then associated with the floating-point representation of the individual. Let x i (t) represent a DE individual, with each x ij (t) (j =,...,n x, where n x is the dimension of the binary-valued problem) a floating-point number. Then, the corresponding bitstring solution, y i (t), is calculated using { if U(, ) <f(xij (t)) y ij (t) = otherwise where f is the sigmoid function, f(x) = +e x The fitness of the individual x i (t) is then simply the fitness obtained using the binary representation, y i (t). The binde algorithm is summarized in Algorithm 2. Algorithm 2 Binary Differential Evolution Algorithm Initialize a population and set control parameter values; repeat Select parent x i (t); Select individuals for reproduction; Produce one offspring, x (t); y i (t) = generated bitsting from x i (t); y i (t) = generated bitsting from x i (t); if f(y i (t)) is better than f(y i(t)) then Replace parent, x i (t), with offspring, x i (t); end else Retain parent, x i (t); end until a convergence criterion is satisfied; V. NORMALIZATION DIFFERENTIAL EVOLUTION The third approach to evolving bitstrings using DE first normalizes the solution represented by each individual. That is, each component of each individual is linearly scaled to the range [, ]. This is done as follows: x ij(t) =(x ij (t)+x min i )/( x min i (t) + x max i (t)) where x min i and x max i are the smallest and largest component values for the i-th individual respectively. The bitstring solution is then generated using { if x y ij (t) = ij (t)) <.5 otherwise VI. EXPERIMENTAL RESULTS This section summarizes results of the three strategies, and compares these results with the (binpso) and an angle modulated PSO (). Section VI-A provides the experimental procedure, while Section VI-B discusses the results. A. Experimental Procedure Each algorithm was run 3 times on each of the functions summarized in Tables I and II. Each run was limited to iterations. Results reported are averages and standard deviations over the 3 runs. TABLE II CHARACTERISTICS OF BENCHMARK FUNCTIONS Function Dimensions Common name Function domain f 3 Spherical (-5.2, 5.2) f 2 2 2D Rosenbrock (-2.48, 2.48) f 3 5 Step Function (-5.2, 5.2) f 4 3 Quadric (-.28,.28) f 5 2 Foxholes ( , ) f 6 2 Schaffer s F6 (-.,.) f 7 3 Griewank (-3., 3.) f 8 3 Ackley (-3., 3.) f 9 3 Rosenbrock (-2.48, 2.48) f 3 Rastrigin (-5.2, 5.2) For the DE implementations, the scaling factor was set to, and the reproduction probability to.25. A value of.25 was selected to ensure that at least one of the 4 coefficients of the bitstring generating function for the are modified using the reproduction operator. The value of. for the scaling factor was selected to ensure that all parents taking part in the computation of differentials have an equal share in providing direction and mutation step size information. For the PSO implementations, the inertia weight was set to , and both acceleration coefficients were set to These values have shown to provide very good results [7]. Velocity clamping was used, with V max set to 4.. A Von Neumann neighborhood topology [8] was used. For all the DE and PSO implementations, a population (swarm) size of 4 individuals (particles) was used. For every individual and particle within the and, the chromosome and position vectors are represented by a 4-dimensional continuous-valued tuple. The values for the tuples are randomly assigned within the range [.,.]. Each benchmark function is defined to operate in n dimensions, with m bits for each dimension. The process to obtain the required input for the benchmark functions, using the angle modulation approach, is defined as: Evolve the 4-dimensional vector using DE or PSO Use the 4 coefficients to create a trigonometric function IEEE Congress on EvolutionaryComputation (CEC 27)

4 TABLE I BENCHMARK FUNCTIONS f f (x) =Σ n i= x2 i f 2 f 2(x) = (x 2 x 2) 2 +( x ) 2 f 3 f 3(x) =6 Σ 5 i= x i f 4 f 4(x) =Σ n i= i x4 i + U(, ) f f 5(x) = 5.2 +Σ25 j= [ j +Σ 2 i= (x i a ij ]) 6 32., 6.,., 6., 32. a ij = 32., 6.,., 6., 32. f 6 f 6(x) =.5+ (sin2 x 2 + y 2 ).5 (.+.(x 2 + y 2 )) 2 f 7 f 7(x) = 4 Σn i= (x i ) 2 Π n i= cos( x i )+ i f 8 f 8(x) = 2 exp(.2 n Σn i= x2 i ) exp( n Σn i= cos(2π x i)) e f 9 f 9(x) =Σ n i= ((x i+ x 2 i )2 +(x i ) 2 ) f f (x) =Σ n i= (x2 i cos(2πx i) + ) Use the generated function to generate an n m bit vector The n m bit vector is then used to calculate the fitness of the benchmark function. The bit vectors are used as input to the benchmark functions. The conversion process is done in accordance to the implementation of Spears. All code was implemented using CIlib ( B. Discussion of the results The algorithms were tested on the minimization problems given in Table I. Instead of using floating-point parameters, each parameter was represented as a bitstring. The objective is then to evolve a bitstring value for each parameter of the function being minimized. In order to evaluate fitness, these bitstrings are converted back to a floating-point representation and then used by the fitness function. Table III summarizes the average fitness values for each function, while the average number of iterations to converge is given in Table IV. With reference to Table III, an average rank is assigned to each algorithm. In order to compute this rank, each algorithm is assigned a score according to its performance for each of the functions. A score of is given if an algorithm provided the best result. The average rank is computed over the ranks assigned for the functions. These ranks clearly illustrate that the DE strategies performed significantly better than the PSO algorithms. It is only for f 5 that the two PSO algorithms outperformed the DE strategies, while the reached the same performance as the and the binde for f 3. Considering only the DE strategies, provided the best accuracy, being the best performer for half of the functions, and second best performer for the other half. The normalized DE performed better than the. Table IV shows that the PSO algorithms required on average significantly less iterations to converge. However, Reference C source code provided by William Spears at wspears/functs.dejong.html as indicated in Table III, this is at the cost of obtaining worse solutions than the DE strategies. This seems to indicate that the PSO algorithms exploit too quickly. Even though the provided the best solutions, this came at a computational cost. The required the most number of iterations to converge; significantly more than the other algorithms. This is an indication that the has better exploration abilities. The performance profiles for functions f,f 3,f 4,f 6,f 9 and f are given in Figure 2. These profiles show for each algorithm the rate at which the fitness is reduced. The general trend is that the initial reduction in fitness is larger for the binary and strategies. After this initial, large improvement in fitness, further improvements are slow. This is an indication that these two algorithms exploit too much, and then later on struggle to refine solutions. The, on the other hand showed a much slower reduction in fitness, but, as indicated in Table III, converges on much better solutions. These graphs support the results in Table IV and the statement that the has better exploration abilities than the other algorithms. VII. CONCLUSIONS This paper presented a new approach to use differential evolution (DE) to solve binary-valued optimization problems. This (binde) is then compared with two other DE and two particle swarm optimization (PSO) approaches to solve binary-valued problems. These approaches include the angle modulated DE (), a DE that simply uses a discretization process, the and an angle modulated PSO (). Results in this paper shows, for the benchmark functions used, that the outperformed the other approaches with respect to accuracy of solutions. However, the is also much slower than the other algorithms. In general, the binary DE and the DE using discretization is faster in reducing the error, but then fails to refine solutions, which is an indication that these approaches do not sufficienlty explore the search space. 27 IEEE Congress on EvolutionaryComputation (CEC 27) 945

5 TABLE III COMPARISON OF PERFORMANCE WITH RESPECT TO AVERAGE FITNESS (DEVIATIONS ARE GIVEN IN PARENTHESES) Function binde normde binpso f e-3.3e-3 (.5) (.47e-4) (.) (.) (.) f e-5.49e-4.42e-3 (.7) (.52e-4) (.92e-5) (.7e-3) (.88e-4) f e-4..7e-4 (.6266) (.) (.47e-4) (.) (.5e-4) f (.8972) (.657) (.63) (.923) (.98) f e-.525e- (.) (.857) (.585) (.) (.965e-4) f e-.73e- (.) (.27e-6) (.5e-5) (.42e-2) (.6e-) f ( ) (.262) (.37) (4.398) (.977) f (.76464) (3.6) (.289) (.567e-) (.489e-) f ( ) (6.375) (65.88) (86.33) (77.424) f (3.5432) (45.463) (5.589) (35.263) (2.84) Ranking TABLE IV COMPARISON OF PERFORMANCE WITH RESPECT TO NUMBER OF ITERATIONS TO CONVERGE Function binde normde binpso Average f f f f f f f f f f Average Further studies will expand this comparative empirical analysis to larger dimensional problems. REFERENCES [] G. Pampara, A. Engelbrecht, and N. Franken, Binary Differential Evolution, in Proceedings of the IEEE Congress on Evolutionary Computation, 26. [2] R. Storn and K. Price, Differential evolution - A Simple and efficient adaptive scheme for global optimization over continuous spaces, International Computer Science Institute, Tech. Rep. TR-95-2, 995. [3], Differential Evolution A Simple and Efficient Heuristic for global Optimization over Continuous Spaces, Journal of Global Optimization, vol., no. 4, pp , 997. [4] J. Lampinen and I. Zelinka, Mixed integer-discrete-continuous optimization by differential evolution, Part I, in Proceedings of the Fifth International Conference on Soft Computing, 999. [5] Y.-C. Lin, F.-S. Wang, and K.-S. Hwang, A hybrid method of evolutionary algorithms for mixed-integer nonlinear optimization problems, in Proceedings of the IEEE Congress on Evolutionary Computation, vol. 3, 999. [6] H.-J. Huang and F.-S. Wang, Fuzzy decision-making design of chemical plant using mixed-integer hybrid differential evolution, Computers & Chemical Engineering, vol. 26, no. 2, pp , 22. [7] C.-T. Su and C.-S. Lee, Network reconfiguration of distribution systems using improved mixed-integer hybrid differential evolution, IEEE Transactions on Power Delivery, vol. 8, no. 3, pp , 22. [8] V. Feoktistov and S. Janaqi, Generalization of the strategies in differential evolution, in Proceedings of the Eighteenth Parallel and Distributed Processing Symposium, 24, p. 65. [9] H. Schmidt and G. Thierauf, A combined heuristic optimization technique, Advances in Engineering Software, vol. 36, no., pp. 9, 25. [] R. Storn, On the Usage of Differential Evolution for Function Optimization, in Proceedings of the North American Fuzzy Information Society, 996, pp [] K. Price, R. Storn, and J. Lampinen, Differential Evolution : A Practical Approach to Global Optimization. Springer, 25. [2] S. Koziel and Z. Michalewicz, Evolutionary algorithms, homomorphous mappings, and constrained parameter optimization, Evolutionary Computation, vol. 7, no., pp. 9 44, 999. [3] J. Proakis and M. Salehi, Communication System Engineering, 2nd ed. Prentice Hall Publishers, 22. [4] J. Kennedy and R. Eberhart, A Discrete Binary Version of the Particle Swarm Algorithm, in Proceedings of the World Multiconference on Systemics, Cybernetics and Informatics, 997, pp [5], Particle Swarm Optimization, in Proceedings of the IEEE International Joint Conference on Neural Ne tworks. IEEE Press, 995, pp [6] R. Eberhart and J. Kennedy, A New Optimizer using Particle Swarm Theory, in Proceedings of the Sixth International Symposium on Micromachine and Human Science, 995, pp [7] F. van den Bergh, An Analysis of Particle Swarm Optimizers, Ph.D. dissertation, Department of Computer Science, University of Pretoria, Pretoria, South Africa, IEEE Congress on EvolutionaryComputation (CEC 27)

6 (a) f (b) f (c) f 4 (d) f (e) f 9 (f) f Fig. 2. Performance Profiles for Selected Functions [8] J. Kennedy and R. Mendes, Population Structure and Particle Performance, in Proceedings of the IEEE Congress on Evolutionary Computation. IEEE Press, 22, pp IEEE Congress on EvolutionaryComputation (CEC 27) 947

Differential Evolution

Differential Evolution Chapter 13 Differential Evolution Differential evolution (DE) is a stochastic, population-based search strategy developed by Storn and Price [696, 813] in 1995. While DE shares similarities with other

More information

A Polar Coordinate Particle Swarm Optimiser

A Polar Coordinate Particle Swarm Optimiser A Polar Coordinate Particle Swarm Optimiser Wiehann Matthysen and Andries P. Engelbrecht Department of Computer Science, University of Pretoria, South Africa engel@cs.up.ac.za Abstract The Particle Swarm

More information

Modified Particle Swarm Optimization

Modified Particle Swarm Optimization Modified Particle Swarm Optimization Swati Agrawal 1, R.P. Shimpi 2 1 Aerospace Engineering Department, IIT Bombay, Mumbai, India, swati.agrawal@iitb.ac.in 2 Aerospace Engineering Department, IIT Bombay,

More information

Small World Particle Swarm Optimizer for Global Optimization Problems

Small World Particle Swarm Optimizer for Global Optimization Problems Small World Particle Swarm Optimizer for Global Optimization Problems Megha Vora and T.T. Mirnalinee Department of Computer Science and Engineering S.S.N College of Engineering, Anna University, Chennai,

More information

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization 2017 2 nd International Electrical Engineering Conference (IEEC 2017) May. 19 th -20 th, 2017 at IEP Centre, Karachi, Pakistan Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic

More information

A Hybrid Fireworks Optimization Method with Differential Evolution Operators

A Hybrid Fireworks Optimization Method with Differential Evolution Operators A Fireworks Optimization Method with Differential Evolution Operators YuJun Zheng a,, XinLi Xu a, HaiFeng Ling b a College of Computer Science & Technology, Zhejiang University of Technology, Hangzhou,

More information

Using CODEQ to Train Feed-forward Neural Networks

Using CODEQ to Train Feed-forward Neural Networks Using CODEQ to Train Feed-forward Neural Networks Mahamed G. H. Omran 1 and Faisal al-adwani 2 1 Department of Computer Science, Gulf University for Science and Technology, Kuwait, Kuwait omran.m@gust.edu.kw

More information

Discrete Multi-Valued Particle Swarm Optimization

Discrete Multi-Valued Particle Swarm Optimization Discrete Multi-d Particle Swarm Optimization Jim Pugh and Alcherio Martinoli Swarm-Intelligent Systems Group École Polytechnique Fédérale de Lausanne 05 Lausanne, Switzerland Email: {jim.pugh,alcherio.martinoli}@epfl.ch

More information

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization J.Venkatesh 1, B.Chiranjeevulu 2 1 PG Student, Dept. of ECE, Viswanadha Institute of Technology And Management,

More information

IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE

IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE Fang Wang, and Yuhui Qiu Intelligent Software and Software Engineering Laboratory, Southwest-China Normal University,

More information

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding e Scientific World Journal, Article ID 746260, 8 pages http://dx.doi.org/10.1155/2014/746260 Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding Ming-Yi

More information

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques Nasser Sadati Abstract Particle Swarm Optimization (PSO) algorithms recently invented as intelligent optimizers with several highly

More information

Inertia Weight. v i = ωv i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) The new velocity update equation:

Inertia Weight. v i = ωv i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) The new velocity update equation: Convergence of PSO The velocity update equation: v i = v i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) for some values of φ 1 and φ 2 the velocity grows without bound can bound velocity to range [ V max,v

More information

Cooperative Coevolution using The Brain Storm Optimization Algorithm

Cooperative Coevolution using The Brain Storm Optimization Algorithm Cooperative Coevolution using The Brain Storm Optimization Algorithm Mohammed El-Abd Electrical and Computer Engineering Department American University of Kuwait Email: melabd@auk.edu.kw Abstract The Brain

More information

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:

More information

PARTICLE SWARM OPTIMIZATION (PSO) [1] is an

PARTICLE SWARM OPTIMIZATION (PSO) [1] is an Proceedings of International Joint Conference on Neural Netorks, Atlanta, Georgia, USA, June -9, 9 Netork-Structured Particle Sarm Optimizer Considering Neighborhood Relationships Haruna Matsushita and

More information

Effect of the PSO Topologies on the Performance of the PSO-ELM

Effect of the PSO Topologies on the Performance of the PSO-ELM 2012 Brazilian Symposium on Neural Networks Effect of the PSO Topologies on the Performance of the PSO-ELM Elliackin M. N. Figueiredo and Teresa B. Ludermir Center of Informatics Federal University of

More information

Particle Swarm Optimization

Particle Swarm Optimization Dario Schor, M.Sc., EIT schor@ieee.org Space Systems Department Magellan Aerospace Winnipeg Winnipeg, Manitoba 1 of 34 Optimization Techniques Motivation Optimization: Where, min x F(x), subject to g(x)

More information

A New Modified Binary Differential Evolution Algorithm and its Applications

A New Modified Binary Differential Evolution Algorithm and its Applications Appl. Math. Inf. Sci. 10, No. 5, 1965-1969 (2016) 1965 Applied Mathematics & Information Sciences An International Journal http://dx.doi.org/10.18576/amis/100538 A New Modified Binary Differential Evolution

More information

Adaptative Clustering Particle Swarm Optimization

Adaptative Clustering Particle Swarm Optimization Adaptative Clustering Particle Swarm Optimization Salomão S. Madeiro, Carmelo J. A. Bastos-Filho, Member, IEEE, and Fernando B. Lima Neto, Senior Member, IEEE, Elliackin M. N. Figueiredo Abstract The performance

More information

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems Dervis Karaboga and Bahriye Basturk Erciyes University, Engineering Faculty, The Department of Computer

More information

An Island Based Hybrid Evolutionary Algorithm for Optimization

An Island Based Hybrid Evolutionary Algorithm for Optimization An Island Based Hybrid Evolutionary Algorithm for Optimization Changhe Li and Shengxiang Yang Department of Computer Science, University of Leicester University Road, Leicester LE1 7RH, UK {cl160,s.yang}@mcs.le.ac.uk

More information

Particle swarm algorithms for multi-local optimization A. Ismael F. Vaz 1, Edite M.G.P. Fernandes 1

Particle swarm algorithms for multi-local optimization A. Ismael F. Vaz 1, Edite M.G.P. Fernandes 1 I Congresso de Estatística e Investigação Operacional da Galiza e Norte de Portugal VII Congreso Galego de Estatística e Investigación de Operacións Guimarães 26, 27 e 28 de Outubro de 2005 Particle swarm

More information

336 THE STATISTICAL SOFTWARE NEWSLETTER where z is one (randomly taken) pole of the simplex S, g the centroid of the remaining d poles of the simplex

336 THE STATISTICAL SOFTWARE NEWSLETTER where z is one (randomly taken) pole of the simplex S, g the centroid of the remaining d poles of the simplex THE STATISTICAL SOFTWARE NEWSLETTER 335 Simple Evolutionary Heuristics for Global Optimization Josef Tvrdk and Ivan Krivy University of Ostrava, Brafova 7, 701 03 Ostrava, Czech Republic Phone: +420.69.6160

More information

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS Gabriela Ochoa http://www.cs.stir.ac.uk/~goc/ OUTLINE Optimisation problems Optimisation & search Two Examples The knapsack problem

More information

Study on the Development of Complex Network for Evolutionary and Swarm based Algorithms

Study on the Development of Complex Network for Evolutionary and Swarm based Algorithms Study on the Development of Complex Network for Evolutionary and Swarm based Algorithms 1 Roman Senkerik, 2 Ivan Zelinka, 1 Michal Pluhacek and 1 Adam Viktorin 1 Tomas Bata University in Zlin, Faculty

More information

Solving the Hard Knapsack Problems with a Binary Particle Swarm Approach

Solving the Hard Knapsack Problems with a Binary Particle Swarm Approach Solving the Hard Knapsack Problems with a Binary Particle Swarm Approach Bin Ye 1, Jun Sun 1, and Wen-Bo Xu 1 School of Information Technology, Southern Yangtze University, No.1800, Lihu Dadao, Wuxi, Jiangsu

More information

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 131 CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 6.1 INTRODUCTION The Orthogonal arrays are helpful in guiding the heuristic algorithms to obtain a good solution when applied to NP-hard problems. This

More information

Small World Network Based Dynamic Topology for Particle Swarm Optimization

Small World Network Based Dynamic Topology for Particle Swarm Optimization Small World Network Based Dynamic Topology for Particle Swarm Optimization Qingxue Liu 1,2, Barend Jacobus van Wyk 1 1 Department of Electrical Engineering Tshwane University of Technology Pretoria, South

More information

Particle Swarm Optimization

Particle Swarm Optimization Particle Swarm Optimization Gonçalo Pereira INESC-ID and Instituto Superior Técnico Porto Salvo, Portugal gpereira@gaips.inesc-id.pt April 15, 2011 1 What is it? Particle Swarm Optimization is an algorithm

More information

INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & MANAGEMENT

INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & MANAGEMENT INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & MANAGEMENT MOBILITY MANAGEMENT IN CELLULAR NETWORK Prakhar Agrawal 1, Ravi Kateeyare 2, Achal Sharma 3 1 Research Scholar, 2,3 Asst. Professor 1,2,3 Department

More information

An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm

An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm A. Lari, A. Khosravi and A. Alfi Faculty of Electrical and Computer Engineering, Noushirvani

More information

QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION

QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION International Journal of Computer Engineering and Applications, Volume VIII, Issue I, Part I, October 14 QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION Shradha Chawla 1, Vivek Panwar 2 1 Department

More information

Witold Pedrycz. University of Alberta Edmonton, Alberta, Canada

Witold Pedrycz. University of Alberta Edmonton, Alberta, Canada 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC) Banff Center, Banff, Canada, October 5-8, 2017 Analysis of Optimization Algorithms in Automated Test Pattern Generation for Sequential

More information

Genetic Algorithms Variations and Implementation Issues

Genetic Algorithms Variations and Implementation Issues Genetic Algorithms Variations and Implementation Issues CS 431 Advanced Topics in AI Classic Genetic Algorithms GAs as proposed by Holland had the following properties: Randomly generated population Binary

More information

Constrained Single-Objective Optimization Using Particle Swarm Optimization

Constrained Single-Objective Optimization Using Particle Swarm Optimization 2006 IEEE Congress on Evolutionary Computation Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada July 16-21, 2006 Constrained Single-Objective Optimization Using Particle Swarm Optimization Karin

More information

The movement of the dimmer firefly i towards the brighter firefly j in terms of the dimmer one s updated location is determined by the following equat

The movement of the dimmer firefly i towards the brighter firefly j in terms of the dimmer one s updated location is determined by the following equat An Improved Firefly Algorithm for Optimization Problems Amarita Ritthipakdee 1, Arit Thammano, Nol Premasathian 3, and Bunyarit Uyyanonvara 4 Abstract Optimization problem is one of the most difficult

More information

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM Journal of Al-Nahrain University Vol.10(2), December, 2007, pp.172-177 Science GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM * Azhar W. Hammad, ** Dr. Ban N. Thannoon Al-Nahrain

More information

A Novel Hybrid Self Organizing Migrating Algorithm with Mutation for Global Optimization

A Novel Hybrid Self Organizing Migrating Algorithm with Mutation for Global Optimization International Journal of Soft Computing and Engineering (IJSCE) ISSN: 2231-2307, Volume-3, Issue-6, January 2014 A Novel Hybrid Self Organizing Migrating Algorithm with Mutation for Global Optimization

More information

GA is the most popular population based heuristic algorithm since it was developed by Holland in 1975 [1]. This algorithm runs faster and requires les

GA is the most popular population based heuristic algorithm since it was developed by Holland in 1975 [1]. This algorithm runs faster and requires les Chaotic Crossover Operator on Genetic Algorithm Hüseyin Demirci Computer Engineering, Sakarya University, Sakarya, 54187, Turkey Ahmet Turan Özcerit Computer Engineering, Sakarya University, Sakarya, 54187,

More information

Optimization of Benchmark Functions Using Artificial Bee Colony (ABC) Algorithm

Optimization of Benchmark Functions Using Artificial Bee Colony (ABC) Algorithm IOSR Journal of Engineering (IOSRJEN) e-issn: 2250-3021, p-issn: 2278-8719 Vol. 3, Issue 10 (October. 2013), V4 PP 09-14 Optimization of Benchmark Functions Using Artificial Bee Colony (ABC) Algorithm

More information

Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems

Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems Australian Journal of Basic and Applied Sciences, 4(8): 3366-3382, 21 ISSN 1991-8178 Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems Akbar H. Borzabadi,

More information

An Improved Tree Seed Algorithm for Optimization Problems

An Improved Tree Seed Algorithm for Optimization Problems International Journal of Machine Learning and Computing, Vol. 8, o. 1, February 2018 An Improved Tree Seed Algorithm for Optimization Problems Murat Aslan, Mehmet Beskirli, Halife Kodaz, and Mustafa Servet

More information

Index Terms PSO, parallel computing, clustering, multiprocessor.

Index Terms PSO, parallel computing, clustering, multiprocessor. Parallel Particle Swarm Optimization in Data Clustering Yasin ORTAKCI Karabuk University, Computer Engineering Department, Karabuk, Turkey yasinortakci@karabuk.edu.tr Abstract Particle Swarm Optimization

More information

Constraints in Particle Swarm Optimization of Hidden Markov Models

Constraints in Particle Swarm Optimization of Hidden Markov Models Constraints in Particle Swarm Optimization of Hidden Markov Models Martin Macaš, Daniel Novák, and Lenka Lhotská Czech Technical University, Faculty of Electrical Engineering, Dep. of Cybernetics, Prague,

More information

SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION

SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION Kamil Zakwan Mohd Azmi, Zuwairie Ibrahim and Dwi Pebrianti Faculty of Electrical

More information

A Parameter Study for Differential Evolution

A Parameter Study for Differential Evolution A Parameter Study for Differential Evolution ROGER GÄMPERLE SIBYLLE D MÜLLER PETROS KOUMOUTSAKOS Institute of Computational Sciences Department of Computer Science Swiss Federal Institute of Technology

More information

Discrete Particle Swarm Optimization for TSP based on Neighborhood

Discrete Particle Swarm Optimization for TSP based on Neighborhood Journal of Computational Information Systems 6:0 (200) 3407-344 Available at http://www.jofcis.com Discrete Particle Swarm Optimization for TSP based on Neighborhood Huilian FAN School of Mathematics and

More information

DERIVATIVE-FREE OPTIMIZATION

DERIVATIVE-FREE OPTIMIZATION DERIVATIVE-FREE OPTIMIZATION Main bibliography J.-S. Jang, C.-T. Sun and E. Mizutani. Neuro-Fuzzy and Soft Computing: A Computational Approach to Learning and Machine Intelligence. Prentice Hall, New Jersey,

More information

IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 5, NO. 1, FEBRUARY

IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 5, NO. 1, FEBRUARY IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 5, NO. 1, FEBRUARY 2001 41 Brief Papers An Orthogonal Genetic Algorithm with Quantization for Global Numerical Optimization Yiu-Wing Leung, Senior Member,

More information

MIC 2009: The VIII Metaheuristics International Conference. A Comparative Study of Adaptive Mutation Operators for Genetic Algorithms

MIC 2009: The VIII Metaheuristics International Conference. A Comparative Study of Adaptive Mutation Operators for Genetic Algorithms : The VIII Metaheuristics International Conference id-1 A Comparative Study of Adaptive Mutation Operators for Genetic Algorithms Imtiaz Korejo, Shengxiang Yang, and ChangheLi Department of Computer Science,

More information

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery Monika Sharma 1, Deepak Sharma 2 1 Research Scholar Department of Computer Science and Engineering, NNSS SGI Samalkha,

More information

An Optimization of Association Rule Mining Algorithm using Weighted Quantum behaved PSO

An Optimization of Association Rule Mining Algorithm using Weighted Quantum behaved PSO An Optimization of Association Rule Mining Algorithm using Weighted Quantum behaved PSO S.Deepa 1, M. Kalimuthu 2 1 PG Student, Department of Information Technology 2 Associate Professor, Department of

More information

Feeding the Fish Weight Update Strategies for the Fish School Search Algorithm

Feeding the Fish Weight Update Strategies for the Fish School Search Algorithm Feeding the Fish Weight Update Strategies for the Fish School Search Algorithm Andreas Janecek and Ying Tan Key Laboratory of Machine Perception (MOE), Peking University Department of Machine Intelligence,

More information

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

International Journal of Digital Application & Contemporary research Website:   (Volume 1, Issue 7, February 2013) Performance Analysis of GA and PSO over Economic Load Dispatch Problem Sakshi Rajpoot sakshirajpoot1988@gmail.com Dr. Sandeep Bhongade sandeepbhongade@rediffmail.com Abstract Economic Load dispatch problem

More information

Luo, W., and Li, Y. (2016) Benchmarking Heuristic Search and Optimisation Algorithms in Matlab. In: 22nd International Conference on Automation and Computing (ICAC), 2016, University of Essex, Colchester,

More information

PARALLEL PARTICLE SWARM OPTIMIZATION IN DATA CLUSTERING

PARALLEL PARTICLE SWARM OPTIMIZATION IN DATA CLUSTERING PARALLEL PARTICLE SWARM OPTIMIZATION IN DATA CLUSTERING YASIN ORTAKCI Karabuk University, Computer Engineering Department, Karabuk, Turkey E-mail: yasinortakci@karabuk.edu.tr Abstract Particle Swarm Optimization

More information

MANY important problems require optimization,

MANY important problems require optimization, IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 21, NO. 2, APRIL 2017 281 Factored Evolutionary Algorithms Shane Strasser, Member, IEEE, John Sheppard, Fellow, IEEE, Nathan Fortier, Member, IEEE, and

More information

Opportunistic Self Organizing Migrating Algorithm for Real-Time Dynamic Traveling Salesman Problem

Opportunistic Self Organizing Migrating Algorithm for Real-Time Dynamic Traveling Salesman Problem Opportunistic Self Organizing Migrating Algorithm for Real-Time Dynamic Traveling Salesman Problem arxiv:1709.03793v1 [cs.ne] 12 Sep 2017 Shubham Dokania, Sunyam Bagga, and Rohit Sharma shubham.k.dokania@gmail.com,

More information

ARMA MODEL SELECTION USING PARTICLE SWARM OPTIMIZATION AND AIC CRITERIA. Mark S. Voss a b. and Xin Feng.

ARMA MODEL SELECTION USING PARTICLE SWARM OPTIMIZATION AND AIC CRITERIA. Mark S. Voss a b. and Xin Feng. Copyright 2002 IFAC 5th Triennial World Congress, Barcelona, Spain ARMA MODEL SELECTION USING PARTICLE SWARM OPTIMIZATION AND AIC CRITERIA Mark S. Voss a b and Xin Feng a Department of Civil and Environmental

More information

Exploration vs. Exploitation in Differential Evolution

Exploration vs. Exploitation in Differential Evolution Exploration vs. Exploitation in Differential Evolution Ângela A. R. Sá 1, Adriano O. Andrade 1, Alcimar B. Soares 1 and Slawomir J. Nasuto 2 Abstract. Differential Evolution (DE) is a tool for efficient

More information

Chapter 14 Global Search Algorithms

Chapter 14 Global Search Algorithms Chapter 14 Global Search Algorithms An Introduction to Optimization Spring, 2015 Wei-Ta Chu 1 Introduction We discuss various search methods that attempts to search throughout the entire feasible set.

More information

Revision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems

Revision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems 4 The Open Cybernetics and Systemics Journal, 008,, 4-9 Revision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems K. Kato *, M. Sakawa and H. Katagiri Department of Artificial

More information

A MULTI-SWARM PARTICLE SWARM OPTIMIZATION WITH LOCAL SEARCH ON MULTI-ROBOT SEARCH SYSTEM

A MULTI-SWARM PARTICLE SWARM OPTIMIZATION WITH LOCAL SEARCH ON MULTI-ROBOT SEARCH SYSTEM A MULTI-SWARM PARTICLE SWARM OPTIMIZATION WITH LOCAL SEARCH ON MULTI-ROBOT SEARCH SYSTEM BAHAREH NAKISA, MOHAMMAD NAIM RASTGOO, MOHAMMAD FAIDZUL NASRUDIN, MOHD ZAKREE AHMAD NAZRI Department of Computer

More information

The Genetic Algorithm for finding the maxima of single-variable functions

The Genetic Algorithm for finding the maxima of single-variable functions Research Inventy: International Journal Of Engineering And Science Vol.4, Issue 3(March 2014), PP 46-54 Issn (e): 2278-4721, Issn (p):2319-6483, www.researchinventy.com The Genetic Algorithm for finding

More information

A HYBRID ALGORITHM BASED ON PARTICLE SWARM OPTIMIZATION

A HYBRID ALGORITHM BASED ON PARTICLE SWARM OPTIMIZATION INTERNATIONAL JOURNAL OF INFORMATION AND SYSTEMS SCIENCES Volume 1, Number 3-4, Pages 275-282 2005 Institute for Scientific Computing and Information A HYBRID ALGORITHM BASED ON PARTICLE SWARM OPTIMIZATION

More information

arxiv: v1 [cs.ne] 22 Mar 2016

arxiv: v1 [cs.ne] 22 Mar 2016 Adaptive Parameter Selection in Evolutionary Algorithms by Reinforcement Learning with Dynamic Discretization of Parameter Range arxiv:1603.06788v1 [cs.ne] 22 Mar 2016 ABSTRACT Arkady Rost ITMO University

More information

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India.

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India. Volume 3, Issue 3, March 2013 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Training Artificial

More information

AN NOVEL NEURAL NETWORK TRAINING BASED ON HYBRID DE AND BP

AN NOVEL NEURAL NETWORK TRAINING BASED ON HYBRID DE AND BP AN NOVEL NEURAL NETWORK TRAINING BASED ON HYBRID DE AND BP Xiaohui Yuan ', Yanbin Yuan 2, Cheng Wang ^ / Huazhong University of Science & Technology, 430074 Wuhan, China 2 Wuhan University of Technology,

More information

Hybrid PSO-SA algorithm for training a Neural Network for Classification

Hybrid PSO-SA algorithm for training a Neural Network for Classification Hybrid PSO-SA algorithm for training a Neural Network for Classification Sriram G. Sanjeevi 1, A. Naga Nikhila 2,Thaseem Khan 3 and G. Sumathi 4 1 Associate Professor, Dept. of CSE, National Institute

More information

Genetic-PSO Fuzzy Data Mining With Divide and Conquer Strategy

Genetic-PSO Fuzzy Data Mining With Divide and Conquer Strategy Genetic-PSO Fuzzy Data Mining With Divide and Conquer Strategy Amin Jourabloo Department of Computer Engineering, Sharif University of Technology, Tehran, Iran E-mail: jourabloo@ce.sharif.edu Abstract

More information

THREE PHASE FAULT DIAGNOSIS BASED ON RBF NEURAL NETWORK OPTIMIZED BY PSO ALGORITHM

THREE PHASE FAULT DIAGNOSIS BASED ON RBF NEURAL NETWORK OPTIMIZED BY PSO ALGORITHM THREE PHASE FAULT DIAGNOSIS BASED ON RBF NEURAL NETWORK OPTIMIZED BY PSO ALGORITHM M. Sivakumar 1 and R. M. S. Parvathi 2 1 Anna University, Tamilnadu, India 2 Sengunthar College of Engineering, Tamilnadu,

More information

An Approach to Polygonal Approximation of Digital CurvesBasedonDiscreteParticleSwarmAlgorithm

An Approach to Polygonal Approximation of Digital CurvesBasedonDiscreteParticleSwarmAlgorithm Journal of Universal Computer Science, vol. 13, no. 10 (2007), 1449-1461 submitted: 12/6/06, accepted: 24/10/06, appeared: 28/10/07 J.UCS An Approach to Polygonal Approximation of Digital CurvesBasedonDiscreteParticleSwarmAlgorithm

More information

A Cultivated Differential Evolution Algorithm using modified Mutation and Selection Strategy

A Cultivated Differential Evolution Algorithm using modified Mutation and Selection Strategy A Cultivated Differential Evolution Algorithm using modified Mutation and Selection Strategy Pooja *1 Praveena Chaturvedi 1 Pravesh Kumar 2 1. Department of Computer Science, Gurukula Kangri Vishwavidyalaya,

More information

Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization

Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization adfa, p. 1, 2011. Springer-Verlag Berlin Heidelberg 2011 Devang Agarwal and Deepak Sharma Department of Mechanical

More information

Automatic differentiation based for particle swarm optimization steepest descent direction

Automatic differentiation based for particle swarm optimization steepest descent direction International Journal of Advances in Intelligent Informatics ISSN: 2442-6571 Vol 1, No 2, July 2015, pp. 90-97 90 Automatic differentiation based for particle swarm optimization steepest descent direction

More information

Lamarckian Repair and Darwinian Repair in EMO Algorithms for Multiobjective 0/1 Knapsack Problems

Lamarckian Repair and Darwinian Repair in EMO Algorithms for Multiobjective 0/1 Knapsack Problems Repair and Repair in EMO Algorithms for Multiobjective 0/ Knapsack Problems Shiori Kaige, Kaname Narukawa, and Hisao Ishibuchi Department of Industrial Engineering, Osaka Prefecture University, - Gakuen-cho,

More information

A Hybrid Metaheuristic Based on Differential Evolution and Local Search with Quadratic Interpolation

A Hybrid Metaheuristic Based on Differential Evolution and Local Search with Quadratic Interpolation A Hybrid Metaheuristic Based on Differential Evolution and Local Search with Quadratic Interpolation María Laura Tardivo 1, Leticia Cagnina 2, Guillermo Leguizamón 2 1 Department of Computer Science, Universidad

More information

Particle Swarm Optimization Artificial Bee Colony Chain (PSOABCC): A Hybrid Meteahuristic Algorithm

Particle Swarm Optimization Artificial Bee Colony Chain (PSOABCC): A Hybrid Meteahuristic Algorithm Particle Swarm Optimization Artificial Bee Colony Chain (PSOABCC): A Hybrid Meteahuristic Algorithm Oğuz Altun Department of Computer Engineering Yildiz Technical University Istanbul, Turkey oaltun@yildiz.edu.tr

More information

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 4,100 116,000 120M Open access books available International authors and editors Downloads Our

More information

Unidimensional Search for solving continuous high-dimensional optimization problems

Unidimensional Search for solving continuous high-dimensional optimization problems 2009 Ninth International Conference on Intelligent Systems Design and Applications Unidimensional Search for solving continuous high-dimensional optimization problems Vincent Gardeux, Rachid Chelouah,

More information

Optimization Using Particle Swarms with Near Neighbor Interactions

Optimization Using Particle Swarms with Near Neighbor Interactions Optimization Using Particle Swarms with Near Neighbor Interactions Kalyan Veeramachaneni, Thanmaya Peram, Chilukuri Mohan, and Lisa Ann Osadciw Department of Electrical Engineering and Computer Science

More information

Innovative Strategy of SOMA Control Parameter Setting

Innovative Strategy of SOMA Control Parameter Setting Innovative Strategy of SOMA Control Parameter Setting PAVEL VAŘACHA Tomas Bata University in Zlin Faculty of Applied Informatics nam. T.G. Masaryka 5555, 76 1 Zlin CZECH REPUBLIC varacha@fai.utb.cz http://www.fai.utb.cz

More information

Optimal Reactive Power Dispatch Using Hybrid Loop-Genetic Based Algorithm

Optimal Reactive Power Dispatch Using Hybrid Loop-Genetic Based Algorithm Optimal Reactive Power Dispatch Using Hybrid Loop-Genetic Based Algorithm Md Sajjad Alam Student Department of Electrical Engineering National Institute of Technology, Patna Patna-800005, Bihar, India

More information

What Makes A Successful Society?

What Makes A Successful Society? What Makes A Successful Society? Experiments With Population Topologies in Particle Swarms Rui Mendes and José Neves Departamento de Informática Universidade do Minho Portugal Abstract. Previous studies

More information

Hybrid Optimization Coupling Electromagnetism and Descent Search for Engineering Problems

Hybrid Optimization Coupling Electromagnetism and Descent Search for Engineering Problems Proceedings of the International Conference on Computational and Mathematical Methods in Science and Engineering, CMMSE 2008 13 17 June 2008. Hybrid Optimization Coupling Electromagnetism and Descent Search

More information

A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

A Comparative Study of Genetic Algorithm and Particle Swarm Optimization IOSR Journal of Computer Engineering (IOSR-JCE) e-issn: 2278-0661,p-ISSN: 2278-8727 PP 18-22 www.iosrjournals.org A Comparative Study of Genetic Algorithm and Particle Swarm Optimization Mrs.D.Shona 1,

More information

Novel Initialisation and Updating Mechanisms in PSO for Feature Selection in Classification

Novel Initialisation and Updating Mechanisms in PSO for Feature Selection in Classification Novel Initialisation and Updating Mechanisms in PSO for Feature Selection in Classification Bing Xue, Mengjie Zhang, and Will N. Browne School of Engineering and Computer Science Victoria University of

More information

Feeder Reconfiguration Using Binary Coding Particle Swarm Optimization

Feeder Reconfiguration Using Binary Coding Particle Swarm Optimization 488 International Journal Wu-Chang of Control, Wu Automation, and Men-Shen and Systems, Tsai vol. 6, no. 4, pp. 488-494, August 2008 Feeder Reconfiguration Using Binary Coding Particle Swarm Optimization

More information

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms Topological Machining Fixture Layout Synthesis Using Genetic Algorithms Necmettin Kaya Uludag University, Mechanical Eng. Department, Bursa, Turkey Ferruh Öztürk Uludag University, Mechanical Eng. Department,

More information

SINCE PARTICLE swarm optimization (PSO) was introduced

SINCE PARTICLE swarm optimization (PSO) was introduced IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL., NO. 5, OCTOBER 9 Frankenstein s PSO: A Composite Particle Swarm Optimization Algorithm Marco A. Montes de Oca, Thomas Stützle, Mauro Birattari, Member,

More information

PARTICLE SWARM OPTIMIZATION (PSO)

PARTICLE SWARM OPTIMIZATION (PSO) PARTICLE SWARM OPTIMIZATION (PSO) J. Kennedy and R. Eberhart, Particle Swarm Optimization. Proceedings of the Fourth IEEE Int. Conference on Neural Networks, 1995. A population based optimization technique

More information

Offspring Generation Method using Delaunay Triangulation for Real-Coded Genetic Algorithms

Offspring Generation Method using Delaunay Triangulation for Real-Coded Genetic Algorithms Offspring Generation Method using Delaunay Triangulation for Real-Coded Genetic Algorithms Hisashi Shimosaka 1, Tomoyuki Hiroyasu 2, and Mitsunori Miki 2 1 Graduate School of Engineering, Doshisha University,

More information

QCA & CQCA: Quad Countries Algorithm and Chaotic Quad Countries Algorithm

QCA & CQCA: Quad Countries Algorithm and Chaotic Quad Countries Algorithm Journal of Theoretical and Applied Computer Science Vol. 6, No. 3, 2012, pp. 3-20 ISSN 2299-2634 http://www.jtacs.org QCA & CQCA: Quad Countries Algorithm and Chaotic Quad Countries Algorithm M. A. Soltani-Sarvestani

More information

Variable Neighborhood Particle Swarm Optimization for Multi-objective Flexible Job-Shop Scheduling Problems

Variable Neighborhood Particle Swarm Optimization for Multi-objective Flexible Job-Shop Scheduling Problems Variable Neighborhood Particle Swarm Optimization for Multi-objective Flexible Job-Shop Scheduling Problems Hongbo Liu 1,2,AjithAbraham 3,1, Okkyung Choi 3,4, and Seong Hwan Moon 4 1 School of Computer

More information

COMPARISON OF ALGORITHMS FOR NONLINEAR REGRESSION ESTIMATES

COMPARISON OF ALGORITHMS FOR NONLINEAR REGRESSION ESTIMATES COMPSTAT 2004 Symposium c Physica-Verlag/Springer 2004 COMPARISON OF ALGORITHMS FOR NONLINEAR REGRESSION ESTIMATES Tvrdík J. and Křivý I. Key words: Global optimization, evolutionary algorithms, heuristics,

More information

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-"&"3 -"(' ( +-" " " % '.+ % ' -0(+$,

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-&3 -(' ( +-   % '.+ % ' -0(+$, The structure is a very important aspect in neural network design, it is not only impossible to determine an optimal structure for a given problem, it is even impossible to prove that a given structure

More information

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES 6.1 INTRODUCTION The exploration of applications of ANN for image classification has yielded satisfactory results. But, the scope for improving

More information

Hybrid Differential Evolution Algorithm for Traveling Salesman Problem

Hybrid Differential Evolution Algorithm for Traveling Salesman Problem Available online at www.sciencedirect.com Procedia Engineering 15 (2011) 2716 2720 Advanced in Control Engineeringand Information Science Hybrid Differential Evolution Algorithm for Traveling Salesman

More information

The Design of Pole Placement With Integral Controllers for Gryphon Robot Using Three Evolutionary Algorithms

The Design of Pole Placement With Integral Controllers for Gryphon Robot Using Three Evolutionary Algorithms The Design of Pole Placement With Integral Controllers for Gryphon Robot Using Three Evolutionary Algorithms Somayyeh Nalan-Ahmadabad and Sehraneh Ghaemi Abstract In this paper, pole placement with integral

More information