336 THE STATISTICAL SOFTWARE NEWSLETTER where z is one (randomly taken) pole of the simplex S, g the centroid of the remaining d poles of the simplex
|
|
- Lorena Wells
- 6 years ago
- Views:
Transcription
1 THE STATISTICAL SOFTWARE NEWSLETTER 335 Simple Evolutionary Heuristics for Global Optimization Josef Tvrdk and Ivan Krivy University of Ostrava, Brafova 7, Ostrava, Czech Republic Phone: , Fax: , Abstract: The paper deals with the empirical comparison of ve evolutionary algorithms. One of them is closely related to the controlled random search, another two are based on evolutionary search and remaining two on dierential evolution. The testbed consists of six functions widely used for testing of optimization algorithms. The average number of objective function evaluations and its variability are used as basic criteria for the comparison. (SSNinCSDA 30, 335 { 342 (May 1999)) Keywords: evolutionary algorithms, controlled random search, dierential evolution, evolutionary search, global optimization, experimental testing Received: December 1998 Revised: March 1999 I. Introduction Some evolutionary algorithms belong to the classes with thorough theoretical background - see e.g. Back (1996), who distinguishes three classes: genetic algorithms, evolutionary strategy and evolutionary programming. But there are many evolutionary algorithms (or heuristics), which cannot be classied according to the Back's categories. This paper presents several such algorithms, namely modied controlled random search (MCRS) - (Krivy and Tvrdk, 1995), evolutionary search (Krivy and Tvrdk, 1996), (Krivy and Tvrdk, 1997) and dierential evolution (Storn and Price, 1997). The aim of this contribution is to compare these algorithms when they are applied to the solution of some well-known optimization tasks. We consider the global optimization problem in a traditional sense: Given an objective function f :! R; R d ; 6= ;, we search for the point x opt such that f(x opt ) = min f(x): x2 Then x opt represents the global minimum of real-valued function f (of d variables) in. II. MCRS Algorithm The MCRS (Modied Controlled Random Search) algorithm - (Krivy and Tvrdk, 1995), (Krivy and Tvrdk, 1996) is based on old ideas of simplex method (Nelder and Mead, 1964) and controlled random search (Price, 1976). The MCRS algorithm starts with population P of N points taken at random in. A new trial point x is generated from a simplex S (a set of d + 1 linearly independent points of a population P in ) by the relation x = g Y (z g) (1)
2 336 THE STATISTICAL SOFTWARE NEWSLETTER where z is one (randomly taken) pole of the simplex S, g the centroid of the remaining d poles of the simplex and Y a multiplication factor. The point x may be considered as resulting from the reection of the point z with respect to the centroid g. The principal modication of the original Price's reection consists in randomizing the multiplication factor Y. Instead of being constant Y is considered as a random variable in MCRS algorithm. Several distributions of Y were tested and it was found that good optimization results were obtained with Y distributed uniformly in h0; ) with ranging from 4 to 8 (Krivy and Tvrdk, 1995). Starting from the procedure Ref lection, which can be formally written as procedure Ref lection(p, var x); S := set of (d + 1) points selected from P at random; x := g Y (z g) ; until x 2, the description of the MCRS algorithm is very simple: procedure MCRS; begin P := population of N randomly generated points in, Ref lection(p; x); if f(x) < f(x max ) then x max := x; until stopping condition is true; end fmcrsg; x max being the point with the largest function value of the N points currently stored. No particular stopping condition is dened. However, in most optimization problems the stopping condition is dened as f(x k ) f(x min ) " (2) where x min is the point with the smallest value of objective function among all the N points of population P, x k is another point of population P (e.g. the point with the largest function value) and " is an input parameter. The algorithm has only three input parameters: N, the size of population, a parameter dening the uniform distribution of Y in(1), the value of " in stopping condition (2). The right setting of the tuning parameters is dependent on the nature of optimization problem to be solved. It is obvious the higher values of N and the more thorough is the search. Empirical recommendation suitable for many tasks is = 8 and N = max(10; d 2 ). III. Evolutionary Search Considering the Ref lection procedure plays the role of a generalized crossover and applying some ideas from genetic algorithms (e.g. Goldberg, 1989) we can deal with another class of stochastic optimization algorithms. Let us call them evolutionary search (ES) algorithms. Their structure can be described as follows:
3 THE STATISTICAL SOFTWARE NEWSLETTER 337 procedure ES generate P (an old population of N points taken at random from ) copy a portion of the best points of P into a new population Q Reection applied to a simplex from P until the next trial point is better than the worst point in P insert the next trial point into Q until Q is completed to N points if condition for the mutation is true then mutate a point of Q replace P by Q until stopping condition is true Comparing to MCRS, the ES algorithms work on two populations P and Q and have two additional tuning parameters, namely probability of mutation and the portion of the old population P to be copied directly to new population Q. It is evident that the algorithms meet the principles of GAs: The initial population is generated randomly in. The new population inherits the properties of the old one in two ways: directly by surviving the best individuals (with respect to f-values) and indirectly by applying the reection, dened by (1), to the old population. An individual with new properties (even with a larger f-value) is allowed to arise with a small probability (mutation probability). In this contribution two algorithms of this class, ES2 and ES3, were tested. They dier from each other in constructing simplexes for generating the next trial points x according to (1). In case of ES2 algorithm the simplex is created completely at random from a current population P. Regarding ES3 algorithm, however, the simplex points, x i ; i = 1; 2; : : : ; d + 1, are selected with the probability proportional to their tness, s i, given by (see Goldberg, 1989) s i = 1 [(1 )i + N] 1 N where i denotes the rank of x i when all points are ordered in a non-decreasing sequence with respect to their f-values and = 1. N+1 The common inputs to both ES algorithms consist of: N, the size of population, a parameter of for the uniform distribution of Y,(1) the value of " in stopping condition, (2). p m, the probability of mutation, M, the number (or its upper limit) of the best points surviving from the old population to the new one. IV. Dierential Evolution Dierential evolution (DE) has appeared recently as a new heuristic approach for minimizing real-valued multimodal objective functions (Price and Storn, 1996). An extensive testing of the algorithms showed that DE converges faster and with more certainty than the adaptive simulated annealing as well as the annealed Nelder-Mead approach, both of which have a reputation for being very powerful (Storn and Price, 1997). DE has some features similar to evolutionary search, so it
4 338 THE STATISTICAL SOFTWARE NEWSLETTER seems worthwhile to compare these algorithms, at least by means of empirical tests. DE like ES also works with two populations P and Q. Basically, for each vector x i;p, i = 1; 2; : : : ; N, of an old population P a mutant vector v i;q is generated by adding the weighted dierence between two vectors of P to a third vector of P. In order to increase the diversity of the new vectors, crossover is introduced. Therefore, the trial vector u i;q of new population Q is created by replacing some elements of vector x i;p by corresponding elements of the mutant vector v i;q. More formally the DE can be written as follows: procedure DE generate P (an old population P of N points taken at random from ) for i := 1 to N do generate a mutant vector v i;q get a new trial vector u i;q by crossover of v i;q and x i;p if f(u i;q ) < f(x i;p ) then x i;q := u i;q else x i;q := x i;p enddo replace P by Q until stopping condition is true Several strategies how to generate mutant vectors v i;q are described in (Price and Storn, 1996) and (Storn and Price, 1997). Two of them were used in this paper: 1. The strategy using three randomly taken vectors of P (denoted RAND) v i;q = x r1;p + F (x r2;p x r3;p ) (3) where r 1, r 2, r 3 are integers taken at random from [1; N], mutually dierent and dierent from running index i, F is an input parameter, F > The strategy using the best vector and four randomly taken vectors of P (denoted BEST) v i;q = x rbest ;P + F (x r1;p + x r2;p x r3;p x r4;p ) (4) where x rbest ;P is the point of population P with the lowest value of the objective function, r 1, r 2, r 3, r 4 are integers taken at random from [1; N], mutually dierent and also dierent from running index i, F is again an input parameter, F > 0. The new trial vector u i;q = (u i0;q ; u i1;q ; : : : ; u i(d 1);Q ) 0 of Q is generated according to the rule vij;q for j = k(mod d); k + 1(mod d); : : : ; k + L 1(mod d) u ij;q = (5) otherwise x ij;p where j = 0; 1; : : : ; d 1 and the integer k, chosen randomly from [0; d 1], denotes the starting position (index) for crossover. The integer L is the number of elements to be exchanged, drawn from [1; d] with the probability P (L t) = C t 1 for t = 1; 2; : : : ; d. C is an input parameter giving the crossover probability, C 2 (0; 1). V. Testing functions All evolutionary algorithms under consideration were tested on searching for the global minimum of some special functions, which present a serious problem for many optimization algorithms. Our
5 THE STATISTICAL SOFTWARE NEWSLETTER 339 collection of testing functions contains ve well-known De Jong functions (some of them in a slightly modied form, see Storn and Price, 1997) and Ackley's function. 1. First De Jong function (sphere) f 1 (x) = This function, whose global minimum is f 1 (0) = 0, presents a relatively simple task for every serious optimization algorithm. 2. Second De Jong function (a generalized Rosenbrock's saddle) f 2 (x) = The global minimum of this function is f 2 (1) = Modied third De Jong function x 2 j : [100(x 2 j x j+1) 2 + (1 x j ) 2 ] f 3 (x) = bx j + 0:5c 2 where byc denotes the largest integer less than or equal to y. The global minimum f 4 = 0 is located at the plateau having x j 2 [ 0:5; 0:5] for all j, j = 1; 2; : : : ; d. 4. Fourth De Jong function (quartic function) f 4 (x) = (jx 4 j + ) where is a random variable with normal distribution N(0; 1). This function was taken to test the behavior of the optimization algorithms in the presence of Gaussian noise. Its minimum is f 4 (0) de() = Fifth De Jong function (Shekel's Foxholes) f 5 (x) = 0: X i=0 where the elements of A matrix are dened as follows: 1 i P 2 (x j a ij ) 6 1 a i1 = 32; 16; 0; 16; 32 for 0; 1; 2; 3; 4; resp.; a i1 = a i(mod 5);1 for i 5; a i2 = 32; 16; 0; 16; 32 for i = 0; 5; 10; 15; 20; resp.; a i2 = a i+k;2 for k = 1; 2; 3; 4: The global minimum is f 5 ( 32; 32) ' 0: Ackley's function v uut 1 f 6 (x) = c 1 exp c 2 d x 2 j exp 1 d cos c 3 x j + c1 + e
6 340 THE STATISTICAL SOFTWARE NEWSLETTER where e denotes Euler constant and c 1 = 20; c 2 = 0:2; c 3 = 2. According to (Back, 1996) this function provides a reasonable test for the necessary combination of path-oriented and volumeoriented characteristics of search strategy. Its global minimum is f 6 (0) = 0. VI. Experimental Results All the algorithms were implemented in TurboPascal and the corresponding programs were tested on PC. Q The searching space was dened in the same way for all testing functions, especially d = [ 40; 60]. The ranges of the individual variables were set relatively large and asymmetric in order to make the optimization tasks a bit more dicult. The dimension, d, of the global optimization tasks was set to 10, excepting the case of the fth De Jong function where d = 2 according to its denition. For each algorithm two basic features were investigated: rate of convergence and reliability in reaching the global minimum. For each optimization task at least 100 independent runs were carried out. As regards MCRS and ES algorithms, the common tuning parameters were adjusted to the same values, namely: N = max(10; d 2 ), = 8, " = 1E 8. Additional tuning parameters for the evolutionary search were set as follows: p m = 0:01, M = bn=2c, i.e. the largest integer less than or equal to N=2. Concerning DE algorithms N and " set as before, F = 1:0, C = 0:8. For all testing functions excepting f 4, the optimization process was stopped when the stopping condition (2) is satised for k = bn=2c, the population points are supposed being ordered in a nondecreasing sequence according to their function values. When searching for the global minimum of f 4, the stopping condition f(x min ) 0 was used. The experimental results are given in Tables 1 and 2. As regards the notation, NOBJF gives the average number of testing function evaluations (calculated only from runs when global minimum was reached), sd the corresponding standard deviation and failure indicates the percentage of runs when the algorithms stopped at a local minimum. Table 1: Average of objective function evaluations Algorithm f 1 f 2 f 3 f 4 f 5 f 6 MCRS ES ES DEBEST DERAND
7 THE STATISTICAL SOFTWARE NEWSLETTER 341 Table 2: Variability of objective function evaluations (100sd/NOBJF) and percentage of failure a) variability b) failure Algorithm f 1 f 2 f 3 f 4 f 5 f 6 f 1 f 2 f 3 f 4 f 5 f 6 MCRS ES ES DEBEST DERAND VII. Conclusions 1. Starting from the analysis of the experimental data, the following conclusions concerning NOBJF values can be made: MCRS algorithm exhibits the fastest convergence for ve of six testing functions, the DE algorithms are better only in the case of the 5th DeJong function. The results of DEBEST are comparable with the results of ES3, DERAND is slightly better in NOBJF. The ES2 requires the highest NOBJF to reach the global minimum in all six cases. 2. The data in the Table 2a illustrate that DE algorithms show the smallest variability of NOBJ values and, therefore, the prediction of their running times (NOBJF values) is more condential. Due to the fact that the generation of a new trial point u involves only partial evaluation of the vector components (see eq. (5)), DE algorithms are less time-consuming than the others at the same NOBJF value. 3. Concerning the reliability of nding the global minimum (see Table 2b), all algorithms sometimes failed in searching for the global minimum of the fth De Jong's function, the number of failures being the lowest for ES2; MCRS also fails from time to time when being applied to three other testing functions. 4. It seems that the heuristic used for generating a new trial point is not the most important feature of evolutionary algorithms. The reasons for this assertion are as follows: MCRS and ES2 algorithms, in which the same heuristic is used, stand on the opposite ends of both NOBJF and reliability scales. DE and ES3 algorithms are almost the same with respect to NOBJF values and reliability in spite of using dierent heuristic approach. The feature of self-adaptation and its eectiveness seem to be more important. 5. All the algorithms presented here are easy to implement and they have a very small number of tuning parameters. It makes their application to tasks of the global optimization in continuous spaces easier as compared with many other algorithms. Some of the algorithms, namely MCRS, ES2, DEBEST and DERAND, proved to be suciently reliable in optimizing a collection of 14 nonlinear regression models which present a dicult task for the optimization algorithms based on objective function derivatives (Krivy and Tvrdk, 1998).
8 342 THE STATISTICAL SOFTWARE NEWSLETTER References Back, T. (1996): Evolutionary Algorithms in Theory and Practice. New York, Oxford University Press Goldberg, D. E. (1989): Genetic Algorithms in Search, Optimization, and Machine Learning. Reading, Addison Wesley Krivy, I., Tvrdk, J. (1995): The Controlled Random Search Algorithm in Optimizing Regression Models. Comput. Statist. and Data Anal., 20, Krivy, I., Tvrdk, J. (1996): Stochastic Algorithms in Estimating Regression Models. In: A. Prat (Ed.), COMPSTAT Proceedings in Computational Statistics, Heidelberg, Physica-Verlag, Krivy, I., Tvrdk, J. (1997): Some Evolutionary Algorithms for Optimization. In: MENDEL'97. 3rd International Mendel Conference on Genetic Algorithms (Tech. Univ. of Brno, June 1997). Brno, PC-DIR, Krivy, I., Tvrdk, J. (1998): Stochastic Optimization and its Applications in Regression. In: R. Payne and P. Lane (Eds), COMPSTAT Proceedings in Computational Statistics, Short Communications and Posters, IACR Rothamsted, Nelder, J.A., Mead, R. (1964): A simplex method for function minimization. Computer J., 7, 308{313 Price K., Storn R. (1996): Minimizing the Real Functions of the ICEC'96 contest by Dierential Evolution, IEEE International Conference on Evolutionary Computation (ICEC'96), Price, W. L. (1976): A controlled random search procedure for global optimization. Computer J., 20, Storn R., Price K. (1997): Dierential evolution - a Simple and Ecient Heuristic for Global Optimization over Continuous Spaces. J. of Global Optimization 11,
COMPARISON OF ALGORITHMS FOR NONLINEAR REGRESSION ESTIMATES
COMPSTAT 2004 Symposium c Physica-Verlag/Springer 2004 COMPARISON OF ALGORITHMS FOR NONLINEAR REGRESSION ESTIMATES Tvrdík J. and Křivý I. Key words: Global optimization, evolutionary algorithms, heuristics,
More informationHyperplane Ranking in. Simple Genetic Algorithms. D. Whitley, K. Mathias, and L. Pyeatt. Department of Computer Science. Colorado State University
Hyperplane Ranking in Simple Genetic Algorithms D. Whitley, K. Mathias, and L. yeatt Department of Computer Science Colorado State University Fort Collins, Colorado 8523 USA whitley,mathiask,pyeatt@cs.colostate.edu
More informationDepartment of. Computer Science. Remapping Subpartitions of. Hyperspace Using Iterative. Genetic Search. Keith Mathias and Darrell Whitley
Department of Computer Science Remapping Subpartitions of Hyperspace Using Iterative Genetic Search Keith Mathias and Darrell Whitley Technical Report CS-4-11 January 7, 14 Colorado State University Remapping
More informationDE/EDA: A New Evolutionary Algorithm for Global Optimization 1
DE/EDA: A New Evolutionary Algorithm for Global Optimization 1 Jianyong Sun, Qingfu Zhang and Edward P.K. Tsang Department of Computer Science, University of Essex, Wivenhoe Park, Colchester, CO4 3SQ,
More informationBinary Differential Evolution Strategies
Binary Differential Evolution Strategies A.P. Engelbrecht, Member, IEEE G. Pampará Abstract Differential evolution has shown to be a very powerful, yet simple, population-based optimization approach. The
More informationA Parameter Study for Differential Evolution
A Parameter Study for Differential Evolution ROGER GÄMPERLE SIBYLLE D MÜLLER PETROS KOUMOUTSAKOS Institute of Computational Sciences Department of Computer Science Swiss Federal Institute of Technology
More informationAn Evolutionary Algorithm for Minimizing Multimodal Functions
An Evolutionary Algorithm for Minimizing Multimodal Functions D.G. Sotiropoulos, V.P. Plagianakos and M.N. Vrahatis University of Patras, Department of Mamatics, Division of Computational Mamatics & Informatics,
More informationTelecommunication and Informatics University of North Carolina, Technical University of Gdansk Charlotte, NC 28223, USA
A Decoder-based Evolutionary Algorithm for Constrained Parameter Optimization Problems S lawomir Kozie l 1 and Zbigniew Michalewicz 2 1 Department of Electronics, 2 Department of Computer Science, Telecommunication
More informationGenetic Algorithms, Numerical Optimization, and Constraints. Zbigniew Michalewicz. Department of Computer Science. University of North Carolina
Genetic Algorithms, Numerical Optimization, and Constraints Zbigniew Michalewicz Department of Computer Science University of North Carolina Charlotte, NC 28223 Abstract During the last two years several
More informationOptimization of Benchmark Functions Using Genetic Algorithm
Optimization of Benchmark s Using Genetic Algorithm Vinod Goyal GJUS&T, Hisar Sakshi Dhingra GJUS&T, Hisar Jyoti Goyat GJUS&T, Hisar Dr Sanjay Singla IET Bhaddal Technical Campus, Ropar, Punjab Abstrat
More informationThe Simple Genetic Algorithm Performance: A Comparative Study on the Operators Combination
INFOCOMP 20 : The First International Conference on Advanced Communications and Computation The Simple Genetic Algorithm Performance: A Comparative Study on the Operators Combination Delmar Broglio Carvalho,
More informationA New Approach for Finding the Global Optimal Point Using Subdividing Labeling Method (SLM)
A New Approach for Finding the Global Optimal Point Using Subdividing Labeling Method (SLM) MasoumehVali Department of Mathematics, Dolatabad Branch, Islamic Azad University, Isfahan, Iran E-mail: vali.masoumeh@gmail.com
More informationThe Genetic Algorithm for finding the maxima of single-variable functions
Research Inventy: International Journal Of Engineering And Science Vol.4, Issue 3(March 2014), PP 46-54 Issn (e): 2278-4721, Issn (p):2319-6483, www.researchinventy.com The Genetic Algorithm for finding
More informationA Novel Hybrid Self Organizing Migrating Algorithm with Mutation for Global Optimization
International Journal of Soft Computing and Engineering (IJSCE) ISSN: 2231-2307, Volume-3, Issue-6, January 2014 A Novel Hybrid Self Organizing Migrating Algorithm with Mutation for Global Optimization
More informationChapter 14 Global Search Algorithms
Chapter 14 Global Search Algorithms An Introduction to Optimization Spring, 2015 Wei-Ta Chu 1 Introduction We discuss various search methods that attempts to search throughout the entire feasible set.
More informationA Population-Based Learning Algorithm Which Learns Both. Architectures and Weights of Neural Networks y. Yong Liu and Xin Yao
A Population-Based Learning Algorithm Which Learns Both Architectures and Weights of Neural Networks y Yong Liu and Xin Yao Computational Intelligence Group Department of Computer Science University College,
More informationPARALLELIZATION OF THE NELDER-MEAD SIMPLEX ALGORITHM
PARALLELIZATION OF THE NELDER-MEAD SIMPLEX ALGORITHM Scott Wu Montgomery Blair High School Silver Spring, Maryland Paul Kienzle Center for Neutron Research, National Institute of Standards and Technology
More informationEvolutionary Algorithms Selected Basic Topics and Terms
NAVY Research Group Department of Computer Science Faculty of Electrical Engineering and Computer Science VŠB- TUO 17. listopadu 15 708 33 Ostrava- Poruba Czech Republic! Basics of Modern Computer Science
More informationCluster quality 15. Running time 0.7. Distance between estimated and true means Running time [s]
Fast, single-pass K-means algorithms Fredrik Farnstrom Computer Science and Engineering Lund Institute of Technology, Sweden arnstrom@ucsd.edu James Lewis Computer Science and Engineering University of
More informationImplementations of Dijkstra's Algorithm. Based on Multi-Level Buckets. November Abstract
Implementations of Dijkstra's Algorithm Based on Multi-Level Buckets Andrew V. Goldberg NEC Research Institute 4 Independence Way Princeton, NJ 08540 avg@research.nj.nec.com Craig Silverstein Computer
More informationExploration vs. Exploitation in Differential Evolution
Exploration vs. Exploitation in Differential Evolution Ângela A. R. Sá 1, Adriano O. Andrade 1, Alcimar B. Soares 1 and Slawomir J. Nasuto 2 Abstract. Differential Evolution (DE) is a tool for efficient
More informationMAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS
In: Journal of Applied Statistical Science Volume 18, Number 3, pp. 1 7 ISSN: 1067-5817 c 2011 Nova Science Publishers, Inc. MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS Füsun Akman
More informationMarch 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms
Olga Galinina olga.galinina@tut.fi ELT-53656 Network Analysis and Dimensioning II Department of Electronics and Communications Engineering Tampere University of Technology, Tampere, Finland March 19, 2014
More informationJournal of Global Optimization, 10, 1{40 (1997) A Discrete Lagrangian-Based Global-Search. Method for Solving Satisability Problems *
Journal of Global Optimization, 10, 1{40 (1997) c 1997 Kluwer Academic Publishers, Boston. Manufactured in The Netherlands. A Discrete Lagrangian-Based Global-Search Method for Solving Satisability Problems
More informationAn evolutionary annealing-simplex algorithm for global optimisation of water resource systems
FIFTH INTERNATIONAL CONFERENCE ON HYDROINFORMATICS 1-5 July 2002, Cardiff, UK C05 - Evolutionary algorithms in hydroinformatics An evolutionary annealing-simplex algorithm for global optimisation of water
More informationEvolutionary Algorithms. CS Evolutionary Algorithms 1
Evolutionary Algorithms CS 478 - Evolutionary Algorithms 1 Evolutionary Computation/Algorithms Genetic Algorithms l Simulate natural evolution of structures via selection and reproduction, based on performance
More informationTime Complexity Analysis of the Genetic Algorithm Clustering Method
Time Complexity Analysis of the Genetic Algorithm Clustering Method Z. M. NOPIAH, M. I. KHAIRIR, S. ABDULLAH, M. N. BAHARIN, and A. ARIFIN Department of Mechanical and Materials Engineering Universiti
More informationA Cultivated Differential Evolution Algorithm using modified Mutation and Selection Strategy
A Cultivated Differential Evolution Algorithm using modified Mutation and Selection Strategy Pooja *1 Praveena Chaturvedi 1 Pravesh Kumar 2 1. Department of Computer Science, Gurukula Kangri Vishwavidyalaya,
More informationA New Modified Binary Differential Evolution Algorithm and its Applications
Appl. Math. Inf. Sci. 10, No. 5, 1965-1969 (2016) 1965 Applied Mathematics & Information Sciences An International Journal http://dx.doi.org/10.18576/amis/100538 A New Modified Binary Differential Evolution
More informationAutomata Construct with Genetic Algorithm
Automata Construct with Genetic Algorithm Vít Fábera Department of Informatics and Telecommunication, Faculty of Transportation Sciences, Czech Technical University, Konviktská 2, Praha, Czech Republic,
More informationBinary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms
Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms Franz Rothlauf Department of Information Systems University of Bayreuth, Germany franz.rothlauf@uni-bayreuth.de
More informationA Hybrid Fireworks Optimization Method with Differential Evolution Operators
A Fireworks Optimization Method with Differential Evolution Operators YuJun Zheng a,, XinLi Xu a, HaiFeng Ling b a College of Computer Science & Technology, Zhejiang University of Technology, Hangzhou,
More informationGENERATING FUZZY RULES FROM EXAMPLES USING GENETIC. Francisco HERRERA, Manuel LOZANO, Jose Luis VERDEGAY
GENERATING FUZZY RULES FROM EXAMPLES USING GENETIC ALGORITHMS Francisco HERRERA, Manuel LOZANO, Jose Luis VERDEGAY Dept. of Computer Science and Articial Intelligence University of Granada, 18071 - Granada,
More informationLuo, W., and Li, Y. (2016) Benchmarking Heuristic Search and Optimisation Algorithms in Matlab. In: 22nd International Conference on Automation and Computing (ICAC), 2016, University of Essex, Colchester,
More informationFrontier Pareto-optimum
Distributed Genetic Algorithms with a New Sharing Approach in Multiobjective Optimization Problems Tomoyuki HIROYASU Mitsunori MIKI Sinya WATANABE Doshisha University, Dept. of Knowledge Engineering and
More informationIEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 5, NO. 1, FEBRUARY
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 5, NO. 1, FEBRUARY 2001 41 Brief Papers An Orthogonal Genetic Algorithm with Quantization for Global Numerical Optimization Yiu-Wing Leung, Senior Member,
More informationDepartment of Electrical Engineering, Keio University Hiyoshi Kouhoku-ku Yokohama 223, Japan
Shape Modeling from Multiple View Images Using GAs Satoshi KIRIHARA and Hideo SAITO Department of Electrical Engineering, Keio University 3-14-1 Hiyoshi Kouhoku-ku Yokohama 223, Japan TEL +81-45-563-1141
More informationProceedings of the First IEEE Conference on Evolutionary Computation - IEEE World Congress on Computational Intelligence, June
Proceedings of the First IEEE Conference on Evolutionary Computation - IEEE World Congress on Computational Intelligence, June 26-July 2, 1994, Orlando, Florida, pp. 829-833. Dynamic Scheduling of Computer
More informationProblems. search space; in the case of optimization problems with. search the boundary of feasible and infeasible regions
Boundary Operators for Constrained Parameter Optimization Problems Marc Schoenauer CMAP { URA CNRS 756 Ecole Polytechnique Palaiseau 91128, France marc.schoenauer@polytechnique.fr Abstract In this paper
More informationMetaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini
Metaheuristic Development Methodology Fall 2009 Instructor: Dr. Masoud Yaghini Phases and Steps Phases and Steps Phase 1: Understanding Problem Step 1: State the Problem Step 2: Review of Existing Solution
More informationEvolutionary Computation, 2018/2019 Programming assignment 3
Evolutionary Computation, 018/019 Programming assignment 3 Important information Deadline: /Oct/018, 3:59. All problems must be submitted through Mooshak. Please go to http://mooshak.deei.fct.ualg.pt/~mooshak/
More informationAn Evolutionary Algorithm for the Multi-objective Shortest Path Problem
An Evolutionary Algorithm for the Multi-objective Shortest Path Problem Fangguo He Huan Qi Qiong Fan Institute of Systems Engineering, Huazhong University of Science & Technology, Wuhan 430074, P. R. China
More informationGenetic Algorithms for EQ-algebras Automatic Generation
Genetic Algorithms for EQ-algebras Automatic Generation Hashim Habiballa, Vilém Novák, Martin Dyba Centre of Excellence IT4Innovations - Division University of Ostrava Institute for Research and Applications
More informationA Genetic Algorithm-Based Approach for Energy- Efficient Clustering of Wireless Sensor Networks
A Genetic Algorithm-Based Approach for Energy- Efficient Clustering of Wireless Sensor Networks A. Zahmatkesh and M. H. Yaghmaee Abstract In this paper, we propose a Genetic Algorithm (GA) to optimize
More informationUsing Genetic Algorithm with Triple Crossover to Solve Travelling Salesman Problem
Proc. 1 st International Conference on Machine Learning and Data Engineering (icmlde2017) 20-22 Nov 2017, Sydney, Australia ISBN: 978-0-6480147-3-7 Using Genetic Algorithm with Triple Crossover to Solve
More informationminimizing minimizing
The Pareto Envelope-based Selection Algorithm for Multiobjective Optimization David W. Corne, Joshua D. Knowles, Martin J. Oates School of Computer Science, Cybernetics and Electronic Engineering University
More informationActive contour: a parallel genetic algorithm approach
id-1 Active contour: a parallel genetic algorithm approach Florence Kussener 1 1 MathWorks, 2 rue de Paris 92196 Meudon Cedex, France Florence.Kussener@mathworks.fr Abstract This paper presents an algorithm
More informationSelf-Adaptive Differential Evolution Algorithm in Constrained Real-Parameter Optimization
006 IEEE Congress on Evolutionary Computation Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada July 6-, 006 Self-Adaptive Differential Evolution Algorithm in Constrained Real-Parameter Optimization
More informationGenetic Model Optimization for Hausdorff Distance-Based Face Localization
c In Proc. International ECCV 2002 Workshop on Biometric Authentication, Springer, Lecture Notes in Computer Science, LNCS-2359, pp. 103 111, Copenhagen, Denmark, June 2002. Genetic Model Optimization
More informationSimplex of Nelder & Mead Algorithm
Simplex of N & M Simplex of Nelder & Mead Algorithm AKA the Amoeba algorithm In the class of direct search methods Unconstrained (although constraints can be added as part of error function) nonlinear
More information5. Computational Geometry, Benchmarks and Algorithms for Rectangular and Irregular Packing. 6. Meta-heuristic Algorithms and Rectangular Packing
1. Introduction 2. Cutting and Packing Problems 3. Optimisation Techniques 4. Automated Packing Techniques 5. Computational Geometry, Benchmarks and Algorithms for Rectangular and Irregular Packing 6.
More informationSimilarity Templates or Schemata. CS 571 Evolutionary Computation
Similarity Templates or Schemata CS 571 Evolutionary Computation Similarities among Strings in a Population A GA has a population of strings (solutions) that change from generation to generation. What
More informationGen := 0. Create Initial Random Population. Termination Criterion Satisfied? Yes. Evaluate fitness of each individual in population.
An Experimental Comparison of Genetic Programming and Inductive Logic Programming on Learning Recursive List Functions Lappoon R. Tang Mary Elaine Cali Raymond J. Mooney Department of Computer Sciences
More informationGenetic algorithms and finite element coupling for mechanical optimization
Computer Aided Optimum Design in Engineering X 87 Genetic algorithms and finite element coupling for mechanical optimization G. Corriveau, R. Guilbault & A. Tahan Department of Mechanical Engineering,
More informationjoint 3 link 3 link 2 joint 1
Robot Arm Fuzzy Control by a Neuro-Genetic Algorithm Carlos Kavka, Mara Liz Crespo Proyecto UNSL 338403 Departamento de Informatica Universidad Nacional de San Luis Ejercito de los Andes 950, 5700, San
More informationA HYBRID APPROACH TO GLOBAL OPTIMIZATION USING A CLUSTERING ALGORITHM IN A GENETIC SEARCH FRAMEWORK VIJAYKUMAR HANAGANDI. Los Alamos, NM 87545, U.S.A.
A HYBRID APPROACH TO GLOBAL OPTIMIZATION USING A CLUSTERING ALGORITHM IN A GENETIC SEARCH FRAMEWORK VIJAYKUMAR HANAGANDI MS C2, Los Alamos National Laboratory Los Alamos, NM 87545, U.S.A. and MICHAEL NIKOLAOU
More informationRevision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems
4 The Open Cybernetics and Systemics Journal, 008,, 4-9 Revision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems K. Kato *, M. Sakawa and H. Katagiri Department of Artificial
More informationPerformance Assessment of DMOEA-DD with CEC 2009 MOEA Competition Test Instances
Performance Assessment of DMOEA-DD with CEC 2009 MOEA Competition Test Instances Minzhong Liu, Xiufen Zou, Yu Chen, Zhijian Wu Abstract In this paper, the DMOEA-DD, which is an improvement of DMOEA[1,
More informationCS5401 FS2015 Exam 1 Key
CS5401 FS2015 Exam 1 Key This is a closed-book, closed-notes exam. The only items you are allowed to use are writing implements. Mark each sheet of paper you use with your name and the string cs5401fs2015
More informationFour Methods for Maintenance Scheduling
Four Methods for Maintenance Scheduling Edmund K. Burke, University of Nottingham, ekb@cs.nott.ac.uk John A. Clark, University of York, jac@minster.york.ac.uk Alistair J. Smith, University of Nottingham,
More informationA Genetic Algorithm for Minimum Tetrahedralization of a Convex Polyhedron
A Genetic Algorithm for Minimum Tetrahedralization of a Convex Polyhedron Kiat-Choong Chen Ian Hsieh Cao An Wang Abstract A minimum tetrahedralization of a convex polyhedron is a partition of the convex
More informationThe Global Standard for Mobility (GSM) (see, e.g., [6], [4], [5]) yields a
Preprint 0 (2000)?{? 1 Approximation of a direction of N d in bounded coordinates Jean-Christophe Novelli a Gilles Schaeer b Florent Hivert a a Universite Paris 7 { LIAFA 2, place Jussieu - 75251 Paris
More informationA genetic algorithm implemented in Matlab is presented. Matlab is used for the following reasons:
A Genetic Algorithm for Function Optimization: A Matlab Implementation Christopher R. Houck North Carolina State University and Jeery A. Joines North Carolina State University and Michael G. Kay North
More informationA Design of an Active OTA-C Filter Based on DESA Algorithm
POSTER 2018, PRAGUE MAY 10 1 A Design of an Active OTA-C Filter Based on DESA Algorithm Dalibor BARRI 1,2 1 Dept. of Microelectronics, Czech Technical University, Technická 2, 166 27 Prague 6, Czech Republic
More informationHybrid Differential Evolution Algorithm for Traveling Salesman Problem
Available online at www.sciencedirect.com Procedia Engineering 15 (2011) 2716 2720 Advanced in Control Engineeringand Information Science Hybrid Differential Evolution Algorithm for Traveling Salesman
More informationReal Coded Genetic Algorithm Particle Filter for Improved Performance
Proceedings of 2012 4th International Conference on Machine Learning and Computing IPCSIT vol. 25 (2012) (2012) IACSIT Press, Singapore Real Coded Genetic Algorithm Particle Filter for Improved Performance
More informationI R TECHNICAL RESEARCH REPORT. Evolutionary Policy Iteration for Solving Markov Decision Processes
TECHNICAL RESEARCH REPORT Evolutionary Policy Iteration for Solving Markov Decision Processes by Hyeong Soo Chang, Hong-Gi Lee, Michael Fu, and Steven Marcus TR 00-31 I R INSTITUTE FOR SYSTEMS RESEARCH
More informationTHE EFFECT OF SEGREGATION IN NON- REPEATED PRISONER'S DILEMMA
THE EFFECT OF SEGREGATION IN NON- REPEATED PRISONER'S DILEMMA Thomas Nordli University of South-Eastern Norway, Norway ABSTRACT This article consolidates the idea that non-random pairing can promote the
More informationGA is the most popular population based heuristic algorithm since it was developed by Holland in 1975 [1]. This algorithm runs faster and requires les
Chaotic Crossover Operator on Genetic Algorithm Hüseyin Demirci Computer Engineering, Sakarya University, Sakarya, 54187, Turkey Ahmet Turan Özcerit Computer Engineering, Sakarya University, Sakarya, 54187,
More informationMulti-Objective Optimization Using Genetic Algorithms
Multi-Objective Optimization Using Genetic Algorithms Mikhail Gaerlan Computational Physics PH 4433 December 8, 2015 1 Optimization Optimization is a general term for a type of numerical problem that involves
More informationNetwork Visualization of Population Dynamics in the Differential Evolution
25 IEEE Symposium Series on Computational Intelligence Network Visualization of Population Dynamics in the Differential Evolution Petr Gajdoš, Pavel Krömer, Ivan Zelinka Department of Computer Science
More informationAdaptive Spiral Optimization Algorithm for Benchmark Problems
Bilecik Şeyh Edebali Üniversitesi Fen Bilimleri Dergisi, Cilt:, Sayı:, 6 ISSN: -77 (http://edergi.bilecik.edu.tr/index.php/fbd) Araştırma Makalesi/Research Article Adaptive Spiral Optimization Algorithm
More informationObject classes. recall (%)
Using Genetic Algorithms to Improve the Accuracy of Object Detection Victor Ciesielski and Mengjie Zhang Department of Computer Science, Royal Melbourne Institute of Technology GPO Box 2476V, Melbourne
More information1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra
Pattern Recall Analysis of the Hopfield Neural Network with a Genetic Algorithm Susmita Mohapatra Department of Computer Science, Utkal University, India Abstract: This paper is focused on the implementation
More informationCHAPTER 3.4 AND 3.5. Sara Gestrelius
CHAPTER 3.4 AND 3.5 Sara Gestrelius 3.4 OTHER EVOLUTIONARY ALGORITHMS Estimation of Distribution algorithms Differential Evolution Coevolutionary algorithms Cultural algorithms LAST TIME: EVOLUTIONARY
More informationAIRFOIL SHAPE OPTIMIZATION USING EVOLUTIONARY ALGORITHMS
AIRFOIL SHAPE OPTIMIZATION USING EVOLUTIONARY ALGORITHMS Emre Alpman Graduate Research Assistant Aerospace Engineering Department Pennstate University University Park, PA, 6802 Abstract A new methodology
More informationCHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM
20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:
More informationJob Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search
A JOB-SHOP SCHEDULING PROBLEM (JSSP) USING GENETIC ALGORITHM (GA) Mahanim Omar, Adam Baharum, Yahya Abu Hasan School of Mathematical Sciences, Universiti Sains Malaysia 11800 Penang, Malaysia Tel: (+)
More informationArtificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems
Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems Dervis Karaboga and Bahriye Basturk Erciyes University, Engineering Faculty, The Department of Computer
More informationAdaptive Crossover in Genetic Algorithms Using Statistics Mechanism
in Artificial Life VIII, Standish, Abbass, Bedau (eds)(mit Press) 2002. pp 182 185 1 Adaptive Crossover in Genetic Algorithms Using Statistics Mechanism Shengxiang Yang Department of Mathematics and Computer
More informationTopological Machining Fixture Layout Synthesis Using Genetic Algorithms
Topological Machining Fixture Layout Synthesis Using Genetic Algorithms Necmettin Kaya Uludag University, Mechanical Eng. Department, Bursa, Turkey Ferruh Öztürk Uludag University, Mechanical Eng. Department,
More informationThe movement of the dimmer firefly i towards the brighter firefly j in terms of the dimmer one s updated location is determined by the following equat
An Improved Firefly Algorithm for Optimization Problems Amarita Ritthipakdee 1, Arit Thammano, Nol Premasathian 3, and Bunyarit Uyyanonvara 4 Abstract Optimization problem is one of the most difficult
More informationTheoretical Foundations of SBSE. Xin Yao CERCIA, School of Computer Science University of Birmingham
Theoretical Foundations of SBSE Xin Yao CERCIA, School of Computer Science University of Birmingham Some Theoretical Foundations of SBSE Xin Yao and Many Others CERCIA, School of Computer Science University
More informationWhat is GOSET? GOSET stands for Genetic Optimization System Engineering Tool
Lecture 5: GOSET 1 What is GOSET? GOSET stands for Genetic Optimization System Engineering Tool GOSET is a MATLAB based genetic algorithm toolbox for solving optimization problems 2 GOSET Features Wide
More informationSensing Error Minimization for Cognitive Radio in Dynamic Environment using Death Penalty Differential Evolution based Threshold Adaptation
Sensing Error Minimization for Cognitive Radio in Dynamic Environment using Death Penalty Differential Evolution based Threshold Adaptation Soumyadip Das 1, Sumitra Mukhopadhyay 2 1,2 Institute of Radio
More informationHybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques
Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques Nasser Sadati Abstract Particle Swarm Optimization (PSO) algorithms recently invented as intelligent optimizers with several highly
More information3 Linkage identication. In each linkage. Intra GA Intra GA Intra GA. BB candidates. Inter GA;
Designing a Genetic Algorithm Using the Linkage Identication by Nonlinearity Check Masaharu Munetomo and David E. Goldberg IlliGAL Report No. 98014 December 1998 Illinois Genetic Algorithms Laboratory
More informationDifferential Evolution Algorithm for Likelihood Estimation
International Conference on Control, Robotics Mechanical Engineering (ICCRME'2014 Jan. 15-16, 2014 Kuala Lumpur (Malaysia Differential Evolution Algorithm for Likelihood Estimation Mohd Sapiyan bin Baba
More informationGenetic Algorithms for Solving. Open Shop Scheduling Problems. Sami Khuri and Sowmya Rao Miryala. San Jose State University.
Genetic Algorithms for Solving Open Shop Scheduling Problems Sami Khuri and Sowmya Rao Miryala Department of Mathematics and Computer Science San Jose State University San Jose, California 95192, USA khuri@cs.sjsu.edu
More informationSparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach
1 Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach David Greiner, Gustavo Montero, Gabriel Winter Institute of Intelligent Systems and Numerical Applications in Engineering (IUSIANI)
More informationModel Parameter Estimation
Model Parameter Estimation Shan He School for Computational Science University of Birmingham Module 06-23836: Computational Modelling with MATLAB Outline Outline of Topics Concepts about model parameter
More informationA Genetic Algorithm Based Hybrid Approach for Reliability- Redundancy Optimization Problem of a Series System with Multiple- Choice
https://dx.doi.org/10.33889/ijmems.017..3-016 A Genetic Algorithm Based Hybrid Approach for Reliability- Redundancy Optimization Problem of a Series System with Multiple- Choice Asoke Kumar Bhunia Department
More informationcontrol polytope. These points are manipulated by a descent method to compute a candidate global minimizer. The second method is described in Section
Some Heuristics and Test Problems for Nonconvex Quadratic Programming over a Simplex Ivo Nowak September 3, 1998 Keywords:global optimization, nonconvex quadratic programming, heuristics, Bezier methods,
More informationNetwork Routing Protocol using Genetic Algorithms
International Journal of Electrical & Computer Sciences IJECS-IJENS Vol:0 No:02 40 Network Routing Protocol using Genetic Algorithms Gihan Nagib and Wahied G. Ali Abstract This paper aims to develop a
More informationOPTIMIZATION EVOLUTIONARY ALGORITHMS. Biologically-Inspired and. Computer Intelligence. Wiley. Population-Based Approaches to.
EVOLUTIONARY OPTIMIZATION ALGORITHMS Biologically-Inspired and Population-Based Approaches to Computer Intelligence Dan Simon Cleveland State University Wiley DETAILED TABLE OF CONTENTS Acknowledgments
More informationList of Figures 1 The GGA algorithm Touching clusters data with articial boundaries The frequency
Clustering with a Genetically Optimized Approach L.O. Hall 1, B. Ozyurt 1, J.C. Bezdek 2 1 Department of Computer Science and Engineering University of South Florida Tampa, Fl. 33620 2 Department of Computer
More informationAn Introduction to Evolutionary Algorithms
An Introduction to Evolutionary Algorithms Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical Information Technology Karthik.sindhya@jyu.fi http://users.jyu.fi/~kasindhy/
More informationRuffled by Ridges: How Evolutionary Algorithms can fail
Ruffled by Ridges: How Evolutionary Algorithms can fail Darrell Whitley and Monte Lunacek Computer Science, Colorado State University, Fort Collins, CO 80523 Abstract. The representations currently used
More informationCooperative Coevolution using The Brain Storm Optimization Algorithm
Cooperative Coevolution using The Brain Storm Optimization Algorithm Mohammed El-Abd Electrical and Computer Engineering Department American University of Kuwait Email: melabd@auk.edu.kw Abstract The Brain
More informationUnidimensional Search for solving continuous high-dimensional optimization problems
2009 Ninth International Conference on Intelligent Systems Design and Applications Unidimensional Search for solving continuous high-dimensional optimization problems Vincent Gardeux, Rachid Chelouah,
More information