A Java Implementation of the SGA, UMDA, ECGA, and HBOA

Similar documents
Parameter-Less Optimization with the Extended Compact Genetic Algorithm and Iterated Local Search

Combining Competent Crossover and Mutation Operators: a Probabilistic Model Building Approach

arxiv:cs/ v1 [cs.ne] 15 Feb 2004

Decomposable Problems, Niching, and Scalability of Multiobjective Estimation of Distribution Algorithms

Updating the probability vector using MRF technique for a Univariate EDA

Multiobjective hboa, Clustering, and Scalability. Martin Pelikan Kumara Sastry David E. Goldberg. IlliGAL Report No February 2005

A THREAD BUILDING BLOCKS BASED PARALLEL GENETIC ALGORITHM

Model-Based Evolutionary Algorithms

A Generator for Hierarchical Problems

Diversity Allocation for Dynamic Optimization using the Extended Compact Genetic Algorithm

Linkage Learning using the Maximum Spanning Tree of the Dependency Graph

Estimation of Distribution Algorithm Based on Mixture

Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms

The University of Algarve Informatics Laboratory

v 2,0 T 2,0 v 1,1 T 1,1

Arithmetic Coding Differential Evolution with Tabu Search

Efficiency Enhancement In Estimation of Distribution Algorithms

Hierarchical Problem Solving with the Linkage Tree Genetic Algorithm

A Survey of Estimation of Distribution Algorithms

Mutation in Compressed Encoding in Estimation of Distribution Algorithm

DE/EDA: A New Evolutionary Algorithm for Global Optimization 1

gorithm and simple two-parent recombination operators soon showed to be insuciently powerful even for problems that are composed of simpler partial su

JGA User Guide. WinNT/2000 Deployment v

Adaptive Crossover in Genetic Algorithms Using Statistics Mechanism

A FACTOR GRAPH BASED GENETIC ALGORITHM

Scaling Simple and Compact Genetic Algorithms using MapReduce

Distribution Tree-Building Real-valued Evolutionary Algorithm

Using Prior Knowledge and Learning from Experience in Estimation of Distribution Algorithms

Intelligent Bias of Network Structures in the Hierarchical BOA

USING CHI-SQUARE MATRIX TO STRENGTHEN MULTI-OBJECTIVE EVOLUTIONARY ALGORITHM

3 Linkage identication. In each linkage. Intra GA Intra GA Intra GA. BB candidates. Inter GA;

Dependency Trees, Permutations, and Quadratic Assignment Problem

Probabilistic Model-Building Genetic Algorithms

MIC 2009: The VIII Metaheuristics International Conference. A Comparative Study of Adaptive Mutation Operators for Genetic Algorithms

Distributed Probabilistic Model-Building Genetic Algorithm

Hybrid Evolutionary Algorithms on Minimum Vertex Cover for Random Graphs

arxiv:cs/ v1 [cs.ne] 4 Feb 2005

A C++ Implementation of the Bayesian Optimization Algorithm (BOA) with Decision Graphs Martin Pelikan Illinois Genetic Algorithms Laboratory 104 S. Ma

MARLEDA: Effective Distribution Estimation through Markov Random Fields

Bayesian Optimization Algorithm, Decision Graphs, and Occam's Razor Martin Pelikan, David. E. Goldberg, and Kumara Sastry Illinois Genetic Algorithms

Learning Adaptive Parameters with Restricted Genetic Optimization Method

Hierarchical Crossover in Genetic Algorithms

COMPARING VARIOUS MARGINAL PROBABILITY MODELS IN EVOLUTIONARY ALGORITHMS

Image Classification and Processing using Modified Parallel-ACTIT

Adaptive Population Sizing Schemes in Genetic Algorithms

Do not Match, Inherit: Fitness Surrogates for Genetics-Based Machine Learning Techniques

Coalition formation in multi-agent systems an evolutionary approach

Generator and Interface for Random Decomposable Problems in C

CONCEPT FORMATION AND DECISION TREE INDUCTION USING THE GENETIC PROGRAMMING PARADIGM

Analysis of the impact of parameters values on the Genetic Algorithm for TSP

Genetic Algorithms with Mapreduce Runtimes

Evoptool: an Extensible Toolkit for Evolutionary Optimization Algorithms Comparison

epoch 1 epoch era 1 era 2

John R. Koza Computer Science Department Stanford University Stanford, CA USA

Population Sizing for the Redundant Trivial Voting Mapping

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra

Object Oriented Design and Implementation of a General Evolutionary Algorithm

MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS

An Adaptive Hardware Classifier in FPGA based-on a Cellular Compact Genetic Algorithm and Block-based Neural Network

Structure Learning of Bayesian Networks by Estimation of Distribution Algorithms with Transpose Mutation

ISSN: [Keswani* et al., 7(1): January, 2018] Impact Factor: 4.116

JEvolution: Evolutionary Algorithms in Java

Relationship between Genetic Algorithms and Ant Colony Optimization Algorithms

PolyEDA: Combining Estimation of Distribution Algorithms and Linear Inequality Constraints

A GENETIC ALGORITHM APPROACH TO OPTIMAL TOPOLOGICAL DESIGN OF ALL TERMINAL NETWORKS

A Performance Analysis of Compressed Compact Genetic Algorithm

Adaptive Elitist-Population Based Genetic Algorithm for Multimodal Function Optimization

GRANULAR COMPUTING AND EVOLUTIONARY FUZZY MODELLING FOR MECHANICAL PROPERTIES OF ALLOY STEELS. G. Panoutsos and M. Mahfouf

Calc Redirection : A Structure for Direction Finding Aided Traffic Monitoring

An experimental evaluation of a parallel genetic algorithm using MPI

Multi-Objective Optimization with Iterated Density Estimation Evolutionary Algorithms using Mixture Models

Revision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems

Mixed-Integer Bayesian Optimization Utilizing A-Priori Knowledge on Parameter Dependences

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

A Study on Optimization Algorithms for Clustering Gene Expression Data

Sampling Bias in Estimation of Distribution Algorithms for Genetic Programming using Prototype Trees

The Genetic Algorithm for finding the maxima of single-variable functions

Real Coded Genetic Algorithm Particle Filter for Improved Performance

Multi-objective pattern and feature selection by a genetic algorithm

COMPLETE INDUCTION OF RECURRENT NEURAL NETWORKS. PETER J. ANGELINE IBM Federal Systems Company, Rt 17C Owego, New York 13827

Node Histogram vs. Edge Histogram: A Comparison of PMBGAs in Permutation Domains

An Improved Genetic Algorithm based Fault tolerance Method for distributed wireless sensor networks.

Usage of of Genetic Algorithm for Lattice Drawing

Evolutionary Computation. Chao Lan

A Framework for Parallel Genetic Algorithms on PC Cluster

P2P Evolutionary Algorithms: A Suitable Approach for Tackling Large Instances in Hard Optimization Problems

GA and CHC. Two Evolutionary Algorithms to Solve the Root Identification Problem in Geometric Constraint Solving

Network Routing Protocol using Genetic Algorithms

T E D A : A TA R G E T E D E S T I M AT I O N O F D I S T R I B U T I O N A L G O R I T H M. geoffrey neumann

Selection of Optimal Path in Routing Using Genetic Algorithm

Parallel Multi-objective Optimization using Master-Slave Model on Heterogeneous Resources

University of Castilla-La Mancha

CHAPTER 4 GENETIC ALGORITHM

GENETIC ALGORITHM with Hands-On exercise

Using Genetic Algorithms in Integer Programming for Decision Support

A Web-Based Evolutionary Algorithm Demonstration using the Traveling Salesman Problem

Parallel Multi-objective Optimization using Master-Slave Model on Heterogeneous Resources

OPTIMIZATION EVOLUTIONARY ALGORITHMS. Biologically-Inspired and. Computer Intelligence. Wiley. Population-Based Approaches to.

A Case Study on Grammatical-based Representation for Regular Expression Evolution

Introduction to Genetic Algorithms

Transcription:

A Java Implementation of the SGA, UMDA, ECGA, and HBOA arxiv:1506.07980v1 [cs.ne] 26 Jun 2015 José C. Pereira CENSE and DEEI-FCT Universidade do Algarve Campus de Gambelas 8005-139 Faro, Portugal unidadeimaginaria@gmail.com Abstract Fernando G. Lobo CENSE and DEEI-FCT Universidade do Algarve Campus de Gambelas 8005-139 Faro, Portugal fernando.lobo@gmail.com The Simple Genetic Algorithm, the Univariate Marginal Distribution Algorithm, the Extended Compact Genetic Algorithm, and the Hierarchical Bayesian Optimization Algorithm are all well known Evolutionary Algorithms. In this report we present a Java implementation of these four algorithms with detailed instructions on how to use each of them to solve a given set of optimization problems. Additionally, it is explained how to implement and integrate new problems within the provided set. The source and binary files of the Java implementations are available for free download at https://github.com/josecpereira/2015evolutionaryalgorithmsjava. 1 Introduction Evolutionary Algorithms (EAs) are a group of optimization techniques inspired by computational models of evolutionary processes from biology, namely natural selection and survival of the fittest. The idea of using biological evolution as a paradigm for stochastic problem solvers has been around since the 1950s and it experienced major developments in the 1970s and 1980s with the emergence of four main independent research fields: Genetic Algorithms (GA), Evolutionary Strategies (ES), Evolutionary Programming (EP) and Genetic Programming (GP). Over the last two decades new classes of Evolutionary Algorithms have been developed, one of them being Estimation of Distribution Algorithms (EDAs). Currently, EAs are a very active and dynamic research field, with various applications in many areas of scientific and technological knowledge. In this report we present Java implementations of the Simple Genetic Algorithm (SGA) (Holland, 1975; Goldberg, 1989; Eiben and Smith, 2003), and of three well known EDAs: the Univariate Marginal Distribution Algorithm (UMDA) (Mühlenbein and Paaß, 1996), the Extended Compact Genetic Algorithm (ECGA) (Harik, 1999), and the Hierarchical Bayesian Optimization Algorithm (HBOA) (Pelikan and Goldberg, 2006). Each of these algorithms is a well known EA with a significant amount of published research. We encourage the interested reader to explore the existing rich scientific literature on the subject. 1

The Java implementations of these four EAs follow the same basic design, with similar I/O interface and parameter configuration. All implementations satisfy the same set of constraints: 1. The algorithm represents possible solutions (individuals) as strings of zeros and ones. 2. All individuals have the same string size. 3. The population size remains constant throughout a complete run of the algorithm. In the following, we provide detailed instructions on how to use each of the algorithms to solve optimization problems. A set of well known optimization problems is given and, additionally, it is explained how to implement and integrate new problems within the provided set. The source and binary files of the Java implementations are available for free download at https://github.com/josecpereira/2015evolutionaryalgorithmsjava. The implemented set of standard EAs presented in this report is also the basis for the Java implementation of the Parameter-less Evolutionary Algorithms as discussed in another arxiv report and whose corresponding source is also available for free download at https://github.com/josecpereira/2015parameterlessevolutionaryalgorithmsjava. 2 How to use the code Herein we provide detailed instructions on how to use a Java implementation of the generic algorithm EA, where the acronym is a mere place holder for the algorithms SGA, UMDA, ECGA, and HBOA. The EA is a Java application developed with the Eclipse 1 IDE. The available code is already compiled and can be executed using the command line. Run the EA from a command line 1. Unzip the source file 2015EAs.zip to any directory. 2. Open your favourite terminal and execute the command cd [yourdirectory]/2015evolutionaryalgorithmsjava/ea/bin where [yourdirectory] is the name of the directory chosen in step 1. 3. Execute the command java com/ea/ea EAParameters.txt The argument EAParameters.txt is in fact the name of the file containing all the options concerning the parameter settings and can be changed at will. After each execution of a single or multiple runs, the EA produces one output file EA * *.txt that records how each run progressed in terms of fitness calls, best current fitness, among other relevant information. Additionally, the EA also creates the file EA-STATS * *.txt that stores some of the statistics necessary for analyzing the behaviour of the EA over multiple runs. 1 Version: Kepler Service Release 2 2

2.1 Set of optimization problems The current code includes a set of optimization problems that can be used to test any of the four EAs. Here is the problem menu: ZERO Problems ONE Problems 0 ZeroMax 10 OneMax 1 Zero Quadratic 11 Quadratic 2 Zero 3-Deceptive 12 3-Deceptive 3 Zero 3-Deceptive Bipolar 13 3-Deceptive Bipolar 4 Zero 3-Deceptive Overlapping 14 3-Deceptive Overlapping 5 Zero Concatenated Trap-k 15 Concatenated Trap-k 6 Zero Uniform 6-Blocks 16 Uniform 6-Blocks Hierarchical Problems 21 Hierarchical Trap One 22 Hierarchical Trap Two The Zero problems always have the string with all zeros as their best individual. The One problems are the same as the Zero problems but their best individual is now the string with all ones. A description of these problems can be found, for instance, in Pelikan et al. (2000). The Hierarchical problems are thoroughly described in Pelikan (2005). It is also possible to define a noisy version for any of the previous problems. This is done by adding a non-zero Gaussian noise term to the fitness function. The source code that implements all the problems mentioned in this section can be found in the file src/com/ea/problem.java. As mentioned previously, all configuration and parameter setting options of the EA are in the file EAParameters.txt. To choose a particular problem the user must set the value of the following three options: Line 81: problemtype Line 90: stringsize Line 107: sigmak (defines the noise component) 2.2 How to implement a new problem The EA uses the design pattern strategy (Gamma et al., 1995) to decouple the implementation of a particular problem from the remaining evolutionary algorithm (see Figure 1). As a consequence, to integrate a new problem to the given set of test problems it is only necessary to define one class that implements the interface IProblem and change some input options to include the new choice. The interface IProblem can be found in the file src/com/ea/problem.java. We note that, the reading of these instructions is best complemented with a good analysis of the corresponding source code. 3

In the following let us consider that we want to solve a new problem called NewProblem with the EA algorithm. To plug in this problem it is necessary to: 1. Define a class called NewProblem in the file src/com/ea/problem.java. The signature of the class will be 2. Code the body of the function computefitness(individual) according to the nature of problem newproblem. The class Individual provides all the necessary functionalities to operate with the string of zeros and ones that represents an individual (e.g., getallele(int)). This class can be found in the file src/com/ea/individual.java. 3. To define the new problem option, add the line to the switch command in the function initializeproblem() of the file src/com/ea/eaparameter.java. The case number 99 is a mere identifier of the new problem option. The user is free to choose other value for this purpose. The rest of the line is to be written verbatim. 4. Validate the new problem option value 99 by adding the case problemtype == 99 within the conditional if(optionname.equals( problemtype )) of the same EAParameter.java file. Although not strictly necessary, it is also advisable to keep updated the problem menu in the file ParParameters.txt. As mentioned before, the four Java implementations follow the same basic design. In practice, this means that all implementations share some common core Java classes. Here is a brief description of those classes: Population Contains the array of individuals that are the population. It provides all the functionalities necessary for operating that set of individuals. It is also responsible for computing its own statistics such as average fitness and best fitness, which are part of the necessary information to analyze the behaviour of the EAs. RandomPopulation Subclass of the class Population. The constructor of this class is responsible for generating the initial random population. SelectedSet Subclass of the class Population. For some algorithms, depending on the selection operator, the size of this set is different from the population size. 4

Figure 1: The EA uses the design pattern strategy (Gamma et al., 1995) to allow an easy implementation of new problems to be solved by the framework. Individual Contains the string of zeros and ones that represents an individual. It provides all the functionalities necessary for operating that string. In particular, it is responsible for computing its own fitness, according to the problem at hand. Stopper Contains all the stop criteria used by the EAs. These criteria are integrated in a single function called criteria(...) which in turn must be returned by the nextgeneration() function in the file EASolver.java. The criteria options can be changed in the file EAParameters.txt Acknowledgements All of the developed source code is one of the by-products of the activities performed by the PhD fellow, José C. Pereira within the doctoral program entitled Search and Optimization with Adaptive Selection of Evolutionary Algorithms. The work program is supported by the Portuguese Foundation for Science and Technology with the doctoral scholarship SFRH/BD/78382/2011 and with the research project PTDC/EEI-AUT/2844/2012. References Eiben, A. E. and Smith, J. E. (2003). Introduction to Evolutionary Computing. SpringerVerlag. Gamma, E., Helm, R., Johnson, R., and Vlissides, J. (1995). Design patterns: elements of reusable object-oriented software. Addison-Wesley Longman Publishing Co., Inc., Boston, MA, USA. Goldberg, D. E. (1989). Genetic algorithms in search, optimization, and machine learning. Addison-Wesley, Reading, MA. 5

Harik, G. R. (1999). Linkage learning via probabilistic modeling in the ECGA. IlliGAL Report No. 99010, Illinois Genetic Algorithms Laboratory, University of Illinois at Urbana-Champaign, Urbana, IL. Holland, J. H. (1975). Adaptation in Natural and Artificial Systems. University of Michigan Press, Ann Arbor, MI. Mühlenbein, H. and Paaß, G. (1996). From recombination of genes to the estimation of distributions I. Binary parameters. In Voigt, H.-M. et al., editors, Parallel Problem Solving from Nature PPSN IV, pages 178 187, Berlin. Kluwer Academic Publishers. Pelikan, M. (2005). Hierarchical Bayesian Optimization Algorithm: Toward a New Generation of Evolutionary Algorithms. Springer. Pelikan, M. and Goldberg, D. (2006). Hierarchical bayesian optimization algorithm. In Pelikan, M., Sastry, K., and CantúPaz, E., editors, Scalable Optimization via Probabilistic Modeling, volume 33 of Studies in Computational Intelligence, pages 63 90. Springer Berlin Heidelberg. Pelikan, M., Goldberg, D. E., and Cantú-Paz, E. (2000). Linkage problem, distribution estimation, and Bayesian networks. Evolutionary Computation, 8(3):311 341. 6