Hybrid Adaptive Evolutionary Algorithm Hyper Heuristic

Similar documents
ECSAGO: Evolutionary Clustering with Self Adaptive Genetic Operators

METAHEURISTICS Genetic Algorithm

Classification Using Genetic Programming. Patrick Kellogg General Assembly Data Science Course (8/23/15-11/12/15)

Evolutionary Computation. Chao Lan

Evolutionary Approaches for Resilient Surveillance Management. Ruidan Li and Errin W. Fulp. U N I V E R S I T Y Department of Computer Science

CHAPTER 4 GENETIC ALGORITHM

Genetic Algorithms. Kang Zheng Karl Schober

4/22/2014. Genetic Algorithms. Diwakar Yagyasen Department of Computer Science BBDNITM. Introduction

Effects of Constant Optimization by Nonlinear Least Squares Minimization in Symbolic Regression

Machine Evolution. Machine Evolution. Let s look at. Machine Evolution. Machine Evolution. Machine Evolution. Machine Evolution

MINIMAL EDGE-ORDERED SPANNING TREES USING A SELF-ADAPTING GENETIC ALGORITHM WITH MULTIPLE GENOMIC REPRESENTATIONS

Escaping Local Optima: Genetic Algorithm

METAHEURISTIC. Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal.

GENETIC ALGORITHM with Hands-On exercise

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2

Introduction to Genetic Algorithms

A New Selection Operator - CSM in Genetic Algorithms for Solving the TSP

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

Heuristic Optimisation

Genetic programming. Lecture Genetic Programming. LISP as a GP language. LISP structure. S-expressions

Evolutionary Algorithms. CS Evolutionary Algorithms 1

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

Artificial Intelligence Application (Genetic Algorithm)

Mutations for Permutations

Evolutionary Computation Algorithms for Cryptanalysis: A Study

A Learning Automata-based Memetic Algorithm

Introduction (7.1) Genetic Algorithms (GA) (7.2) Simulated Annealing (SA) (7.3) Random Search (7.4) Downhill Simplex Search (DSS) (7.

A GENETIC ALGORITHM FOR CLUSTERING ON VERY LARGE DATA SETS

The movement of the dimmer firefly i towards the brighter firefly j in terms of the dimmer one s updated location is determined by the following equat

Adaptive Crossover in Genetic Algorithms Using Statistics Mechanism

Outline. Motivation. Introduction of GAs. Genetic Algorithm 9/7/2017. Motivation Genetic algorithms An illustrative example Hypothesis space search

The study of comparisons of three crossover operators in genetic algorithm for solving single machine scheduling problem. Quan OuYang, Hongyun XU a*

Ant colony optimization with genetic operations

Genetic Algorithms Variations and Implementation Issues

ANTICIPATORY VERSUS TRADITIONAL GENETIC ALGORITHM

A HYBRID APPROACH IN GENETIC ALGORITHM: COEVOLUTION OF THREE VECTOR SOLUTION ENCODING. A CASE-STUDY

Lecture 6: Genetic Algorithm. An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved

Santa Fe Trail Problem Solution Using Grammatical Evolution

CHAPTER 5 ENERGY MANAGEMENT USING FUZZY GENETIC APPROACH IN WSN

IMPROVING A GREEDY DNA MOTIF SEARCH USING A MULTIPLE GENOMIC SELF-ADAPTATING GENETIC ALGORITHM

Evolutionary origins of modularity

Evolution of Fuzzy Rule Based Classifiers

On the sampling property of real-parameter crossover

DETERMINING MAXIMUM/MINIMUM VALUES FOR TWO- DIMENTIONAL MATHMATICLE FUNCTIONS USING RANDOM CREOSSOVER TECHNIQUES

JHPCSN: Volume 4, Number 1, 2012, pp. 1-7

Preliminary Background Tabu Search Genetic Algorithm

Genetic Placement: Genie Algorithm Way Sern Shong ECE556 Final Project Fall 2004

AN IMPROVED ITERATIVE METHOD FOR SOLVING GENERAL SYSTEM OF EQUATIONS VIA GENETIC ALGORITHMS

Monika Maharishi Dayanand University Rohtak

GRASP. Greedy Randomized Adaptive. Search Procedure

The Continuous Genetic Algorithm. Universidad de los Andes-CODENSA

A Genetic Algorithm Approach for Clustering

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem

The k-means Algorithm and Genetic Algorithm

Genetic Process Mining: A Basic Approach and its Challenges

Genetic Programming of Autonomous Agents. Functional Description and Complete System Block Diagram. Scott O'Dell

AN EVOLUTIONARY APPROACH TO DISTANCE VECTOR ROUTING

Evolving SQL Queries for Data Mining

MATLAB Based Optimization Techniques and Parallel Computing

PROBLEM SOLVING AND SEARCH IN ARTIFICIAL INTELLIGENCE

Improving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms.

Solving Traveling Salesman Problem Using Parallel Genetic. Algorithm and Simulated Annealing

A Web-Based Evolutionary Algorithm Demonstration using the Traveling Salesman Problem

A Genetic Algorithm Framework

Relationship between Genetic Algorithms and Ant Colony Optimization Algorithms

BI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR FLEXIBLE JOB-SHOP SCHEDULING PROBLEM. Minimizing Make Span and the Total Workload of Machines

Genetic Algorithms. Chapter 3

Task Graph Scheduling on Multiprocessor System using Genetic Algorithm

SELF-ADAPTATION IN GENETIC ALGORITHMS USING MULTIPLE GENOMIC REDUNDANT REPRESENTATIONS ABSTRACT

Evolutionary Algorithms

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms

Evaluation of Nature Inspired Metaheuristics for Search and Reconnaissance Operations by Rotary-wing Aircrafts

Solving the Travelling Salesman Problem in Parallel by Genetic Algorithm on Multicomputer Cluster

Genetic Algorithms for Vision and Pattern Recognition

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

The Binary Genetic Algorithm. Universidad de los Andes-CODENSA

INFORMS Annual Meeting 2013 Eva Selene Hernández Gress Autonomous University of Hidalgo

Planning and Search. Genetic algorithms. Genetic algorithms 1

ADAPTATION OF REPRESENTATION IN GP

Differential Evolution Biogeography Based Optimization for Linear Phase Fir Low Pass Filter Design

Lecture

Chapter 9: Genetic Algorithms

Lecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman

Genetic Algorithms for Classification and Feature Extraction

Genetic Algorithm for Circuit Partitioning

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

An efficient evolutionary algorithm for the orienteering problem

A Fitness Function to Find Feasible Sequences of Method Calls for Evolutionary Testing of Object-Oriented Programs

ISSN: [Keswani* et al., 7(1): January, 2018] Impact Factor: 4.116

Evolving Transformation Sequences using Genetic Algorithms

Chapter 14 Global Search Algorithms

Software Module Clustering using a Fast Multi-objective Hyper-heuristic Evolutionary Algorithm

Optimization of Robotic Arm Trajectory Using Genetic Algorithm

SIMULATION APPROACH OF CUTTING TOOL MOVEMENT USING ARTIFICIAL INTELLIGENCE METHOD

Generalized Multiobjective Multitree model solution using MOEA

Optimization Technique for Maximization Problem in Evolutionary Programming of Genetic Algorithm in Data Mining

Solving A Nonlinear Side Constrained Transportation Problem. by Using Spanning Tree-based Genetic Algorithm. with Fuzzy Logic Controller

The Parallel Software Design Process. Parallel Software Design

Introduction to Genetic Algorithms. Genetic Algorithms

Transcription:

Hybrid Adaptive Evolutionary Algorithm Hyper Heuristic Jonatan Gómez Universidad Nacional de Colombia Abstract. This paper presents a hyper heuristic that is able to adapt two low level parameters (depth of search rate and mutation intensity) and the probability of applying a low level heuristic to an evolving solution in order to improve the solution quality. Basically, two random subsets of heuristics are maintained: one sub set of the full set of heuristics and other sub-set of the local heuristics. For the full set of heuristics, the random selection is performed in such a way that at least one local heuristic is included in the sub-set. In each iteration, one heuristic is selected from the sub-set of heuristic according to its associated probability (just the heuristics in the subset are considered and their initial probability is the same for all of them). The same process is performed for selecting a local heuristic. Then, the heuristic and the local heuristic that were selected are applied to the candidate solution. The generated solution is compared against the candidate solution. If the solution is improved then the candidate solution will be replaced by the new one and both the heuristic and the local heuristics are rewarded (an increase in the probability of being selected is perform in a random fashion). If not improvement is made, both the heuristic and the local heuristic are punished (a decrease in the probability of being selected is perform in a random fashion), low level heuristic parameters are adjusted and a soft replacement policy is applied. The low level parameter depth of search rate (starting at 0.1) is increased in a random value in the range [0.0, 0.1] up to a maximum value of 0.5. When the depth of search rate reaches 0.5, the parameter is reset to a value between [0.0, 0.1]. The intensity of mutation is set as the inverse value of the depth search rate. The soft replacement policy includes a control variable that determines if the range of depth of search rates has being tested or not. If so a new subset of heuristics and a new subset of local heuristics are selected and a new candidate solution is created by a randomly selected crossover heuristic (between the best reached solution and the generated solution) and a local heuristic. Otherwise the candidate solution is maintained only if the new solution has lower performance. 1 Algorithm description The Hybrid Adaptive Evolutionary Hyper Heuristic (HaEaHH) proposed in this paper is based on the Hybrid Adaptive Evolutionary Algorithm (HaEa) proposed by Gomez in [1]. Since HaEa is an individual based approach, we

use only three individuals in the population for evolving a solution: the current candidate solution (parent), the generated solution (child), and the best solution reached during the evolution (best), see Algorithm 1. Basically, the algorithm is divided in four main steps: setting initial variable values 1 (line 1), setting the low level heuristic parameters (lines 3 and 4), offspring generation (line 5), and replacement strategy (line 6).Notice that the dept of search parameter is always set in oppisited manner to the intensity of mutation parameter. When depth of search is high, intensity of mutation is low. 1.1 Initializing Variable Values The initialization process is shown in Algorithm 2. We set the memory size to 3 (line 1), since we use only three individuals in the population for evolving a solution: the current candidate solution (parent), the generated solution (child), and the best solution reached during the evolution (best). We initialize two of them and select the best one as parent and initial best solution (lines 2-6). We obtain the full set of heuristics and the set of local heuristics (lines 7, 8) and set the in use heuristics to 4 and the in use local heuristic to be maximum 4 (lines 9, 10). We decided to keep just 4 heuristics (an maximum 4 local heuristics) since the probability of being selected will be no useful if a large number of heuristics are in use (the probability will tend to zero as the number of heuristics increase). The mechanism for selecting and changing the in use heuristics and local heuristics, (line 11) is shown in Algorithm 3. This is applied in the replacement strategy when required, see sub section 1.3 for more details. Finally, the Search Depth rate parameter is set to 0.1 meaning that more exploration (mutation) is performed at first than local search. In the reset of the heuristics process (Algorithm 3), the in use heuristics are selected from the full set of heuristics after performing a random permutation of the set of heuristics that includes one local heuristic in the first N heuristics, and considering the first N heuristics in the set (line 1). Similar process is performed for the local heuristics and selecting the first M local heuristics as the in use ones (line 2). Two heuristics rates vectors are associated to each of the in use (local) heuristic sets, every in use heuristic having the same probability of being selected (lines 3,4). 1.2 Offspring Generation The offspring generation process is shown in Algorithm 4. In each iteration, one heuristic is selected from the sub-set of in use heuristic according to its associated probability (line 1) 2. The same process is performed for selecting a local heuristic (line 2). Then, the heuristic and the local heuristic that were selected are applied (in that order) to the candidate solution (lines 3-8). Notice that when the heuristic is a croosver heuristic, it is perform between the candidate solution and the best solution (line 4). 1 See Table 1 for a reference of the set of variables used by HaEaHH. 2 Just the heuristics in the subset are considered

Algorithm 1 Hybrid Adaptive Evolutionary Algorithm (HAEA) HaEaHH( P r ) 1. init(p r) 2. while(!hastimeexpired() ) do 3. setdepthofsearch(p, α) 4. setintensityofmutation(p, β α) 5. ospring( P ) 6. replacement( P, h, l ) 7. end Algorithm 2 Initializing Variable Values init(p ) 1. setmemorysize(p, 3) 2. initialisesolution(p, 0) 3. initialisesolution(p, 1) 4. p =indexofbest(0, 1) 5. c =indexofworst( 0, 1) 6. copysolution(p, b) 7. heu =getheuristics(p ) 8. loc = getlocalheuristics(p ) 9. N = 4 10. M = min{n, size(loc)} 11. reset_operators(); 12. α = 0.1 Algorithm 3 Resetting the in-use heuristics reset_heuristics() 1. permutation'(heu) 2. permutation(loc) 3. r h = [ ] 1 N 4. r l = [ 1 M N] M Algorithm 4 Offspring Generation ospring( P r ) 1. h =roulette(r h ) 2. l =roulette(r l ) 3. if heu[h] is xover 4. applyheuristic(p, heu[h], p, b,c ) 5. else 6. applyheuristic(p, heu[h], p, c ) 7. end 8. f c =applyheuristic(p, loc[l], c, c )

1.3 Replacement Strategy The generated solution is compared against the candidate solution (line 1 Algoritmh 5). If the solution is improved then the candidate solution will be replaced by the new one and both the heuristic and the local heuristics are rewarded (lines 2-6). If not improvement is made, both the heuristic and the local heuristic are punished (lines 8,9), low level heuristic parameters are adjusted (line 10) and a soft replacement policy is applied (lines 11-29). The reward/punish scheme is performed by increasing or decreasing, in a random fashion, the (local) heuristic probability of being selected, see Algorithm 6. The low level parameter depth of search rate (starting at 0.1) is increased in a random value in the range [0.0, 0.1] up to a maximum value of 0.5. When the depth of search rate reaches 0.5, the parameter is reset to a value between [0.0, 0.1], see Algorithm 6. The soft replacement policy starts by trying to improve the best solution (lines 11-13) by applying a local heuristic, and uses a control variable to determine if the range of depth of search rates has being tested or not (lines 14-29). If so a new subset of heuristics and a new subset of local heuristics are selected (line 27) and a new candidate solution is created by a randomly selected crossover heuristic (between the best reached solution and the generated solution) and a local heuristic (lines 17-22). Otherwise the candidate solution is maintained only if the new solution has lower performance (line 29). Variable Description c, p, b Child, parent and best (up to now) solution indices f c, f p, f b Fitness of the child, parent and best solutions N Number of in use heuristics M Number of in use local heuristics r h in use heuristic selection rates r l in use local heuristic selection rates P Problem instance h Heuristic index l Local heuristic index α Depth of search rate β Depth of search rate limit Depth of search rate control variable loc Array of local heuristics heu Array of heuristics Table 1. HaEa Variables References 1. J. Gomez. Self adaptation of operator rates in evolutionary algorithms. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2004), June 2004.

Algorithm 5 Replacement Strategy replacement( P, h, l ) 1. if f c < f p 2. reward(r h, h) 3. reward(r l, l) 4. if f c < f b copysolution(p, c, b) 5. = 0 6. copysolution(p, c, p) 7. else 8. punish(r h, h) 9. punish(r l, l) 10. update_alpha() 11. setdepthofsearch(p, 0.2) 12. setintensityofmutation(p, 0.2) 13. f b = applyheuristic(p, loc[last], b, b); 14. if β 15. = 0 16. if f p f b 17. if heu[last] is xover 18. f p =applyheuristic(p, hue[last], c, b, p) 19. else 20. f p =applyheuristic(p, hue[last], b, p) 21. end 22. f p =applyheuristic(p, loc[last], p, p) 23. if f p < f b copysolution(p, p, b) 24. else 25. copysolution(p, c) 26. end 27. reset_heuristics() 28. else 29. if f p = f c copysolution(p, c, p) 30. end Algorithm 6 Reward/punish and Update of Depth of search rate strategies reward( r, i ) punish( r, i ) update_alpha() 1. r[i] = (1.0 + δ) r[i] 1. r[i] = (1.0 δ) r[i] 1. x = 0.1 δ 2. normalize(r) 2. normalize(r) 2. α = α + δ 3. = + δ 4. if α > β α = α β