CHAPTER 3.4 AND 3.5. Sara Gestrelius

Similar documents
GRASP. Greedy Randomized Adaptive. Search Procedure

METAHEURISTIC. Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal.

Heuristic Optimisation

METAHEURISTICS Genetic Algorithm

Escaping Local Optima: Genetic Algorithm

CS5401 FS2015 Exam 1 Key

Chapter 9: Genetic Algorithms

An Introduction to Evolutionary Algorithms

A Genetic Algorithm with Evolutionary Path-relinking for the SONET Ring Assignment Problem

GENETIC ALGORITHM with Hands-On exercise

Genetic Algorithms. Kang Zheng Karl Schober

Introduction to Genetic Algorithms

Heuristic Optimisation

Outline. CS 6776 Evolutionary Computation. Numerical Optimization. Fitness Function. ,x 2. ) = x 2 1. , x , 5.0 x 1.

Introduction to Optimization

Introduction to Genetic Algorithms. Based on Chapter 10 of Marsland Chapter 9 of Mitchell

Evolutionary Computation Part 2

Assigning Judges to Competitions Using Tabu Search Approach

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

Application of Emerging Metaheuristics in Power System Field

Preliminary Background Tabu Search Genetic Algorithm

Lecture 6: Genetic Algorithm. An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved

Introduction to Optimization

Introduction to Evolutionary Computation

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2

Heuristis for Combinatorial Optimization

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms

Genetic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem

Heuristis for Combinatorial Optimization

Evolutionary Algorithms Selected Basic Topics and Terms

Evolutionary Computation for Combinatorial Optimization

GRASP WITH PATH-RELINKING FOR THE GENERALIZED QUADRATIC ASSIGNMENT PROBLEM

Review: Final Exam CPSC Artificial Intelligence Michael M. Richter

Simple mechanisms for escaping from local optima:

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Mutation in Compressed Encoding in Estimation of Distribution Algorithm

A Course on Meta-Heuristic Search Methods for Combinatorial Optimization Problems

GRASP WITH PATH-RELINKING FOR THE GENERALIZED QUADRATIC ASSIGNMENT PROBLEM

Multi-objective Optimization

Non-deterministic Search techniques. Emma Hart

ACO and other (meta)heuristics for CO

A Parameter Study for Differential Evolution

The Genetic Algorithm for finding the maxima of single-variable functions

Estimation of Distribution Algorithm Based on Mixture

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

Similarity Templates or Schemata. CS 571 Evolutionary Computation

Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat:

Artificial Intelligence Application (Genetic Algorithm)

Extending MATLAB and GA to Solve Job Shop Manufacturing Scheduling Problems

Genetic Algorithms. Chapter 3

Memetic Algorithms in Discrete Optimization

Lecture

arxiv: v1 [cs.ai] 12 Feb 2017

Evolutionary Computation Algorithms for Cryptanalysis: A Study

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem

COMPARISON OF ALGORITHMS FOR NONLINEAR REGRESSION ESTIMATES

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

Pre-requisite Material for Course Heuristics and Approximation Algorithms

Distributed Probabilistic Model-Building Genetic Algorithm

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function

DERIVATIVE-FREE OPTIMIZATION

A tabu search based memetic algorithm for the max-mean dispersion problem

QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION

Evolutionary Algorithms. CS Evolutionary Algorithms 1

Rich Vehicle Routing Problems Challenges and Prospects in Exploring the Power of Parallelism. Andreas Reinholz. 1 st COLLAB Workshop

OPTIMIZATION METHODS. For more information visit: or send an to:

Exploration vs. Exploitation in Differential Evolution

A Cooperative Coevolutionary Approach to Function Optimization

Metaheuristics: a quick overview

Ahmed T. Sadiq. Ali Makki Sagheer* Mohammed Salah Ibrahim

Genetic Programming. and its use for learning Concepts in Description Logics

REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN

Outline. Motivation. Introduction of GAs. Genetic Algorithm 9/7/2017. Motivation Genetic algorithms An illustrative example Hypothesis space search

Unidimensional Search for solving continuous high-dimensional optimization problems

Computational Intelligence

A Hybrid Fireworks Optimization Method with Differential Evolution Operators

SCATTER SEARCH AND PATH-RELINKING: FUNDAMENTALS, ADVANCES, AND APPLICATIONS

A New Modified Binary Differential Evolution Algorithm and its Applications

Evolutionary algorithms in communications

Solving ISP Problem by Using Genetic Algorithm

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini

CHAPTER 4 GENETIC ALGORITHM

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

Multi-Objective Optimization Using Genetic Algorithms

Automated Test Data Generation and Optimization Scheme Using Genetic Algorithm

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms

OPTIMIZATION EVOLUTIONARY ALGORITHMS. Biologically-Inspired and. Computer Intelligence. Wiley. Population-Based Approaches to.

Pseudo-code for typical EA

Evolutionary Search in Machine Learning. Lutz Hamel Dept. of Computer Science & Statistics University of Rhode Island

Evolutionary Algorithm for Embedded System Topology Optimization. Supervisor: Prof. Dr. Martin Radetzki Author: Haowei Wang

A Combined Meta-Heuristic with Hyper-Heuristic Approach to Single Machine Production Scheduling Problem

Scatter Search: Methodology and Applications

Global Optimization. for practical engineering applications. Harry Lee 4/9/2018 CEE 696

A Memetic Algorithm for Parallel Machine Scheduling

V.Petridis, S. Kazarlis and A. Papaikonomou

An evolutionary annealing-simplex algorithm for global optimisation of water resource systems

Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach

Optimizing Flow Shop Sequencing Through Simulation Optimization Using Evolutionary Methods

Transcription:

CHAPTER 3.4 AND 3.5 Sara Gestrelius

3.4 OTHER EVOLUTIONARY ALGORITHMS Estimation of Distribution algorithms Differential Evolution Coevolutionary algorithms Cultural algorithms

LAST TIME: EVOLUTIONARY ALGORITHMS Initialise Population Stopping criteria Selection Replacement Parents Offspring Objective function Reproduction: recombination, mutation.

ESTIMATION OF DISTRIBUTION ALGORITHMS Main idea Variable interaction Example: PBIL

ESTIMATION OF DISTRIBUTION ALGORITHMS MAIN IDEA The population is represented by a probability distribution that is used to generate new individuals. Not yet competitive compared to more traditional metaheuristics. Population of individuals Probability distribution 0 0 1 0 1 0 1 1 1 0 0 0 0 0 P = 0.6 0.4 0.2 0.0 0.8 0.6 0.4 1 0 0 0 1 1 1 1 0 0 0 1 1 0 0 1 0 0 1 1 0 Example: A population of binary arrays can be represented by an array of probabilities that that entry is a 1.

ESTIMATION OF DISTRIBUTION ALGORITHMS MAIN IDEA Algorithm 3.4 Template of the EDA algorithm t = 1; Generate randomly a population of n individuals; Initialise a probability model Q(x); While Termination criteria are not met Do Create a population of n individuals by sampling from Q(x); Evaluate the objective function for each individual; Select m individuals according to a selection methods; Update the probabilistic model Q(x) using selected population and f() values; t=t+1 End While Output: Best found solution or set of solutions.

ESTIMATION OF DISTRIBUTION ALGORITHMS MAIN IDEA Initialise Probability model Stopping criteria Generate Objective function Select and update Offspring

ESTIMATION OF DISTRIBUTION ALGORITHMS VARIABLE INTERACTION Sometimes the variables interact. Then this should be included in the probability model. 1. Univariate EDA: No interaction between the problem variables are taken into account. 2. Bivariate EDA: Interactions between two variables defines the probability model 3. Multivariate EDA: Interactions between many variables defined the probability model.

ESTIMATION OF DISTRIBUTION ALGORITHMS EXAMPLE: POPULATIONBASED INCREMENTAL LEARNING, PBIL Algorithm 3.5 Template of the PBIL algorithm Initial distribution D=(0.5,, 0.5); Repeat Repeat /*Generate population P of size n*/ Repeat /*For every entry i in D*/ r=uniform[0,1] If r<d i Then X i = 1 Else X i = 0 Until i= D Until n Evaluate and sort the population P; /*Find best offspring*/ Update the distribution D=(1-α)D+αX best Until Stopping Criteria

DIFFERENTIAL EVOLUTION Main idea Example: DE/rand/1/bin Repair strategies Tuning

DIFFERENTIAL EVOLUTION MAIN IDEA When creating a new offspring: Use one main parent and a group of other parents. For each entry in the main parent vector, randomly choose whether to used main parent entry or new entry generated by the other parents. Very successful for continuous optimisation. All entries are bounded: l j x ij u j Main Parent Green = other parents 2 3 1 r 3 +F.(r 1 -r 2 )

DIFFERENTIAL EVOLUTION MAIN IDEA Initialise Population Stopping criteria Sample Replacement Selection E.g. Go through all individuals Parent Parents Objective function Recombination Offspring

DIFFERENTIAL EVOLUTION EXAMPLE: DE/RAND/BIN Algorithm 3.8 Template of the DE/rand/1/bin algorithm Input: Parameters F (scaling factor) and CR (crossover constant). Initialise the population (uniform random distribution); Repeat For (i=1, i k, i++) Do /*Each individual*/ /*Mutate and recombine*/ j rand =int(rand i [0,1]*D)+1; For (j=1, j D, j++) Do If (rand i [0,1]<CR) or (j=j rand )Then u ij = x r3j +F.(x r1j -.(x r2j ) Else u ij = x ij End For */Replace*/ If f(u i (t+1)) f(u i (t)) x i (t+1) = u i (t+1) Else x i (t+1) = x i (t) End For Until Stopping Criteria Output: Best population or solution found.

DIFFERENTIAL EVOLUTION REPAIR STRATEGY Extreme strategies: 1. Set variable to violated bound 2. Randomly reinitialise value Intermediate strategy: 1. Set to midway between old value and violated bound.

DIFFERENTIAL EVOLUTION TUNING To get convergence: Increase number of parents, decrease F. To increase convergence speed (and decrease robustness): Increase CR DE much more sensitive to the value of F than the value of CR. From book: NP=10 times number of decision variables, CR=0.9, F=0.8

COEVOLUTIONARY ALGORITHMS Main idea The Rosenbrock function: competetive coevolution The Rosenbrock function: cooperative coevolution

COEVOLUTION ALGORITHMS MAIN IDEA Cooperative or competitive strategy involving different populations. The fitness of an individual in a given population depends on the fitness of individuals in other populations.

COEVOLUTION ALGORITHMS MAIN IDEA Global obj. function Initialise Stopping criteria Population Population Population Selection Replacement Population Population obj. Population function obj. function obj. function Representatives Representatives Representatives Offspring Offspring Offspring Competition Cooperation

COEVOLUTION ALGORITHMS THE ROSENBROCK FUNCTION f x = n i=1 (100 x i x i+1 2 + 1 x i 2 ) xεr n populations with the individuals x i Objective functions f i (x i ; x i+1 ) Interaction/communication graph G comm defined by relations between subcomponents.

COEVOLUTION ALGORITHMS THE ROSENBROCK FUNCTION - COMPETITIVE f x = n i=1 (100 x i x i+1 2 + 1 x i 2 ) xεr Nodes x i interact with nodes x i+1. The coevlolving populations compete to minimize their local function.

COEVOLUTION ALGORITHMS THE ROSENBROCK FUNCTION - COMPETITIVE Species 1 Representatives Population Species 2 Population Representatives Species n Population Representatives

COEVOLUTION ALGORITHMS THE ROSENBROCK FUNCTION - COOPERATIVE f x = n i=1 (100 x i x i+1 2 + 1 x i 2 ) xεr Interaction graph fully connected. Algorithm 1. Initialize by randomly connecting individuals of populations and find a best solution I i best for each population. 2. Repeat: 1. For each population: combine each individual with best individuals of all other populations. Find new best solution in active population. 2. For each population: match best individual with randomly selected individuals from other populations. 3. For each population: Choose the best out of the two solutions. 3. Construct complete solution from all best individuals from each population.

COEVOLUTION ALGORITHMS THE ROSENBROCK FUNCTION - COOPERATIVE Species 1 Population Fitness evaluation Species 2 Population Representatives Individual Assembly Representatives Species n Frozen populations Population Representatives

CULTURAL ALGORITHMS Main idea

CULTURAL ALGORITHMS MAIN IDEA Two main elements: 1. A population space at the mico-evolutionary level 2. A belief space at the macro-evolutionary level Good when solutions require extensive domain knowledge

CULTURAL ALGORITHMS MAIN IDEA Belief space Adjust knowledge: normative, domain specific, situational, temporal, spatial Symbolic resoning: Logic- and rule based reasoning models Schemata Graphical models Semantic networks Version spaces Knowledge Promote Vote Influence evolution Population Evolution of the population (e.g. evolutionary algorithm)

CULTURAL ALGORITHMS MAIN IDEA Algorithm 3.9 Template of the cultural algorithm Initialise the population Pop(0); Initialise the belief BLF(0); t = 0; Repeat Evaluate the population Pop(t); Adjust(BLF(t), Accept(Pop(t))); Evolve(Pop(t+1), Influence(BLF(t))); t=t+1 Until Stopping criteria Output: Best found solution or set of solutions.

3.4 SCATTER SEARCH Scatter search Path relinking

SCATTER SEARCH Main idea Components

SCATTER SEARCH MAIN IDEA Very many parents (aka reference set). Also, many parents can be used to generate new solutions. F. Glover. Heuristics for integer programming using surrogate constraints. Decision Sciences, 8:156-166, 1977.

SCATTER SEARCH MAIN IDEA Initial population Create initial population (quality, diversity) Reference set Subset generation method Subsets Improve the initial population Generate reference set Reference set update method Solution combination method Improved population Improved generated solutions Improvement method Generated solutions

SCATTER SEARCH COMPONENTS Initial population Create initial population (quality, diversity) Reference set Greedy procedures are applied to diversify the search while selecting high-quality solutions. Subset generation method Subsets Improve the initial population Generate reference set Reference set update method Solution combination method Improved population Improved generated solutions Improvement method Generated solutions Any S-metaheuristic, normally local search.

SCATTER SEARCH COMPONENTS Initial population Create initial population (quality, diversity) Reference set Subset generation method Subsets Improve the initial population Generate reference set Reference set update method Solution combination method Improved population Improved generated solutions Improvement method Generated solutions Quality and diversity: Select RefSet1 with the best objective function and RefSet2 with best diversity.

SCATTER SEARCH MAIN IDEA Initial population Create initial population (quality, diversity) Reference set Usually selects all the subsets of fixed size r (often r=2). It s a deterministic operator. Subset generation method Subsets Improve the initial population Generate reference set Reference set update method Solution combination method Improved population Improved generated solutions Improvement method Generated solutions

SCATTER SEARCH MAIN IDEA Initial population Create initial population (quality, diversity) Reference set In the example it s a crossover method. Pretty much anything that can take a subset and turn it into a solution. Subset generation method Subsets Improve the initial population Generate reference set Reference set update method Solution combination method Improved population Improved generated solutions Improvement method Generated solutions

SCATTER SEARCH MAIN IDEA Initial population Create initial population (quality, diversity) Reference set Subset generation method Subsets Improve the initial population Generate reference set Reference set update method Solution combination method Improved population Improved generated solutions Improvement method Generated solutions In the example it s a local search method.

SCATTER SEARCH MAIN IDEA Initial population Create initial population (quality, diversity) Reference set Subset generation method Subsets Improve the initial population Generate reference set Reference set update method Solution combination method Improved population Improved generated solutions Improvement method Generated solutions In example: use same method as in Generate reference set. Use solutions in Reference set and Improved generated solutions.

PATH RELINKING Main idea Components Directions Example: Binary search space

PATH RELINKING MAIN IDEA Find a path between two elite solutions (s and t), return best solution on this path. The solutions on the path will generally share attributes with s and t.

PATH RELINKING MAIN IDEA Algorithm 3.11 Template of the basic PR algorithm Input Starting solution s and target solution t x=s; While dist(x,t) 0 do Find the best move m which decreases dist(x+m,t); x=x+m; /*Apply the move m to the solution x*/ Output: Best solution found in the trajectory between s and t.

PATH RELINKING COMPONENTS Algorithm 3.11 Template of the basic PR algorithm Input Starting solution s and target solution t x=s; While dist(x,t) 0 do Find the best move m which decreases dist(x+m,t); x=x+m; /*Apply the move m to the solution x*/ Output: Best solution found in the trajectory between s and t. Point with minimum distance to t. Point with best/worst objective function value. Use history of search (tabu search).

PATH RELINKING COMPONENTS Algorithm 3.11 Template of the basic PR algorithm Input Starting solution s and target solution t x=s; While dist(x,t) 0 do Find the best move m which decreases dist(x+m,t); x=x+m; /*Apply the move m to the solution x*/ If x is valid solution x= local optimum; /*S-metaheuristic*/ Output: Best solution found in the trajectory between s and t. Intermediate operations: Do something at each step of the path construction. E.g. find local optimum if valid solution.

PATH RELINKING DIRECTION Algorithm 3.11 Template of the basic PR algorithm Input Starting solution s and target solution t x=s; While dist(x,t) 0 do Find the best move m which decreases dist(x+m,t); x=x+m; /*Apply the move m to the solution x*/ Output: Best solution found in the trajectory between s and t. How to chose which solution that should be s and which t? Forward: From worst solution to best solution. Backward: From best solution to worst solution. Back and forward relinking: Both directions constructed in parallel (might not be worth the effort). Mixed: Chose intermediate point as t and both ends as s. Compute both paths in parallel.

PATH RELINKING EXAMPLE: BINARY SEARCH SPACE Move: Flip operator. Distance: Hamming distance.

WWW.SICS.SE