Preliminary Background Tabu Search Genetic Algorithm

Similar documents
METAHEURISTICS Genetic Algorithm

METAHEURISTIC. Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal.

Assigning Judges to Competitions Using Tabu Search Approach

2 1 Introduction In this paper we summarize most of the basic heuristic search methods used to solve combinatorial programming problems. Sometimes, th

Genetic Algorithms. Kang Zheng Karl Schober

Introduction to Optimization

Genetic Algorithms Variations and Implementation Issues

Escaping Local Optima: Genetic Algorithm

Lecture 6: Genetic Algorithm. An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved

CS5401 FS2015 Exam 1 Key

Heuristis for Combinatorial Optimization

Introduction to Optimization

Heuristis for Combinatorial Optimization

The Genetic Algorithm for finding the maxima of single-variable functions

Chapter 14 Global Search Algorithms

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

Evolutionary Computation Algorithms for Cryptanalysis: A Study

Introduction to Genetic Algorithms. Based on Chapter 10 of Marsland Chapter 9 of Mitchell

Neural Network Weight Selection Using Genetic Algorithms

Introduction to Genetic Algorithms

Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat:

Chapter 9: Genetic Algorithms

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function

Similarity Templates or Schemata. CS 571 Evolutionary Computation

Artificial Intelligence Application (Genetic Algorithm)

Heuristic Optimisation

Machine Evolution. Machine Evolution. Let s look at. Machine Evolution. Machine Evolution. Machine Evolution. Machine Evolution

Adaptive Large Neighborhood Search

CHAPTER 4 GENETIC ALGORITHM

Two approaches. Local Search TSP. Examples of algorithms using local search. Local search heuristics - To do list

Chapter 5 Search Strategies

Optimization Techniques for Design Space Exploration

Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm. Santos and Mateus (2007)

GENETIC ALGORITHM with Hands-On exercise

Mutations for Permutations

CHAPTER 5 ENERGY MANAGEMENT USING FUZZY GENETIC APPROACH IN WSN

V.Petridis, S. Kazarlis and A. Papaikonomou

Genetic Algorithms: Setting Parmeters and Incorporating Constraints OUTLINE OF TOPICS: 1. Setting GA parameters. 2. Constraint Handling (two methods)

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem

Outline. Motivation. Introduction of GAs. Genetic Algorithm 9/7/2017. Motivation Genetic algorithms An illustrative example Hypothesis space search

A Memetic Algorithm for Parallel Machine Scheduling

Evolutionary Computation. Chao Lan

Genetic Algorithms. Chapter 3

[Premalatha, 4(5): May, 2015] ISSN: (I2OR), Publication Impact Factor: (ISRA), Journal Impact Factor: 2.114

A Genetic Approach for Solving Minimum Routing Cost Spanning Tree Problem

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

CHAPTER 4 FEATURE SELECTION USING GENETIC ALGORITHM

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

6. Tabu Search. 6.3 Minimum k-tree Problem. Fall 2010 Instructor: Dr. Masoud Yaghini

Multi-objective Optimization

arxiv: v1 [cs.ai] 12 Feb 2017

Genetic Algorithm for Finding Shortest Path in a Network

Algorithm Design (4) Metaheuristics

CHAPTER 3.4 AND 3.5. Sara Gestrelius

A TABU SEARCH APPROACH FOR SOLVING A DIFFICULT FOREST HARVESTING MACHINE LOCATION PROBLEM

Computational Intelligence

Introduction to Design Optimization: Search Methods

Genetic algorithms and VRP: the behaviour of a crossover operator

Network Routing Protocol using Genetic Algorithms

GENERATION OF GREY PATTERNS USING AN IMPROVED GENETIC- EVOLUTIONARY ALGORITHM: SOME NEW RESULTS

Solving the C sum Permutation Flowshop Scheduling Problem by Genetic Local Search

A Hybrid Genetic Algorithm for the Distributed Permutation Flowshop Scheduling Problem Yan Li 1, a*, Zhigang Chen 2, b

Discrete (and Continuous) Optimization WI4 131

The k-means Algorithm and Genetic Algorithm

Genetic Programming. Charles Chilaka. Department of Computational Science Memorial University of Newfoundland

A Genetic Algorithm Framework

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

Variable Neighborhood Search for Solving the Balanced Location Problem

TABU search and Iterated Local Search classical OR methods

Multiobjective Job-Shop Scheduling With Genetic Algorithms Using a New Representation and Standard Uniform Crossover

Outline. TABU search and Iterated Local Search classical OR methods. Traveling Salesman Problem (TSP) 2-opt

Memetic Algorithms in Discrete Optimization

HEURISTICS FOR THE NETWORK DESIGN PROBLEM

Pseudo-code for typical EA

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra

Test Data Generation with A Hybrid Genetic Tabu Search Algorithm for Decision Coverage Criteria

JavaEvA A Java based framework for Evolutionary Algorithms - Manual and Documentation - Felix Streichert and Holger Ulmer Tech. Report WSI

Genetic Algorithms for Vision and Pattern Recognition

Automatic Generation of Test Case based on GATS Algorithm *

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization

Artificial Intelligence Methods (G52AIM)

Introduction to Evolutionary Computation

DERIVATIVE-FREE OPTIMIZATION

Introduction to Optimization

Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach

A New Selection Operator - CSM in Genetic Algorithms for Solving the TSP

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2

Genetic Algorithms. Genetic Algorithms

Lecture 8: Genetic Algorithms

Planning and Search. Genetic algorithms. Genetic algorithms 1

Non-deterministic Search techniques. Emma Hart

A COMPARATIVE STUDY OF FIVE PARALLEL GENETIC ALGORITHMS USING THE TRAVELING SALESMAN PROBLEM

A Memetic Heuristic for the Co-clustering Problem

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding

Metaheuristics: a quick overview

Evolutionary Computation

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS

Hill Climbing. Assume a heuristic value for each assignment of values to all variables. Maintain an assignment of a value to each variable.

Structural Optimizations of a 12/8 Switched Reluctance Motor using a Genetic Algorithm

Automated Test Data Generation and Optimization Scheme Using Genetic Algorithm

Transcription:

Preliminary Background Tabu Search Genetic Algorithm Faculty of Information Technology University of Science Vietnam National University of Ho Chi Minh City March 2010

Problem used to illustrate General problem min f(x) x є X Assignment type problem: Assignment of resources j to activities i min f(x) Subject to 1 j m x ij = 1 1 i n x ij = 0 or 1 1 i n, 1 j m

Neighborhood (Local) Search Techniques (NST) A Neighborhood (Local) Search Technique (NST) is an iterative procedure starting with an initial feasible solution x 0. At each iteration: - we move from the current solution x є X to a new one x'єx in its neighborhood N(x) - x' becomes the current solution for the next iteration - we update the best solution x* found so far. The procedure continues until some stopping criterion is satisfied

Neighborhood Neighborhood N(x) : The neighborhood N(x) varies with the problem, but its elements are always generated by slightly modifying x. If we denote M the set of modifications (or moves) to generate neighboring solutions, then N(x) = {x' : x' = x m, m M }

Neighborhood for assigment type problem For the assignment type problem: Let x be as follows: for each 1 i n, x ij(i) = 1 x ij = 0 for all other j The elements of the neighborhood N(x) are generated by slightly modifying x: N(x) = {x' : x' = x m, m M } Each solution x'єn(x) is obtained by selecting an activity i and modifying its resource from j(i) to some other p (i. e., the modification can be denoted m = [i, p] ): x' ij(i) = 0 x' ip = 1 x' ij = x ij for all other i, j

Descent Method (D) At each iteration, a best solution x' є N(x) is selected as the current solution for the next iteration. Stopping criterion: f(x') f(x) i.e., the current solution cannot be improved or a first local minimum is reached.

Tabu Search Tabu Search is an iterative neighborhood or local search technique At each iteration we move from a current solution x to a new solution x' in a neigborhood of x denoted N(x), until we reach some solution x* acceptable according to some criterion

Selecting x'

Selecting x'

Selecting x'

Selecting x'

Selecting x'

Selecting x'

Selecting x'

Tabu Search (TS)

Tabu Search (TS) Initialize Select an initial solution x 0 є X Let x:= x 0

Tabu Search (TS) Initialize Select an initial solution x 0 є X Let x:= x 0 and stop:= false While not stop Determine a subset NC(x) N(x) Determine x' є NC(x) such that x' := argmin zєnc(x) { f(z) } At each iteration, a best solution x' є NC(x) is selected

Tabu Search (TS) Initialize Select an initial solution x 0 є X Let x:= x 0 and stop:= false While not stop Determine a subset NC(x) N(x) At each iteration, a best solution x' є NC(x) is selected Determine x' є NC(x) such that x' := argmin zєnc(x) { f(z) } x:= x' x' є NC(x) is the current solution for the next iteration

Tabu Search (TS) Initialize Select an initial solution x 0 є X Let x:= x 0 and stop:= false While not stop As long as x' is better than x, the behavior of the procedure is similar to that of the descent method. Otherwise, moving to x' as the next current solution induces no improvement or a deterioration of the objective function, but it allows to move out of a local minimum Determine a subset NC(x) N(x) Determine x' є NC(x) such that x' := argmin zєnc(x) { f(z) } x:= x'

Tabu Search (TS) Initialize Select an initial solution x 0 є X Let TL k = Φ, k = 1, 2,,p Let x:= x 0 and stop:= false While not stop Determine a subset NC(x) N(x) of solutions z = x m such that t k (m) is not in TL k, k = 1, 2,, p As long as x' is better than x, the behavior of the procedure is similar to that of the descent method. Otherwise, moving to x' as the next current solution induces no improvement or a deterioration of the objective function, but it allows to move out of a local minimum To prevent cycling, recently visited solutions are eliminated from NC(x) using Tabu lists Determine x' є NC(x) such that x' := argmin zєnc(x) { f(z) } x:= x' Update Tabu Lists TL k, k = 1, 2,, p

Tabu Lists (TL) Short term Tabu lists TL k are used to remember attributes or characteristics of the modification used to generate the new current solution A Tabu List often used for the assignment type problem is the following: If the new current solution x' is generated from x by modifying the resource of i from j(i) to p, then the pair (i, j(i)) is introduced in the Tabu list TL If the pair (i, j) is in TL, then any solution where resource j is to be assigned to i is declared Tabu The Tabu lists are cyclic in order for an attribute to remain Tabu for a fixed number n k of iterations

Tabu Search (TS) Initialize Select an initial solution x 0 є X Let TL k = Φ, k = 1, 2,,p Let x:= x 0 and stop:= false While not stop Determine a subset NC(x) N(x) of solutions z = x m such that t k (m) is not in TL k, k = 1, 2,, p As long as x' is better than x, the behavior of the procedure is similar to that of the descent method. Otherwise, moving to x' as the next current solution induces no improvement or a deterioration of the objective function, but it allows to move out of a local minimum To prevent cycling, recently visited solutions are eliminated from NC(x) using Tabu lists Determine x' є NC(x) such that x' := argmin zєnc(x) { f(z) } x:= x' Update Tabu Lists TL k, k = 1, 2,, p

Tabu Search (TS) Initialize Select an initial solution x 0 є X Let TL k = Φ, k = 1, 2,,p Let x* := x:= x 0 and stop:= false While not stop Determine a subset NC(x) N(x) of solutions z = x m such that t k (m) is not in TL k, k = 1, 2,, p or f(z) < f(x*) Determine x' є NC(x) such that x' := argmin zєnc(x) { f(z) } x:= x' Since Tabu lists are specified in terms of attributes of the modifications used, we required an Aspiration criterion to bypass the tabu status of good solutions declared Tabu without having been visited recently may include z in NC(x) even if z is Tabu whenever f(z) < f(x*) where x* is the best solution found so far Update Tabu Lists TL k, k = 1, 2,, p

Tabu Search (TS) Initialize Select an initial solution x 0 є X Let TL k = Φ, k = 1, 2,,p Let x* := x:= x 0 and stop:= false While not stop Determine a subset NC(x) N(x) of solutions z = x m such that t k (m) is not in TL k, k = 1, 2,, p or f(z) < f(x*) Determine x' є NC(x) such that x' := argmin zєnc(x) { f(z) } x:= x' If f(x) < f(x*) then x* := x, Update x* the best solution found so far Update Tabu Lists TL k, k = 1, 2,, p

Tabu Search (TS) Initialize Select an initial solution x 0 є X Let TL k = Φ, k = 1, 2,,p Let x* := x:= x 0 and stop:= false No monotonicity of the objective function!!! Stopping criterion??? While not stop Determine a subset NC(x) N(x) of solutions z = x m such that t k (m) is not in TL k, k = 1, 2,, p or f(z) < f(x*) Determine x' є NC(x) such that x' := argmin zєnc(x) { f(z) } x:= x' If f(x) < f(x*) then x* := x, Update Tabu Lists TL k, k = 1, 2,, p

Tabu Search (TS) Initialize Select an initial solution x 0 є X Let TL k = Φ, k = 1, 2,,p Let iter := niter := 0 Let x* := x:= x 0 and stop:= false While not stop iter := iter + 1 ; niter := niter + 1 Determine a subset NC(x) N(x) of solutions z = x m such that t k (m) is not in TL k, k = 1, 2,, p or f(z) < f(x*) Determine x' є NC(x) such that x' := argmin zєnc(x) { f(z) } x:= x' If f(x) < f(x*) then x* := x, and niter := 0 If iter = itermax or niter = nitermax then stop := true Update Tabu Lists TL k, k = 1, 2,, p x* is the best solution generated Stopping criteria: - maximum number of iterations - maximum number of successive iterations where f(x*) does not improve

Improving Strategies Intensification Multistart diversification strategies: - Random Diversification (RD) - First Order Diversification (FOD) Variable Neighborhood Search (VNS) Exchange Procedure

Intensification Intensification strategy used to search more extensively a promissing region

Diversification The diversification principle is complementary to the intensification. Its objective is to search more extensively the feasible domain by leading the NST to unexplored regions of the feasible domain. Numerical experiences indicate that it seems better to apply a short NST (of shorter duration) several times using different initial solutions rather than a long NST (of longer duration).

Genetic Algorithm (GA) Population based algorithm At each iteration (generation) three different operators are first applied to generate a set of new (offspring) solutions using the N solutions of the current population: selection operator: selecting from the current population parent-solutions that reproduce themselves crossover (reproduction) operator: producing offspring-solutions from each pair of parent-solutions mutation operator: modifying (improving) individual offspring-solution A fourth operator (culling operator) is applied to determine a new population of size N by selecting among the solutions of the current population and the offspring-solutions according to some strategy

Encoding the solution The phenotype form of the solution x є R n is encoded (represented) as a genotype form vector z є R m (or chromozome) where m may be different from n. For example in the assignment type problem: let x be the following solution: for each 1 i n, x ij(i) = 1 x ij = 0 for all other j x є R nxm can be encoded as z є R n where z i = j(i) i = 1, 2,, n i.e., z i is the index of the resource j(i) assigned to activity i

Selection operator This operator is used to select an even number (2, or 4, or, or N) of parent-solutions. Each parent-solution is selected from the current population according to some strategy or selection operator. Note that the same solution can be selected more than once. The parent-solutions are paired two by two to reproduce themselves. Selection operators: Random selection operator Proportional (or roulette whell) selection operator Tournament selection operator Diversity preserving selection operator

Crossover (recombination) operators Crossover operator is used to generate new solutions including interesting components contained in different solutions of the current population. The objective is to guide the search toward promissing regions of the feasible domain X while maintaining some level of diversity in the population. Pairs of parent-solutions are combined to generate offspringsolutions according to different crossover (recombination) operators.

One point crossover The one point crossover generates two offspring-solutions from the two parent-solutions z 1 = [ z 11, z 21,, z m1 ] z 2 = [ z 12, z 22,, z m2 ] as follows: i) Select randomly a position (index) ρ, 0 ρ m. ii) Then the offspring-solutions are specified as follows: oz 1 = [ z 11, z 21,, z ρ1, z ρ+12,, z m2 ] oz 2 = [ z 12, z 22,, z ρ2, z ρ+11,, z m1 ] Hence the first ρ components of offspring oz 1 (offspring oz 2 ) are the corresponding ones of parent 1 (parent 2), and the rest of the components are the corresponding ones of parent 2 (parent 1)

Two points crossover The two points crossover generates two offspring-solutions from the two parent-solutions z 1 = [ z 11, z 21,, z m1 ] z 2 = [ z 12, z 22,, z m2 ] as follows: i) Select randomly two positions (indices) µ,ν, 1 µ ν m. ii) Then the offspring-soltions are specified as follows: oz 1 = [ z 11,, z µ-11, z µ2,, z ν2, z ν+11,, z m1 ] oz 2 = [ z 12,, z µ-12, z µ1,, z ν1, z ν+12,, z m2 ] Hence the offspring oz 1 (offspring oz 2 ) has components µ, µ+1,, ν of parent 2 (parent 1), and the rest of the components are the corresponding ones of parent 1 (parent 2)

Uniform crossover The uniform crossover requires a vector of bits (0 or 1) of dimension m to generate two offspring-solutions from the two parent-solutions z 1 = [ z 11, z 21,, z m1 ], z 2 = [ z 12, z 22,, z m2 ] : i) Generate randomly a vector of bits, for example [0, 1, 1, 0,, 1, 0] ii) Then the offspring-solutions are specified as follows: parent 1: [ z 11, z 21, z 31, z 41,, z m-11, z m1 ] parent 2: [ z 12, z 22, z 32, z 42,, z m-12, z m2 ] Vector of bits: [ 0, 1, 1, 0,, 1, 0 ] Offspring oz 1 : [ z 11, z 22, z 32, z 41,, z m-12, z m1 ] Offspring oz 2 : [ z 12, z 21, z 31, z 42,, z m-11, z m2 ]

Uniform crossover The uniform crossover requires a vector of bits (0 or 1) of dimension m to generate two offspring-solutions from the two parent-solutions z 1 = [ z 11, z 21,, z m1 ], z 2 = [ z 12, z 22,, z m2 ] : i) Generate randomly a vector of bits, for example [0, 1, 1, 0,, 1, 0] ii) Then the offspring-solutions are specified as follows: parent 1: [ z 11, z 21, z 31, z 41,, z m-11, z m1 ] parent 2: [ z 12, z 22, z 32, z 42,, z m-12, z m2 ] Vector of bits: [ 0, 1, 1, 0,, 1, 0 ] Offspring oz 1 : [ z 11, z 22, z 32, z 41,, z m-12, z m1 ] Offspring oz 2 : [ z 12, z 21, z 31, z 42,, z m-11, z m2 ] Hence the i th component of oz 1 (oz 2 ) is the i th component of parent 1 (parent 2) if the i th component of the vector of bits is 0, otherwise, it is equal to the i th component of parent 2 (parent 1)

Mutation operator Mutation operator is an individual process to modify offspring-solutions In traditional variants of Genetic Algorithm the mutation operator is used to modify arbitrarely each componenet z i with a small probability: For i = 1 to m Generate a random numberβє[0, 1] If β < βmax then select randomly a new value for z i whereβmax is small enough in order to modify z i with a small probability Mutation operator simulates random events perturbating the natural evolution process Mutation operator not essential, but the randomness that it introduces in the process, promotes diversity in the current population and may prevent premature convergence to a bad local minimum