Algorithm Design (4) Metaheuristics

Similar documents
Artificial Intelligence

Escaping Local Optima: Genetic Algorithm

Evolutionary Computation Algorithms for Cryptanalysis: A Study

Non-deterministic Search techniques. Emma Hart

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Comparison of TSP Algorithms

Machine Learning for Software Engineering

Introduction to Design Optimization: Search Methods

x n+1 = x n f(x n) f (x n ), (1)

Chapter 14 Global Search Algorithms

Artificial Intelligence p.1/49. n-queens. Artificial Intelligence p.2/49. Initial state: the empty board or a board with n random

Simulated Annealing. Slides based on lecture by Van Larhoven

Heuristic Optimisation

TABU search and Iterated Local Search classical OR methods

Outline. TABU search and Iterated Local Search classical OR methods. Traveling Salesman Problem (TSP) 2-opt

TDDC17. Intuitions behind heuristic search. Best-First Search. Recall Uniform-Cost Search. f(n) =... + h(n) g(n) = cost of path from root node to n

Outline. Best-first search. Greedy best-first search A* search Heuristics Local search algorithms

Algorithms & Complexity

Artificial Intelligence

TDDC17. Intuitions behind heuristic search. Recall Uniform-Cost Search. Best-First Search. f(n) =... + h(n) g(n) = cost of path from root node to n

Optimization Techniques for Design Space Exploration

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

CS 331: Artificial Intelligence Local Search 1. Tough real-world problems

Simulated Annealing. G5BAIM: Artificial Intelligence Methods. Graham Kendall. 15 Feb 09 1

DERIVATIVE-FREE OPTIMIZATION

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function

Informed search algorithms. Chapter 4

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms

Ar#ficial)Intelligence!!

Advanced Search Simulated annealing

Hardware-Software Codesign

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

Lecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman

An evolutionary annealing-simplex algorithm for global optimisation of water resource systems

Solving Traveling Salesman Problem Using Parallel Genetic. Algorithm and Simulated Annealing

Informed search algorithms. (Based on slides by Oren Etzioni, Stuart Russell)

CS:4420 Artificial Intelligence

Random Search Report An objective look at random search performance for 4 problem sets

Simple mechanisms for escaping from local optima:

Local Search. CS 486/686: Introduction to Artificial Intelligence Winter 2016

An Overview of Search Algorithms With a Focus in Simulated Annealing

Artificial Intelligence

Two approaches. Local Search TSP. Examples of algorithms using local search. Local search heuristics - To do list

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

AI Programming CS S-08 Local Search / Genetic Algorithms

A New Algorithm for Solving the Operation Assignment Problem in 3-Machine Robotic Cell Scheduling

Note: In physical process (e.g., annealing of metals), perfect ground states are achieved by very slow lowering of temperature.

Single Candidate Methods

BEYOND CLASSICAL SEARCH

K-Consistency. CS 188: Artificial Intelligence. K-Consistency. Strong K-Consistency. Constraint Satisfaction Problems II

Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat:

Variable Neighborhood Search

Introduction to Optimization

Automatic Generation of Test Case based on GATS Algorithm *

ABSTRACT I. INTRODUCTION. J Kanimozhi *, R Subramanian Department of Computer Science, Pondicherry University, Puducherry, Tamil Nadu, India

Design and Analysis of Algorithms

REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN

Pre-requisite Material for Course Heuristics and Approximation Algorithms

Using Genetic Algorithms to solve the Minimum Labeling Spanning Tree Problem

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

Hill Climbing. Assume a heuristic value for each assignment of values to all variables. Maintain an assignment of a value to each variable.

Heuristic Optimization Introduction and Simple Heuristics

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

Today. CS 188: Artificial Intelligence Fall Example: Boolean Satisfiability. Reminder: CSPs. Example: 3-SAT. CSPs: Queries.

Using Genetic Algorithms to optimize ACS-TSP

Exploration vs. Exploitation in Differential Evolution

Introduction to Optimization

Using Genetic Algorithm with Triple Crossover to Solve Travelling Salesman Problem

Local Search (Ch )

Informed search algorithms. Chapter 4

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Artificial Intelligence, CS, Nanjing University Spring, 2016, Yang Yu. Lecture 5: Search 4.

4 INFORMED SEARCH AND EXPLORATION. 4.1 Heuristic Search Strategies

What is Search For? CSE 473: Artificial Intelligence. Example: N-Queens. Example: N-Queens. Example: Map-Coloring 4/7/17

Introduction to Artificial Intelligence 2 nd semester 2016/2017. Chapter 4: Beyond Classical Search

INF Biologically inspired computing Lecture 1: Marsland chapter 9.1, Optimization and Search Jim Tørresen

3.6.2 Generating admissible heuristics from relaxed problems

SPATIAL OPTIMIZATION METHODS

Outline. Informed Search. Recall: Uninformed Search. An Idea. Heuristics Informed search techniques More on heuristics Iterative improvement

Introduction (7.1) Genetic Algorithms (GA) (7.2) Simulated Annealing (SA) (7.3) Random Search (7.4) Downhill Simplex Search (DSS) (7.

Artificial Intelligence

A Course on Meta-Heuristic Search Methods for Combinatorial Optimization Problems

ARTIFICIAL INTELLIGENCE. Informed search

Optimization in Brachytherapy. Gary A. Ezzell, Ph.D. Mayo Clinic Scottsdale

n Informally: n How to form solutions n How to traverse the search space n Systematic: guarantee completeness

Evolutionary Algorithms: Perfecting the Art of Good Enough. Liz Sander

EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS

Local Search. CS 486/686: Introduction to Artificial Intelligence

mywbut.com Informed Search Strategies-II

Particle Swarm Optimization Approach for Scheduling of Flexible Job Shops

Announcements. CS 188: Artificial Intelligence Fall Reminder: CSPs. Today. Example: 3-SAT. Example: Boolean Satisfiability.

CS 188: Artificial Intelligence Fall 2008

Evolutionary Methods for State-based Testing

GRASP. Greedy Randomized Adaptive. Search Procedure

A Two-Dimensional Mapping for the Traveling Salesman Problem

METAHEURISTIC. Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal.

Transcription:

Algorithm Design (4) Metaheuristics Takashi Chikayama School of Engineering The University of Tokyo

Formalization of Constraint Optimization Minimize (or maximize) the objective function f(x 0,, x n ) with values <x 1,, x n >, x k D k that satisfy a condition C(x 1,, x n ) An objective function to be minimized is also called a cost function A value set <x 1,, x n > that satisfies the constraint but may not give the minimum (or maximum) the objective function is called a feasible solution

Algorithms for Combinatorial Optimization Strict algorithms Strictly the best solution is found i.e., No other feasible solution is better Often requires large computational cost Approximate algorithms Find a solution hopefully close to the best i.e., Not necessarily the real best Often decreases the computational cost

Iterative Improvement Methods 1. Find an initial feasible solution, which satisfies the constraint but may be far from optimal 2. The solution is modified a bit without violating the constraints, making the next feasible solution (neighbor solution) 3. Repeat the process until some termination condition is reached Small modifications are expected to lead to better feasible solutions

Simple Iterative Improvement In the step 2 of the previous page, always choose the best among the neighbor solutions Simple and efficient Several names Local Search Greedy Search Hill Climbing

Local Search 1. If there exists a better feasible solution in the neighborhood of the current solution, make that current 2. Repeat this until there is no better solutions in the neighborhood Neighborhood: A set of feasible solutions that can be easily derived from the current solution Usually, only some of the variables comprising the solution are modified Broad neighborhood means high cost in each step Should be able to cover all the feasible solutions

Convergence to Local Optima Local search may result in a locally optimal solution which is far from the global optimum initial solution cost one of local optima The global optimum solution space

Repeated Local Search Repeat local search from randomly chosen multiple initial solutions cost initial solution one of local optima another initial solution The global optimum solution space

Repeated Local Search Quite simple and efficient High parallelism: a parameter survey Parallel trials with different parameters No communication nor synchronization except for data distribution and solution gathering Depends on characteristics of the search space and distribution of initial solutions Can initial solutions be placed close to optimum No reason to use more complicated metaheuristics if repeated local search is enough

Metaheuristics Heuristics Methods likely to lead to solutions No guarantee, however, to find a solution Usually specific to problem areas Metaheuristics Heuristics independent on problem areas The same formulation can be applied widely Search in a space with some appropriate neighborhood notion is considered here

Simulated Annealing (SA) Local search always take the best neighbor, which often leads to local optima Allowing a little worse solution may help Annealing (metallurgy) Heating and then slowly cooling increases crystal size and releases defects Giving higher energy makes the state jump out of a locally lowest energy state, to be able to go toward lower energy states

Annealing Balls on a tray are flattened by shaking up potential energy

Notions in Algorithms and Physical Counterparts Algorithm Cost Feasible solutions Optimum Local search Annealing Metallurgy Physics Energy level Physical states Ground state Quenching Annealing = Slow cooling

Local Search vs. Simulated Annealing initial solution cost local optimum global optimum solution space

Choice of Next Solution in SA A randomly chosen neighborhood solution is chosen if it is not too bad 1. In the neighborhood of the current solution X, randomly choose one solution Y 2. With a random number r in the range [0,1], and some constant T (temperature), Y is accepted if the cost improvement satisfies r d e / T 3. If not, go back to step 1 and repeat

Temperature and Acceptance Probability Acceptance Probability 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0-10 -9-8 -7-6 -5-4 -3-2 -1 0 Cost Improvement Temp. 10 9 8 7 6 5 4 3 2 1

Temperature Scheduling Temperature value is critical Low temperature does not allow the solution escape from local optima High temperature may make an already good solution much worse Allow relatively large worsening in the beginning and gradually decrease the allowance Temperature scheduling High temp cool down low temp

Difficulty of Temp. Scheduling Cooling down too rapidly may make the search trapped in a local optimum Cooling down slowly takes more computation time With large enough constant c and setting the temperature of the n-th step T n to c/log n, the algorithm is guaranteed to converge to the real optimum solution Unfortunately, convergence is too slow for practical use

Conventional Scheduling A frequently used scheme is to decrease the temperature by a constant ratio α (0 < α < 1) T αt; There is no good general scheme to decide the value of α Usually, a constant close to 1 (0.999, for example) is used for α

Temperature-Parallel SA Parallel annealing with different temp. Decent solutions with high temp. are swapped with not-so-good solutions with lower temp. No temperature scheduling needed High Temp. Low Temp.

Tabu Search If local search ended up at is a local optimum X, we may select some feasible solution in the neighborhood of X as the next candidate Simply doing this will lead to X again, resulting in an infinite loop Make recently visited candidates taboo Keep already visited candidates in a list and exclude them from the candidates The list may grow too long

Simple Taboo List may Become Too Large Putting all feasible solutions in into the list is required to escape the local optimum Initial solution cost Local optimum Global optimum Space of feasible solutions

More Efficient Taboo Condition Settings diff(x, Y): Changes made to move from a feasible solution X to another one Y E.g., Which variables have different values For a certain period after a move X Y, diff(y, X) will be kept as a taboo change in the taboo list Much smaller than solutions themselves With the taboo period of L steps, the search will never have a loop shorter than 2L steps

Difficulty in Setting the Taboo Period A period too short makes the search likely to loop around local optima A period too long will make The cost of taboo checking larger, and Non-taboo moves within the neighborhood fewer; In the worst case, all moves within the neighborhood may become taboos

Design of Neighborhood is Essential Both simulated annealing and tabu search can escape from small local optima, but larger local optima are hard to escape from It is essential to design neighborhoods so that smooth transition of neighborhoods will lead to the global optimum With a good neighborhood design, simple local search may also lead to a good solution

Good Neighborhood Design

Different Choices of Neighborhoods for TSP 2-opt 3-opt Or-opt

Multiple Searches in Parallel All of local search, annealing, and tabu search try to gradually improve a single solution Parallelization is available by improving multiple feasible solutions in parallel, but, information on the solutions visited in the improvement process is not utilized Group Optimization

Particle Swarm Optimization Particles move around in a multi-dimensional space, as insects swarm around food Positions in the space have fitness values Particles can exchange information

Particle Swarm Optimization Assumption: Solutions within the neighborhood of a good solution are likely to be good also A particle has acceleration given by some linear combination of the following Direction of the best solution found so far by the particle itself Direction of the best solution found in the neighborhood of the particle Direction of the best ever found globally Some randomness

Utilizing Parts of Solutions When solutions can be decomposed into parts A feasible solution may have good parts and not-so-good parts Updating a feasible solution may destroy the good parts Combining good parts of multiple feasible solution may be enabled if search is conducted for a group of solutions Genetic Algorithms

Genetic Algorithms An algorithm mimicking evolution 1. Start with a group (population) of a number of initial solutions (individuals) 2. Offspring made through some alterations Decompose individuals and combine the parts again to make new individuals: Mating Some random alterations: Mutation 3. Pick up better individuals to form the next generation: Selection 4. Loop back to 2 until an appropriate solution is obtained

Gene, Mating, and Mutation A gene as a list of variables Mating: Crossover With two genes X = { x 1, x 2,, x n } Y = { y 1, y 2,, y n } Define a cross point k (1< k n) at random and make crossed genes Z = { x 1, x 2,, x k, y k+1, y k+2,, y n } W = { y 1, y 2,, y k, x k+1, x k+2,, x n } Mutation: Random changes of values

Selection In principle, individuals with higher fitness (those with lower costs) are chosen Strict application of the principle will damage the gene diversity, that may impede later improvements Not-so-good individuals may have good parts in their genes Through mating with other individuals, the good parts may become apparent Introduce some randomness to the selection

Points in Gene Design Results of mating and mutation should represent feasible solutions frequently Inefficient if only few meet constraints; Even extinction may result Relaxing the constraint and reflecting its violation to the cost may be useful Crossover should preserve meaningful parts of genes Variables closely related should have close locations on genes

Genes with Explicit Structures Genes can have structures other than lists Obtaining feasible solutions through mating and mutation could be made easier Tree-structure: Explicit correspondence of parts of solutions and the gene structure

Island Model GA Population is divided into groups GA is applied to each group individually Good individuals are exchanged occasionally Evolution in an area consisting of islands Each group would develop distinctive sets of genes Good parallelism with small communication Reported to be efficient even in sequential environments

Genetic Programming Automatic programming through GA Programs as tree-structured genes Nodes: Primitive operations Leaves: Constants and variables Fitness: How close to the specification The same algorithm as GA Good for problems that specification fitness can be quantitatively stated e.g. Find an expr. that explains data seq.

There Ain't No Such Thing As A Free Lunch It is impossible to get something for nothing 19 th century tradition of saloons in US to provide free lunch to patrons who had purchased at least one drink The food were salty and thus the customers usually ended up paying for a lot of beer When we obtain something free, that is actually at the expense of something else

No-Free-Lunch Theorem Wolpert and Macready, 1995 When objective functions are drawn uniformly at random, all algorithms have identical mean performance Algorithms that perform better for some kinds of objective functions must perform worse for some other kinds Algorithms that fit the characteristics of the objective functions should be chosen

Metaheuristics Heuristics not specific to problem domains Can be applied to a variety of problems Heuristics is heuristics No guarantee to find a good solution Whether the formulation fits the problem is the question Complicated algorithms have higher computational costs Simple repeated local search may work better