Outline of the module

Similar documents
ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

Non-deterministic Search techniques. Emma Hart

Heuristic Optimisation

Heuristic Optimisation

Outline. Search-based Approaches and Hyper-heuristics. Optimisation problems 24/06/2014. Optimisation problems are everywhere!

Pre-requisite Material for Course Heuristics and Approximation Algorithms

Heuristic Optimization Introduction and Simple Heuristics

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2

TABU search and Iterated Local Search classical OR methods

Outline. TABU search and Iterated Local Search classical OR methods. Traveling Salesman Problem (TSP) 2-opt

Introduction to Optimization

SLS Methods: An Overview

Escaping Local Optima: Genetic Algorithm

Introduction to Optimization Using Metaheuristics. Thomas J. K. Stidsen

Introduction to Optimization Using Metaheuristics. The Lecturer: Thomas Stidsen. Outline. Name: Thomas Stidsen: Nationality: Danish.

MVE165/MMG630, Applied Optimization Lecture 8 Integer linear programming algorithms. Ann-Brith Strömberg

INF Biologically inspired computing Lecture 1: Marsland chapter 9.1, Optimization and Search Jim Tørresen

Introduction to Optimization

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Heuristis for Combinatorial Optimization

Heuristis for Combinatorial Optimization

CS:4420 Artificial Intelligence

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Evolutionary Computation for Combinatorial Optimization

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function

Machine Learning for Software Engineering

A Course on Meta-Heuristic Search Methods for Combinatorial Optimization Problems

Methods and Models for Combinatorial Optimization Heuristis for Combinatorial Optimization

CS 331: Artificial Intelligence Local Search 1. Tough real-world problems

Introduction to Genetic Algorithms

February 19, Integer programming. Outline. Problem formulation. Branch-andbound

MVE165/MMG631 Linear and integer optimization with applications Lecture 9 Discrete optimization: theory and algorithms

Optimization Techniques for Design Space Exploration

Heuristic Optimisation

Local Search. CS 486/686: Introduction to Artificial Intelligence Winter 2016

Fundamentals of Integer Programming

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini

Two approaches. Local Search TSP. Examples of algorithms using local search. Local search heuristics - To do list

Fuzzy Inspired Hybrid Genetic Approach to Optimize Travelling Salesman Problem

Lecture: Iterative Search Methods

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search. CS 486/686: Introduction to Artificial Intelligence

Introduction to Artificial Intelligence 2 nd semester 2016/2017. Chapter 4: Beyond Classical Search

CMU-Q Lecture 8: Optimization I: Optimization for CSP Local Search. Teacher: Gianni A. Di Caro

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

Search Algorithms for Regression Test Suite Minimisation

Heuristic Optimisation Lecture Notes

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Algorithms & Complexity

GRASP. Greedy Randomized Adaptive. Search Procedure

Construction of Minimum-Weight Spanners Mikkel Sigurd Martin Zachariasen

SPATIAL OPTIMIZATION METHODS

Genetic Algorithms Variations and Implementation Issues

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

n Informally: n How to form solutions n How to traverse the search space n Systematic: guarantee completeness

Outline. Best-first search. Greedy best-first search A* search Heuristics Local search algorithms

SLS Algorithms. 2.1 Iterative Improvement (revisited)

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

Beyond Classical Search: Local Search. CMPSCI 383 September 23, 2011

ARTIFICIAL INTELLIGENCE. Informed search

The Heuristic (Dark) Side of MIP Solvers. Asja Derviskadic, EPFL Vit Prochazka, NHH Christoph Schaefer, EPFL

Genetic Algorithms and Genetic Programming Lecture 7

Lecture 6: The Building Block Hypothesis. Genetic Algorithms and Genetic Programming Lecture 6. The Schema Theorem Reminder

Lecture 6: Genetic Algorithm. An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved

Outline. Informed Search. Recall: Uninformed Search. An Idea. Heuristics Informed search techniques More on heuristics Iterative improvement

Note: In physical process (e.g., annealing of metals), perfect ground states are achieved by very slow lowering of temperature.

Algorithmic problem-solving: Lecture 2. Algorithmic problem-solving: Tractable vs Intractable problems. Based on Part V of the course textbook.

Artificial Intelligence

Geometric Semantic Genetic Programming ~ Theory & Practice ~

a local optimum is encountered in such a way that further improvement steps become possible.

Artificial Intelligence

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Simple mechanisms for escaping from local optima:

ACO and other (meta)heuristics for CO

PROBLEM SOLVING AND SEARCH IN ARTIFICIAL INTELLIGENCE

Introduction to Stochastic Optimization Methods (meta-heuristics) Modern optimization methods 1

Algorithms for Integer Programming

GENETIC ALGORITHM with Hands-On exercise

15.083J Integer Programming and Combinatorial Optimization Fall Enumerative Methods

Informed search algorithms. (Based on slides by Oren Etzioni, Stuart Russell)

Introduction to Computer Science and Programming for Astronomers

ARTIFICIAL INTELLIGENCE

Exploration vs. Exploitation in Differential Evolution

Parallel Computing in Combinatorial Optimization

Massively Parallel Approximation Algorithms for the Traveling Salesman Problem

Multiobjective Optimisation. Why? Panorama. General Formulation. Decision Space and Objective Space. 1 of 7 02/03/15 09:49.

Algorithm Design (4) Metaheuristics

An Empirical Investigation of Meta-heuristic and Heuristic Algorithms for a 2D Packing Problem

Dynamically Configured λ-opt Heuristics for Bus Scheduling

Travelling salesman problem using reduced algorithmic Branch and bound approach P. Ranjana Hindustan Institute of Technology and Science

Theorem 2.9: nearest addition algorithm

A Parallel Architecture for the Generalized Traveling Salesman Problem

Informed Search Algorithms. Chapter 4

General Purpose Methods for Combinatorial Optimization

An efficient evolutionary algorithm for the orienteering problem

Lecture Plan. Best-first search Greedy search A* search Designing heuristics. Hill-climbing. 1 Informed search strategies. Informed strategies

TDDC17. Intuitions behind heuristic search. Recall Uniform-Cost Search. Best-First Search. f(n) =... + h(n) g(n) = cost of path from root node to n

Integer Programming. Xi Chen. Department of Management Science and Engineering International Business School Beijing Foreign Studies University

An evolutionary annealing-simplex algorithm for global optimisation of water resource systems

Transcription:

Evolutionary and Heuristic Optimisation (ITNPD8) Lecture 2: Heuristics and Metaheuristics Gabriela Ochoa http://www.cs.stir.ac.uk/~goc/ Computing Science and Mathematics, School of Natural Sciences University of Stirling, Stirling, Scotland Outline of the module 1. Optimisation problems Optimisation & search Classic mathematical models Example problems (Knapsack, TSP, others) 2. Optimisation methods Heuristics and metaheuristcis Single point algorithms Population-based algorithms 3. Advanced topics Fitness landscape analysis Multi-objective optimisation Gabriela Ochoa, goc@stir.ac.uk 2 1

Optimisation/search algorithms Guarantee finding optimal solution Useful when problems can be solved in Polynomial time, or for small instances Optimisation algorithms Do not Guarantee finding optimal solution For most interesting optimisation problems no polynomial methods are known Exact Approximate Special purpose General purpose Special purpose Meta and Hyper heuristics Generate bounds: dual ascent, Langrangean relax Branch and bound Cutting planes Approximation Greedy / Constructive Heuristics Single point Population based Approximation algorithms: An attempt to formalise heuristics (emerged from the field of theoretical computer science) Polynomial time heuristics that provide some sort of guarantee on the quality of the solution 3 Terminology and dates Heuristic: Greek word heuriskein, the art of discovering new strategies to solve problems Heuristics for solving optimization problems, G. Poyla (1945) A method for helping in solving of a problem, commonly informal rules of thumb, educated guesses, or simply common sense Prefix meta: Greek for upper level methodology Metaheuristics: term was introduced by Fred Glover (1986). Other terms: modern heuristics, heuristic optimisation, stochastic local search G. Poyla, How to Solve it. Princeton University Press, Princeton NJ, 1945 F. Glover, Future Paths for Integer Programming and Links to Artificial Intelligence, Computers & Ops. Res, Vol. 13, No.5, pp. 533-549, 1986. 4 2

What is a heuristic? An optimisation method that tries to exploit problem-specific knowledge, for which we have no guarantee to find the optimal solution Construction Search space: partial candidate solutions Search step: extension with one or more solution components Example in TSP: nearest neighbour Improvement Search space: complete candidate solutions Search step: modification of one or more solution components Example in TSP: 2-opt Video for TSP: https://www.youtube.com/watch?v=sc5cx8dratu Gabriela Ochoa, goc@stir.ac.uk 5 What is a metaheuristic? Extended variants of improvement heuristics General-purpose solvers, usually applicable to a large variety of problems Use two phases during search Intensification (exploitation): focus the applications of operators on high-quality solutions Diversification (exploration): systematically modifies existing solutions such as new areas of the search space are explored Gabriela Ochoa, goc@stir.ac.uk 6 3

Genealogy of metaheuristics The Simplex Algorithm (G. Dantzig, 1947) (J.Edmonds, 1971): Metaheuristics: From Design to Implementation By El-Ghazali Talbi (2009) Gabriela Ochoa, goc@stir.ac.uk 7 Classification of Metaheuristcs Different ways of classifying metaheuristics Nature-inspired vs. non-nature in-spired Population-based vs. single point search Dynamic vs. static objective function One vs. various neighbourhood structures Memory usage vs. memory-less methods. Gabriela Ochoa, goc@stir.ac.uk 8 4

Key components of metaheuristics Problem Representation Describes encoding of solutions Application of search operators Fitness Function Search/Variation Operators Initial Solution(s) Often same as the objective function Extensions might be necessary (e.g.. Infeasible solutions) Closely related to the representation Mutation, recombination, ruin-recreate Created randomly Seeding with higher quality or biased solutions Search Strategy Defines intensification/diversification mechanisms Many possibilities and alternatives! Gabriela Ochoa, goc@cs.stir.ac.uk The knapsack problem Formulation. Mathematical model maximise 4x 1 +2x 2 +x 3 +10x 4 +2x 5 subject to 12x 1 +2x 2 +x 3 +4x 4 +x 5 15 x 1,x 2,x 3,x 4,x 5 {0, 1} X i = 1 If we select item i 0 Otherwise Search space size = 2 n n = 100, 2 100 10 30 A thief breaks into a store and wants to fill his knapsack with as much value in goods as possible before making his escape. Given the following list of items available, what should he take? Item A, weighting wa kg and valued at va tem B, weighting wb kg and valued at vb tem C, weighting wc kg and valued at vc Gabriela Ochoa, goc@stir.ac.uk 10 5

The knapsack problem Input Capacity K n items with weights w i and values v i Goal Output a set of items s such that the sum of weights of items in s is at most K and the sum of values of items in s is maximized Gabriela Ochoa, goc@stir.ac.uk 11 Exhaustive enumeration Try out all possible ways of packing/leaving out the items For each way, it is easy to calculate the total weight carried and the total value carried Consider the following knapsack problem instance: 3 1 5 4 2 12 10 3 8 5 11 Where: The first line gives the number of items. The last line gives the capacity of the knapsack. The remaining lines give the index, value and weight of each item. 12 Question: How many possible solutions can you enumerate? 6

Knapsack, full enumeration Items Value Weight Feasible? 000 0 0 Yes 001 8 5 Yes 010 12 10 Yes 011 20 15 No 100 5 4 Yes 101 13 9 Yes 110 17 14 No 111 25 19 No Optimal!! 13 The notion of Neighbourhood Region of the search space that is near to some particular point in that space Define a distance function dist on the search space S Dist: S x S R N(x) = {y Є S: dist(x,y) ε } Examples: Euclidean distance, for search spaces defined over continuous variables Hamming distance, for search spaces defined over binary strings Hamming distance between two strings is the number of positions at which the corresponding symbols are different. A search space S, a potential solution x, and its neighbourhood N(x) x N(x) S 14 7

Defining neighbourhoods Binary representation 1-flip: Solutions generated by flipping a single bit in the given bit string If the string length is n, how many neighbours each solution has? Example: 1 1 0 0 1 0 1 0 0 1, 2-flip: Solutions generated by flipping two bits in the given bit string If the string length is n, how many neighbours each solution has? Example: 1 1 0 0 1 0 1 0 1 1 k-flip: can be generalised for larger k, k < n Gabriela Ochoa, goc@stir.ac.uk 15 Defining neighbourhoods Permutation 2-swap: Solutions generated by swapping two cities from a given tour Every solution has n(n-1)/2 neighbours Examples: 2 4 5 3 1 2 3 5 4 1, 1 3 5 2 6 4 7 8 1 3 7 2 6 4 5 8 8

The notions of Local Optimum and Global Optimum A solution is a Local Optimum, if no solution in its neighbourhood has a better evaluation. A solution is the Global Optimum, if it no solution in the whole search space has a better evaluation. Maximisation problem: local maximum and global maximum Minimisation problem: Local minimum and global minimum. Plurals: optima, maxima, minima Gabriela Ochoa, goc@stir.ac.uk 17 Hill climbing (Iterative Improvement) algorithm (first Improvement, random mutation) procedure first-hill-climbing begin s = random initial solution solution repeat evaluate solution (s) s = random neighbour of s if evaluation(s ) is better than evaluation(s) s = s until stopping-criterion satisfied return s end The stopping criterion can be a fixed number of iterations. Gabriela Ochoa, goc@stir.ac.uk 18 9

Hill climbing algorithm (Best Improvement) procedure best-hill-climbing begin s = random initial solution solution repeat evaluate solution (s) s = best solution in the Neighbourhood if evaluation(s ) is better than evaluation(s) s = s until s is a local optimum return s end Gabriela Ochoa, goc@stir.ac.uk 19 Constraint handling Not trivial to deal withcontraints Reject strategies: only feasible solutions are kept Penalising strategies: penalty functions Repairing strategies: repairin feasibles olutions Decoding strategies: only feasible solutions are generated Preserving strategies: specific representation and search operators which preserve the feasibility Gabriela Ochoa, goc@stir.ac.uk 20 10

Hill-climbing search Problem: depending on initial state, can get stuck in local maxima Gabriela Ochoa, goc@stir.ac.uk 21 Hill-climbing search global optimum hill climb er hill climber (3,4,1,2),395 local optimum 11

The Hill-climbing algorithms is blind Like climbing a mountain in thick fog with amnesia? hill climb er?? hill climber? Hill-climbing methods Weaknesses Usually terminate at solutions that are local optima No information as to how much the discovered local optimum deviates from the global (or even other local optima) Obtained optimum depends on starting point Usually no upper bound on computation time Advantages Very easy to apply. Only needs A representation The evaluation function A neighbourhood Used in combination of other heuristics (memetic algorithms, iterated local search) Gabriela Ochoa, goc@stir.ac.uk 24 12

Implementation: binary representation 1-flip mutation: flipping a single bit in the given bit string For strings of length n, every solution has n neighbours Example: 1 1 0 0 1 0 1 0 0 1, Python implementation 25 Practical exercises Lab1: Random search to solve the simple knapsack problem. The random mutation operation. Optional: exhaustive enumeration Lab2: Implement variants of Hill-climbing (Best- Improvement, First-Improvement) to solve the simple knapsack problem. Implement a multi-start hill-climbing. Optional: Iterated Local Search Lab3: Implement a simple Genetic Algorithm to solve the simple knapsack problem. Optional: Hybridise with hillclimbing Extra Lab: Catch up with Check points. Which of the algorithms implemented is best for this problem and why? Gabriela Ochoa, goc@stir.ac.uk 26 13

What do we mean by random search? Attempting several solutions at random, and keeping the best found. procedure random-search begin s = random initial solution solution repeat evaluate solution (s) s = random solution if evaluation(s ) is better than evaluation(s) s = s until stopping-criterion satisfied return s end The stopping criterion can be a fixed number of iterations. Gabriela Ochoa, goc@stir.ac.uk 27 14