Introduction to Stochastic Optimization Methods (meta-heuristics) Modern optimization methods 1

Similar documents
Introduction to Optimization Using Metaheuristics. The Lecturer: Thomas Stidsen. Outline. Name: Thomas Stidsen: Nationality: Danish.

Non-deterministic Search techniques. Emma Hart

Introduction to Optimization Using Metaheuristics. Thomas J. K. Stidsen

Origins of Operations Research: World War II

SIMULATED ANNEALING TECHNIQUES AND OVERVIEW. Daniel Kitchener Young Scholars Program Florida State University Tallahassee, Florida, USA

Metamodeling. Modern optimization methods 1

An evolutionary annealing-simplex algorithm for global optimisation of water resource systems

An Evolutionary Algorithm for Minimizing Multimodal Functions

Algorithms & Complexity

Optimization Techniques for Design Space Exploration

Simulated Annealing. G5BAIM: Artificial Intelligence Methods. Graham Kendall. 15 Feb 09 1

A Design of an Active OTA-C Filter Based on DESA Algorithm

Machine Learning for Software Engineering

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function

Outline of the module

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques

INF Biologically inspired computing Lecture 1: Marsland chapter 9.1, Optimization and Search Jim Tørresen

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini

Evolutionary Computation Algorithms for Cryptanalysis: A Study

Gradient Descent. 1) S! initial state 2) Repeat: Similar to: - hill climbing with h - gradient descent over continuous space

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

TABU search and Iterated Local Search classical OR methods

Outline. TABU search and Iterated Local Search classical OR methods. Traveling Salesman Problem (TSP) 2-opt

Fast oriented bounding box optimization on the rotation group SO(3, R)

Simulated Annealing. Slides based on lecture by Van Larhoven

Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach

Two approaches. Local Search TSP. Examples of algorithms using local search. Local search heuristics - To do list

Modern Methods of Data Analysis - WS 07/08

A New Approach for Finding the Global Optimal Point Using Subdividing Labeling Method (SLM)

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

ACONM: A hybrid of Ant Colony Optimization and Nelder-Mead Simplex Search

Introduction (7.1) Genetic Algorithms (GA) (7.2) Simulated Annealing (SA) (7.3) Random Search (7.4) Downhill Simplex Search (DSS) (7.

n Informally: n How to form solutions n How to traverse the search space n Systematic: guarantee completeness

Note: In physical process (e.g., annealing of metals), perfect ground states are achieved by very slow lowering of temperature.

Four Methods for Maintenance Scheduling

Outline. Informed Search. Recall: Uninformed Search. An Idea. Heuristics Informed search techniques More on heuristics Iterative improvement

Homework 2: Search and Optimization

PROBLEM SOLVING AND SEARCH IN ARTIFICIAL INTELLIGENCE

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms

Simple mechanisms for escaping from local optima:

Algorithm Design (4) Metaheuristics

General Purpose Methods for Combinatorial Optimization

Lecture 4. Convexity Robust cost functions Optimizing non-convex functions. 3B1B Optimization Michaelmas 2017 A. Zisserman

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Escaping Local Optima: Genetic Algorithm

Biologically Inspired Algorithms (A4M33BIA) Optimization: Conventional and Unconventional Optimization Techniques

k-center Problems Joey Durham Graphs, Combinatorics and Convex Optimization Reading Group Summer 2008

Hyper-Heuristic Based on Iterated Local Search Driven by Evolutionary Algorithm

Simulated Annealing Overview

Using Genetic Algorithms to optimize ACS-TSP

Optimizing Clustering Algorithm in Mobile Ad hoc Networks Using Simulated Annealing

Simple meta-heuristics using the simplex algorithm for non-linear programming

Simulation and Optimization Methods for Reliability Analysis

Chapter 14 Global Search Algorithms

Artificial Intelligence

Pre-requisite Material for Course Heuristics and Approximation Algorithms

Design and Analysis of Algorithms

Introduction to Computer Science and Programming for Astronomers

Solving Traveling Salesman Problem Using Parallel Genetic. Algorithm and Simulated Annealing

CS 331: Artificial Intelligence Local Search 1. Tough real-world problems

An Overview of Search Algorithms With a Focus in Simulated Annealing

Metaheuristic Algorithms for Hybrid Flow-Shop Scheduling Problem with Multiprocessor Tasks

COMPARISON OF DIFFERENT HEURISTIC, METAHEURISTIC, NATURE BASED OPTIMIZATION ALGORITHMS FOR TRAVELLING SALESMAN PROBLEM SOLUTION

Unidimensional Search for solving continuous high-dimensional optimization problems

Artificial Intelligence

Performance Comparison of Genetic Algorithm, Particle Swarm Optimization and Simulated Annealing Applied to TSP

A Search Algorithm for Global Optimisation

Exploration vs. Exploitation in Differential Evolution

Heuristic Optimization Introduction and Simple Heuristics

Investigation of global optimum seeking methods in water resources problems

Heuristic (Informed) Search

PARALLELIZATION OF THE NELDER-MEAD SIMPLEX ALGORITHM

Ant Colony Optimization

Single Candidate Methods

SPATIAL OPTIMIZATION METHODS

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

Global optimisation techniques in water resources management

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

x n+1 = x n f(x n) f (x n ), (1)

ACO and other (meta)heuristics for CO

EXECUTION PLAN OPTIMIZATION TECHNIQUES

COMPARATIVE STUDY OF CIRCUIT PARTITIONING ALGORITHMS

Informed search algorithms. (Based on slides by Oren Etzioni, Stuart Russell)

A combination of clustering algorithms with Ant Colony Optimization for large clustered Euclidean Travelling Salesman Problem

Task Allocation for Minimizing Programs Completion Time in Multicomputer Systems

Solving a combinatorial problem using a local optimization in ant based system

Outline. Best-first search. Greedy best-first search A* search Heuristics Local search algorithms

Machine Learning for Software Engineering

A COMPARATIVE STUDY OF HEURISTIC OPTIMIZATION ALGORITHMS

An Ant Approach to the Flow Shop Problem

Today. Golden section, discussion of error Newton s method. Newton s method, steepest descent, conjugate gradient

Advanced A* Improvements

Applying Simulated Annealing and the No Fit Polygon to the Nesting Problem. Abstract. 1. Introduction

SCIENCE & TECHNOLOGY

Heuristic Optimisation

Comparison of Acceptance Criteria in Randomized Local Searches

CS:4420 Artificial Intelligence

White Paper. Scia Engineer Optimizer: Automatic Optimization of Civil Engineering Structures. Authors: Radim Blažek, Martin Novák, Pavel Roun

Direct Search Firefly Algorithm for Solving Global Optimization Problems

Transcription:

Introduction to Stochastic Optimization Methods (meta-heuristics) Modern optimization methods 1

Efficiency of optimization methods Robust method Efficiency Specialized method Enumeration or MC kombinatorial unimodal multimodal Problem type Modern optimization methods 2

Classification of optimization methods Modern optimization methods 3

History of metaheuristics [wikipedia] Modern optimization methods 4

Direct search methods Without knowledge of gradients (zero-order methods) Heuristics:... a heuristic is an algorithm that ignores whether the solution to the problem can be proven to be correct, but which usually produces a good solution or solves a simpler problem that contains or intersects with the solution of the more complex problem. Heuristics are typically used when there is no known way to find an optimal solution, or when it is desirable to give up finding the optimal solution for an improvement in run time. [Wikipedia] For instance: Nelder-Mead algorithm Metaheuristics: designates a computational method that optimizes a problem by iteratively trying to improve a candidate solution. [Wikipedia] Modern optimization methods 5

Nelder-Mead method Heuristic method Known also as a simplex method Suitable for Dim<10 Modern optimization methods 6

Nelder-Mead method [Gioda & Maier, 1980] Modern optimization methods 7

Nelder-Mead method 0) original triangle, 1) expansion, 2) reflexion, 3) outer contraction, 4) inner contraction, 5) reduction [Čermák & Hlavička, VUT, 2006] Modern optimization methods 8

Nelder-Mead method Modern optimization methods 9

Meta-heuristic Also called stochastic optimization methods usage of random numbers => random behavior Altering of existing solutions by local change Examples: Monte Carlo method Dynamic Hill-climbing algorithm Simulated Annealing, Threshold Acceptance Tabu Search Modern optimization methods 10

Monte Carlo method Or blind algorithm Serve as a benchmark also (brute-force search) 1 t = 0 2 Create P, evaluate P 3 while (not stopping_criterion) { 4 t = t+1 5 Randomly create N, evaluate N 6 If N is better than P, then P=N 7 } Modern optimization methods 11

Global view on optimization Modern optimization methods 12

Local view on optimization Modern optimization methods 13

Hill-climbing algorithm Search in the vicinity of the current solution ( ) max ( ) x Modern optimization methods 14

Hill-climbing algorithm Search in the vicinity of the current solution 1 t = 0 2 Create P, evaluate P 3 while (not stopping_criterion) { 4 t = t+1 5 Randomly or systematically create N i in the vicinity of P, evaluate N i 6 The best N i exchanges P 7 } Modern optimization methods 15

Simulated annealing (SA) Discovered in 80 s independently by two authors [Kirkpatrick et al., 1983] and [Černý, 1985]. Based on physical meaning, on an idea of annealing of metals, where the slow cooling leads material to the state of minimum energy. This is equivalent to the global minimization. There is a mathematical proof of convergence. Modern optimization methods 16

Physical background Theory based on monitoring of metallic crystals at different temperatures High temperature Low temperature Modern optimization methods 17

Principles of method stochastic optimization method Probability to escape from local minima Pr( E) E = exp κ B T consecutive decreasing of temperature, so-called cooling schedule Modern optimization methods 18

Temperature influence Probability Pr( E) = E exp κ B T 1,80 Průběh tradičního vzorce pro různé teploty 1,60 1,40 1,20 1,00 Pr 0,80 5 4 3 2 1 0,5 0,60 0,40 0,20 0,00-5 0 5 10 15 20 25 E Modern optimization methods 19

Algorithm 1 T = T max, Create P, evaluate P 2 while (not stopping_criterion) { 3 count = succ = 0 4 while ( count < count max & succ < succ max ) { 5 count = count + 1 6 alter P by operator O, result is N 7 p = exp ((F(N) F(P))/T) (application of probability) 8 if ( random number u[0, 1] < p) { 9 succ = succ + 1 10 P=N 11 } if } while 12 Decrease T (cooling schedule) 13 } while Modern optimization methods 20

Algorithm Initial temperature T max is recommended at the level of e.g. 50% probability of acceptance of random solution. This leads to experimental investigation employing usually the trial and error method Parameter count max is for maximal number of all iteration and succ max is the number of successful iterations at the fixed temperature level (so-called Markov chain). Recommended ration is count max = 10 succ max Modern optimization methods 21

Algorithm Stopping criterion is usually the maximum number of function calls As an mutation operator, the addition of Gaussian random number with zero mean is usually used, i.e. N = P + G(0,σ), where the standard deviation σ is usually obtained by the trial and error method Modern optimization methods 22

Cooling algorithms Classical: T i+1 = T mult T i Where T mult =0,99 or 0,999 Other possibilities exist. e.g.: see e.g. [Ingber] These settings usually do not provide flexibility to control the algorithm Modern optimization methods 23

Cooling algorithms Proposed methodology [Lepš, MSc. Thesis] [Lepš, Šejnoha] This relationship enables at maximum iterations iter max reach the minimal temperature T min given by T min = 0,01T max For instance: for succ max /iter max = 0,001 is T mult =0,995 Modern optimization methods 24

Note on probability For population based algorithms: Pr( Porovnání vzorců pro T=10 E ) = 1 + 1 E exp κ B T 1,5 p=e^(-e/t) p=1/(1+e^(e/t)) 1,0 PR 0,5 0,0-5 -4-3 -2-1 0 1 2 3 4 5 E Modern optimization methods 25

SA convergence Modern optimization methods 26

Threshold acceptance Simpler than SA Instead of probability, there is a threshold, or range, within which the worst solution can replace the old one Threshold is decreasing similarly as temperature in SA There is a proof that this method does not ensure finding the global minima. Modern optimization methods 27

References [1] Kirkpatrick, S., Gelatt, Jr., C., and Vecchi, M. P. (1983). Optimization by Simulated Annealing. Science, 220:671 680. [2] Černý, J. (1985). Thermodynamical approach to the traveling salesmanproblem: An efficient simulation algorithm. J. Opt. Theory Appl., 45:41 51. [3] Ingber, L. (1993). Simulated annealing: Practice versus theory. Mathematical and Computer Modelling, 18(11):29 57. [4] Ingber, L. (1995). Adaptive simulated annealing (ASA): Lessons learned. Control and Cybernetics, 25(1):33 54. [5] Vidal, R. V. V. (1993). Applied Simulated Annealing, volume 396 of Notes in Economics and Mathematical Systems. Springer-Verlag. Modern optimization methods 28

References [6] J. Dréo, A. Pétrowski, P. Siarry, E. Taillard, A. Chatterjee (2005). Metaheuristics for Hard Optimization: Methods and Case Studies. Springer. [7] M. Lepš and M. Šejnoha: New approach to optimization of reinforced concrete beams. Computers & Structures, 81 (18-19), 1957-1966, (2003). [8] Burke, Edmund K., and Graham Kendall. Search methodologies. Springer Science+ Business Media, Incorporated, 2005. [9] Weise, Thomas, et al. "Why is optimization difficult?" Nature-Inspired Algorithms for Optimisation. Springer Berlin Heidelberg, 2009. 1-50. [10] S. Venkataraman and R.T. Haftka: Structural optimization complexity: What has Moore s law done for us? Struct Multidisc Optim 28, 375 387 (2004). Modern optimization methods 29

Travelling Salesman problem (TSP) One of the most famous discrete optimization problems The goal is to find closed shortest path in edgeoriented weighted graph For n cities, there are (n 1)! /2 possible solutions NP-complete task Modern optimization methods 30

Travelling Salesman problem (TSP) Example A 20 B A B C D A 0 20 42 35 42 C 30 12 35 D 34 B 20 0 30 34 C 42 30 0 12 D 35 34 12 0 Modern optimization methods 31

Travelling Salesman problem (TSP) A Solution I. Solution II. Solution III. 20 B A 20 B A 20 B 42 C 30 12 35 D 34 42 C 30 12 35 D 34 42 C 30 12 35 D 34 No. Order of the city distance 1 2 3 4 I. A B C D 97 II. A B D C 108 III. A C B D 141 Modern optimization methods 32

Travelling Salesman problem (TSP) No. cities Possible paths Computational demands 4 3 3,0 10-09 seconds 5 12 1,2 10-08 seconds 6 60 6,0 10-08 seconds 7 360 3,6 10-07 seconds 8 2520 2,5 10-06 seconds 9 20160 2,0 10-05 seconds 10 181440 1,8 10-04 seconds 20 6,1 10 16 19,3 years 50 3,0 10 62 9,6 10 46 years 100 4,7 10 157 1,5 10 140 years Based on 10 9 operations per 1 second computer speed Modern optimization methods 33

Travelling Salesman problem (TSP) The OPTNET Procedure 1 Mathematica software 2 http://gebweb.net/optimap/ 1 http://support.sas.com/documentation/cdl/en/ornoaug/65289/html/default/viewer.htm#ornoaug_optnet_examples07.htm 2 http://mathematica.stackexchange.com/questions/15985/solving-the-travelling-salesman-problem Modern optimization methods 34

Example tsp.m Modern optimization methods 35

Local operator Modern optimization methods 36

Some parts of this presentation have been kindly prepared by Ing. Adéla Pospíšilová with Faculty of Civil Engineering, CTU in Prague. A humble plea. Please feel free to e-mail any suggestions, errors and typos to matej.leps@fsv.cvut.cz. News 21.10.2013: Added Nelder-Mead method and the introduction redone. News 5.11.2014: Added TSP after the end. News 10.11.2015: Enhancement of TSP. Date of the last version: 10.11.2015 Version: 003 Modern optimization methods 37