Introduction to Optimization

Similar documents
Introduction to Optimization

Introduction to Optimization

Introduction to Optimization

Introduction to Optimization

Evolutionary Computation for Combinatorial Optimization

An Introduction to Evolutionary Algorithms

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

Computational Intelligence

Introduction to Evolutionary Computation

Outline of the module

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2

CS5401 FS2015 Exam 1 Key

Introduction to Genetic Algorithms. Based on Chapter 10 of Marsland Chapter 9 of Mitchell

Genetic Algorithms Variations and Implementation Issues

GENETIC ALGORITHM with Hands-On exercise

Artificial Intelligence Application (Genetic Algorithm)

An Evolutionary Algorithm for the Multi-objective Shortest Path Problem

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

Escaping Local Optima: Genetic Algorithm

Introduction to Optimization

Introduction to Optimization

March 19, Heuristics for Optimization. Outline. Problem formulation. Genetic algorithms

Local Search (Greedy Descent): Maintain an assignment of a value to each variable. Repeat:

Mutations for Permutations

Introduction to Genetic Algorithms

Artificial Intelligence

Non-deterministic Search techniques. Emma Hart

Genetic Algorithms. Kang Zheng Karl Schober

Preliminary Background Tabu Search Genetic Algorithm

Genetic Programming. Charles Chilaka. Department of Computational Science Memorial University of Newfoundland

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Optimizing Flow Shop Sequencing Through Simulation Optimization Using Evolutionary Methods

Evolutionary Algorithms. CS Evolutionary Algorithms 1

OPTIMIZATION METHODS. For more information visit: or send an to:

Algorithms & Complexity

Pre-requisite Material for Course Heuristics and Approximation Algorithms

Chapter 9: Genetic Algorithms

THE Multiconstrained 0 1 Knapsack Problem (MKP) is

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

The Genetic Algorithm for finding the maxima of single-variable functions

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

An Evolutionary Algorithm with Stochastic Hill-Climbing for the Edge-Biconnectivity Augmentation Problem

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Outline of Lecture. Scope of Optimization in Practice. Scope of Optimization (cont.)

DERIVATIVE-FREE OPTIMIZATION

Evolutionary Computation Algorithms for Cryptanalysis: A Study

Algorithm Design (4) Metaheuristics

METAHEURISTICS. Introduction. Introduction. Nature of metaheuristics. Local improvement procedure. Example: objective function

Genetic programming. Lecture Genetic Programming. LISP as a GP language. LISP structure. S-expressions

Constraint Handling. Fernando Lobo. University of Algarve

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Heuristic Optimisation

EVOLUTIONARY OPTIMIZATION OF A FLOW LINE USED ExtendSim BUILT-IN OPTIMIZER

Optimization Technique for Maximization Problem in Evolutionary Programming of Genetic Algorithm in Data Mining

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

An Improved Hybrid Genetic Algorithm for the Generalized Assignment Problem

REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN

Evolutionary Computation. Chao Lan

NP-Hardness. We start by defining types of problem, and then move on to defining the polynomial-time reductions.

Introduction to Design Optimization: Search Methods

Outline. CS 6776 Evolutionary Computation. Numerical Optimization. Fitness Function. ,x 2. ) = x 2 1. , x , 5.0 x 1.

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

CHAPTER 4 GENETIC ALGORITHM

Optimization of Benchmark Functions Using Genetic Algorithm

Evolutionary Algorithms: Lecture 4. Department of Cybernetics, CTU Prague.

METAHEURISTICS Genetic Algorithm

GRASP. Greedy Randomized Adaptive. Search Procedure

Genetic Algorithms. Genetic Algorithms

METAHEURISTIC. Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal.

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS

Genetic Algorithms. Chapter 3

[Premalatha, 4(5): May, 2015] ISSN: (I2OR), Publication Impact Factor: (ISRA), Journal Impact Factor: 2.114

Heuristic Optimisation

Evolutionary Computation

CHAPTER 5 ENERGY MANAGEMENT USING FUZZY GENETIC APPROACH IN WSN

Software Vulnerability

Evolutionary Computation Part 2

Hill Climbing. Assume a heuristic value for each assignment of values to all variables. Maintain an assignment of a value to each variable.

Pseudo-code for typical EA

Using Genetic Algorithms in Integer Programming for Decision Support

Automata Construct with Genetic Algorithm

Copyright 2000, Kevin Wayne 1

Outline. Motivation. Introduction of GAs. Genetic Algorithm 9/7/2017. Motivation Genetic algorithms An illustrative example Hypothesis space search

Application of Genetic Algorithm in Multiobjective Optimization of an Indeterminate Structure with Discontinuous Space for Support Locations

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

A Genetic Algorithm for the Multiple Knapsack Problem in Dynamic Environment

AI Programming CS S-08 Local Search / Genetic Algorithms

NICHING EVOLUTIONARY ALGORITHMS FOR MULTIMODAL AND DYNAMIC OPTIMIZATION

Search Algorithms for Regression Test Suite Minimisation

Theorem 2.9: nearest addition algorithm

Heuristic Optimization Introduction and Simple Heuristics

Bi-Objective Optimization for Scheduling in Heterogeneous Computing Systems

Two approaches. Local Search TSP. Examples of algorithms using local search. Local search heuristics - To do list

Selection of Optimal Path in Routing Using Genetic Algorithm

LECTURE 3 ALGORITHM DESIGN PARADIGMS

TDDC17. Intuitions behind heuristic search. Recall Uniform-Cost Search. Best-First Search. f(n) =... + h(n) g(n) = cost of path from root node to n

Artificial Intelligence

1 Standard definitions and algorithm description

Transcription:

Introduction to Optimization Approximation Algorithms and Heuristics November 21, 2016 École Centrale Paris, Châtenay-Malabry, France Dimo Brockhoff Inria Saclay Ile-de-France

2 Exercise: The Knapsack Problem and Dynamic Programming http://chercheurs.lille.inria.fr/ ~brockhof/introoptimization/exercises.php

Conclusions 3 I hope it became clear......what the algorithm design idea of dynamic programming is...for which problem types is is supposed to be suitable...and how to apply the idea to the knapsack problem

4 Course Overview Date Fri, 7.10.2016 Fri, 28.10.2016 Fri, 4.11.2016 Fri, 18.11.2016 Mon, 21.11.2016 in S103-S105 Topic Introduction D Introduction to Discrete Optimization + Greedy algorithms I D Greedy algorithms II + Branch and bound D Dynamic programming D Approximation algorithms and heuristics Fri, 25.11.2016 in S103-S105 Mon, 28.11.2016 Mon, 5.12.2016 Fri, 9.12.2016 Mon, 12.12.2016 Fri, 16.12.2016 Wed, 4.1.2017 C Introduction to Continuous Optimization I C Introduction to Continuous Optimization II C Gradient-based Algorithms C Stochastic Optimization and Derivative Free Optimization I C Stochastic Optimization and Derivative Free Optimization II C Benchmarking Optimizers with the COCO platform Exam all classes last 3h15 and take place in S115-S117 (see exceptions)

Overview of Today s Lecture 5 Approximation Algorithms a greedy approximation algorithm for bin packing an FPTAS for the KP Overview of (Randomized) Search Heuristics randomized local search variable neighborhood search tabu search evolutionary algorithms [exercise: simple randomized algorithms for the knapsack problem] Potential Master's/PhD thesis projects

6 Approximation Algorithms

Coping with Difficult Problems 7 Exact brute-force often too slow better strategies such as dynamic programming & branch and bound still: often exponential runtime Approximation Algorithms (now) guarantee of low run time guarantee of high quality solution obstacle: difficult to prove these guarantees Heuristics (later today) intuitive algorithms guarantee to run in short time often no guarantees on solution quality

Approximations, PTAS, and FPTAS 8 An algorithm is a ρ-approximation algorithm for problem Π if, for each problem instance of Π, it outputs a feasible solution which function value is within a ratio ρ of the true optimum for that instance. An algorithm A is an approximation scheme for a minimization* problem Π if for any instance I of Π and a parameter ε>0, it outputs a solution s with f Π (I,s) (1+ε) OPT. An approximation scheme is called polynomial time approximation scheme (PTAS) if for a fixed ε>0, its running time is polynomially bounded in the size of the instance I. note: runtime might be exponential in 1/ε actually! An approximation scheme is a fully polynomial time approximation scheme (FPTAS) if its runtime is bounded polynomially in both the size of the input I and in 1/ε.

Today s Lecture 9 only example algorithm(s) no detailed proofs here due to the time restrictions of the class rather spend more time on the heuristics part and the exercise Actually Two Examples: a greedy approximation algorithm for bin packing an FPTAS for the KP

Bin Packing (BP) 10 Bin Packing Problem Given a set of n items with sizes a 1, a 2,..., a n. Find an assignment of the a i s to bins of size V such that the number of bins is minimal and the sum of the sizes of all items assigned to each bin is V. Known Facts no optimization algorithm reaches a better than 3/2 approximation in polynomial time (not shown here) greedy first-fit approach already yields an approximation algorithm with ρ-ratio of 2

First-Fit Approach 11 First-Fit Algorithm without sorting the items do: put each item into the first bin where it fits if it does not fit anywhere, open a new bin 0.5 0.3 0.4 0.8 0.2 0.2 0.2 0.5 0.3 0.2 0.4 0.2 0.2 0.8 Theorem: First-Fit algorithm is a 2-approximation algorithm Proof: Assume First Fit uses m bins. Then, at least m-1 bins are more than half full (otherwise, move items). because m and OPT are integer

An FPTAS for the Knapsack Problem 12 Knapsack Problem Dake

An FPTAS for the Knapsack Problem 13 Similar to this week s exercise, we can design a dynamic programming algorithm for which a subproblem is restricting the items to {1,..., k} and searches for the lightest packing with prefixed profit P runs in O(n 2 P max ) What strange runtime is O(n 2 P max )? Answer: pseudo-polynomial (polynomial if P max would be polynomial in input size) Idea behind FPTAS: p scale the profits smartly to i n to make P max polynomially bounded P max prove that dynamic programming approach computes profit of at least (1-ε) OPT (not shown here)

14 (Randomized) Search Heuristics

Motivation General Search Heuristics 15 often, problem complicated and not much time available to develop a problem-specific algorithm search heuristics are a good choice: relatively easy to implement easy to adapt/change/improve e.g. when the problem formulation changes in an early product design phase or when slightly different problems need to be solved over time

Lecture Overview 16 Randomized Local Search (RLS) Variable Neighborhood Search (VNS) Tabu Search (TS) Evolutionary Algorithms (EAs)

Neighborhoods 17 For most (stochastic) search heuristics in discrete domain, we need to define a neighborhood structure which search points are close to each other? Example: k-bit flip / Hamming distance k neighborhood search space: bitstrings of length n (Ω={0,1} n ) two search points are neighbors if their Hamming distance is k in other words: x and y are neighbors if we can flip exactly k bits in x to obtain y 0001001101 is neighbor of 0001000101 for k=1 0101000101 for k=2 1101000101 for k=3

Randomized Local Search (RLS) 18 Idea behind (Randomized) Local Search: explore the local neighborhood of the current solution (randomly) Pure Random Search: go to randomly chosen neighbor (not dependent on obj. function) First Improvement Local Search, Randomized Local Search (RLS): go to first (randomly) chosen neighbor which is better Best Improvement Strategy: always go to the best neighbor not random anymore computationally expensive if neighborhood large

Variable Neighborhood Search 19 Main Idea: [Mladenovic and P. Hansen, 1997] change the neighborhood from time to time local optima are not the same for different neighborhood operators but often close to each other global optimum is local optimum for all neighborhoods rather a framework than a concrete algorithm e.g. deterministic and stochastic neighborhood changes typically combined with (i) first improvement, (ii) a random order in which the neighbors are visited and (iii) restarts N. Mladenovic and P. Hansen (1997). "Variable neighborhood search". Computers and Operations Research 24 (11): 1097 1100.

Tabu Search 20 Disadvantages of local searches (with or without varying neighborhoods) they get stuck in local optima have problems to traverse large plateaus of equal objective function value ( random walk ) Tabu search addresses these by allowing worsening moves if all neighbors are explored introducing a tabu list of temporarily not allowed moves those restricted moves are problem-specific and can be specific solutions or not permitted search directions such as don t include this edge anymore or do not flip this specific bit the tabu list is typically restricted in size and after a while, restricted moves are permitted again

Stochastic Optimization Algorithms 21 One class of (bio-inspired) stochastic optimization algorithms: Evolutionary Algorithms (EAs) 1859 Class of optimization algorithms originally inspired by the idea of biological evolution selection, mutation, recombination

Metaphors 22 Classical Optimization Evolutionary Computation variables or parameters candidate solution vector of decision variables / design variables / object variables variables or chromosomes individual, offspring, parent set of candidate solutions objective function loss function cost function error function population fitness function iteration generation

Generic Framework of an EA 23 initialization best individual evaluation potential parents stop? mating selection environmental selection parents evaluation offspring crossover/ mutation stochastic operators Darwinism stopping criteria Important: representation (search space)

The Historic Roots of EAs 24 Genetic Algorithms (GA) J. Holland 1975 and D. Goldberg (USA) Evolution Strategies (ES) I. Rechenberg and H.P. Schwefel, 1965 (Berlin) Evolutionary Programming (EP) L.J. Fogel 1966 (USA) Genetic Programming (GP) J. Koza 1990 (USA) nowadays one umbrella term: evolutionary algorithms

Genotype Phenotype mapping 25 The genotype phenotype mapping related to the question: how to come up with a fitness of each individual from the representation? related to DNA vs. actual animal (which then has a fitness) Fitness of an individual not always = f(x) include constraints include diversity others but needed: always a total order on the solutions

Handling Constraints 26 Several possible ways to handle constraints, e.g.: resampling until a new feasible point is found ( often bad idea ) penalty function approach: add constraint violation term (potentially scaled, see also the Lagrangian in the continuous part of the lecture) repair approach: after generation of a new point, repair it (e.g. with a heuristic) to become feasible again if infeasible continue to use repaired solution in the population or use repaired solution only for the evaluation? multiobjective approach: keep objective function and constraint functions separate and try to optimize all of them in parallel some more...

27 Examples for some EA parts (for discrete search spaces)

Selection 28 Selection is the major determinant for specifying the trade-off between exploitation and exploration Selection is either stochastic or deterministic e.g. fitness proportional Disadvantage: depends on scaling of f e.g. via a tournament e.g. (µ+λ), (µ,λ) best µ from offspring and parents best µ from offspring only Mating selection (selection for variation): usually stochastic Environmental selection (selection for survival): often deterministic

Variation Operators 29 Variation aims at generating new individuals on the basis of those individuals selected for mating Variation = Mutation and Recombination/Crossover mutation: mut: recombination: recomb: where and choice always depends on the problem and the chosen representation however, there are some operators that are applicable to a wide range of problems and tailored to standard representations such as vectors, permutations, trees, etc.

Variation Operators: Guidelines 30 Two desirable properties for mutation operators: exhaustiveness : every solution can be generated from every other with a probability greater than 0 locality : Desirable property of recombination operators ( in-between-ness ):

Examples of Mutation Operators on Permutations 31 Swap: Scramble: Invert: Insert:

Examples of Recombination Operators: {0,1} n 32 1-point crossover n-point crossover uniform crossover choose each bit independently from one parent or another

A Canonical Genetic Algorithm 33 search space of all binary strings of length n, maximization uniform initialization generational cycle of the population: evaluation of solutions mating selection (e.g. roulette wheel) crossover (e.g. 1-point) environmental selection (e.g. plus-selection)

Conclusions 34 EAs are generic algorithms (randomized search heuristics, meta-heuristics,...) for black box optimization no or almost no assumptions on the objective function They are typically less efficient than problem-specific (exact) algorithms (in terms of #funevals) not the case in the continuous case (we will see later) Allow for an easy and rapid implementation and therefore to find good solutions fast easy to incorporate problem-specific knowledge to improve the algorithm

37 Potential Master's/PhD thesis projects

38 Potential Research Topics for Master's/PhD Theses http://randopt.gforge.inria.fr/thesisprojects/

Potential Research Topics for Master's/PhD Theses 40 More projects without the involvement of companies: stopping criteria in multiobjective optimization large-scale variants of CMA-ES algorithms for expensive optimization based on CMA-ES all above: relatively flexible between theoretical and practical projects Coco-related: implementing and benchmarking algorithms for expensive opt. data mining performance results

Conclusions 41 I hope it became clear......that approximation algorithms are often what we can hope for in practice (might be difficult to achieve guarantees though)...that heuristics is what we typically can afford in practice (no guarantees and no proofs)