Automatic Algorithm Configuration based on Local Search

Similar documents
Automatic Algorithm Configuration based on Local Search

Automated Configuration of MIP solvers

Sequential Model-based Optimization for General Algorithm Configuration

Performance Prediction and Automated Tuning of Randomized and Parametric Algorithms

ParamILS: An Automatic Algorithm Configuration Framework

Problem Solving and Search in Artificial Intelligence

Automated Configuration of Mixed Integer Programming Solvers

MO-ParamILS: A Multi-objective Automatic Algorithm Configuration Framework

A Modular Multiphase Heuristic Solver for Post Enrolment Course Timetabling

Boosting Verification by Automatic Tuning of Decision Procedures

Automated Design of Metaheuristic Algorithms

Automatic Algorithm Configuration

Automatically Configuring Algorithms. Some recent work

Continuous optimization algorithms for tuning real and integer parameters of swarm intelligence algorithms

SpySMAC: Automated Configuration and Performance Analysis of SAT Solvers

Note: In physical process (e.g., annealing of metals), perfect ground states are achieved by very slow lowering of temperature.

Meta-algorithmic techniques in SAT solving: Automated configuration, selection and beyond

Sequential Model-Based Parameter Optimization: an Experimental Investigation of Automated and Interactive Approaches

Computer-Aided Design of High-Performance Algorithms

a local optimum is encountered in such a way that further improvement steps become possible.

Data Science meets Optimization. ODS 2017 EURO Plenary Patrick De Causmaecker EWG/DSO EURO Working Group CODeS KU Leuven/ KULAK

Parallel Algorithm Configuration

Université Libre de Bruxelles

arxiv: v1 [cs.ai] 7 Oct 2013

On the Run-time Behaviour of Stochastic Local Search Algorithms for SAT

Improvement Strategies for thef-racealgorithm: Sampling Design and Iterative Refinement

Effective Stochastic Local Search Algorithms for Biobjective Permutation Flow-shop Problems

AI-Augmented Algorithms

Simple mechanisms for escaping from local optima:

A Portfolio Solver for Answer Set Programming: Preliminary Report

Programming by Optimisation:

Non-deterministic Search techniques. Emma Hart

n Informally: n How to form solutions n How to traverse the search space n Systematic: guarantee completeness

Testing the Impact of Parameter Tuning on a Variant of IPOP-CMA-ES with a Bounded Maximum Population Size on the Noiseless BBOB Testbed

Automatic solver configuration using SMAC

CS-E3200 Discrete Models and Search

Continuous optimization algorithms for tuning real and integer parameters of swarm intelligence algorithms

Iterated Robust Tabu Search for MAX-SAT

Parallel Algorithm Configuration

Performance Prediction and Automated Tuning of Randomized and Parametric Algorithms

Hydra-MIP: Automated Algorithm Configuration and Selection for Mixed Integer Programming

Gradient Descent. 1) S! initial state 2) Repeat: Similar to: - hill climbing with h - gradient descent over continuous space

underlying iterative best improvement procedure based on tabu attributes. Heuristic Optimization

Université Libre de Bruxelles

AI-Augmented Algorithms

Tuning Algorithms for Tackling Large Instances: An Experimental Protocol.

Identifying Key Algorithm Parameters and Instance Features using Forward Selection

Handbook of Constraint Programming 245 Edited by F. Rossi, P. van Beek and T. Walsh c 2006 Elsevier All rights reserved

Local Search for CSPs

The Mesh Adaptive Direct Search Algorithm for Discrete Blackbox Optimization

Ordered Racing Protocols for Automatically Configuring Algorithms for Scaling Performance

2. Blackbox hyperparameter optimization and AutoML

Massively Parallel Seesaw Search for MAX-SAT

Automatically Improving the Anytime Behaviour of Optimisation Algorithms

Stochastic Local Search Methods for Dynamic SAT an Initial Investigation

Integrating Probabilistic Reasoning with Constraint Satisfaction

WalkSAT: Solving Boolean Satisfiability via Stochastic Search

Parameter Adjustment Based on Performance Prediction: Towards an Instance-Aware Problem Solver

The COMPSET Algorithm for Subset Selection

Parallel Stochastic Gradient Descent

An Adaptive Noise Mechanism for WalkSAT

arxiv: v2 [cs.ai] 11 Oct 2018

Automatic Configuration of Sequential Planning Portfolios

The MAX-SAX Problems

Predicting the runtime of combinatorial problems CS6780: Machine Learning Project Final Report

The Impact of Initial Designs on the Performance of MATSuMoTo on the Noiseless BBOB-2015 Testbed: A Preliminary Study

From Stochastic Search to Programming by Optimisation:

Outline. TABU search and Iterated Local Search classical OR methods. Traveling Salesman Problem (TSP) 2-opt

TABU search and Iterated Local Search classical OR methods

Pre-scheduled and adaptive parameter variation in MAX-MIN Ant System

Simulated Annealing. Slides based on lecture by Van Larhoven

Programming by Optimisation:

Off-line vs. On-line Tuning: A Study on MAX MIN Ant System for the TSP

Ant Colony Optimization Exercises

Empirical Methods for the Analysis of Optimization Heuristics

Outline of the module

CS264: Beyond Worst-Case Analysis Lecture #20: Application-Specific Algorithm Selection

Université Libre de Bruxelles

Designing a Portfolio of Parameter Configurations for Online Algorithm Selection

A Brief Look at Optimization

Machine Learning on Physical Robots

Classical Gradient Methods

Stochastic greedy local search Chapter 7

Outline. No Free Lunch Theorems SMTWTP. Outline DM812 METAHEURISTICS

Single Candidate Methods

CS227: Assignment 1 Report

Reactive Stochastic Local Search Algorithms for the Genomic Median Problem

SLS Algorithms. 2.1 Iterative Improvement (revisited)

Sudoku is a logic puzzle where one has to fill a grid with

Ant Colony Optimization

Automatic Configuration of MIP Solver: Case Study in Vertical Flight Planning

ReACT: Real-time Algorithm Configuration through Tournaments

Discovery of the Source of Contaminant Release

INF Biologically inspired computing Lecture 1: Marsland chapter 9.1, Optimization and Search Jim Tørresen

EECS 219C: Computer-Aided Verification Boolean Satisfiability Solving. Sanjit A. Seshia EECS, UC Berkeley

Université Libre de Bruxelles

Satisfiability. Michail G. Lagoudakis. Department of Computer Science Duke University Durham, NC SATISFIABILITY

Learning to Choose Instance-Specific Macro Operators

algorithms, i.e., they attempt to construct a solution piece by piece and are not able to offer a complete solution until the end. The FM algorithm, l

EECS 219C: Formal Methods Boolean Satisfiability Solving. Sanjit A. Seshia EECS, UC Berkeley

Transcription:

Automatic Algorithm Configuration based on Local Search Frank Hutter 1 Holger Hoos 1 Thomas Stützle 2 1 Department of Computer Science University of British Columbia Canada 2 IRIDIA Université Libre de Bruxelles Belgium

Motivation for automatic algorithm configuration Want to design best algorithm to solve a problem Many design choices need to be made Some choices deferred to later: free parameters of algorithm Set parameters to maximise empirical performance Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 2

Motivation for automatic algorithm configuration Want to design best algorithm to solve a problem Many design choices need to be made Some choices deferred to later: free parameters of algorithm Set parameters to maximise empirical performance Finding best parameter configuration is non-trivial Many parameters, discrete & continuous Dependencies between parameters Many test instances needed to generalize Many runs per instance needed for randomised algorithms Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 2

Motivation for automatic algorithm configuration Want to design best algorithm to solve a problem Many design choices need to be made Some choices deferred to later: free parameters of algorithm Set parameters to maximise empirical performance Finding best parameter configuration is non-trivial Many parameters, discrete & continuous Dependencies between parameters Many test instances needed to generalize Many runs per instance needed for randomised algorithms Algorithm configuration / tuning still often done manually, using ad-hoc methods tedious and time-consuming, sub-optimal results big incentive for automation Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 2

Real-world example: Application: Solving SAT-encoded software verification problems Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 3

Real-world example: Application: Solving SAT-encoded software verification problems Tune 26 parameters of new DPLL-type SAT solver (Spear) 7 categorical, 3 boolean, 12 continuous, 4 integer parameters Variable/value heuristics, clause learning, restarts,... Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 3

Real-world example: Application: Solving SAT-encoded software verification problems Tune 26 parameters of new DPLL-type SAT solver (Spear) 7 categorical, 3 boolean, 12 continuous, 4 integer parameters Variable/value heuristics, clause learning, restarts,... Minimize expected run-time Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 3

Real-world example: Application: Solving SAT-encoded software verification problems Tune 26 parameters of new DPLL-type SAT solver (Spear) 7 categorical, 3 boolean, 12 continuous, 4 integer parameters Variable/value heuristics, clause learning, restarts,... Minimize expected run-time Problems: default settings 300 seconds / run good performance on a few instances may not generalise Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 3

Standard algorithm configuration approach Choose a representative benchmark set for tuning Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 4

Standard algorithm configuration approach Choose a representative benchmark set for tuning Perform iterative manual tuning: start with some parameter configuration repeat modify a single parameter if results on tuning set improve then keep new configuration until no more improvement possible (or good enough ) Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 4

Problems: cost for evaluating configuration depends on number and hardness of problem instances constraints on tuning time (per iteration and overall) typically use few and fairly easy instances, few iterations Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 5

Problems: cost for evaluating configuration depends on number and hardness of problem instances constraints on tuning time (per iteration and overall) typically use few and fairly easy instances, few iterations manual search = iterative improvement (hill climbing) finds local optimum only Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 5

Problems: cost for evaluating configuration depends on number and hardness of problem instances constraints on tuning time (per iteration and overall) typically use few and fairly easy instances, few iterations manual search = iterative improvement (hill climbing) finds local optimum only slow and tedious, requires significant human time procedure often performed in ad-hoc way Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 5

Problems: cost for evaluating configuration depends on number and hardness of problem instances constraints on tuning time (per iteration and overall) typically use few and fairly easy instances, few iterations manual search = iterative improvement (hill climbing) finds local optimum only slow and tedious, requires significant human time procedure often performed in ad-hoc way Solution: automate process use more powerful search method Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 5

Related work Search approaches [Minton 1993, 1996], [Hutter 2004], [Cavazos & O Boyle 2005], [Adenso-Diaz & Laguna 2006], [Audet & Orban 2006] Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 6

Related work Search approaches [Minton 1993, 1996], [Hutter 2004], [Cavazos & O Boyle 2005], [Adenso-Diaz & Laguna 2006], [Audet & Orban 2006] Racing algorithms/bandit solvers [Birattari et al. 2002], [Smith et al. 2004, 2006] Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 6

Related work Search approaches [Minton 1993, 1996], [Hutter 2004], [Cavazos & O Boyle 2005], [Adenso-Diaz & Laguna 2006], [Audet & Orban 2006] Racing algorithms/bandit solvers [Birattari et al. 2002], [Smith et al. 2004, 2006] Stochastic Optimisation [Kiefer & Wolfowitz 1952], [Spall 1987] Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 6

Related work Search approaches [Minton 1993, 1996], [Hutter 2004], [Cavazos & O Boyle 2005], [Adenso-Diaz & Laguna 2006], [Audet & Orban 2006] Racing algorithms/bandit solvers [Birattari et al. 2002], [Smith et al. 2004, 2006] Stochastic Optimisation [Kiefer & Wolfowitz 1952], [Spall 1987] Learning approaches Regression trees [Bartz-Beielstein et al. 2004] Response surface models, DACE [Bartz-Beielstein et al. 2004 2006] Lots of work on per-instance / reactive tuning orthogonal to the approach followed here Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 6

Outline 1. Introduction 2. Iterated local search over parameter configurations 3. The FocusedILS algorithm 4. Sample applications and performance results 5. Conclusions and future work Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 7

ILS in parameter configuration space (ParamILS): choose initial parameter configuration θ perform subsidiary local search on s Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 8

ILS in parameter configuration space (ParamILS): choose initial parameter configuration θ perform subsidiary local search on s While tuning time left: θ := θ perform perturbation on θ perform subsidiary local search on θ Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 8

ILS in parameter configuration space (ParamILS): choose initial parameter configuration θ perform subsidiary local search on s While tuning time left: θ := θ perform perturbation on θ perform subsidiary local search on θ based on acceptance criterion, keep θ or revert to θ := θ Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 8

ILS in parameter configuration space (ParamILS): choose initial parameter configuration θ perform subsidiary local search on s While tuning time left: θ := θ perform perturbation on θ perform subsidiary local search on θ based on acceptance criterion, keep θ or revert to θ := θ with probability p restart randomly pick new θ performs biased random walk over local minima Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 8

Details on ParamILS: subsidiary local search: iterative first improvement, change on parameter in each step Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 9

Details on ParamILS: subsidiary local search: iterative first improvement, change on parameter in each step perturbation: change 3 randomly chosen parameters Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 9

Details on ParamILS: subsidiary local search: iterative first improvement, change on parameter in each step perturbation: change 3 randomly chosen parameters acceptance criterion: always select better configuration Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 9

Details on ParamILS: subsidiary local search: iterative first improvement, change on parameter in each step perturbation: change 3 randomly chosen parameters acceptance criterion: always select better configuration initialisation: pick best of default + R random configurations Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 9

Evaluation of a parameter configuration θ (based on N runs) Sample N instances from given set (with repetitions) Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 10

Evaluation of a parameter configuration θ (based on N runs) Sample N instances from given set (with repetitions) For each of the N instances, execute algorithm with configuration θ. Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 10

Evaluation of a parameter configuration θ (based on N runs) Sample N instances from given set (with repetitions) For each of the N instances, execute algorithm with configuration θ. Record scalar cost of each of the N runs: sc 1,..., sc n (run-time, solution quality,... ) empirical cost distribution ĈD Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 10

Evaluation of a parameter configuration θ (based on N runs) Sample N instances from given set (with repetitions) For each of the N instances, execute algorithm with configuration θ. Record scalar cost of each of the N runs: sc 1,..., sc n (run-time, solution quality,... ) empirical cost distribution ĈD Compute scalar statistic ĉ N (θ) of ĈD (mean, median,... ) Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 10

Evaluation of a parameter configuration θ (based on N runs) Sample N instances from given set (with repetitions) For each of the N instances, execute algorithm with configuration θ. Record scalar cost of each of the N runs: sc 1,..., sc n (run-time, solution quality,... ) empirical cost distribution ĈD Compute scalar statistic ĉ N (θ) of ĈD (mean, median,... ) Note: For large N, ĉ N (θ) approaches true cost c(θ) Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 10

Solution quality over time achieved by ParamILS Runlength (median, 10% & 90% quantiles) 2.6 x 104 BasicILS(100) performance on training set 2.4 2.2 2 1.8 1.6 1.4 1.2 1 0.8 10 1 10 2 10 3 10 4 CPU time [s] Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 11

Question: How many runs/instances? too many evaluating a configuration is very expensive optimisation process is very slow Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 12

Question: How many runs/instances? too many evaluating a configuration is very expensive optimisation process is very slow too few very noisy approximations ĉ N (θ) poor generalisation to independent test runs Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 12

Generalisation to independent test runs (1) Runlength (median, 10% & 90% quantiles) 2.5 2 1.5 1 x 10 4 BasicILS(100) performance on test set BasicILS(100) performance on training set 10 1 10 2 10 3 10 4 CPU time [s] Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 13

Generalisation to independent test runs (2) Runlength (median, 10% & 90% quantiles) 6 x 104 5 4 3 2 1 BasicILS(1) performance on test set BasicILS(1) performance on training set 0 10 2 10 0 10 2 CPU time [s] Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 14

The FocusedILS algorithm Given a budget of available CPU time for tuning, use different numbers of runs, N(θ), for each configuration θ Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 15

The FocusedILS algorithm Given a budget of available CPU time for tuning, use different numbers of runs, N(θ), for each configuration θ Idea: Use high N(θ) only for good θ start with N(θ) = 0 for all θ increment N(θ) whenever θ is visited additional runs upon finding new, better configuration θ Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 15

The FocusedILS algorithm Theorem: As number of FocusedILS iterations, it converges to true optimal configuration θ Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 16

The FocusedILS algorithm Theorem: As number of FocusedILS iterations, it converges to true optimal configuration θ Key ideas in proof: 1. For N(θ), N(θ ), comparisons between θ, θ become precise. Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 16

The FocusedILS algorithm Theorem: As number of FocusedILS iterations, it converges to true optimal configuration θ Key ideas in proof: 1. For N(θ), N(θ ), comparisons between θ, θ become precise. 2. Underlying ILS eventually reaches any configuration θ. Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 16

Performance of FocusedILS vs BasicILS (Test performance on SAPS-QWH) Median runlength of SAPS [steps] 2.2 x 104 2 1.8 1.6 1.4 1.2 BasicILS(100) 1 10 2 10 0 10 2 10 4 CPU time for ParamILS [s] Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 17

Performance of FocusedILS vs BasicILS (Test performance on SAPS-QWH) Median runlength of SAPS [steps] 2.2 x 104 2 1.8 1.6 1.4 1.2 BasicILS(100) BasicILS(10) 1 10 2 10 0 10 2 10 4 CPU time for ParamILS [s] Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 17

Performance of FocusedILS vs BasicILS (Test performance on SAPS-QWH) Median runlength of SAPS [steps] 2.2 x 104 2 1.8 1.6 1.4 1.2 BasicILS(100) BasicILS(1) BasicILS(10) 1 10 2 10 0 10 2 10 4 CPU time for ParamILS [s] Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 17

Performance of FocusedILS vs BasicILS (Test performance on SAPS-QWH) Median runlength of SAPS [steps] 2.2 x 104 2 1.8 1.6 1.4 1.2 FocusedILS BasicILS(100) BasicILS(1) BasicILS(10) 1 10 2 10 0 10 2 10 4 CPU time for ParamILS [s] Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 17

Sample applications and performance results Experiment: Comparison to CALIBRA Scenario Default CALIBRA(100) BasicILS(100) FocusedILS GLS-GRID ɛ = 1.81 1.234 ± 0.492 0.951 ± 0.004 0.949 ± 0.0001 Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 18

Sample applications and performance results Experiment: Comparison to CALIBRA Scenario Default CALIBRA(100) BasicILS(100) FocusedILS GLS-GRID ɛ = 1.81 1.234 ± 0.492 0.951 ± 0.004 0.949 ± 0.0001 SAPS-QWH 85.5K steps 10.7K ± 1.1K 10.9K ± 0.6K 10.6K ± 0.5K Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 18

Sample applications and performance results Experiment: Comparison to CALIBRA Scenario Default CALIBRA(100) BasicILS(100) FocusedILS GLS-GRID ɛ = 1.81 1.234 ± 0.492 0.951 ± 0.004 0.949 ± 0.0001 SAPS-QWH 85.5K steps 10.7K ± 1.1K 10.9K ± 0.6K 10.6K ± 0.5K SAPS-SW 5.60 s 0.053 ± 0.010 0.046 ± 0.01 0.043 ± 0.005 Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 18

Sample applications and performance results Experiment: Comparison to CALIBRA Scenario Default CALIBRA(100) BasicILS(100) FocusedILS GLS-GRID ɛ = 1.81 1.234 ± 0.492 0.951 ± 0.004 0.949 ± 0.0001 SAPS-QWH 85.5K steps 10.7K ± 1.1K 10.9K ± 0.6K 10.6K ± 0.5K SAPS-SW 5.60 s 0.053 ± 0.010 0.046 ± 0.01 0.043 ± 0.005 SAT4J-SW 7.02 s (too many param.) 1.19 ± 0.58 0.65 ± 0.2 Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 18

Speedup obtained by automated tuning (SAPS default vs tuned on test set SW-GCP) run time [s], auto tuned parameters 10 4 10 3 10 2 10 1 10 0 10 1 10 2 10 2 10 1 10 0 10 1 10 2 10 3 10 4 run time [s], default parameters Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 19

Two real-world applications New DPLL-type SAT solver Spear 26 parameters Software verification: 500-fold speedup Hardware verification: 4.5-fold speedup New state of the art for those instances Hutter, Babić, Hoos & Hu: FMCAD 07 (to appear) Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 20

Two real-world applications New DPLL-type SAT solver Spear 26 parameters Software verification: 500-fold speedup Hardware verification: 4.5-fold speedup New state of the art for those instances Hutter, Babić, Hoos & Hu: FMCAD 07 (to appear) New replica exchange Monte Carlo algorithm for protein structure prediction 3 parameters 2-fold improvement New state of the art for 2D/3D protein structure prediction Thachuk, Shmygelska & Hoos (under review) Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 20

Conclusions ParamILS: Simple and efficient framework for automatic parameter optimization Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 21

Conclusions ParamILS: Simple and efficient framework for automatic parameter optimization FocusedILS: converges provably towards optimal configuration, no over-confidence excellent performance in practice (outperforms BasicILS, CALIBRA) Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 21

Conclusions ParamILS: Simple and efficient framework for automatic parameter optimization FocusedILS: converges provably towards optimal configuration, no over-confidence excellent performance in practice (outperforms BasicILS, CALIBRA) Huge speedups: 100 for SAPS (local search) on graph colouring 500 for Spear (tree search) on software verification Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 21

Conclusions ParamILS: Simple and efficient framework for automatic parameter optimization FocusedILS: converges provably towards optimal configuration, no over-confidence excellent performance in practice (outperforms BasicILS, CALIBRA) Huge speedups: 100 for SAPS (local search) on graph colouring 500 for Spear (tree search) on software verification Publically available at: http://www.cs.ubc.ca/labs/beta/projects/paramils Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 21

Future work Continuous parameters (currently discretised) Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 22

Future work Continuous parameters (currently discretised) Statistical tests (cf. racing algorithms) Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 22

Future work Continuous parameters (currently discretised) Statistical tests (cf. racing algorithms) Per-instance tuning Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 22

Future work Continuous parameters (currently discretised) Statistical tests (cf. racing algorithms) Per-instance tuning Automatic algorithm design Hutter, Hoos, Stützle: Automatic Algorithm Configuration based on Local Search 22