Automatic Algorithm Configuration
|
|
- Norma French
- 5 years ago
- Views:
Transcription
1 Automatic Algorithm Configuration Thomas Stützle RDA, CoDE, Université Libre de Bruxelles Brussels, Belgium iridia.ulb.ac.be/~stuetzle Outline 1. Context 2. Automatic algorithm configuration 3. Applications 4. Concluding remarks TPNC 2017, Prague, Czech Republic 2
2 The algorithmic solution of hard optimization problems is one of the CS/OR success stories! Exact (systematic search) algorithms Branch&Bound, Branch&Cut, constraint programming,... powerful general-purpose software available guarantees on optimality but often time/memory consuming Approximate algorithms heuristics, local search, metaheuristics, hyperheuristics... typically special-purpose software rarely provable guarantees but often fast and accurate Much active research on hybrids between exact and approximate algorithms! TPNC 2017, Prague, Czech Republic 3 Design choices and parameters everywhere Todays high-performance optimizers involve a large number of design choices and parameter settings exact solvers design choices include alternative models, pre-processing, variable selection, value selection, branching rules... many design choices have associated numerical parameters example: SCP solver (fastest non-commercial MP solver) has more than 200 relevant parameters that influence the solver s search mechanism approximate algorithms design choices include solution representation, operators, neighborhoods, pre-processing, strategies,... many design choices have associated numerical parameters example: multi-objective ACO algorithms with 22 parameters (plus several still hidden ones) TPNC 2017, Prague, Czech Republic 4
3 LS design choices and numerical parameters choice of constructive procedures random vs. greedy construction static vs. adaptive construction how many initial solutions? perturbation type of perturbation fixed vs. variable size of destruction random vs. biased perturbation acceptance criterion strength of bias towards best solutions use of history information and if yes, how local search... many choices... numerical parameters perturbation size parameters associated type of perturbation parameters related to acceptance criterion TPNC 2017, Prague, Czech Republic 5 Parameter types categorical parameters choice of constructive procedure, choice of recombination operator, choice of branching strategy,... ordinal parameters neighborhoods, lower bounds,... numerical parameters design design tuning, calibration integer or real-valued parameters weighting factors, population sizes, temperature, hidden constants,... numerical parameters may be conditional to specific values of categorical or ordinal parameters Design and configuration of algorithms involves setting categorical, ordinal, and numerical parameters TPNC 2017, Prague, Czech Republic 6
4 Designing optimization algorithms Challenges many alternative design choices nonlinear interactions among algorithm components and/or parameters performance assessment is di cult Traditional design approach trial and error design guided by expertise/intuition prone to over-generalizations, implicit independence assumptions, limited exploration of design alternatives Can we make this approach more principled and automatic? TPNC 2017, Prague, Czech Republic 7 Towards automatic algorithm configuration Automated algorithm configuration apply powerful search techniques to design algorithms use computation power to explore design spaces assist algorithm designer in the design process free human creativity for higher level tasks TPNC 2017, Prague, Czech Republic 8
5 O ine configuration and online parameter control O ine configuration configure algorithm before deploying it configuration on training instances related to algorithm design Online parameter control adapt parameter setting while solving an instance typically limited to a set of known crucial algorithm parameters related to parameter calibration O ine configuration techniques can be helpful to configure (online) parameter control strategies TPNC 2017, Prague, Czech Republic 9 O ine configuration Typical performance measures maximize solution quality (within given computation time) minimize computation time (to reach optimal solution) TPNC 2017, Prague, Czech Republic 10
6 Configurators TPNC 2017, Prague, Czech Republic 11 Approaches to configuration experimental design techniques e.g. CALBRA [Adenso Díaz, Laguna, 2006], [Ridge&Kudenko, 2007], [Coy et al., 2001], [Ruiz, Stützle, 2005] numerical optimization techniques e.g. MADS [Audet&Orban, 2006], various [Yuan et al., 2012] heuristic search methods e.g. meta-ga [Grefenstette, 1985], ParamLS [Hutter et al., 2007, 2009], gender-based GA [Ansótegui at al., 2009], linear GP [Oltean, 2005], REVAC(++) [Eiben et al., 2007, 2009, 2010]... model-based optimization approaches e.g. SPO [Bartz-Beielstein et al., 2005, 2006,.. ], SMAC [Hutter et al., 2011,..], GGA++ [Ansótegui, 2015] sequential statistical testing e.g. F-race, iterated F-race [Birattari et al, 2002, 2007,...] General, domain-independent methods required: (i) applicable to all variable types, (ii) multiple training instances, (iii) high performance, (iv) scalable TPNC 2017, Prague, Czech Republic 12
7 Approaches to configuration experimental design techniques e.g. CALBRA [Adenso Díaz, Laguna, 2006], [Ridge&Kudenko, 2007], [Coy et al., 2001], [Ruiz, Stützle, 2005] numerical optimization techniques e.g. MADS [Audet&Orban, 2006], various [Yuan et al., 2012] heuristic search methods e.g. meta-ga [Grefenstette, 1985], ParamLS [Hutter et al., 2007, 2009], gender-based GA [Ansótegui at al., 2009], linear GP [Oltean, 2005], REVAC(++) [Eiben et al., 2007, 2009, 2010]... model-based optimization approaches e.g. SPO [Bartz-Beielstein et al., 2005, 2006,.. ], SMAC [Hutter et al., 2011,..], GGA++ [Ansótegui, 2015] sequential statistical testing e.g. F-race, iterated F-race [Birattari et al, 2002, 2007,...] General, domain-independent methods required: (i) applicable to all variable types, (ii) multiple training instances, (iii) high performance, (iv) scalable TPNC 2017, Prague, Czech Republic 13 The racing approach i start with a set of initial candidates consider a stream of instances sequentially evaluate candidates discard inferior candidates as su cient evidence is gathered against them...repeat until a winner is selected or until computation time expires TPNC 2017, Prague, Czech Republic 14
8 The F-Race algorithm Statistical testing 1. family-wise tests for di erences among configurations Friedman two-way analysis of variance by ranks 2. if Friedman rejects H0, perform pairwise comparisons to best configuration apply Friedman post-test TPNC 2017, Prague, Czech Republic 15 terated race Racing is a method for the selection of the best configuration and independent of the way the set of configurations is sampled terated race sample configurations from initial distribution While not terminate() apply race modify sampling distribution sample configurations TPNC 2017, Prague, Czech Republic 16
9 terated race: sampling { x 1 x 2 x 3 { x 1 x 2 x 3 TPNC 2017, Prague, Czech Republic 17 terated racing: sampling distributions Numerical parameter X d 2 [x d, x d ] ) Truncated normal distribution N (µ z d, i d ) 2 [x d, x d ] µ z d =valueofparameterd in elite configuration z i d =decreaseswiththenumberofiterations Categorical parameter X d 2 {x 1, x 2,...,x nd } ) Discrete probability distribution x 1 x 2... x nd Pr z {X d = x j } = x 1 x 2 x 3 Updated by increasing probability of parameter value in elite configuration Other probabilities are reduced TPNC 2017, Prague, Czech Republic 18
10 The irace package Manuel López-báñez, Jérémie Dubois-Lacoste, Thomas Stützle, and Mauro Birattari. The irace package, terated Race for Automatic Algorithm Configuration. Technical Report TR/RDA/ , RDA, Université Libre de Bruxelles, Belgium, 2011; extended version: Operations Research Perspectives, The irace Package: User Guide, 2016, Technical Report TR/RDA/ implementation of terated Racing in R Goal 1: flexible Goal 2: easy to use but no knowledge of R necessary parallel evaluation (MP, multi-cores, grid engine.. ) capping for run-time optimization irace has shown to be e ective for configuration tasks with several hundred of variables TPNC 2017, Prague, Czech Republic 19 The irace package: usage Training instances Parameter space Configuration scenario calls with θ,i irace irace returns c(θ,i) targetrunner TPNC 2017, Prague, Czech Republic 20
11 Other tools: ParamLS, SMAC ParamLS iterated local search in configuration space requires discretization of numerical parameters SMAC surrogate model assisted search process encouraging results for large configuration spaces capping: e ective speed-up technique for configuration target run-time TPNC 2017, Prague, Czech Republic 21 Applications of automatic configuration tools configuration of black-box solvers e.g. mixed integer programming solvers, continuous optimizers supporting tool in algorithm engineering e.g. metaheuristics for probabilistic TSP, re-engineering PSO bottom-up generation of heuristic algorithms e.g. heuristics for SAT, FSP, etc.; metaheuristic framework design configurable algorithm frameworks e.g. Satenstein, MOACO, UACOR TPNC 2017, Prague, Czech Republic 22
12 Example, configuration of black-box solvers Mixed-integer programming solvers TPNC 2017, Prague, Czech Republic 23 Mixed integer programming (MP) solvers [Hutter, Hoos, Leyton-Brown, Stützle, 2009, Hutter, Hoos Leighton-Brown, 2010] MP modelling widely used for tackling optimization problems powerful commercial (e.g. CPLEX) and non-commercial (e.g. SCP) solvers available large number of parameters (tens to hundreds) Benchmark set Default Configured Speedup Regions (11.4 ± 0.9) 6.8 Conic.SCH (2.4 ± 0.29) 2.51 CLS (327 ± 860) MK (301 ± 948) QP (827 ± 306) 1.85 FocusedLS, 10 runs, 2 CPU days, 63 parameters TPNC 2017, Prague, Czech Republic 24
13 Example, bottom-up generation of algorithms Automatic design of hybrid SLS algorithms TPNC 2017, Prague, Czech Republic 25 Approach TPNC 2017, Prague, Czech Republic 26
14 Main approaches Top-down approaches develop flexible framework following a fixed algorithm template with alternatives apply high-performing configurators Examples: Satenstein, MOACO, MOEA, MP Solvers?(!) Bottom-up approaches flexible framework implementing algorithm components define rules for composing algorithms from components e.g. through grammars frequently usage of genetic programming, grammatical evolution etc. TPNC 2017, Prague, Czech Republic 27 Automatic design of hybrid SLS algorithms [Marmion, Mascia, Lópes-báñez, Stützle, 2013] Approach decompose single-point SLS methods into components derive generalized metaheuristic structure component-wise implementation of metaheuristic part mplementation present possible algorithm compositions by a grammar instantiate grammer using a parametric representation allows use of standard automatic configuration tools shows good performance when compared to, e.g., grammatical evolution [Mascia, Lópes-báñez, Dubois-Lacoste, Stützle, 2013] TPNC 2017, Prague, Czech Republic 28
15 General Local Search Structure: LS s 0 :=initsolution s := ls(s 0 ) repeat s 0 :=perturb(s,history) s 0 :=ls(s 0 ) s :=accept(s,s 0,history) until termination criterion met many SLS methods instantiable from this structure abilities hybridization recursion problem specific implementation at low-level TPNC 2017, Prague, Czech Republic 29 Grammar <algorithm> ::= <initialization> <ils> <initialization> ::= random <pbs_initialization> <ils> ::= LS(<perturb>, <ls>, <accept>, <stop>) <perturb> ::= none <initialization> <pbs_perturb> <ls> ::= <ils> <descent> <sa> <rii> <pii> <vns> <ig> <pbs_ls> <accept> ::= alwaysaccept improvingaccept <comparator> prob(<value_prob_accept>) probrandom <metropolis> threshold(<value_threshold_accept>) <pbs_accept> <descent> ::= bestdescent(<comparator>, <stop>) firstmprdescent(<comparator>, <stop>) <sa> ::= LS(<pbs_move>, no_ls, <metropolis>, <stop>) <rii> ::= LS(<pbs_move>, no_ls, probrandom, <stop>) <pii> ::= LS(<pbs_move>, no_ls, prob(<value_prob_accept>), <stop>) <vns> ::= LS(<pbs_variable_move>, firstmprdescent(improvingstrictly), improvingaccept(improvingstrictly), <stop>) <ig> ::= LS(<deconst-construct_perturb>, <ls>, <accept>, <stop>) <comparator> ::= improvingstrictly improving <value_prob_accept> ::= [0, 1] <value_threshold_accept> ::= [0, 1] <metropolis> ::= metropolisaccept(<init_temperature>, <final_temperature>, <decreasing_temperature_ratio>, <span>) <init_temperature> ::= {1, 2,..., 10000} <final_temperature> ::= {1, 2,..., 100} <decreasing_temperature_ratio> ::= [0, 1] <span> ::= {1, 2,..., 10000} TPNC 2017, Prague, Czech Republic 30
16 Grammar <algorithm> ::= <initialization> <ils> <initialization> ::= random <pbs_initialization> <ils> ::= LS(<perturb>, <ls>, <accept>, <stop>) <perturb> ::= none <initialization> <pbs_perturb> <ls> ::= <ils> <descent> <sa> <rii> <pii> <vns> <ig> <pbs_ls> <accept> ::= alwaysaccept improvingaccept <comparator> prob(<value_prob_accept>) probrandom <metropolis> threshold(<value_threshold_accept>) <pbs_accept> <descent> ::= bestdescent(<comparator>, <stop>) firstmprdescent(<comparator>, <stop>) <sa> ::= LS(<pbs_move>, no_ls, <metropolis>, <stop>) <rii> ::= LS(<pbs_move>, no_ls, probrandom, <stop>) <pii> ::= LS(<pbs_move>, no_ls, prob(<value_prob_accept>), <stop>) <vns> ::= LS(<pbs_variable_move>, firstmprdescent(improvingstrictly), improvingaccept(improvingstrictly), <stop>) <ig> ::= LS(<deconst-construct_perturb>, <ls>, <accept>, <stop>) <comparator> ::= improvingstrictly improving <value_prob_accept> ::= [0, 1] <value_threshold_accept> ::= [0, 1] <metropolis> ::= metropolisaccept(<init_temperature>, <final_temperature>, <decreasing_temperature_ratio>, <span>) <init_temperature> ::= {1, 2,..., 10000} <final_temperature> ::= {1, 2,..., 100} <decreasing_temperature_ratio> ::= [0, 1] <span> ::= {1, 2,..., 10000} TPNC 2017, Prague, Czech Republic 31 Grammar <algorithm> ::= <initialization> <ils> <initialization> ::= random <pbs_initialization> <ils> ::= LS(<perturb>, <ls>, <accept>, <stop>) <perturb> ::= none <initialization> <pbs_perturb> <ls> ::= <ils> <descent> <sa> <rii> <pii> <vns> <ig> <pbs_ls <accept> ::= alwaysaccept improvingaccept <comparator> prob(<value_prob_accept>) probrandom <metropolis> threshold(<value_threshold_accept>) <pbs_accept> <descent> ::= bestdescent(<comparator>, <stop>) firstmprdescent(<comparator>, <stop>) <sa> ::= LS(<pbs_move>, no_ls, <metropolis>, <stop>) <rii> ::= LS(<pbs_move>, no_ls, probrandom, <stop>) <pii> ::= LS(<pbs_move>, no_ls, prob(<value_prob_accept>), <stop>) <vns> ::= LS(<pbs_variable_move>, firstmprdescent(improvingstrictly), improvingaccept(improvingstrictly), <stop>) <ig> ::= LS(<deconst-construct_perturb>, <ls>, <accept>, <stop>) <comparator> ::= improvingstrictly improving <value_prob_accept> ::= [0, 1] <value_threshold_accept> ::= [0, 1] <metropolis> ::= metropolisaccept(<init_temperature>, <final_temperature>, <decreasing_temperature_ratio>, <span>) <init_temperature> ::= {1, 2,..., 10000} <final_temperature> ::= {1, 2,..., 100} <decreasing_temperature_ratio> ::= [0, 1] <span> ::= {1, 2,..., 10000} TPNC 2017, Prague, Czech Republic 32
17 Flow-shop problem with makespan objective Automatic configuration: max. three levels of recursion biased / unbiased grammar resulting in 262 and 502 parameters, respectively budget: trials of n m 0.03 seconds 95% confidence limits ARPD Grs Gtb irace1 irace2 irace3 irace4 Algorithms Results are clearly superior to state-of-the-art TPNC 2017, Prague, Czech Republic 33 Flow-shop problem with total tardiness objective Automatic configuration: max. three levels of recursion biased / unbiased grammar resulting in 262 and 502 parameters, respectively budget: trials of n m 0.03 seconds Relative Deviation ndex TSM63 TSMe63 irace4 Results are clearly superior to state-of-the-art TPNC 2017, Prague, Czech Republic 34
18 Conclusions Results design and analysis of (hybrid) metaheuristics is automated not a silver bullet, but needs right components, especially low-level problem-specific ones better or equal performance to state-of-the-art for PFSP-WT, UBQP, TSP-TW directly extendible for unbiased comparisons of metaheuristics Future work extensions to other methods and templates dealing with complexity of hybrid algorithms increase generality, tackling wide problem classes TPNC 2017, Prague, Czech Republic 35 Why automatic algorithm configuration? improvement over manual, ad-hoc methods for tuning reduction of development time and human intervention increase number of considerable degrees of freedom empirical studies, comparisons of algorithms support for end users of algorithms... and it has become feasible due to increase in computational power! TPNC 2017, Prague, Czech Republic 36
19 Configuring configurators What about configuring automatically the configurator?... and configuring the configurator of the configurator? can be done (example, see [Hutter et al., 2009]), but... it is costly and iterating further leads to diminishing returns TPNC 2017, Prague, Czech Republic 37 Towards a shift of paradigm in algorithm design TPNC 2017, Prague, Czech Republic 38
20 Towards a shift of paradigm in algorithm design TPNC 2017, Prague, Czech Republic 39 Towards a shift of paradigm in algorithm design TPNC 2017, Prague, Czech Republic 40
21 Conclusions Status using automatic configuration tools is rewarding in terms of development time and algorithm performance interactive usage of configurators allows humans to focus on creative part of algorithm design many application opportunities also in other areas than optimization Future work more powerful configurators more and more complex applications best practice TPNC 2017, Prague, Czech Republic 41
Automatically Configuring Algorithms. Some recent work
Automatically Configuring Algorithms Some recent work Thomas Stützle, Leonardo Bezerra, Jérémie Dubois-Lacoste, Tianjun Liao, Manuel López-Ibáñez, Franco Mascia and Leslie Perez Caceres IRIDIA, CoDE, Université
More informationProblem Solving and Search in Artificial Intelligence
Problem Solving and Search in Artificial Intelligence Algorithm Configuration Nysret Musliu Database and Artificial Intelligence Group, Institut of Logic and Computation, TU Wien Motivation Metaheuristic
More informationAutomated Design of Metaheuristic Algorithms
Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle Automated Design of Metaheuristic Algorithms T. Stützle and M. López-Ibáñez IRIDIA
More informationEffective Stochastic Local Search Algorithms for Biobjective Permutation Flow-shop Problems
Effective Stochastic Local Search Algorithms for Biobjective Permutation Flow-shop Problems Jérémie Dubois-Lacoste, Manuel López-Ibáñez, and Thomas Stützle IRIDIA, CoDE, Université Libre de Bruxelles Brussels,
More informationAutomatic Algorithm Configuration based on Local Search
Automatic Algorithm Configuration based on Local Search Frank Hutter 1 Holger Hoos 1 Thomas Stützle 2 1 Department of Computer Science University of British Columbia Canada 2 IRIDIA Université Libre de
More informationSequential Model-based Optimization for General Algorithm Configuration
Sequential Model-based Optimization for General Algorithm Configuration Frank Hutter, Holger Hoos, Kevin Leyton-Brown University of British Columbia LION 5, Rome January 18, 2011 Motivation Most optimization
More informationContinuous optimization algorithms for tuning real and integer parameters of swarm intelligence algorithms
Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle Continuous optimization algorithms for tuning real and integer parameters of swarm
More informationAutomatic Algorithm Configuration based on Local Search
Automatic Algorithm Configuration based on Local Search Frank Hutter 1 Holger Hoos 1 Thomas Stützle 2 1 Department of Computer Science University of British Columbia Canada 2 IRIDIA Université Libre de
More informationSpySMAC: Automated Configuration and Performance Analysis of SAT Solvers
SpySMAC: Automated Configuration and Performance Analysis of SAT Solvers Stefan Falkner, Marius Lindauer, and Frank Hutter University of Freiburg {sfalkner,lindauer,fh}@cs.uni-freiburg.de Abstract. Most
More informationTesting the Impact of Parameter Tuning on a Variant of IPOP-CMA-ES with a Bounded Maximum Population Size on the Noiseless BBOB Testbed
Testing the Impact of Parameter Tuning on a Variant of IPOP-CMA-ES with a Bounded Maximum Population Size on the Noiseless BBOB Testbed ABSTRACT Tianjun Liao IRIDIA, CoDE, Université Libre de Bruxelles
More informationData Science meets Optimization. ODS 2017 EURO Plenary Patrick De Causmaecker EWG/DSO EURO Working Group CODeS KU Leuven/ KULAK
Data Science meets Optimization ODS 2017 EURO Plenary Patrick De Causmaecker EWG/DSO EURO Working Group CODeS KU Leuven/ KULAK ODS 2017, Sorrento www.fourthparadigm.com Copyright 2009 Microsoft Corporation
More informationa local optimum is encountered in such a way that further improvement steps become possible.
Dynamic Local Search I Key Idea: Modify the evaluation function whenever a local optimum is encountered in such a way that further improvement steps become possible. I Associate penalty weights (penalties)
More informationUniversité Libre de Bruxelles
Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle F-Race and iterated F-Race: An overview Mauro Birattari, Zhi Yuan, Prasanna Balaprakash,
More informationContinuous optimization algorithms for tuning real and integer parameters of swarm intelligence algorithms
Swarm Intell (2012) 6:49 75 DOI 10.1007/s11721-011-0065-9 Continuous optimization algorithms for tuning real and integer parameters of swarm intelligence algorithms Zhi Yuan MarcoA.MontesdeOca Mauro Birattari
More informationAutomated Configuration of MIP solvers
Automated Configuration of MIP solvers Frank Hutter, Holger Hoos, and Kevin Leyton-Brown Department of Computer Science University of British Columbia Vancouver, Canada {hutter,hoos,kevinlb}@cs.ubc.ca
More informationUniversité Libre de Bruxelles
Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle Adaptive Sample Size and Importance Sampling in Estimation-based Local Search
More informationNote: In physical process (e.g., annealing of metals), perfect ground states are achieved by very slow lowering of temperature.
Simulated Annealing Key idea: Vary temperature parameter, i.e., probability of accepting worsening moves, in Probabilistic Iterative Improvement according to annealing schedule (aka cooling schedule).
More informationSimple mechanisms for escaping from local optima:
The methods we have seen so far are iterative improvement methods, that is, they get stuck in local optima. Simple mechanisms for escaping from local optima: I Restart: re-initialise search whenever a
More informationParamILS: An Automatic Algorithm Configuration Framework
Journal of Artificial Intelligence Research 36 (2009) 267-306 Submitted 06/09; published 10/09 ParamILS: An Automatic Algorithm Configuration Framework Frank Hutter Holger H. Hoos Kevin Leyton-Brown University
More informationDeconstructing Multi-objective Evolutionary Algorithms: An Iterative Analysis on the Permutation Flow-Shop Problem
Deconstructing Multi-objective Evolutionary Algorithms: An Iterative Analysis on the Permutation Flow-Shop Problem Leonardo C. T. Bezerra, Manuel López-Ibáñez, and Thomas Stützle IRIDIA, Université Libre
More informationComparison of Acceptance Criteria in Randomized Local Searches
Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle Comparison of Acceptance Criteria in Randomized Local Searches A. Franzin and
More informationTuning Algorithms for Tackling Large Instances: An Experimental Protocol.
Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle Tuning Algorithms for Tackling Large Instances: An Experimental Protocol. Franco
More informationUniversité Libre de Bruxelles
Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle Improvement Strategies for the F-Race algorithm: Sampling Design and Iterative
More informationHeuristic Optimization
Heuristic Optimization Thomas Stützle RDA, CoDE Université Libre de Bruxelles stuetzle@ulb.ac.be iridia.ulb.ac.be/~stuetzle iridia.ulb.ac.be/~stuetzle/teaching/ho Example problems imagine a very good friend
More informationAnt Colony Optimization: A Component-Wise Overview
Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle Ant Colony Optimization: A Component-Wise Overview M. López-Ibáñez, T. Stützle,
More informationAutomated Configuration of Mixed Integer Programming Solvers
Automated Configuration of Mixed Integer Programming Solvers Frank Hutter, Holger H. Hoos, and Kevin Leyton-Brown University of British Columbia, 2366 Main Mall, Vancouver BC, V6T 1Z4, Canada {hutter,hoos,kevinlb}@cs.ubc.ca
More informationunderlying iterative best improvement procedure based on tabu attributes. Heuristic Optimization
Tabu Search Key idea: Use aspects of search history (memory) to escape from local minima. Simple Tabu Search: I Associate tabu attributes with candidate solutions or solution components. I Forbid steps
More informationMetaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini
Metaheuristic Development Methodology Fall 2009 Instructor: Dr. Masoud Yaghini Phases and Steps Phases and Steps Phase 1: Understanding Problem Step 1: State the Problem Step 2: Review of Existing Solution
More informationAutomatically Improving the Anytime Behaviour of Optimisation Algorithms
Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle Automatically Improving the Anytime Behaviour of Optimisation Algorithms Manuel
More informationSLS Methods: An Overview
HEURSTC OPTMZATON SLS Methods: An Overview adapted from slides for SLS:FA, Chapter 2 Outline 1. Constructive Heuristics (Revisited) 2. terative mprovement (Revisited) 3. Simple SLS Methods 4. Hybrid SLS
More informationANT colony optimization (ACO) [1] is a metaheuristic. The Automatic Design of Multi-Objective Ant Colony Optimization Algorithms
FINAL DRAFT PUBLISHED IN IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION 1 The Automatic Design of Multi-Objective Ant Colony Optimization Algorithms Manuel López-Ibáñez and Thomas Stützle Abstract Multi-objective
More informationThe Automatic Design of Multi-Objective Ant Colony Optimization Algorithms
Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle The Automatic Design of Multi-Objective Ant Colony Optimization Algorithms Manuel
More informationAnt Algorithms for the University Course Timetabling Problem with Regard to the State-of-the-Art
Ant Algorithms for the University Course Timetabling Problem with Regard to the State-of-the-Art Krzysztof Socha, Michael Sampels, and Max Manfrin IRIDIA, Université Libre de Bruxelles, CP 194/6, Av. Franklin
More informationGRASP. Greedy Randomized Adaptive. Search Procedure
GRASP Greedy Randomized Adaptive Search Procedure Type of problems Combinatorial optimization problem: Finite ensemble E = {1,2,... n } Subset of feasible solutions F 2 Objective function f : 2 Minimisation
More informationReACT: Real-time Algorithm Configuration through Tournaments
ReACT: Real-time Algorithm Configuration through Tournaments Tadhg Fitzgerald and Yuri Malitsky and Barry O Sullivan Kevin Tierney Insight Centre for Data Analytics Department of Business Information Systems
More informationVariable Neighbourhood Search (VNS)
Variable Neighbourhood Search (VNS) Key dea: systematically change neighbourhoods during search Motivation: recall: changing neighbourhoods can help escape local optima a global optimum is locally optimal
More informationOff-line vs. On-line Tuning: A Study on MAX MIN Ant System for the TSP
Off-line vs. On-line Tuning: A Study on MAX MIN Ant System for the TSP Paola Pellegrini 1,ThomasStützle 2, and Mauro Birattari 2 1 Dipartimento di Matematica Applicata Università Ca Foscari Venezia, Venezia,
More informationAn Ant Approach to the Flow Shop Problem
An Ant Approach to the Flow Shop Problem Thomas Stützle TU Darmstadt, Computer Science Department Alexanderstr. 10, 64283 Darmstadt Phone: +49-6151-166651, Fax +49-6151-165326 email: stuetzle@informatik.tu-darmstadt.de
More informationA Modular Multiphase Heuristic Solver for Post Enrolment Course Timetabling
A Modular Multiphase Heuristic Solver for Post Enrolment Course Timetabling Marco Chiarandini 1, Chris Fawcett 2, and Holger H. Hoos 2,3 1 University of Southern Denmark, Department of Mathematics and
More informationVariable Neighbourhood Search (VNS)
Variable Neighbourhood Search (VNS) Key dea: systematically change neighbourhoods during search Motivation: recall: changing neighbourhoods can help escape local optima a global optimum is locally optimal
More informationImprovement Strategies for thef-racealgorithm: Sampling Design and Iterative Refinement
Improvement Strategies for thef-racealgorithm: Sampling Design and Iterative Refinement Prasanna Balaprakash, Mauro Birattari, and Thomas Stützle IRIDIA, CoDE, Université Libre de Bruxelles, Brussels,
More informationSudoku is a logic puzzle where one has to fill a grid with
1 A Hybrid Heuristic Approach for the Sudoku problem Nysret Musliu and Felix Winter Vienna University of Technology corresponding author Abstract Sudoku is not only a popular puzzle but also an interesting
More informationPerformance Prediction and Automated Tuning of Randomized and Parametric Algorithms
Performance Prediction and Automated Tuning of Randomized and Parametric Algorithms Frank Hutter 1, Youssef Hamadi 2, Holger Hoos 1, and Kevin Leyton-Brown 1 1 University of British Columbia, Vancouver,
More informationAutomatic solver configuration using SMAC
Automatic solver configuration using SMAC How to boost performance of your SAT solver? Marius Lindauer 1 University of Freiburg SAT Industrial Day 2016, Bordeaux 1 Thanks to Frank Hutter! Ever looked into
More informationMetaheuristic Optimization with Evolver, Genocop and OptQuest
Metaheuristic Optimization with Evolver, Genocop and OptQuest MANUEL LAGUNA Graduate School of Business Administration University of Colorado, Boulder, CO 80309-0419 Manuel.Laguna@Colorado.EDU Last revision:
More informationn Informally: n How to form solutions n How to traverse the search space n Systematic: guarantee completeness
Advanced Search Applications: Combinatorial Optimization Scheduling Algorithms: Stochastic Local Search and others Analyses: Phase transitions, structural analysis, statistical models Combinatorial Problems
More informationParallel Machine and Flow Shop Models
Outline DM87 SCHEDULING, TIMETABLING AND ROUTING 1. Resume and Extensions on Single Machine Models Lecture 10 Parallel Machine and Flow Shop Models 2. Parallel Machine Models Marco Chiarandini 3. Flow
More informationGenetic Algorithms and Genetic Programming. Lecture 9: (23/10/09)
Genetic Algorithms and Genetic Programming Lecture 9: (23/10/09) Genetic programming II Michael Herrmann michael.herrmann@ed.ac.uk, phone: 0131 6 517177, Informatics Forum 1.42 Overview 1. Introduction:
More informationEffective probabilistic stopping rules for randomized metaheuristics: GRASP implementations
Effective probabilistic stopping rules for randomized metaheuristics: GRASP implementations Celso C. Ribeiro Isabel Rosseti Reinaldo C. Souza Universidade Federal Fluminense, Brazil July 2012 1/45 Contents
More informationRich Vehicle Routing Problems Challenges and Prospects in Exploring the Power of Parallelism. Andreas Reinholz. 1 st COLLAB Workshop
Collaborative Research Center SFB559 Modeling of Large Logistic Networks Project M8 - Optimization Rich Vehicle Routing Problems Challenges and Prospects in Exploring the Power of Parallelism Andreas Reinholz
More informationOptimization Techniques for Design Space Exploration
0-0-7 Optimization Techniques for Design Space Exploration Zebo Peng Embedded Systems Laboratory (ESLAB) Linköping University Outline Optimization problems in ERT system design Heuristic techniques Simulated
More informationVariable Neighborhood Search for Solving the Balanced Location Problem
TECHNISCHE UNIVERSITÄT WIEN Institut für Computergraphik und Algorithmen Variable Neighborhood Search for Solving the Balanced Location Problem Jozef Kratica, Markus Leitner, Ivana Ljubić Forschungsbericht
More informationMeta-algorithmic techniques in SAT solving: Automated configuration, selection and beyond
Meta-algorithmic techniques in SAT solving: Automated configuration, selection and beyond Holger H. Hoos BETA Lab Department of Computer Science University of British Columbia Canada SAT/SMT Summer School
More informationReal-time train routing and scheduling through mixed integer linear programming: Heuristic approach
Real-time train routing and scheduling through mixed integer linear programming: Heuristic approach Paola Pellegrini, Guillaume Douchet, Grégory Marliere, Joaquin Rodriguez To cite this version: Paola
More informationModule 1 Lecture Notes 2. Optimization Problem and Model Formulation
Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization
More informationLECTURE 20: SWARM INTELLIGENCE 6 / ANT COLONY OPTIMIZATION 2
15-382 COLLECTIVE INTELLIGENCE - S18 LECTURE 20: SWARM INTELLIGENCE 6 / ANT COLONY OPTIMIZATION 2 INSTRUCTOR: GIANNI A. DI CARO ANT-ROUTING TABLE: COMBINING PHEROMONE AND HEURISTIC 2 STATE-TRANSITION:
More informationMO-ParamILS: A Multi-objective Automatic Algorithm Configuration Framework
MO-ParamILS: A Multi-objective Automatic Algorithm Configuration Framework Aymeric Blot, Holger Hoos, Laetitia Jourdan, Marie-Éléonore Marmion, Heike Trautmann To cite this version: Aymeric Blot, Holger
More informationAutomatic Configuration of MIP Solver: Case Study in Vertical Flight Planning
Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle Automatic Configuration of MIP Solver: Case Study in Vertical Flight Planning
More informationAlgorithms for Integer Programming
Algorithms for Integer Programming Laura Galli November 9, 2016 Unlike linear programming problems, integer programming problems are very difficult to solve. In fact, no efficient general algorithm is
More informationA Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery
A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery Monika Sharma 1, Deepak Sharma 2 1 Research Scholar Department of Computer Science and Engineering, NNSS SGI Samalkha,
More informationThe Impact of Initial Designs on the Performance of MATSuMoTo on the Noiseless BBOB-2015 Testbed: A Preliminary Study
The Impact of Initial Designs on the Performance of MATSuMoTo on the Noiseless BBOB-2015 Testbed: A Preliminary Study Dimo Brockhoff INRIA Lille Nord Europe Bernd Bischl LMU München Tobias Wagner TU Dortmund
More informationUniversité Libre de Bruxelles
Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle Exposing a Bias Toward Short-Length Numbers in Grammatical Evolution Marco A.
More informationCombining Meta-EAs and Racing for Difficult EA Parameter Tuning Tasks
Combining Meta-EAs and Racing for Difficult EA Parameter Tuning Tasks Bo Yuan and Marcus Gallagher School of Information Technology and Electrical Engineering University of Queensland, Qld. 4072, Australia
More informationA Study of Neighborhood Structures for the Multiple Depot Vehicle Scheduling Problem
A Study of Neighborhood Structures for the Multiple Depot Vehicle Scheduling Problem Benoît Laurent 1,2 and Jin-Kao Hao 2 1 Perinfo SA, Strasbourg, France 2 LERIA, Université d Angers, Angers, France blaurent@perinfo.com,
More informationCHAPTER 4 GENETIC ALGORITHM
69 CHAPTER 4 GENETIC ALGORITHM 4.1 INTRODUCTION Genetic Algorithms (GAs) were first proposed by John Holland (Holland 1975) whose ideas were applied and expanded on by Goldberg (Goldberg 1989). GAs is
More informationProgramming by Optimisation:
Programming by Optimisation: Towards a new Paradigm for Developing High-Performance Software Holger H. Hoos BETA Lab Department of Computer Science University of British Columbia Canada ICAPS 2013 Rome,
More informationInternational Journal of Information Technology and Knowledge Management (ISSN: ) July-December 2012, Volume 5, No. 2, pp.
Empirical Evaluation of Metaheuristic Approaches for Symbolic Execution based Automated Test Generation Surender Singh [1], Parvin Kumar [2] [1] CMJ University, Shillong, Meghalya, (INDIA) [2] Meerut Institute
More informationThe irace Package: User Guide
Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle The irace Package: User Guide M. López-Ibáñez, L. Pérez Cáceres, J. Dubois-Lacoste,
More informationarxiv: v1 [cs.ai] 7 Oct 2013
Bayesian Optimization With Censored Response Data Frank Hutter, Holger Hoos, and Kevin Leyton-Brown Department of Computer Science University of British Columbia {hutter, hoos, kevinlb}@cs.ubc.ca arxiv:.97v
More informationGraph Coloring via Constraint Programming-based Column Generation
Graph Coloring via Constraint Programming-based Column Generation Stefano Gualandi Federico Malucelli Dipartimento di Elettronica e Informatica, Politecnico di Milano Viale Ponzio 24/A, 20133, Milan, Italy
More informationMeta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization
2017 2 nd International Electrical Engineering Conference (IEEC 2017) May. 19 th -20 th, 2017 at IEP Centre, Karachi, Pakistan Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic
More informationProceedings of the 2012 International Conference on Industrial Engineering and Operations Management Istanbul, Turkey, July 3 6, 2012
Proceedings of the 2012 International Conference on Industrial Engineering and Operations Management Istanbul, Turkey, July 3 6, 2012 Solving Assembly Line Balancing Problem in the State of Multiple- Alternative
More informationComputers & Operations Research
Computers & Operations Research 36 (2009) 2619 -- 2631 Contents lists available at ScienceDirect Computers & Operations Research journal homepage: www.elsevier.com/locate/cor Design and analysis of stochastic
More informationTuning Differential Evolution for Cheap, Medium, and Expensive Computational Budgets
Tuning Differential Evolution for Cheap, Medium, and Expensive Computational Budgets Ryoji Tanabe and Alex Fukunaga Graduate School of Arts and Sciences The University of Tokyo Abstract This paper presents
More informationAn experimental analysis of design choices of multi-objective ant colony optimization algorithms
Swarm Intell (2012) 6:207 232 DOI 10.1007/s11721-012-0070-7 An experimental analysis of design choices of multi-objective ant colony optimization algorithms Manuel López-Ibáñez Thomas Stützle Received:
More informationPre-scheduled and adaptive parameter variation in MAX-MIN Ant System
Pre-scheduled and adaptive parameter variation in MAX-MIN Ant System Michael Maur, Manuel López-Ibáñez, and Thomas Stützle Abstract MAX-MIN Ant System (MMAS) is an ant colony optimization (ACO) algorithm
More informationHybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2
Hybridization EVOLUTIONARY COMPUTING Hybrid Evolutionary Algorithms hybridization of an EA with local search techniques (commonly called memetic algorithms) EA+LS=MA constructive heuristics exact methods
More informationA tabu search approach for makespan minimization in a permutation flow shop scheduling problems
A tabu search approach for makespan minimization in a permutation flow shop scheduling problems Sawat Pararach Department of Industrial Engineering, Faculty of Engineering, Thammasat University, Pathumthani
More informationThe aims of this tutorial
A.E. Eiben Free University Amsterdam (google: eiben) The aims of this tutorial Creating awareness of the tuning issue Providing guidelines for tuning EAs Presenting a vision for future development CEC
More informationAutomatic Configuration of Sequential Planning Portfolios
Automatic Configuration of Sequential Planning Portfolios Jendrik Seipp and Silvan Sievers and Malte Helmert University of Basel Basel, Switzerland {jendrik.seipp,silvan.sievers,malte.helmert}@unibas.ch
More informationParallel Computing in Combinatorial Optimization
Parallel Computing in Combinatorial Optimization Bernard Gendron Université de Montréal gendron@iro.umontreal.ca Course Outline Objective: provide an overview of the current research on the design of parallel
More informationOutline. TABU search and Iterated Local Search classical OR methods. Traveling Salesman Problem (TSP) 2-opt
TABU search and Iterated Local Search classical OR methods Outline TSP optimization problem Tabu Search (TS) (most important) Iterated Local Search (ILS) tks@imm.dtu.dk Informatics and Mathematical Modeling
More informationData Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University
Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Search & Optimization Search and Optimization method deals with
More informationTABU search and Iterated Local Search classical OR methods
TABU search and Iterated Local Search classical OR methods tks@imm.dtu.dk Informatics and Mathematical Modeling Technical University of Denmark 1 Outline TSP optimization problem Tabu Search (TS) (most
More informationOrdered Racing Protocols for Automatically Configuring Algorithms for Scaling Performance
Ordered Racing Protocols for Automatically Configuring Algorithms for Scaling Performance James Styles University of British Columbia jastyles@cs.ubc.ca Holger Hoos University of British Columbia hoos@cs.ubc.ca
More informationA Survey of Solving Approaches for Multiple Objective Flexible Job Shop Scheduling Problems
BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 15, No 2 Sofia 2015 Print ISSN: 1311-9702; Online ISSN: 1314-4081 DOI: 10.1515/cait-2015-0025 A Survey of Solving Approaches
More informationConsultant-Guided Search A New Metaheuristic for Combinatorial Optimization Problems
Consultant-Guided Search A New Metaheuristic for Combinatorial Optimization Problems Serban Iordache SCOOP Software GmbH Am Kielshof 29, 51105 Köln, Germany siordache@acm.org ABSTRACT In this paper, we
More informationFeature Selection. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani
Feature Selection CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Dimensionality reduction Feature selection vs. feature extraction Filter univariate
More informationGRASP WITH PATH-RELINKING FOR THE GENERALIZED QUADRATIC ASSIGNMENT PROBLEM
GRASP WITH PATH-RELINKING FOR THE GENERALIZED QUADRATIC ASSIGNMENT PROBLEM GERALDO R. MATEUS, MAURICIO G.C. RESENDE, AND RICARDO M. A. SILVA Abstract. The generalized quadratic assignment problem (GQAP)
More informationBoosting as a Metaphor for Algorithm Design
Boosting as a Metaphor for Algorithm Design Kevin Leyton-Brown, Eugene Nudelman, Galen Andrew, Jim McFadden, and Yoav Shoham {kevinlb;eugnud;galand;jmcf;shoham}@cs.stanford.edu Stanford University, Stanford
More informationUsing CODEQ to Train Feed-forward Neural Networks
Using CODEQ to Train Feed-forward Neural Networks Mahamed G. H. Omran 1 and Faisal al-adwani 2 1 Department of Computer Science, Gulf University for Science and Technology, Kuwait, Kuwait omran.m@gust.edu.kw
More informationOPTIMIZING THE TOTAL COMPLETION TIME IN PROCESS PLANNING USING THE RANDOM SIMULATION ALGORITHM
OPTIMIZING THE TOTAL COMPLETION TIME IN PROCESS PLANNING USING THE RANDOM SIMULATION ALGORITHM Baskar A. and Anthony Xavior M. School of Mechanical and Building Sciences, VIT University, Vellore, India
More informationUniversité Libre de Bruxelles
Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle An Incremental Particle Swarm for Large-Scale Continuous Optimization Problems:
More informationDesigning a Portfolio of Parameter Configurations for Online Algorithm Selection
Designing a Portfolio of Parameter Configurations for Online Algorithm Selection Aldy Gunawan and Hoong Chuin Lau and Mustafa Mısır Singapore Management University 80 Stamford Road Singapore 178902 {aldygunawan,
More informationReactive Stochastic Local Search Algorithms for the Genomic Median Problem
Reactive Stochastic Local Search Algorithms for the Genomic Median Problem Renaud Lenne 1,2, Christine Solnon 1, Thomas Stützle 2, Eric Tannier 3, and Mauro Birattari 2 1 LIRIS, UMR CNRS 5205, Université
More informationPre-requisite Material for Course Heuristics and Approximation Algorithms
Pre-requisite Material for Course Heuristics and Approximation Algorithms This document contains an overview of the basic concepts that are needed in preparation to participate in the course. In addition,
More informationAutomatic Component-wise Design of Multi-Objective Evolutionary Algorithms
Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle Automatic Component-wise Design of Multi-Objective Evolutionary Algorithms L.
More informationAnt Colony Optimization Exercises
Outline DM6 HEURISTICS FOR COMBINATORIAL OPTIMIZATION Lecture 11 Ant Colony Optimization Exercises Ant Colony Optimization: the Metaheuristic Application Examples Connection between ACO and other Metaheuristics
More informationConstructive meta-heuristics
Constructive meta-heuristics Heuristic algorithms Giovanni Righini University of Milan Department of Computer Science (Crema) Improving constructive algorithms For many problems constructive algorithms
More informationCover Page. The handle holds various files of this Leiden University dissertation.
Cover Page The handle http://hdl.handle.net/1887/22055 holds various files of this Leiden University dissertation. Author: Koch, Patrick Title: Efficient tuning in supervised machine learning Issue Date:
More information