Meta-algorithmic techniques in SAT solving: Automated configuration, selection and beyond
|
|
- Gabriel Chapman
- 6 years ago
- Views:
Transcription
1 Meta-algorithmic techniques in SAT solving: Automated configuration, selection and beyond Holger H. Hoos BETA Lab Department of Computer Science University of British Columbia Canada SAT/SMT Summer School Trento, Italy, 2012/06/12
2 What fuels progress in SAT solving? Insights into SAT (theory) Creativity of algorithm designers heuristics Advanced debugging techniques fuzz testing, delta-debugging Principled experimentation statistical techniques, experimentation platforms SAT competitions Holger Hoos: Meta-algorithmic techniques in SAT solving 2
3 2009: 5 of 27 medals 2011: 28 of 54 medals Holger Hoos: Meta-algorithmic techniques in SAT solving 3
4 Meta-algorithmic techniques algorithms that operate upon other algorithms (SAT solvers) meta-heuristics here: algorithms whose inputs include one or more SAT solvers configurators (e.g., ParamILS, GGA, SMAC) selectors (e.g., SATzilla, 3S) schedulers (e.g., aspeed; also: 3S, SATzilla) (parallel) portfolios (e.g., ManySAT, ppfolio) run-time predictors experimentation platforms (e.g., EDACC, HAL) Holger Hoos: Meta-algorithmic techniques in SAT solving 4
5 Why are meta-algorithmic techniques important? no one knows how to best solve SAT (or any other NP-hard problem) no single dominant solver state-of-the-art performance often achieved by combinations of various heuristic choices (e.g., pre-processing; variable/value selection heuristic; restart rules; data structures;... ) complex interactions, unexpected behaviour performance can be tricky to assess due to differences in behaviour across problem instances stochasticity potential for suboptimal choices in solver development, applications Holger Hoos: Meta-algorithmic techniques in SAT solving 5
6 Why are meta-algorithmic techniques important? human intuitions can be misleading, abilities are limited substantial benefit from augmentation with computational techniques use of fully specified procedures (rather than intuition / ad hoc choices) can improve reproducibility, facilitate scientific analysis / understanding Holger Hoos: Meta-algorithmic techniques in SAT solving 6
7 SAT is a hotbed for meta-algorithmic work! Drosophila problem for computing science (and beyond) prototypical NP-hard problem prominent in various areas of CS and beyond important applications conceptual simplicity aids solver design / development active and diverse community SAT competitions Holger Hoos: Meta-algorithmic techniques in SAT solving 7
8 Outline 1. Introduction 2. Algorithm configuration 3. Algorithm selection 4. Algorithm scheduling 5. Parallel algorithm portfolios 6. Programming by Optimisation (PbO) Holger Hoos: Meta-algorithmic techniques in SAT solving 8
9 Traditional algorithm design approach: iterative, manual process designer gradually introduces/modifies components or mechanisms performance is tested on benchmark instances design often starts from generic or broadly applicable problem solving method (e.g., evolutionary algorithm) Holger Hoos: Meta-algorithmic techniques in SAT solving 9
10 Note: During the design process, many decisions are made. Some choices take the form of parameters, others are hard-coded. Design decisions interact in complex ways. Holger Hoos: Meta-algorithmic techniques in SAT solving 10
11 Problems: Design process is labour-intensive. Design decisions often made in ad-hoc fasion, based on limited experimentation and intuition. Human designers typically over-generalise observations, explore few designs. Implicit assumptions of independence, monotonicity are often incorrect. Number of components and mechanisms tends to grow in each stage of design process. complicated designs, unfulfilled performance potential Holger Hoos: Meta-algorithmic techniques in SAT solving 11
12 Solution: Automated Algorithm Configuration Key idea: expose design choices as parameters, Key idea: use automated procedure to effectively Key idea: explore resulting configuration space Given: algorithm A with parameters p and configuration space C; set Π of benchmark instances, performance metric m Objective: find configuration c C for which A performs best on Π according to metric m Holger Hoos: Meta-algorithmic techniques in SAT solving 12
13 Algorithm configuration Lo Hi Holger Hoos: Meta-algorithmic techniques in SAT solving 13
14 Example: SAT-based software verification Hutter, Babic, HH, Hu (2007) Goal: Solve suite of SAT-encoded software verification Goal: instances as fast as possible new DPLL-style SAT solver Spear (by Domagoj Babic) = highly parameterised heuristic algorithm = (26 parameters, configurations) manual configuration by algorithm designer automated configuration using ParamILS, a generic algorithm configuration procedure Hutter, HH, Stützle (2007) Holger Hoos: Meta-algorithmic techniques in SAT solving 14
15 Spear: Empirical results on software verification benchmarks solver num. solved mean run-time MiniSAT / CPU sec Spear original 298/ CPU sec Spear generic. opt. config. 302/ CPU sec Spear specific. opt. config. 302/ CPU sec 500-fold speedup through use automated algorithm configuration procedure (ParamILS) new state of the art (winner of 2007 SMT Competition, QF BV category) Holger Hoos: Meta-algorithmic techniques in SAT solving 15
16 SPEAR on software verification 10 4 auto-config. [CPU sec] default config. [CPU sec] Holger Hoos: Meta-algorithmic techniques in SAT solving 16
17 Advantages of using automated configurators: enables better exploration of larger design spaces better performance lets human designer focus on higher-level issues more effective use of human expertise / creativity uses principled, fully formalised methods to find good configurations better reproducibility, fairer comparisons, insights can be used to customise algorithms for use in specific application context with minimal human effort expanded range of successful applications Holger Hoos: Meta-algorithmic techniques in SAT solving 17
18 Approaches to automated algorithm configuration Standard optimisation techniques (e.g., CMA-ES Hansen & Ostermeier 2001; MADS Audet & Orban 2006) Advanced sampling methods (e.g., REVAC Nannen & Eiben ) Racing (e.g., F-Race Birattari, Stützle, Paquete, Varrentrapp 2002; Iterative F-Race Balaprakash, Birattari, Stützle 2007) Model-free search (e.g., ParamILS Hutter, HH, Stützle 2007; Hutter, HH, Leyton-Brown, Stützle 2009; GGA Ansótegui, Sellmann, Tierney 2009) Sequential model-based optimisation (e.g., SPO Bartz-Beielstein 2006; SMAC Hutter, HH, Leyton-Brown ) Holger Hoos: Meta-algorithmic techniques in SAT solving 18
19 Iterated Local Search (Initialisation) Holger Hoos: Meta-algorithmic techniques in SAT solving 19
20 Iterated Local Search (Initialisation) Holger Hoos: Meta-algorithmic techniques in SAT solving 19
21 Iterated Local Search (Local Search) Holger Hoos: Meta-algorithmic techniques in SAT solving 19
22 Iterated Local Search (Perturbation) Holger Hoos: Meta-algorithmic techniques in SAT solving 19
23 Iterated Local Search (Local Search) Holger Hoos: Meta-algorithmic techniques in SAT solving 19
24 Iterated Local Search? Selection (using Acceptance Criterion) Holger Hoos: Meta-algorithmic techniques in SAT solving 19
25 Iterated Local Search (Perturbation) Holger Hoos: Meta-algorithmic techniques in SAT solving 19
26 ParamILS iterated local search in configuration space initialisation: pick best of default + R random configurations subsidiary local search: iterative first improvement, change one parameter in each step perturbation: change s randomly chosen parameters acceptance criterion: always select better configuration number of runs per configuration increases over time; ensure that incumbent always has same number of runs as challengers Holger Hoos: Meta-algorithmic techniques in SAT solving 20
27 SATenstein: Automatically Building Local Search Solvers for SAT KhudaBukhsh, Xu, HH, Leyton-Brown (2009) Frankenstein: create perfect human being from scavenged body parts SATenstein: create perfect SAT solvers using components scavenged from existing solvers Geneneral approach: components from GSAT, WalkSAT, dynamic local search and G2WSAT algorithms flexible SLS framework (derived from UBCSAT) find performance-optimising instantiations using ParamILS Holger Hoos: Meta-algorithmic techniques in SAT solving 21
28 Challenge: 41 parameters (mostly categorical) over configurations 6 well-known distributions of SAT instances (QCP, SW-GCP, R3SAT, HGEN, FAC, CBMC-SE) 11 challenger algorithms (includes all winning SLS solvers from SAT competitions ) Holger Hoos: Meta-algorithmic techniques in SAT solving 22
29 Result: factor performance improvements over best challengers on QCP, HGEN, CBMC-SE factor performance improvement over best challengers on SW-GCP, R3SAT, FAC Holger Hoos: Meta-algorithmic techniques in SAT solving 23
30 SATenstein-LS vs VW on CBMC-SE 10 3 VW median runtime (CPU sec) SOLVED BY BOTH SOLVED BY ONE SATenstein LS[CBMC SE] median runtime (CPU sec) Holger Hoos: Meta-algorithmic techniques in SAT solving 24
31 SATenstein-LS vs Oracle on CBMC-SE 10 3 Oracle median runtime (CPU sec) SOLVED BY BOTH SOLVED BY ONE SATenstein LS[CBMC SE] median runtime (CPU sec) Holger Hoos: Meta-algorithmic techniques in SAT solving 25
32 Algorithm selection Off-line (per-distribution) algorithm selection: Given: set S of algorithms for a problem, representative distribution (or set) of problem instance Π Objective: select from S the algorithm expected to solve instances from Π most efficiently (w.r.t. aggregate performance measure) = single best solver SAT competitions Holger Hoos: Meta-algorithmic techniques in SAT solving 26
33 Methods for off-line algorithm selection exhaustive evaluation: run all solvers on all instances racing methods: eliminate solvers as soon as they fall significantly behind the current leader (Maron & Moore 1994; Birattari, Stützle, Paquete, Varrentrapp 2002) Holger Hoos: Meta-algorithmic techniques in SAT solving 27
34 Racing (for Algorithm Selection) problem instances algorithms (Initialisation) Holger Hoos: Meta-algorithmic techniques in SAT solving 28
35 Racing (for Algorithm Selection) problem instances algorithms (Initialisation) Holger Hoos: Meta-algorithmic techniques in SAT solving 28
36 Racing (for Algorithm Selection) problem instances winner algorithms (Initialisation) Holger Hoos: Meta-algorithmic techniques in SAT solving 28
37 Potential problem: inhomogenous benchmark sets / distributions may not correctly identify single best solver Solutions: use sets of instances at each stage of the race racing within (reasonably) homogenous subsets order instances according to difficulty (ordered racing) (Styles & HH in preparation) Note: racing can also be used for algorithm configuration (requires special techniques for handling large configuration spaces) (Balaprakash, Birattari, Stützle 2007) Holger Hoos: Meta-algorithmic techniques in SAT solving 29
38 Instance-based algorithm selection (Rice 1976): Given: set S of algorithms for a problem, problem instance π Objective: select from S the algorithm expected to solve π most efficiently, based on (cheaply computable) features of π Note: Best case performance bounded by oracle, which selects the best s S for each π = virtual best solver (VBS) Holger Hoos: Meta-algorithmic techniques in SAT solving 30
39 Instance-based algorithm selection feature extractor selector component algorithms Holger Hoos: Meta-algorithmic techniques in SAT solving 31
40 Key components: set of (state-of-the-art) solvers set of cheaply computable, informative features efficient procedure for mapping features to solvers (selector) training data procedure for building good selector based on training data (selector builder) Holger Hoos: Meta-algorithmic techniques in SAT solving 32
41 Methods for instance-based selection: classification-based: predict the best solver, e.g., using... decision trees (Guerri & Milano 2004) case-based reasoning (Gebruers, Guerri, Hnich, Milano 2004) (weighted) k-nearest neighbours (Malitsky, Sabharwal, Samulowitz, Sellmann 2011; Kadioglu, Malitsky, Sabharwal, Samulowitz, Sellmann 2011) pairwise cost-sensitive decision forests + voting (Xu, Hutter, HH, Leyton-Brown 2012) regression-based: predict running time for each solver, select the one predicted to be fastest (Leyton-Brown, Nudelman, Shoham 2003; Xu, Hutter, HH, Leyton-Brown ) Holger Hoos: Meta-algorithmic techniques in SAT solving 33
42 Instance features: Use generic and problem-specific features that correlate with performance and can be computed (relatively) cheaply: number of clauses, variables,... constraint graph features local & complete search probes Use as features statistics of distributions, e.g., variation coefficient of node degree in constraint graph Consider combinations of features (e.g., pairwise products quadratic basis function expansion). Holger Hoos: Meta-algorithmic techniques in SAT solving 34
43 SATzilla (Xu, Hutter, HH, Leyton-Brown): use state-of-the-art complete (DPLL/CDCL) and incomplete (local search) SAT solvers extract (up to) 84 polytime-computable instance features use ridge regression on selected features to predict solver run-times from instance features (one model per solver) run solver with best predicted performance Holger Hoos: Meta-algorithmic techniques in SAT solving 35
44 Some bells and whistles: use pre-solvers to solve easy instances quickly build run-time predictors for various types of instances, use classifier to select best predictor based on instance features. predict time required for feature computation; if that time is too long (or error occurs), use back-up solver use method by Schmee & Hahn (1979) to deal with censored run-time data prizes in 5 of the 9 main categories of the 2009 SAT Solver Competition (3 gold, 2 silver medals) Holger Hoos: Meta-algorithmic techniques in SAT solving 36
45 The problem with standard classification approaches Crucial assumption: solvers behave similarly on instances with similar features But do they really? uninformative features correlated features feature normalisation (tricky!) cost of misclassification Holger Hoos: Meta-algorithmic techniques in SAT solving 37
46 SATzilla (Xu, Hutter, HH, Leyton-Brown 2012): uses cost-based decision forests to directly select solver based on features one predictive model for each pair of solvers (which is better?) majority voting (over pairwise predictions) to select solver to be run (further details, results next Monday, 11:30 session) Holger Hoos: Meta-algorithmic techniques in SAT solving 38
47 Hydra: Automatically Configuring Algorithms for Portfolio-Based Selection Xu, HH, Leyton-Brown (2010) Note: SATenstein builds solvers that work well on average on a given set of SAT instances but: may have to settle for compromises for broad, heterogenous sets SATzilla builds algorithm selector based on given set of SAT solvers but: success entirely depends on quality of given solvers Idea: Combine the two approaches portfolio-based selection from set of automatically constructed solvers Holger Hoos: Meta-algorithmic techniques in SAT solving 39
48 Configuration + Selection = Hydra feature extractor selector parametric algorithm (multiple configurations) Holger Hoos: Meta-algorithmic techniques in SAT solving 40
49 Simple combination: 1. build solvers for various types of instances using automated algorithm configuration 2. construct portfolio-based selector from these Drawback: Requires suitably defined sets of instances Better solution: iteratively build & add solvers that improve performance of given portfolio Hydra Note: Builds portfolios solely using generic, highly configurable solver (e.g., SATenstein) instance features (as used in SATzilla) Holger Hoos: Meta-algorithmic techniques in SAT solving 41
50 Results on mixture of 6 well-known benchmark sets 10 3 Hydra[BM, 7] PAR Score Hydra[BM, 1] PAR Score Holger Hoos: Meta-algorithmic techniques in SAT solving 42
51 Results on mixture of 6 well-known benchmark sets PAR Score Hydra[BM] training 50 Hydra[BM] test SATenFACT on training SATenFACT on test Number of Hydra Steps Holger Hoos: Meta-algorithmic techniques in SAT solving 43
52 Note: Hydra can use arbitrary algorithm configurators, selector builders different approaches are possible: e.g., ISAC, based on feature-based instance clustering, distance-based selection (Kadioglu, Malitsky, Sellmann, Tierney 2010) Holger Hoos: Meta-algorithmic techniques in SAT solving 44
53 Algorithm scheduling Note: SATzilla and 3S use a sequence of (pre-)solvers before instance-based selection. Kadioglu, Malitsky, Sabharwal, Samulowitz, Sellmann (2011) Holger Hoos: Meta-algorithmic techniques in SAT solving 45
54 Algorithm Scheduling schedule Holger Hoos: Meta-algorithmic techniques in SAT solving 46
55 Questions: 1. How to determine that sequence? 2. How much performance can be obtained from solver scheduling only? Holger Hoos: Meta-algorithmic techniques in SAT solving 47
56 Methods for algorithm scheduling methods: exhaustive search (as done SATzilla) expensive; limited to few solvers, cutoff times based on optimisation procedure using integer programming (IP) techniques 3S Kadioglu et al. (2011) using answer-set-programming (ASP) formulation + solver aspeed HH, Kaminski, Schaub, Schneider (2012) Holger Hoos: Meta-algorithmic techniques in SAT solving 48
57 Preliminary performance results: Performance of pure scheduling can be suprisingly close to that of combined scheduling + selection (full SATzilla). (HH, Kaminski, Schaub, Schneider in preparation; Xu, Hutter, HH, Leyton-Brown in preparation) Holger Hoos: Meta-algorithmic techniques in SAT solving 49
58 Notes: the ASP solver clasp used by aspeed is powered by a (state-of-the-art) SAT solver core pure algorithm scheduling (e.g., aspeed) does not require instance features sequential schedules can be parallelised easily and effectively (HH, Kaminski, Schaub, Schneider 2012) Holger Hoos: Meta-algorithmic techniques in SAT solving 50
59 Parallel algorithm portfolios Key idea: Exploit complementary strengths by running multiple algorithms (or instances of a randomised algorithm) concurrently. Holger Hoos: Meta-algorithmic techniques in SAT solving 51
60 Parallel Algorithm Portfolios Holger Hoos: Meta-algorithmic techniques in SAT solving 51
61 Parallel algorithm portfolios Key idea: Exploit complementary strengths by running multiple algorithms (or instances of a randomised algorithm) concurrently. risk vs reward (expected running time) tradeoff, robust performance on a wide range of instances Huberman, Lukose, Hogg (1997); Gomes & Selman (1997,2000) Note: can be realised through time-sharing / multi-tasking particularly attractive for multi-core / multi-processor architectures Holger Hoos: Meta-algorithmic techniques in SAT solving 51
62 Application to decision problems (like SAT, SMT): Concurrently run given component solvers until the first of them solves the instance. running time on instance π = (# solvers) (running time of VBS on π) Examples: ManySAT (Hamadi, Jabbour, Sais 2009; Guo, Hamadi, Jabbour, Sais 2010) Plingeling (Biere ) ppfolio (Roussel 2011) excellent performance (see 2009, 2011 SAT competitions) Holger Hoos: Meta-algorithmic techniques in SAT solving 52
63 Types of parallel portfolios: static vs instance-based vs dynamic with or without communication (SAT: clause sharing) Methods for constructing (static) parallel portfolios: manual (by human expert, based on experimentation) classification maximisation (Petrik & Zilberstein 2006) case-based reasoning + greedy construction heuristic (Yun & Epstein 2012) Holger Hoos: Meta-algorithmic techniques in SAT solving 53
64 Constructing portfolios from a single parametric solver HH, Leyton-Brown, Schaub, Schneider (under review) Key idea: Take single parametric solver, find configurations that make an effective parallel portfolio (analogous to Hydra). Note: This allows to automatically obtain parallel solvers from sequential sources (automatic parallisation) Methods for constructing such portfolios: global optimisation: simultaneous configuration of all component solvers greedy construction: add + configure one component at a time Holger Hoos: Meta-algorithmic techniques in SAT solving 54
65 Preliminary results on competition application instances (4 components) solver PAR1 PAR10 #timeouts ManySAT (1.1) /679 ManySAT (2.0) /679 Plingeling (276) /679 Plingeling (587) /679 Greedy-MT4(Lingeling) /679 ppfolio /679 CryptoMiniSat /679 VBS over all of the above /679 Holger Hoos: Meta-algorithmic techniques in SAT solving 55
66 Algorithm configuration Lo Hi Holger Hoos: Meta-algorithmic techniques in SAT solving 56
67 Algorithm configuration Lo Hi Holger Hoos: Meta-algorithmic techniques in SAT solving 56
68 The next step: Programming by Optimisation (PbO) HH ( ) Key idea: specify large, rich design spaces of solver for given problem avoid premature, uninformed, possibly detrimental design choices active development of promising alternatives for design components automatically make choices to obtain solver optimised for given use context Holger Hoos: Meta-algorithmic techniques in SAT solving 57
69 solver design space of solvers optimised solver application context Holger Hoos: Meta-algorithmic techniques in SAT solving 58
70 solver parallel planner portfolio design space of solvers instancebased selector optimised solver application context Holger Hoos: Meta-algorithmic techniques in SAT solving 58
71 parallel planner portfolio design space of solvers instancebased selector optimised solver application context Holger Hoos: Meta-algorithmic techniques in SAT solving 58
72 Levels of PbO: Level 4: Make no design choice prematurely that cannot be justified compellingly. Level 3: Strive to provide design choices and alternatives. Level 2: Keep and expose design choices considered during software development. Level 1: Expose design choices hardwired into existing code (magic constants, hidden parameters, abandoned design alternatives). Level 0: Optimise settings of parameters exposed by existing software. Holger Hoos: Meta-algorithmic techniques in SAT solving 59
73 Success in optimising speed: Application, Design choices Speedup PbO level SAT-based software verification (Spear), 41 Hutter, Babić, HH, Hu (2007) AI Planning (LPG), 62 Vallati, Fawcett, Gerevini, HH, Saetti (2011) Mixed integer programming (CPLEX), 76 Hutter, HH, Leyton-Brown (2010) and solution quality: University timetabling, 18 design choices, PbO level 2 3 new state of the art; UBC exam scheduling Fawcett, Chiarandini, HH (2009) Holger Hoos: Meta-algorithmic techniques in SAT solving 60
74 Software development in the PbO paradigm design space description PbO-<L> source(s) PbO-<L> weaver parametric <L> source(s) PbO design optimiser instantiated <L> source(s) benchmark inputs use context deployed executable Holger Hoos: Meta-algorithmic techniques in SAT solving 61
75 Design space specification Option 1: use language-specific mechanisms command-line parameters conditional execution conditional compilation (ifdef) Option 2: generic programming language extension Dedicated support for... exposing parameters specifying alternative blocks of code Holger Hoos: Meta-algorithmic techniques in SAT solving 62
76 Advantages of generic language extension: reduced overhead for programmer clean separation of design choices from other code dedicated PbO support in software development environments Key idea: augmented sources: PbO-Java = Java + PbO constructs,... tool to compile down into target language: weaver Holger Hoos: Meta-algorithmic techniques in SAT solving 63
77 design space description PbO-<L> source(s) PbO-<L> weaver parametric <L> source(s) PbO design optimiser instantiated <L> source(s) benchmark input use context deployed executable Holger Hoos: Meta-algorithmic techniques in SAT solving 64
78 Exposing parameters... numerator -= (int) (numerator / (adjfactor+1) * 1.4); ##PARAM(float multiplier=1.4) numerator -= (int) (numerator / (adjfactor+1) * ##multiplier);... parameter declarations can appear at arbitrary places (before or after first use of parameter) access to parameters is read-only (values can only be set/changed via command-line or config file) Holger Hoos: Meta-algorithmic techniques in SAT solving 65
79 Specifying design alternatives Choice: set of interchangeable fragments of code that represent design alternatives (instances of choice) Choice point: location in a program at which a choice is available ##BEGIN CHOICE preprocessing <block 1> ##END CHOICE preprocessing Holger Hoos: Meta-algorithmic techniques in SAT solving 66
80 Specifying design alternatives Choice: set of interchangeable fragments of code that represent design alternatives (instances of choice) Choice point: location in a program at which a choice is available ##BEGIN CHOICE preprocessing=standard <block S> ##END CHOICE preprocessing ##BEGIN CHOICE preprocessing=enhanced <block E> ##END CHOICE preprocessing Holger Hoos: Meta-algorithmic techniques in SAT solving 66
81 Specifying design alternatives Choice: set of interchangeable fragments of code that represent design alternatives (instances of choice) Choice point: location in a program at which a choice is available ##BEGIN CHOICE preprocessing <block 1> ##END CHOICE preprocessing... ##BEGIN CHOICE preprocessing <block 2> ##END CHOICE preprocessing Holger Hoos: Meta-algorithmic techniques in SAT solving 66
82 Specifying design alternatives Choice: set of interchangeable fragments of code that represent design alternatives (instances of choice) Choice point: location in a program at which a choice is available ##BEGIN CHOICE preprocessing <block 1a> ##BEGIN CHOICE extrapreprocessing <block 2> ##END CHOICE extrapreprocessing <block 1b> ##END CHOICE preprocessing Holger Hoos: Meta-algorithmic techniques in SAT solving 66
83 Holger Hoos: Meta-algorithmic techniques in SAT solving 67
84 The Weaver transforms PbO-<L> code into <L> code (<L> = Java, C++,... ) parametric mode: expose parameters make choices accessible via (conditional, categorical) parameters (partial) instantiation mode: hardwire (some) parameters into code (expose others) hardwire (some) choices into code (make others accessible via parameters) Holger Hoos: Meta-algorithmic techniques in SAT solving 68
85 Holger Hoos: Meta-algorithmic techniques in SAT solving 69
86 Design space optimisation Use previously discussed techniques for... automated algorithm configuration (e.g., ParamILS,... ) instance-based selector construction (e.g., Hydra) parallel portfolio construction (e.g., Schneider, ) Much room for further improvements: better optimisation procedures (e.g., for configuration) meta-optimisation (optimising the optimisers) scenario-based optimiser selection parallel portfolios of design optimisation procedures Holger Hoos: Meta-algorithmic techniques in SAT solving 70
87 Cost & concerns But what about... Computational complexity? Cost of development? Limitations of scope? Holger Hoos: Meta-algorithmic techniques in SAT solving 71
88 Computationally too expensive? Spear revisited: total configuration time on software verification benchmarks: 30 CPU days wall-clock time on 10 CPU cluster: 3 days cost on Amazon Elastic Compute Cloud (EC2): USD (= EUR) USD pays for... 1:45 hours of average software engineer 8:26 hours at minimum wage Holger Hoos: Meta-algorithmic techniques in SAT solving 72
89 Too expensive in terms of development? Design and coding: tradeoff between performance/flexibility and overhead overhead depends on level of PbO traditional approach: cost from manual exploration of design choices! Testing and debugging: design alternatives for individual mechanisms and components can be tested separately effort linear (rather than exponential) in the number of design choices Holger Hoos: Meta-algorithmic techniques in SAT solving 73
90 Limited to the niche of NP-hard problem solving? Some PbO-flavoured work in the literature: computing-platform-specific performance optimisation of linear algebra routines (Whaley et al. 2001) optimisation of sorting algorithms using genetic programming (Li et al. 2005) compiler optimisation (Pan & Eigenmann 2006, Cavazos et al. 2007) database server configuration (Diao et al. 2003) Holger Hoos: Meta-algorithmic techniques in SAT solving 74
91 The road ahead Support for PbO-based software development Weavers for PbO-C, PbO-C++, PbO-Java PbO-aware development platforms Improved / integrated PbO design optimiser Best practices Many further applications Scientific insights Holger Hoos: Meta-algorithmic techniques in SAT solving 75
92 Communications of the ACM, 55(2), pp , February
93 Meta-algorithmic techniques... leverage computational power to construct better algorithms liberate human designers from boring, menial tasks and lets them focus on higher-level design issues facilitates principled design of heuristic algorithms profoundly changes how we build and use algorithms Holger Hoos: Meta-algorithmic techniques in SAT solving 76
94 Three (slightly provocative?) predictions Meta-algorithmic techniques will become indispensable for solving SAT, SMT and all other NP-hard problems. Programming by Optimisation (or a closely related approach) will become as ubiquitous as compilation. Driven by SAT, SMT, PbO and advances in automated testing / debugging, program synthesis from higher-level designs will become practical and widely used. Holger Hoos: Meta-algorithmic techniques in SAT solving 77
From Stochastic Search to Programming by Optimisation:
From Stochastic Search to Programming by Optimisation: My Quest for Automating the Design of High-Performance Algorithms Holger H. Hoos Department of Computer Science University of British Columbia Canada
More informationProgramming by Optimisation:
Programming by Optimisation: Towards a new Paradigm for Developing High-Performance Software Holger H. Hoos BETA Lab Department of Computer Science University of British Columbia Canada ICAPS 2013 Rome,
More informationAutomatic Algorithm Configuration based on Local Search
Automatic Algorithm Configuration based on Local Search Frank Hutter 1 Holger Hoos 1 Thomas Stützle 2 1 Department of Computer Science University of British Columbia Canada 2 IRIDIA Université Libre de
More informationAutomatic Algorithm Configuration based on Local Search
Automatic Algorithm Configuration based on Local Search Frank Hutter 1 Holger Hoos 1 Thomas Stützle 2 1 Department of Computer Science University of British Columbia Canada 2 IRIDIA Université Libre de
More informationAn Empirical Study of Per-Instance Algorithm Scheduling
An Empirical Study of Per-Instance Algorithm Scheduling Marius Lindauer, Rolf-David Bergdoll, and Frank Hutter University of Freiburg Abstract. Algorithm selection is a prominent approach to improve a
More informationSequential Model-based Optimization for General Algorithm Configuration
Sequential Model-based Optimization for General Algorithm Configuration Frank Hutter, Holger Hoos, Kevin Leyton-Brown University of British Columbia LION 5, Rome January 18, 2011 Motivation Most optimization
More informationAlgorithm Configuration for Portfolio-based Parallel SAT-Solving
Algorithm Configuration for Portfolio-based Parallel SAT-Solving Holger Hoos 1 and Kevin Leyton-Brown 1 and Torsten Schaub 2 and Marius Schneider 2 Abstract. Since 2004, the increases in processing power
More informationQuantifying Homogeneity of Instance Sets for Algorithm Configuration
Quantifying Homogeneity of Instance Sets for Algorithm Configuration Marius Schneider and Holger H. Hoos University of Potsdam, manju@cs.uni-potsdam.de Unversity of British Columbia, hoos@cs.ubc.ca Abstract.
More informationHydra-MIP: Automated Algorithm Configuration and Selection for Mixed Integer Programming
Hydra-MIP: Automated Algorithm Configuration and Selection for Mixed Integer Programming Lin Xu, Frank Hutter, Holger H. Hoos, and Kevin Leyton-Brown Department of Computer Science University of British
More informationPerformance Prediction and Automated Tuning of Randomized and Parametric Algorithms
Performance Prediction and Automated Tuning of Randomized and Parametric Algorithms Frank Hutter 1, Youssef Hamadi 2, Holger Hoos 1, and Kevin Leyton-Brown 1 1 University of British Columbia, Vancouver,
More informationProgramming by Optimisation:
Programming by Optimisation: A Practical Paradigm for Computer-Aided Algorithm Design Holger H. Hoos & Frank Hutter Department of Computer Science University of British Columbia Canada Department of Computer
More informationSimple algorithm portfolio for SAT
Artif Intell Rev DOI 10.1007/s10462-011-9290-2 Simple algorithm portfolio for SAT Mladen Nikolić Filip Marić PredragJaničić Springer Science+Business Media B.V. 2011 Abstract The importance of algorithm
More informationSpySMAC: Automated Configuration and Performance Analysis of SAT Solvers
SpySMAC: Automated Configuration and Performance Analysis of SAT Solvers Stefan Falkner, Marius Lindauer, and Frank Hutter University of Freiburg {sfalkner,lindauer,fh}@cs.uni-freiburg.de Abstract. Most
More informationFrom Sequential Algorithm Selection to Parallel Portfolio Selection
From Sequential Algorithm Selection to Parallel Portfolio Selection M. Lindauer 1 and H. Hoos 2 and F. Hutter 1 1 University of Freiburg and 2 University of British Columbia Abstract. In view of the increasing
More informationAutomatic Configuration of Sequential Planning Portfolios
Automatic Configuration of Sequential Planning Portfolios Jendrik Seipp and Silvan Sievers and Malte Helmert University of Basel Basel, Switzerland {jendrik.seipp,silvan.sievers,malte.helmert}@unibas.ch
More informationComputer-Aided Design of High-Performance Algorithms
University of British Columbia Department of Computer Science Technical Report TR-2008-16 Computer-Aided Design of High-Performance Algorithms Holger H. Hoos University of British Columbia Department of
More informationEfficient Parameter Importance Analysis via Ablation with Surrogates
Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence (AAAI-17) Efficient Parameter Importance Analysis via Ablation with Surrogates André Biedenkapp, Marius Lindauer, Katharina Eggensperger,
More informationAI-Augmented Algorithms
AI-Augmented Algorithms How I Learned to Stop Worrying and Love Choice Lars Kotthoff University of Wyoming larsko@uwyo.edu Warsaw, 17 April 2019 Outline Big Picture Motivation Choosing Algorithms Tuning
More informationSATenstein: Automatically Building Local Search SAT Solvers from Components
SATenstein: Automatically Building Local Search SAT Solvers from Components Ashiqur R. KhudaBukhsh* 1a, Lin Xu* b, Holger H. Hoos b, Kevin Leyton-Brown b a Department of Computer Science, Carnegie Mellon
More informationAutomated Configuration of MIP solvers
Automated Configuration of MIP solvers Frank Hutter, Holger Hoos, and Kevin Leyton-Brown Department of Computer Science University of British Columbia Vancouver, Canada {hutter,hoos,kevinlb}@cs.ubc.ca
More informationOrdered Racing Protocols for Automatically Configuring Algorithms for Scaling Performance
Ordered Racing Protocols for Automatically Configuring Algorithms for Scaling Performance James Styles University of British Columbia jastyles@cs.ubc.ca Holger Hoos University of British Columbia hoos@cs.ubc.ca
More informationAutomatic solver configuration using SMAC
Automatic solver configuration using SMAC How to boost performance of your SAT solver? Marius Lindauer 1 University of Freiburg SAT Industrial Day 2016, Bordeaux 1 Thanks to Frank Hutter! Ever looked into
More informationProgramming by Optimisation:
Programming by Optimisation: A Practical Paradigm for Computer-Aided Algorithm Design Holger H. Hoos & Frank Hutter Leiden Institute for Advanced CS Universiteit Leiden The Netherlands Department of Computer
More informationAI-Augmented Algorithms
AI-Augmented Algorithms How I Learned to Stop Worrying and Love Choice Lars Kotthoff University of Wyoming larsko@uwyo.edu Boulder, 16 January 2019 Outline Big Picture Motivation Choosing Algorithms Tuning
More informationProblem Solving and Search in Artificial Intelligence
Problem Solving and Search in Artificial Intelligence Algorithm Configuration Nysret Musliu Database and Artificial Intelligence Group, Institut of Logic and Computation, TU Wien Motivation Metaheuristic
More informationA Portfolio Solver for Answer Set Programming: Preliminary Report
A Portfolio Solver for Answer Set Programming: Preliminary Report M. Gebser, R. Kaminski, B. Kaufmann, T. Schaub, M. Schneider, and S. Ziller Institut für Informatik, Universität Potsdam Abstract. We propose
More informationA Modular Multiphase Heuristic Solver for Post Enrolment Course Timetabling
A Modular Multiphase Heuristic Solver for Post Enrolment Course Timetabling Marco Chiarandini 1, Chris Fawcett 2, and Holger H. Hoos 2,3 1 University of Southern Denmark, Department of Mathematics and
More informationAutoFolio: Algorithm Configuration for Algorithm Selection
Algorithm Configuration: Papers from the 2015 AAAI Workshop AutoFolio: Algorithm Configuration for Algorithm Selection Marius Lindauer Holger H. Hoos lindauer@cs.uni-freiburg.de University of Freiburg
More informationFast Downward Cedalion
Fast Downward Cedalion Jendrik Seipp and Silvan Sievers Universität Basel Basel, Switzerland {jendrik.seipp,silvan.sievers}@unibas.ch Frank Hutter Universität Freiburg Freiburg, Germany fh@informatik.uni-freiburg.de
More informationPredicting the runtime of combinatorial problems CS6780: Machine Learning Project Final Report
Predicting the runtime of combinatorial problems CS6780: Machine Learning Project Final Report Eoin O Mahony December 16, 2010 Abstract Solving combinatorial problems often requires expert knowledge. These
More informationSATzilla: Portfolio-based Algorithm Selection for SAT
Journal of Artificial Intelligence Research 32 (2008) 565-606 Submitted 11/07; published 06/08 SATzilla: Portfolio-based Algorithm Selection for SAT Lin Xu Frank Hutter Holger H. Hoos Kevin Leyton-Brown
More informationIdentifying Key Algorithm Parameters and Instance Features using Forward Selection
Identifying Key Algorithm Parameters and Instance Features using Forward Selection Frank Hutter, Holger H. Hoos and Kevin Leyton-Brown University of British Columbia, 66 Main Mall, Vancouver BC, V6T Z,
More informationImproving the State of the Art in Inexact TSP Solving using Per-Instance Algorithm Selection
Improving the State of the Art in Inexact TSP Solving using Per-Instance Algorithm Selection Lars Kotthoff 1, Pascal Kerschke 2, Holger Hoos 3, and Heike Trautmann 2 1 Insight Centre for Data Analytics,
More informationSATenstein: Automatically Building Local Search SAT Solvers From Components
SATenstein: Automatically Building Local Search SAT Solvers From Components by Ashiqur Rahman KhudaBukhsh A THESIS SUBMITTED IN PARTIAL FULFILMENT OF THE REQUIREMENTS FOR THE DEGREE OF Master of Science
More informationReACT: Real-time Algorithm Configuration through Tournaments
ReACT: Real-time Algorithm Configuration through Tournaments Tadhg Fitzgerald and Yuri Malitsky and Barry O Sullivan Kevin Tierney Insight Centre for Data Analytics Department of Business Information Systems
More informationAutomatic Algorithm Configuration
Automatic Algorithm Configuration Thomas Stützle RDA, CoDE, Université Libre de Bruxelles Brussels, Belgium stuetzle@ulb.ac.be iridia.ulb.ac.be/~stuetzle Outline 1. Context 2. Automatic algorithm configuration
More informationAutomating Meta-algorithmic Analysis and Design
Automating Meta-algorithmic Analysis and Design by Christopher Warren Nell B. Sc., University of Guelph, 2005 A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF Master of Science
More informationDesigning a Portfolio of Parameter Configurations for Online Algorithm Selection
Designing a Portfolio of Parameter Configurations for Online Algorithm Selection Aldy Gunawan and Hoong Chuin Lau and Mustafa Mısır Singapore Management University 80 Stamford Road Singapore 178902 {aldygunawan,
More informationFrom CATS to SAT: Modeling Empirical Hardness to Understand and Solve Hard Computational Problems. Kevin Leyton Brown
From CATS to SAT: Modeling Empirical Hardness to Understand and Solve Hard Computational Problems Kevin Leyton Brown Computer Science Department University of British Columbia Intro From combinatorial
More informationUnderstanding the Empirical Hardness of NP-Complete Problems
Using machine learning to predict algorithm runtime. BY KEVIN LEYTON-BROWN, HOLGER H. HOOS, FRANK HUTTER, AND LIN XU DOI:.45/59443.59444 Understanding the Empirical Hardness of NP-Complete Problems PROBLEMS
More informationCS-E3200 Discrete Models and Search
Shahab Tasharrofi Department of Information and Computer Science, Aalto University Lecture 7: Complete and local search methods for SAT Outline Algorithms for solving Boolean satisfiability problems Complete
More informationarxiv: v2 [cs.ai] 11 Oct 2018
Journal of Artificial Intelligence Research X (XXX) X-XX Submitted X/XX; published X/XX Pitfalls and Best Practices in Algorithm Configuration Katharina Eggensperger Marius Lindauer Frank Hutter University
More information2. Blackbox hyperparameter optimization and AutoML
AutoML 2017. Automatic Selection, Configuration & Composition of ML Algorithms. at ECML PKDD 2017, Skopje. 2. Blackbox hyperparameter optimization and AutoML Pavel Brazdil, Frank Hutter, Holger Hoos, Joaquin
More informationCaptain Jack: New Variable Selection Heuristics in Local Search for SAT
Captain Jack: New Variable Selection Heuristics in Local Search for SAT Dave Tompkins, Adrian Balint, Holger Hoos SAT 2011 :: Ann Arbor, Michigan http://www.cs.ubc.ca/research/captain-jack Key Contribution:
More informationLearning to Choose Instance-Specific Macro Operators
Learning to Choose Instance-Specific Macro Operators Maher Alhossaini Department of Computer Science University of Toronto Abstract The acquisition and use of macro actions has been shown to be effective
More informationPerformance Prediction and Automated Tuning of Randomized and Parametric Algorithms
Performance Prediction and Automated Tuning of Randomized and Parametric Algorithms Frank Hutter, Youssef Hamadi, Holger H. Hoos, and Kevin Leyton-Brown University of British Columbia, 366 Main Mall, Vancouver
More informationAutomatically Configuring Algorithms. Some recent work
Automatically Configuring Algorithms Some recent work Thomas Stützle, Leonardo Bezerra, Jérémie Dubois-Lacoste, Tianjun Liao, Manuel López-Ibáñez, Franco Mascia and Leslie Perez Caceres IRIDIA, CoDE, Université
More informationHierarchical Hardness Models for SAT
Hierarchical Hardness Models for SAT Lin Xu, Holger H. Hoos, and Kevin Leyton-Brown University of British Columbia, 66 Main Mall, Vancouver BC, V6T Z, Canada {xulin7,hoos,kevinlb}@cs.ubc.ca Abstract. Empirical
More informationarxiv: v1 [cs.ai] 7 Oct 2013
Bayesian Optimization With Censored Response Data Frank Hutter, Holger Hoos, and Kevin Leyton-Brown Department of Computer Science University of British Columbia {hutter, hoos, kevinlb}@cs.ubc.ca arxiv:.97v
More informationASAP.V2 and ASAP.V3: Sequential optimization of an Algorithm Selector and a Scheduler
ASAP.V2 and ASAP.V3: Sequential optimization of an Algorithm Selector and a Scheduler François Gonard, Marc Schoenauer, Michele Sebag To cite this version: François Gonard, Marc Schoenauer, Michele Sebag.
More informationAutomated Configuration of Mixed Integer Programming Solvers
Automated Configuration of Mixed Integer Programming Solvers Frank Hutter, Holger H. Hoos, and Kevin Leyton-Brown University of British Columbia, 2366 Main Mall, Vancouver BC, V6T 1Z4, Canada {hutter,hoos,kevinlb}@cs.ubc.ca
More informationMulti-Level Algorithm Selection for ASP
Multi-Level Algorithm Selection for ASP Marco Maratea 1, Luca Pulina 2, and Francesco Ricca 3 1 DIBRIS, University of Genova 2 POLCOMING, University of Sassari 3 DeMaCS, University of Calabria 13th International
More informationBias in Algorithm Portfolio Performance Evaluation
Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence (IJCAI-16) Bias in Algorithm Portfolio Performance Evaluation Chris Cameron, Holger H. Hoos, Kevin Leyton-Brown
More informationBoosting as a Metaphor for Algorithm Design
Boosting as a Metaphor for Algorithm Design Kevin Leyton-Brown, Eugene Nudelman, Galen Andrew, Jim McFadden, and Yoav Shoham {kevinlb;eugnud;galand;jmcf;shoham}@cs.stanford.edu Stanford University, Stanford
More informationMO-ParamILS: A Multi-objective Automatic Algorithm Configuration Framework
MO-ParamILS: A Multi-objective Automatic Algorithm Configuration Framework Aymeric Blot, Holger Hoos, Laetitia Jourdan, Marie-Éléonore Marmion, Heike Trautmann To cite this version: Aymeric Blot, Holger
More informationParallel Algorithm Configuration
Parallel Algorithm Configuration Frank Hutter, Holger H. Hoos and Kevin Leyton-Brown University of British Columbia, 366 Main Mall, Vancouver BC, V6T Z4, Canada {hutter,hoos,kevinlb}@cs.ubc.ca Abstract.
More informationParamILS: An Automatic Algorithm Configuration Framework
Journal of Artificial Intelligence Research 36 (2009) 267-306 Submitted 06/09; published 10/09 ParamILS: An Automatic Algorithm Configuration Framework Frank Hutter Holger H. Hoos Kevin Leyton-Brown University
More informationMachine Learning for Constraint Solving
Machine Learning for Constraint Solving Alejandro Arbelaez, Youssef Hamadi, Michèle Sebag TAO, Univ. Paris-Sud Dagstuhl May 17th, 2011 Position of the problem Algorithms, the vision Software editor vision
More informationData Science meets Optimization. ODS 2017 EURO Plenary Patrick De Causmaecker EWG/DSO EURO Working Group CODeS KU Leuven/ KULAK
Data Science meets Optimization ODS 2017 EURO Plenary Patrick De Causmaecker EWG/DSO EURO Working Group CODeS KU Leuven/ KULAK ODS 2017, Sorrento www.fourthparadigm.com Copyright 2009 Microsoft Corporation
More informationMassively Parallel Seesaw Search for MAX-SAT
Massively Parallel Seesaw Search for MAX-SAT Harshad Paradkar Rochester Institute of Technology hp7212@rit.edu Prof. Alan Kaminsky (Advisor) Rochester Institute of Technology ark@cs.rit.edu Abstract The
More informationHierarchical Hardness Models for SAT
Hierarchical Hardness Models for SAT Lin Xu, Holger H. Hoos, and Kevin Leyton-Brown University of British Columbia, 66 Main Mall, Vancouver BC, V6T Z, Canada {xulin7,hoos,kevinlb}@cs.ubc.ca Abstract. Empirical
More informationHandbook of Constraint Programming 245 Edited by F. Rossi, P. van Beek and T. Walsh c 2006 Elsevier All rights reserved
Handbook of Constraint Programming 245 Edited by F. Rossi, P. van Beek and T. Walsh c 2006 Elsevier All rights reserved Chapter 8 Local Search Methods Holger H. Hoos and Edward Tsang Local search is one
More informationParallel Algorithm Configuration
Parallel Algorithm Configuration Frank Hutter, Holger H. Hoos, and Kevin Leyton-Brown University of British Columbia, 366 Main Mall, Vancouver BC, V6T Z4, Canada {hutter,hoos,kevinlb}@cs.ubc.ca Abstract.
More informationInstance-Specific Algorithm Configuration
Instance-Specific Algorithm Configuration by Yuri Malitsky B.Sc., Cornell University, Ithaca, NY, 2007 M.Sc., Brown University, Providence, RI, 2009 A dissertation submitted in partial fulfillment of the
More informationAn Adaptive Noise Mechanism for WalkSAT
From: AAAI-02 Proceedings. Copyright 2002, AAAI (www.aaai.org). All rights reserved. An Adaptive Noise Mechanism for WalkSAT Holger H. Hoos University of British Columbia Computer Science Department 2366
More informationMetaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini
Metaheuristic Development Methodology Fall 2009 Instructor: Dr. Masoud Yaghini Phases and Steps Phases and Steps Phase 1: Understanding Problem Step 1: State the Problem Step 2: Review of Existing Solution
More informationTuning Algorithms for Tackling Large Instances: An Experimental Protocol.
Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle Tuning Algorithms for Tackling Large Instances: An Experimental Protocol. Franco
More informationarxiv: v1 [cs.ai] 6 Jan 2014
Under consideration for publication in Theory and Practice of Logic Programming 1 aspeed: Solver Scheduling via Answer Set Programming arxiv:1401.1024v1 [cs.ai] 6 Jan 2014 Holger Hoos Department of Computer
More informationEECS 219C: Computer-Aided Verification Boolean Satisfiability Solving. Sanjit A. Seshia EECS, UC Berkeley
EECS 219C: Computer-Aided Verification Boolean Satisfiability Solving Sanjit A. Seshia EECS, UC Berkeley Project Proposals Due Friday, February 13 on bcourses Will discuss project topics on Monday Instructions
More informationAn Empirical Study of Hoeffding Racing for Model Selection in k-nearest Neighbor Classification
An Empirical Study of Hoeffding Racing for Model Selection in k-nearest Neighbor Classification Flora Yu-Hui Yeh and Marcus Gallagher School of Information Technology and Electrical Engineering University
More informationStochastic Local Search Methods for Dynamic SAT an Initial Investigation
Stochastic Local Search Methods for Dynamic SAT an Initial Investigation Holger H. Hoos and Kevin O Neill Abstract. We introduce the dynamic SAT problem, a generalisation of the satisfiability problem
More informationUsing Machine Learning to Optimize Storage Systems
Using Machine Learning to Optimize Storage Systems Dr. Kiran Gunnam 1 Outline 1. Overview 2. Building Flash Models using Logistic Regression. 3. Storage Object classification 4. Storage Allocation recommendation
More informationReliability of Computational Experiments on Virtualised Hardware
Reliability of Computational Experiments on Virtualised Hardware Ian P. Gent and Lars Kotthoff {ipg,larsko}@cs.st-andrews.ac.uk University of St Andrews arxiv:1110.6288v1 [cs.dc] 28 Oct 2011 Abstract.
More informationCS227: Assignment 1 Report
1 CS227: Assignment 1 Report Lei Huang and Lawson Wong April 20, 2008 1 Introduction Propositional satisfiability (SAT) problems have been of great historical and practical significance in AI. Despite
More informationAlgorithm Selection via Ranking
Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence Algorithm Selection via Ranking Richard J. Oentaryo, Stephanus Daniel Handoko, and Hoong Chuin Lau Living Analytics Research Centre,
More informationAutomated Design of Metaheuristic Algorithms
Université Libre de Bruxelles Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle Automated Design of Metaheuristic Algorithms T. Stützle and M. López-Ibáñez IRIDIA
More informationCS264: Beyond Worst-Case Analysis Lecture #20: Application-Specific Algorithm Selection
CS264: Beyond Worst-Case Analysis Lecture #20: Application-Specific Algorithm Selection Tim Roughgarden March 16, 2017 1 Introduction A major theme of CS264 is to use theory to obtain accurate guidance
More informationBoosting Verification by Automatic Tuning of Decision Procedures
Boosting Verification by Automatic Tuning of Decision Procedures Frank Hutter, Domagoj Babić, Holger H. Hoos, and Alan J. Hu Department of Computer Science, University of British Columbia {hutter, babic,
More informationThe MAX-SAX Problems
STOCHASTIC LOCAL SEARCH FOUNDATION AND APPLICATION MAX-SAT & MAX-CSP Presented by: Wei-Lwun Lu 1 The MAX-SAX Problems MAX-SAT is the optimization variant of SAT. Unweighted MAX-SAT: Finds a variable assignment
More informationSimple mechanisms for escaping from local optima:
The methods we have seen so far are iterative improvement methods, that is, they get stuck in local optima. Simple mechanisms for escaping from local optima: I Restart: re-initialise search whenever a
More informationEvolving Variable-Ordering Heuristics for Constrained Optimisation
Griffith Research Online https://research-repository.griffith.edu.au Evolving Variable-Ordering Heuristics for Constrained Optimisation Author Bain, Stuart, Thornton, John, Sattar, Abdul Published 2005
More informationStochastic greedy local search Chapter 7
Stochastic greedy local search Chapter 7 ICS-275 Winter 2016 Example: 8-queen problem Main elements Choose a full assignment and iteratively improve it towards a solution Requires a cost function: number
More informationWarmstarting of Model-Based Algorithm Configuration
The Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-18) Warmstarting of Model-Based Algorithm Configuration Marius Lindauer, Frank Hutter University of Freiburg {lindauer,fh}@cs.uni-freiburg.de
More informationPortfolios of Subgraph Isomorphism Algorithms
Portfolios of Subgraph Isomorphism Algorithms Lars Kotthoff 1, Ciaran McCreesh 2, and Christine Solnon 3 1 University of British Columbia, Vancouver, Canada 2 University of Glasgow, Glasgow, Scotland 3
More informationEECS 219C: Formal Methods Boolean Satisfiability Solving. Sanjit A. Seshia EECS, UC Berkeley
EECS 219C: Formal Methods Boolean Satisfiability Solving Sanjit A. Seshia EECS, UC Berkeley The Boolean Satisfiability Problem (SAT) Given: A Boolean formula F(x 1, x 2, x 3,, x n ) Can F evaluate to 1
More informationRecommender Systems New Approaches with Netflix Dataset
Recommender Systems New Approaches with Netflix Dataset Robert Bell Yehuda Koren AT&T Labs ICDM 2007 Presented by Matt Rodriguez Outline Overview of Recommender System Approaches which are Content based
More informationOn the Run-time Behaviour of Stochastic Local Search Algorithms for SAT
From: AAAI-99 Proceedings. Copyright 1999, AAAI (www.aaai.org). All rights reserved. On the Run-time Behaviour of Stochastic Local Search Algorithms for SAT Holger H. Hoos University of British Columbia
More informationWORKSHOP ON COMBINING CONSTRAINT SOLVING WITH MINING AND LEARNING
20TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE ECAI 2012 Proceedings WORKSHOP ON COMBINING CONSTRAINT SOLVING WITH MINING AND LEARNING CoCoMile 2012 August 27, 2012 Montpellier, France Workshop Organization
More informationUnderstanding Random SAT: Beyond the Clauses-to-Variables Ratio
Understanding Random SAT: Beyond the Clauses-to-Variables Ratio Eugene Nudelman 1, Kevin Leyton-Brown 2, Holger H. Hoos 2, Alex Devkar 1, and Yoav Shoham 1 1 Computer Science Department, Stanford University,Stanford,
More informationPredictive Analytics: Demystifying Current and Emerging Methodologies. Tom Kolde, FCAS, MAAA Linda Brobeck, FCAS, MAAA
Predictive Analytics: Demystifying Current and Emerging Methodologies Tom Kolde, FCAS, MAAA Linda Brobeck, FCAS, MAAA May 18, 2017 About the Presenters Tom Kolde, FCAS, MAAA Consulting Actuary Chicago,
More informationa local optimum is encountered in such a way that further improvement steps become possible.
Dynamic Local Search I Key Idea: Modify the evaluation function whenever a local optimum is encountered in such a way that further improvement steps become possible. I Associate penalty weights (penalties)
More informationn Informally: n How to form solutions n How to traverse the search space n Systematic: guarantee completeness
Advanced Search Applications: Combinatorial Optimization Scheduling Algorithms: Stochastic Local Search and others Analyses: Phase transitions, structural analysis, statistical models Combinatorial Problems
More informationAppendix: Generic PbO programming language extension
Holger H. Hoos: Programming by Optimization Appendix: Generic PbO programming language extension As explained in the main text, we propose three fundamental mechanisms to be covered by a generic PbO programming
More informationOn the impact of small-world on local search
On the impact of small-world on local search Andrea Roli andrea.roli@unibo.it DEIS Università degli Studi di Bologna Campus of Cesena p. 1 Motivation The impact of structure whatever it is on search algorithms
More informationLearning techniques for Automatic Algorithm Portfolio Selection
Learning techniques for Automatic Algorithm Portfolio Selection Alessio Guerri and Michela Milano 1 Abstract. The purpose of this paper is to show that a well known machine learning technique based on
More informationNote: In physical process (e.g., annealing of metals), perfect ground states are achieved by very slow lowering of temperature.
Simulated Annealing Key idea: Vary temperature parameter, i.e., probability of accepting worsening moves, in Probabilistic Iterative Improvement according to annealing schedule (aka cooling schedule).
More informationBig Data Methods. Chapter 5: Machine learning. Big Data Methods, Chapter 5, Slide 1
Big Data Methods Chapter 5: Machine learning Big Data Methods, Chapter 5, Slide 1 5.1 Introduction to machine learning What is machine learning? Concerned with the study and development of algorithms that
More informationData Knowledge and Optimization. Patrick De Causmaecker CODeS-imec Department of Computer Science KU Leuven/ KULAK
Data Knowledge and Optimization Patrick De Causmaecker CODeS-imec Department of Computer Science KU Leuven/ KULAK Whishes from West-Flanders, Belgium 10 rules for success in supply chain mngmnt & logistics
More informationPackage llama. R topics documented: July 11, Type Package
Type Package Package llama July 11, 2018 Title Leveraging Learning to Automatically Manage Algorithms Version 0.9.2 Date 2018-07-11 Author Lars Kotthoff [aut,cre], Bernd Bischl [aut], Barry Hurley [ctb],
More informationCombinational Equivalence Checking Using Incremental SAT Solving, Output Ordering, and Resets
ASP-DAC 2007 Yokohama Combinational Equivalence Checking Using Incremental SAT Solving, Output ing, and Resets Stefan Disch Christoph Scholl Outline Motivation Preliminaries Our Approach Output ing Heuristics
More information