The Branch & Move algorithm: Improving Global Constraints Support by Local Search

Similar documents
The TV-Break Packing Problem

arxiv: v1 [cs.dm] 6 May 2009

A CSP Search Algorithm with Reduced Branching Factor

General Methods and Search Algorithms

Constraint (Logic) Programming

Propagate the Right Thing: How Preferences Can Speed-Up Constraint Solving

Tabu Search for Constraint Solving and Its Applications. Jin-Kao Hao LERIA University of Angers 2 Boulevard Lavoisier Angers Cedex 01 - France

Hybrid strategies of enumeration in constraint solving

Constraint Satisfaction Problems

Search Algorithms. IE 496 Lecture 17

3 INTEGER LINEAR PROGRAMMING

MVE165/MMG630, Applied Optimization Lecture 8 Integer linear programming algorithms. Ann-Brith Strömberg

Lagrangian Relaxation in CP

A Totally Unimodular Description of the Consistent Value Polytope for Binary Constraint Programming

(Stochastic) Local Search Algorithms

CP-based Local Branching

L2: Algorithms: Knapsack Problem & BnB

Hybrid Metaheuristics

Constraint Programming

15.083J Integer Programming and Combinatorial Optimization Fall Enumerative Methods

Constraint Satisfaction Problems. Chapter 6

Column Generation Based Primal Heuristics

A fuzzy constraint assigns every possible tuple in a relation a membership degree. The function

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Propagating separable equalities in an MDD store

Outline. Best-first search

The MIP-Solving-Framework SCIP

Outline of the module

Using Constraint Programming to Solve the Maximum Clique Problem

Cardinality Reasoning for bin-packing constraint. Application to a tank allocation problem

AI Fundamentals: Constraints Satisfaction Problems. Maria Simi

3 No-Wait Job Shops with Variable Processing Times

GRASP. Greedy Randomized Adaptive. Search Procedure

Module 4. Constraint satisfaction problems. Version 2 CSE IIT, Kharagpur

CP Models for Maximum Common Subgraph Problems

Constraint Satisfaction Problems. slides from: Padhraic Smyth, Bryan Low, S. Russell and P. Norvig, Jean-Claude Latombe

Lecture: Iterative Search Methods

Constraint Programming

10/11/2017. Constraint Satisfaction Problems II. Review: CSP Representations. Heuristic 1: Most constrained variable

Semi-Independent Partitioning: A Method for Bounding the Solution to COP s

Example: Map coloring

A Soft Constraint-Based Approach to QoS-Aware Service Selection

Constraint Satisfaction Problems

Constraint Satisfaction Problems

Pre-requisite Material for Course Heuristics and Approximation Algorithms

Integer Programming ISE 418. Lecture 7. Dr. Ted Ralphs

Regensburger DISKUSSIONSBEITRÄGE zur Wirtschaftswissenschaft

Constraint Satisfaction Problems

Decision Diagrams for Solving Traveling Salesman Problems with Pickup and Delivery in Real Time

Parallel Computing in Combinatorial Optimization

Other approaches. Marco Kuhlmann & Guido Tack Lecture 11

A Fast Arc Consistency Algorithm for n-ary Constraints

February 19, Integer programming. Outline. Problem formulation. Branch-andbound

Combining constraint programming and local search to design new powerful heuristics

Neighborhood Search: Mixing Gecode and EasyLocal++

Conflict-based Statistics

Unifying and extending hybrid tractable classes of CSPs

Constraint Satisfaction

Chapter 6 Constraint Satisfaction Problems

The Complexity of Global Constraints

Methods and Models for Combinatorial Optimization Heuristis for Combinatorial Optimization

ACO and other (meta)heuristics for CO

Variable Neighborhood Search for Solving the Balanced Location Problem

A heuristic for solving the frequency assignment problem

Integrating Mixed-Integer Optimisation & Satisfiability Modulo Theories

Graph Coloring via Constraint Programming-based Column Generation

An Exact Algorithm for Unicost Set Covering

Lecture 6: Constraint Satisfaction Problems (CSPs)

THE CONSTRAINT PROGRAMMER S TOOLBOX. Christian Schulte, SCALE, KTH & SICS

Principles of Optimization Techniques to Combinatorial Optimization Problems and Decomposition [1]

Search. Krzysztof Kuchcinski. Department of Computer Science Lund Institute of Technology Sweden.

A Re-examination of Limited Discrepancy Search

Informed search methods

Foundations of Artificial Intelligence

2WKHUZLVH 6HOHFWFRQIOLFW& LL 6HOHFWFRQVWUDLQW7& WKDWZRXOGOHDGWRUHSDLU& LLL &UHDWHDFKRLFHSRLQWFRUUHVSRQGLQJD GLVMXQFWLRQRI7& DQGLWVQHJDWLRQ*RWR

Constraint Propagation: The Heart of Constraint Programming

A Constraint Programming Primer

Constraint Satisfaction. AI Slides (5e) c Lin

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany, Course on Artificial Intelligence,

Integrating Local-Search Advice Into Refinement Search (Or Not)

Constraint Satisfaction Problems

CMU-Q Lecture 8: Optimization I: Optimization for CSP Local Search. Teacher: Gianni A. Di Caro

A strong local consistency for constraint satisfaction

Mathematical Programming Formulations, Constraint Programming

Constructive and destructive algorithms

Primal Heuristics in SCIP

What is Search For? CSE 473: Artificial Intelligence. Example: N-Queens. Example: N-Queens. Example: Map-Coloring 4/7/17

1. Lecture notes on bipartite matching

Constraint Solving by Composition

SLS Methods: An Overview

Revisiting the Upper Bounding Process in a Safe Branch and Bound Algorithm

Constraint Satisfaction Problems

Hybrid Constraint Programming and Metaheuristic methods for Large Scale Optimization Problems

Fundamentals of Integer Programming

Methods and Models for Combinatorial Optimization Exact methods for the Traveling Salesman Problem

LocalSolver 4.0: novelties and benchmarks

Outline. Best-first search

An Ant Colony Optimization Meta-Heuristic for Subset Selection Problems

ARTIFICIAL INTELLIGENCE (CS 370D)

The Encoding Complexity of Network Coding

Transcription:

Branch and Move 1 The Branch & Move algorithm: Improving Global Constraints Support by Local Search Thierry Benoist Bouygues e-lab, 1 av. Eugène Freyssinet, 78061 St Quentin en Yvelines Cedex, France tbenoist@bouygues.com Abstract. Most global constraints maintain a support for their filtering algorithm, namely a tuple consistent with both the constraint and current domains. However, this highly valuable information is rarely used outside of the constraint. In this paper we propose a generic hybridization scheme that we tested on a real-world application in the field of TV advertisement. The principle of this Branch and Move approach is to use the support of the main global constraint of the problem as a guide for the branching strategy. The accuracy of this oracle is enhanced by local search improvements of this support tuple at each node of the search tree. 1. Introduction Constraint propagation is based on infeasible values. The filtering algorithm of a constraint aims at removing infeasible values from the domain of its variables, i.e. values belonging to no tuple of the relation consistent with current domains. When all such inconsistent values are detected, the algorithm is said to be complete. In the past few years, in order to perform more accurate filtering on rich n-ary constraints, several global filtering algorithms have been developed usually based on OR polynomial algorithms. Most of these global constraints maintain a feasible tuple consistent with current domains of variables. When no such support tuple exists, the constraint fails (detects a contradiction), otherwise it is used by the filtering algorithm. For instance the reference matching of the AllDifferent constraints (Régin 1994) ensure the feasibility of the constraint, and the strongly connected component decomposition based on this matching offers complete filtering. As more and more CP models are centred on a few number of global constraints (often one), the central role played by global constraints points out the relevancy of considering their support. For instance constrained TSP, constrained Knapsack problems and constrained flow problems can be respectively modelled with (in addition to problem specific constraints) TSP con-

Branch and Move 2 straints (Beldiceanu and E. Contejean 1994), Knapsack constraints (Trick 2001, T. Fahle, M. Sellmann 2002) or flow constraints (Bockmayr et al. 2001, Benoist et al. 2002). For such problems where a principal global constraint can be identified, we propose a support-guided branching scheme and an associated cooperation between Constraint Programming and Local Search. We name this procedure Branch and Move. and expose it in section 3, after the introduction of some definitions in section 2. 2. Definitions Given K an ordered set and D = D 1 D 2 D n with D i K for all i [1,n] (domains of variables), we define the following objects: 1. Constraint 1 : a constraint R is a relation on K n (R K n ). 2. Support set: the support set of relation R on D is supp(r,d) = R D. Omitting D, we will note the support set of a constraint R as supp(r)=supp(r,d) with D equal to current domains. 3. Support tuple: a support of R is a tuple x supp(r) 4. Constraint Satisfaction Problem: A constraint satisfaction problem on K is a triplet (n,d,p) where P is a collection of constraints P={R 1,R 2. R m } on K n. 5. Solutions: Solutions of P are sol(p)= R 1 R 2.. R m D. 6. Potential: With δ K a distance on K, we define the potential (R,x) of constraint R with respect to tuple x K n as the L 1 distance from x to the current support set of R. ( R, x) = min δ ( x, y ) (1) y supp( R) i n Note that (R,x) equals 0 if x supp(r) and + if supp(r)=. This potential literally measures the distance to feasibility of a tuple x for constraint R. For a binary constraint R involving two variables of domains D 1 and D 2, (R,x) can be computed by enumeration in complexity O( D 1 D 2 ). Moreover, the potential of a linear constraint like X i X 0, merely equals max(0, x 0 - x i ) (slack variable). When a exact computation of (R,x) would be too costly, the distance to a feasible tuple of R ( empirically close ) is an upper bound of (R,x). 7. Corrective decision: A corrective decision for a conflicting pair R,x (a pair with (R,x)>0) is a constraint A such that x A and A supp(r,d). In other words it is a constraint discarding x whilst consistent with R. A non-empty support for R is sufficient to ensure its existence. 8. Neighbourhood: A neighbourhood structure for R is a function N R,D associating to each support tuple x of R a subset N R,D (x) supp(r,d) such that x N R,D (x). 9. Selection: A function move defined on supp(r,d) is a selection function for neighbourhood N R,D if and only if move(x) N R,D (x). 10.Improvement: Given a strict order < on K n, move is an improving selection function if and only if x, move(x)= x move(x) < x. For instance the potential order < P K i i 1 Without loss of generality we extend constraints of smaller arity to relations on K n.

Branch and Move 3 defined by x < P y R P (R,x) < R P (R,y) is a strict order on K n. For optimization problems a second criteria based on the objective function can be added. 11. Descent: A descent from x supp(r,d) is the iteration of an improving selection function move until a fix point is reached 2 : descent(x) while (move(x) x) x := move(x), return x Example with relations on K n ={0,1,2} 2 : D 1 =D 2 ={0,1}, R={(0,0),(0,1),(1,0),(2,2)},. The support set of R is supp(r,d)=r (D 1 D 2 ) = {(0,0),(0,1),(1,0)} (0,1) is a support of R (among the three possible ones). With R ={(0,0),(1,1),(2,2)} a second relation on {0,1,2} 2, solutions of problem P={R,R } are sol(p)= R R (D 1 D 2 )={(0,0)}. The potential of R with respect to tuple (1,1) is (R,(1,1))=1 because the closest support is (0,1) supp(r,d) and δ((0,1),(1,1))=1. A possible corrective decision for pair R,(1,1) is A={(0,x) x} since (1,1) A and A supp(r,d)={(0,0),(0,1)}. A possible neighbourhood for R is N R,D : (x,y) {(x,0),(0,y),(x,y)}. For this simple neighbourhood, move: (x,y) (0,0) is a possible selection function since (0,0) belongs to the neighbourhood of any tuple of supp(r,d). This selection function is strictly improving with respect to the potential order < P, since (0,0) is the only solution of P. 3. Branching strategy and local search improvements Let R 0 be the main constraint of a problem P and x an element of its support. If the potential of x with respect to all other constraints is null ( R P, (R,x)=0) then x is a solution of P. Otherwise we decide to select among all additional constraints the one with highest potential and to branch on an associated corrective decision 3. More precisely we select R maximizing (R,x) and A a corrective decision for R,x. Then we successively consider sub problems P A and P A, with A=K n -A (complementary decision). In order to make this branching strategy more efficient, the global appropriateness of the support tuple x is improved by local search before each choice: according to a neighbourhood structure suitable for constraint R 0, a descent procedure is applied on x. The resulting Branch and Move algorithm reads as follows: 2 such a fix point is necessarily reached since K n is finite and < P is a strict order. 3 This approach is directly inspired from what can be done in a MIP branch and bound when a continuous solution is found: a violated integrality constraint is selected, and a decision is taken to bring the LP solver to modify this best continuous solution accordingly.

Branch and Move 4 solve(p) // returns true if P is feasible, false otherwise if not (propagate(p)) returnfalse, // constraints propagation, returns false in case of inconsistency else let x := descent(getsupport(r 0 )), // improvement of the support tuple of R 0 by local search R max := argmax{ (R,x), R P} in // selection of the most unhappy constraint w.r.t. x (if ( (R max,x) = 0) return true // if x satisfies all constraints it is a solution of P else let A = selectcorrective(r max,x) in // selection of a corrective decision for R max return (solve(p A) or solve(p A)) // recursive exploration Fig. 1. The Branch and Move algorithm It is important to note that the improvement of the support by local search is completely non-destructive: it has no impact at all on current domains since only the support tuple is modified. Instead of jumping to another node of equal depth, modifying already instantiated variables as in Incremental Local Optimization [Caseau & Laburthe 1999], we modify the values taken by remaining variables in the support tuple. This absence of impact on current domains keeps the search tree complete whilst the embedded problem-specific local search algorithm avoids wasting time in a costly sub tree searching for a solution laying just a few moves far from current support tuple, or trying the ensure the satisfaction of constraints that are consistent with a nearby support tuple. The latter case points out that the improvement process helps identifying critical points: constraints remaining in conflict with the obtained support ( resisting to local moves ) may be the most critical ones. The Branch and Move algorithm can be compared to the Branch and Greed [Sourd & Chrétienne 1999] and Local Probing techniques [Kamarainen & El Sakkout 2002] that are also based on heuristic computations at each node, searching for possible extension of the current partial assignement. From a more general point of view these algorithms belong to the family of CP/LS hybrids [Focacci et al 2003], [Focacci & Shaw 2002]. From Local Search point of view, the whole process can be seen as the systematic exploration of the support set of the main constraint. In this context, the initial improvement process is a standard greedy descent starting from the support tuple. Now the local optimum escaping strategy significantly differs from tabu or simulated annealing paradigms. Instead of changing the solution by non-improving moves, the landscape itself is modified thanks to a corrective decision. The propagation of this decision removes many tuples from the landscape, including the current one. The filtering algorithm computes a new support tuple close to the discarded one. Then a new greedy descent is applied from this new starting point to reach the closest local optimum in this new filtered landscape. Since these decisions are embedded in a CP search tree, a complete enumeration of possible landscapes is performed yielding to an exact local search algorithm. When such a complete enumeration would be too costly, the CP framework gives us the opportunity to use well-tried partial tree search techniques developed in the last decade. Restricted Candidate Lists, discrepancy based search, credit search, barrier limit, etc are likely to guide the enumeration through a diversified set of promising landscapes [Focacci et al 2003].

Branch and Move 5 For a more comprehensive description of Branch and Move principles, practice (on the so-called TV-Break Packing Problem) and related work, we refer the reader to [Benoist & Bourreau, 2003]. References N. Beldiceanu and E. Contejean, 1994. Introducing global constraints in CHIP. Mathematical and Computer Modelling, 20(12):p 97-123. T. Benoist, E. Bourreau, 2003. Improving Global Constraints Support by Local Search. In COSOLV 03. T. Benoist, E. Gaudin, B. Rottembourg, 2002. Constraint Programming Contribution to Benders Decomposition: A Case Study. In CP-02, pages 603-617. A. Bockmayr, N. Pisaruk, A. Aggoun, 2001. Network Flow Problems in Constraint Programming. In CP-01, pages 196-210. Y. Caseau, F. Laburthe, 1999. Heuristics for Large Constrained Vehicle Routing Problems, Journal of Heuristics 5 (3), Kluwer. T. Fahle, M. Sellmann, 2002. Cost Based Filtering for the Constrained Knapsack Problem. In Annals of Operations Research 115:73-94. F. Focacci, F. Laburthe, A. Lodi, 2003. Local search and constraint programming. In Handbook of Metaheuristics pages 369-403, Kluwer. F. Focacci and P. Shaw, 2002. Pruning sub-optimal seach branches using local search. In Proceedings of CP-AI-OR-02 pages 181-189. O. Kamarainen, H. El Sakkout, 2002. Local Probing Applied to Scheduling. In CP02: 155-171. U. Montanari, 1974. Networks of constraints: fundamental properties and applications to picture processing. Information Science, 7: 95-132. J-C. Régin, 1994. A filtering algorithm for constraints of difference in CSPs. In AAAI 94, Twelth National Conference on Artificial Intelligence, pages 362-367, Seattle, Washington. F. Sourd, P. Chrétienne, 1999, Fiber-to-Object Assignment Heuristics. In European Journal of Operational Research, 117. M. Trick, 2001. A dynamic programming approach for consistency and propagation for knapsack constraints. Proceedings of CPAIOR-01 pages 113-124.