Learning Implied Global Constraints

Size: px
Start display at page:

Download "Learning Implied Global Constraints"

Transcription

1 Learning Implied Global Constraints Christian Bessiere LIRMM-CNRS U. Montpellier, France Remi Coletta LRD Montpellier, France Thierry Petit LINA-CNRS École des Mines de Nantes, France Abstract Finding a constraint network that will be eiciently solved by a constraint solver requires a strong expertise in Constraint Programming. Hence, there is an increasing interest in automatic reormulation. This paper presents a general ramework or learning implied global constraints in a constraint network assumed to be provided by a non-expert user. The learned global constraints can then be added to the network to improve the solving process. We apply our technique to global cardinality constraints. Experiments show the signiicance o the approach. 1 Introduction Encoding a problem as a constraint network (called model) that will be solved eiciently by a constraint toolkit is a diicult task. This requires a strong expertise in Constraint Programming (CP), and this is known as one o the main bottlenecks to the widespread o CP technology [Puget, 2004]. CP experts oten start by building a irst model that expresses correctly the problem without any other requirement. Then, i the time to solve this model is too high, they reine it by adding new constraints that do not change the set o solutions but that increase the propagation capabilities o the solver. In Ilog Solver 5.0. user s manual, a Gcc constraint is added to improve the solving time o a graph coloring problem (page 279 in [Ilog, 2000]). In [Régin, 2001], new constraints are added in a sports league scheduling problem. Such constraints are called implied constraints [Smith et al., 1999]. The community has studied automatic ways o generating implied constraints rom a model. Colton and Miguel [Colton and Miguel, 2001] exploit a theory ormation program to provide concepts and theorems which can be translated into implied constraints. In Hnich et al. [Hnich et al., 2001], some implied linear constraints are derived by using an extension o the Prolog Equation Solving System [Sterling et al., 1989]. In all these rameworks the new generated constraints are derived rom structural properties o the original constraints. They do not depend on domains o variables. This paper presents a ramework based on ideas presented in [Bessiere et al., 2005]. This ramework contains two original contributions in automatic generation o implied constraints: First, it permits to learn global constraints with parameters; Second, the implied constraints are learned according to the actual domains o the variables in the model. Learning global constraints is important because global constraints are a key eature o constraint programming. They permit to reduce the search space with eicient propagators. Global constraints generally involve parameters. For instance, consider the NValue(k, [x 1,..., x n ]) constraint. It holds i k is equal to the number o dierent values taken by x 1,..., x n. k can be seen as the parameter, which leads to dierent sets o allowed tuples on the x i s depending on the value it takes. The increase o expressiveness provided by the parameters increases the chances to ind implied constraints in a network. A learning algorithm will try, or example, to learn the smallest range o possible values or k s.t. the NValue is an implied constraint. The tighter the range or k, the more the learned constraint can reduce the search space. Many constraints rom [Beldiceanu et al., 2005] may be used as implied constraints involving parameters. Learning implied constraints according to the actual domains is important because constraints learned that way take into account more than just the structure o the network and the constraints. They are closer to the real problem to be solved. For instance, i we know that X, Y and Z are 0/1 variables, we can derive X = Z rom the set o constraints X Y, Y Z. Without knowledge on the domains, we cannot derive anything. Even an expert in CP has diiculties to ind such constraints because they depend on complex interactions between domains and constraints. 2 Motivation Example Let us ocus on a simple problem where n tasks have to be scheduled. All tasks have the same duration d. Each task i requires a given amount m i o resource. Steps o time are represented by integers starting at 0. At any time, the amount o resource used by all tasks is bounded by maxr. The starting times o any two tasks must be separated by at least two steps o time. Random precedence constraints between start times o tasks are added. The makespan (maximum time) is equal to 2 (n 1)+d (the best possible makespan w.r.t. the constraint on starting times). The question is to ind a easible schedule. We implemented a naive model with Choco [Choco, 2005]. The start time o task i is represented by a variable x i with domain [0..2 (n 1)] (because o the makespan), and the resource capacity is represented at each unit o time t by an

2 Naive model Implied constraint n / d / n 1 / n 2 / maxr #nodes / time (sec.) #nodes / time (sec.) 4 / 4 / 4 / 0 / 4 / > / / 4 / 4 / 0 / 3 42,129 / / / 4 / 4 / 0 / 2 85 / / / 4 / 3 / 1 / 4 45 / / / 4 / 3 / 1 / 3 33 / / / 4 / 3 / 1 / 2 7 (unsat) / (unsat) / 0.03 Table 1: Results on the naive model (let) and when an implied Gcc is added to the model (right). (n tasks o duration d among which n j tasks use j resources.) array R[t][1..n] o variables, where R[t][i] = m i i task i is active at time t, R[t][i] = 0 otherwise. Constraints ensuring consistency between x i and R[t][i] and constraints on separation o starting times are primitive ones. Sum constraints are used or expressing resource capacity ( n i=1 R[t][i] maxr). We place us in the context o a non expert user, so we run the deault search heuristics o Choco (the variable with minimum domain irst, assigned with its smallest available value). We put a cuto at 60 seconds. Table 1(let) reports results on instances where n = 4, d = 4 and n j is the number o tasks requiring j resources. Only two precedence constraints are added. It shows that even on these small instances, our naive model can be hard to solve with standard solving techniques. I we want to solve this problem with more tasks, we deinitely need to improve the model. An expert-user will see that constraints on the starting time o tasks induce limits on the number o tasks that can start at each point o time. 1 The makespan is indeed equal to the minimal possible value or scheduling all the tasks i they are separated by at least two units. We thereore have to place tasks as soon as possible. This means that one task starts at time zero, one task at time 2, etc. This can be expressed explicitly with a global cardinality constraint (Gcc). Gcc([x 1,..., x n ], [p v1,..., p vk ]) holds i {i x i = v j } = p v j or all j. So, we add to our model a Gcc constraint that involves all starting times x 1,..., x n and guarantees that the n irst even values o time are taken at least once: Gcc([x 1,..., x n ], [p 0,..., p 2n 1 ]), where p v 1... n i v is even, p v 0... n otherwise. As shown in Table 1(right), adding this simple Gcc constraint to the naive model leads to signiicant improvements. This example shows that or a nonexpert user, even very small problems can lead to bad perormance o the solver. Adding simple implied constraints to the model can dramatically reduce the solving cost. 3 Basic Deinitions A constraint network N is deined by a set X = {x 1,..., x n } o variables, a set D = {D(x 1 ),..., D(x n )} o domains o values or the variables in X, and a set C o constraints. A constraint c is deined on its scope X(c) X. c speciies which combinations o values (or tuples) are allowed on the variables in X(c). A solution to N is an assignment o values rom their domain to all variables in X s.t. all constraints in C are satisied. sol(n) denotes the set o solutions o N. 1 The automatic generation o robust search heuristics is out o the scope o this paper. An implied constraint or a constraint network N is a constraint that does not change the solutions o N. That is, given the constraint network N = (X, D, C), the constraint c with X(c) X is an implied constraint or N i sol(n) = sol(n + c) where N + c denotes the network (X, D, C {c}). Global constraints are deined on any number o variables. For instance, alldierent is a global constraint that can be instantiated on scopes o variables o any size. For most o the global constraints, the signature is not completely ixed. Some o their variables can be seen as parameters, that is, external variables that are not involved in any other constraint o the problem. For instance, the Gcc constraint Gcc([x 1,..., x n ], [p v1,..., p vk ]) involves x i s and p v s. The p v s can be variables o the problem (like in [Quimper et al., 2003]), or parameters that take a value in a range (like in [Régin, 1996]). Another example is the Among constraint. Among([x 1,..., x n ], [v 1,..., v k ], N) holds i {i j 1..k, x i = v j } = N. The v j s and N can be parameters or variables [Beldiceanu and Contejean, 1994; Bessiere et al., 2006]. So, a global constraint C deined on a scheme scheme(c) can be speciied into dierent types o parametric constraints depending on which elements in scheme(c) are variables or parameters. Deinition 1 (Parametric constraint) Given a global constraint C deined on scheme(c), a parametric constraint derived rom C is a global constraint C[P] s.t. P = {p 1,..., p m } is the set o parameters o C[P] with P scheme(c), and X(C[P]) is the scope o C[P] with X(C[P]) = scheme(c) \ P. So, in addition to its variables, a constraint can be deined by a set P o parameters. The set o tuples that are allowed by the constraint depends on the values the parameters can take. Deinition 2 (Instance o parametric constraint) Given a constraint network N = (X, D, C), an instance o a parametric constraint C[P] in N is a constraint C[P S ] s.t.: X(C[P S ]) = X(C[P]) X, P X =, S is a set o tuples on P, C[P S ] is satisied by an instantiation e on X(C[P]) i there exists t S s.t. the tuple (e + t) on scheme(c) is a satisying tuple or the global constraint C rom which C[P] is derived. Example 1 Let C[P] be the parametric constraint derived rom Among([x 1,..., x 4 ], [v], N) with scope (x 1,..., x 4 ) and parameters P = (v, N). Let S = {(1, 3), (2, 1)} be a set o combinations or P. A tuple on (x 1,..., x 4 ) is accepted by the instance o constraint C[P S ] i it has three occurrences o value 1 or one occurrence o value 2. Tuples (1, 1, 3, 1) and (1, 1, 2, 3) satisy the constraint. Tuples (3, 2, 2, 2) and (2, 1, 1, 2) do not. We suppose that this parametric constraint allows us speciying combinations o parameters that are allowed. In practice, propagation algorithms oten put restrictions on how parameters can be expressed. Given a parametric constraint C[P], dierent sets S and S o tuples or the parameters lead to dierent constraints C[P S ] and C[P S ]. Hence, when adding an instance o parametric constraint C[P] as implied constraint in a network N, it is worth using a set S o tuples or the parameters which is as tight as possible and C[P S ] is implied in N.

3 4 Learning Implied Parametric Constraints The objective is to learn a target set T, as small as possible s.t. C[P] is still an implied constraint. Any s which is not necessary to accept some solutions o N can be discarded rom T. The tighter the learned constraint is, the more promising its iltering power is. Deinition 3 (Learning an implied parametric constraint) Given a constraint network N = (X, D, C) and a parametric constraint C[P] with P X = and X(C[P]) X, learning an instance o the parametric constraint C[P] implied in N consists in inding a set S as tight as possible s.t. C[P S ] is implied in N. A set o tuples T s.t. C[P T] is implied in N and there does not exist any S T with C[P S ] implied in N is called a target set or C[P] in N. We now give a general process to learn the parameters o implied global constraints. Since our goal is to deal with any type o constraint network, we ocus on global constraints C and sets o parameters P scheme(c) s.t. C[P ] is the universal constraint. This means that i N contains the variables in scheme(c) \ P, C[P ] is an implied constraint in N. The learning task is to ind a small superset o a target set T (see Deinition 3). Given a problem N = (X, D, C) and a parametric constraint C[P], we denote by: s.t. C[P ub T ] is an implied con- ub T a subset o straint, lb T a subset o ub T s.t. or any t lb T, C[P ub T \ {t}] is not an implied constraint. In other words, lb T represents those combinations o values or the parameters that are necessary in ub T to preserve the set o solutions o N, and ub T those or which we have not proved yet that we would not lose solutions without them. The ollowing propositions give some general conditions on how to incrementally tighten the bounds lb T and ub T while keeping the invariant lb T T ub T, where T is a target. Proposition 1 Let C[P ub T ] be an implied constraint in a network N. I there exists S ub T s.t. sol(n) = sol(n + C[P S ]) then C[P S ] is an implied constraint in N. So ub T can be replaced by S. Testing equivalence o the sets o solutions o two networks is conp-complete. This cannot be handled by classical constraint solvers. Hence, we have to relax the condition to a condition that can more easily be checked by a solver. Corollary 1 Let C[P ub T ] be an implied constraint in a network N. I there exists S ub T s.t. N + C[P S ] is inconsistent then C[P ub T \ S ] is an implied constraint in N. So ub T can be replaced by ub T \ S. The weakness o this corollary is that we have no clue on which subsets S o ub T to test or inconsistency. But we know that an implied constraint should not remove solutions when we add it to the network. Hence, solutions can help us. Proposition 2 Let C[P ub T ] be an implied constraint in a network N and e be a solution o N. For any S ub T s.t. C[P S ] is an implied constraint or N there exists t in S s.t. e satisies C[P {t}]. I a given solution is accepted by a unique combination o values or the parameters, then this combination is necessary or having an implied constraint (i.e., it belongs to lb T ). Corollary 2 Let C[P ub T ] be an implied constraint in a network N and e be a solution o N. I there is a unique t in ub T s.t. e satisies C[P {t}] then t can be put in lb T. By Corollaries 1 and 2 we can shrink the bounds o an initial set o possibilities or the parameters. Algorithm 1 perorms a brute-orce computation o a set ub T s.t. C[P ub T ] is an implied constraint or a network N. This algorithm works when no extra-inormation is known rom the parameters. It uses corollaries 1 and 2. The input is a parametric constraint and a constraint network (and optionally an initial upper bound UB on the target set). The output is a set ub T o tuples or the parameters, s.t. no solution o the problem is lost i we add C[P ub T ] to the network. Algorithm 1: Brute-orce Learning Algorithm Input: C[P], N (optionally UB); Output: ub T ; begin lb T ; ub T ; /* or the tighter UB */ while lb T ub T and not time-out do Choose S ub T \ lb T ; 1 i sol(n + C[P S ]) = then ub T ub T \ S ; /* Corollary 1 */ else pick e in sol(n + C[P S ]); 2 i there is a unique t in ub T s.t. e satisies C[P {t}] then lb T lb T {t} ; /* Corollary 2 */ end It is highly predictable that Algorithm 1 can be ineicient. The space o possible combinations o values to explore is exponential in the number o parameters. In addition, checking whether N + C[P S ] is inconsistent (line 1) is NPcomplete. Even i N in this learning phase should be only a subpart o the whole problem, it is necessary to use some heuristics to improve the process. Fortunately, in practice, constraints have characteristics that can be used. The next section studies some o these properties. 5 Using Properties on the Constraints For a given parametric constraint C[P], dierent representations o parameters may exist. The complexity o associated iltering algorithms generally diers according to the representation used. The more general case studied in Section 4 consists in considering that allowed tuples o parameters or C[P] are given in extension as a set S. 5.1 Partitioning parameters In practice, many parametric constraints satisy the ollowing property: any two dierent combinations o values in or the parameters correspond to disjoint sets o solutions on X(C[P]). For instance, this is the case or the Gcc([x 1,..., x n ], [p v1,..., p vk ]) constraint: Each combination

4 o values or the parameters p v1,..., p vk imposes a ixed number o occurrences or each value v 1,..., v k in x 1,..., x n. Deinition 4 A parametric constraint C[P] is called parameter-partitioning i or any t, t in, i t t then C[P {t}] C[P {t }] =. Given an instantiation e on X(C[P]), the unique tuple t on P s.t. e satisies C[P {t}] is denoted by t e. When a constraint is parameter-partitioning, we have the nice property that there is a unique target set. Lemma 1 Given a constraint network N, i a parametric constraint C[P] is parameter-partitioning then there exists a unique target set T or C[P] in N. The next proposition tells us that when a constraint is parameter-partitioning, updating the lower bound on the target set is easier than in the general case. Proposition 3 Let C[P ub T ] be an implied constraint in a network N and e be a solution o N. I C[P] is a parameterpartitioning constraint then the unique tuple t e on P s.t. e satisies C[P {t}] can be put in lb T. This Proposition is a irst way to improve Algorithm 1. It allows a aster construction o lb T because any solution o N contributes to the lower bound lb T in line Parameters as sets o integers As ar as we know, existing parametric constraints are deined by sets o possible values or their parameters p 1,..., p m taken separately. This means that the target set T must be a Cartesian product T(p 1 ) T(p m ), where T(p i ) is the target set o values or parameter p i. In other words, T is derived rom the sets o possible values o each parameter. This is less expressive than directly considering any subset o, but the learning process is easier to handle. Let lb T (p i ) and ub T (p i ) be the required and possible values in T(p i ). Corollaries 1 and 2 are specialized in Corollaries 3 and 4. Corollary 3 Let C[P ub T ] be an implied constraint in a constraint network N, with ub T = p i P(ub T (p i )). Let p i P, S i ub T (p i ) \ lb T (p i ) and S = j i ub T (p j ) S i. I N + C[P S ] has no solution then values in S i can be removed rom ub T (p i ). This corollary says that i a parametric constraint instance C[P S ] is inconsistent with a network whereas all its parameters except one (p i ) can take all the values in their upper bound, then all values in S (p i ) can be discarded rom ub T (p i ). Corollary 4 Let C[P ub T ] be an implied constraint in a network N and e be a solution o N. Given a parameter p i P and a value v ub T (p i ), i we have t[p i ] = v or every tuple t in ub T s.t. e satisies C[P {t}], then v can be put in lb T (p i ). As in Section 5.1, Algorithm 1 can be modiied to deal with the speciic case where parameters are represented as sets o integers. Corollary 4 tells us when a value must go in lb T (p i ) or some parameter p i. Corollary 3 tells us when a value can be removed rom ub T (p i ) or some parameter p i. 5.3 Parameters as ranges The possible values or a parameter p i can be represented by a range o integers, that is, a special case o set where all values must be consecutive w.r.t. the total order on integers. The only possibility or modiying the sets lb T (p i ) and ub T (p i ) o a parameter p i is to shrink their bounds min(lb T (p i )), max(lb T (p i )), min(ub T (p i )) and max(ub T (p i )). This case restricts even more the possibilities or the combinations o parameters allowed, and simpliies the learning process. Corollaries 1 and 2 can be specialized as Corollaries 5 and 6. Corollary 5 Let C[P ub T ] be an implied constraint in a constraint network N, where parameters are represented as ranges. Let p i P and v < min(lb T (p i )) (resp. v > max(lb T (p i ))) and S = j i ub(p j ) (..v] (resp. S = j i ub(p j ) [v.. + )). I N + C[P S ] has no solution then min(ub T (p i )) > v (resp. max(ub T (p i )) < v). Corollary 6 Let C[P ub T ] be an implied constraint in a network N and e be a solution o N. Given a parameter p i P and a value v ub T (p i ), i we have t[p i ] = v or every tuple t in ub T s.t. e satisies C[P {t}], then min(lb T (p i )) v max(lb T (p i )). Thanks to Corollary 5, tightening the upper bounds or a given p i is simply done by checking i orcing it to take values smaller than the minimum (or greater than the maximum) value o its lower bound leads to an inconsistency in the network. I yes, we know that all these values smaller than the minimum o lb T (p i ) (or greater than the maximum o lb T (p i )) can be removed rom ub T (p i ). 6 Making the Learning Process Tractable One o the operations perormed by the learning technique described in Sections 4 and 5 is a call to a solver to check i the network containing a given instance o parametric constraint is still consistent (line 1 in Algorithm 1). This is NPcomplete. A key idea to tackle this problem is that implied constraints learned on a relaxation o the original network are still implied constraints on the original network. Proposition 4 Given a network N = (X, D, C) and a network N = (X, D, C ), i a constraint c is implied in N and sol(n) sol(n ) then c is implied in N. Selecting a subset o the constraints. Thanks to Proposition 4, we can select any subset o the constraints o the network on which we want to learn an implied constraint, and the constraint learned on that subnetwork will be implied in the original network. In the example o Section 2, the Gcc constraint is only posted on the variables representing tasks and is an implied constraint on the subnetwork where neither the resource constraints nor the precedence constraints are taken into account. Thus, the underlying network was a simple one that could easily be solved. Optimization problems. In an optimization problem, the set o solutions is the set o instantiations that satisy all constraints and s.t. a cost unction has minimal (or maximal) value. I N = (X, D, C) is the network and is the cost unction, let N = (X, D, C {c UB })

5 n / d / n 1 / n 2 / maxr #nodes learning(sec.) + solve (sec.) 4 / 4 / 4 / 0 / / 4 / 4 / 0 / / 4 / 4 / 0 / / 4 / 3 / 1 / / 4 / 3 / 1 / / 4 / 3 / 1 / 2 3 (unsat) Table 2: Learning an implied Gcc on the naive model o Section 2: time to learn and solving time. (n tasks o duration d among which n j tasks use j resources.) where c UB is a constraint accepting all tuples on X with cost below a ixed upper bound U B (we concentrate on minimization). I UB is greater than the minimal cost, then any implied constraint in N accepts all solutions o N optimal wrt. The idea is thus to run a branch and bound on N or a ixed (short) amount o time. UB is set to the value o the best solution ound. Then, the classical learning process is launched on the network N + c UB. Note that i the learned implied constraint improves the solving phase, it can permit to quickly ind a bound UB better than UB. Using this new bound UB, we can continue the learning process on N +c UB instead o N + c UB. The learned implied constraint will be tighter because N + c UB is a relaxation o N + c UB. We observe a nice cooperation between the learning process and the solving process. Time limit. It can be the case that inding a subnetwork easy to solve is not possible. In this case, we have to ind other ways to decrease the cost o the consistency test o line 1 in Algorithm 1. A simple way is to put a time limit to the consistency test. I the solver has not inished its search beore the limit, ub T is not updated and the algorithm goes to the next loop. 7 Experiments We evaluate our learning technique on the Gcc constraint (see Section 2). The Gcc global constraint is NP-hard to propagate when all elements in its scheme are variables [Quimper et al., 2003]. However, when the cardinalities o the values are parameters that take values in a range, eicient algorithms exist to propagate it [Régin, 1996; Quimper et al., 2004]. In addition, the Gcc constraint can express many dierent eatures on the cardinalities o the values. This is thus a good candidate or being added as an implied constraint. In all these experiments we learn a Gcc where cardinalities are ranges o integers. Results in Sections 5.1 and 5.3 apply directly. We used the Choco constraint solver [Choco, 2005]. Satisaction Problem. The irst experiment was perormed on the problem o Section 2. The learning algorithm was run on the subnetwork where resource constraints and precedence constraints are discarded. The learning is thus ast (300 to 800 milliseconds or n = 4 or n = 5). We ixed to 30 the limit on the number o consistency tests o the subnetwork with the added Gcc (line n / d / n 1 / n 2 / maxr #nodes solve (sec.) 5 / 5 / 5 / 0 / 3 > 60 5 / 5 / 5 / 0 / 2 53,250 (unsat) 19 5 / 5 / 3 / 2 / 4 > 60 5 / 5 / 3 / 2 / 3 1,790 (unsat) / 5 / 1 / 4 / 5 11, / 5 / 1 / 4 / 4 7,985 (unsat) 2.9 Table 3: Naive model with 5 tasks (n tasks o duration d among which n j tasks use j resources.) n / d / n 1 / n 2 / maxr #nodes learning (sec.) + solve (sec.) 5 / 5 / 5 / 0 / / 5 / 5 / 0 / (unsat) / 5 / 3 / 2 / / 5 / 3 / 2 / (unsat) / 5 / 1 / 4 / / 5 / 1 / 4 / (unsat) Table 4: Learning an implied Gcc on the problems with 5 tasks: time to learn and solving time. (n tasks o duration d among which n j tasks use j resources.) 1 in Algorithm 1). Table 2 shows the results on a model containing the learned implied Gcc. The irst time number corresponds to the learning process and the second to the solving time. Results in Table 2 are almost the same as results in Table 1, where the implied Gcc was added by the expert user. This shows that a short run o the learning algorithm gives some robustness to the model wrt the solving process: The problem is consistently easy to solve on all types o instances whereas some o them could not be solved on the naive model o Section 2. Tables 3 and 4 show the results when we increase the number o tasks and their length (n = d = 5 instead o 4) and the number o random precedence constraints (4 instead o 2). We stopped the search at 60 seconds. We were able to solve all problems with size n = 6 in a ew seconds (including the learning time) while most o them are not solvable without the implied Gcc even ater several minutes. This irst experiment shows that when taking a naive model with a naive search strategy, our learning technique can improve the robustness o the model in terms o solving time. The eort asked to the user was just to have the intuition that there is maybe some hidden Gcc related to the ordering o tasks. This is a much lower eort than studying by hand the possible implied constraints or each number o tasks and durations n and d. Optimization Problem. At the Institute o Technology o the University o Montpellier (IUT), n students provide a totally ordered list (u 1,..., u m ) o the m projects they preer: project u i is strictly preerred to project u i+1. The goal is to assign projects s.t. two students do not share a same project, while maximizing their satisaction: a student is very satisied i she obtains her irst choice, less i she obtains the second one, etc. Obtaining no project o her list is the worst possible satisaction. Additional constraints exist such as a limit o 6 projects selected in the set

6 Initial model Implied constraint n #nodes solve (sec.) #nodes learning (sec.) + solve (sec.) , , > 12 hours min. Table 5: Learning an implied Gcc on the projects optimization problem. n is the number o students o projects proposed by a teacher. A irst model was provided by students rom that institute who are novices in constraint programming: students are variables, domains are their set o preerred projects plus an extra-value 0, which means that no project in the preerred list was ound. Constraints (x i x j ) (x i = x j = 0) between each pair (x i, x j ) o students express mutual exclusion on projects. Each student has a preerence-variable measuring her satisaction. Preerencevariables are linked to student-variables by table constraints {(u 1, 0),..., (u m, m 1), (0, m)}. The objective is to minimize the sum o preerence-variables. This model is reerred as the initial model. We wish to learn an implied Gcc on student variables. The solver is irst launched on the initial model or a short time (1 sec.) to get a irst solution S 0 and an upper bound on the quality o solutions, as described in Section 6. We had only one real instance (with 60 students), so we derived rom the real data some sub-instances with less students (preserving the ratio number o projects/number o students and the distribution o the choices). Let hand side o Table 5 shows the time and number o nodes with the initial model according to the number o students n. Right hand side o Table 5 shows the results obtained when learning an implied Gcc on the student variables. We ixed to 10 n the limit on the number o consistency tests (line 1 in Algorithm 1), with a cuto o 0.5 seconds or each o them. Thus, the maximum learning time grows linearly with the size o the problem. The learning time includes the generation o an initial positive example. The results show a tremendous speed up on the model reined with a learned implied Gcc compared to the initial model. This shows that learning implied global constraints in an optimization problem can greatly pay o when the model has been designed by a non CP expert. 8 Conclusion Global constraints are essential in improving the eiciency o solving constraint models. We proposed a generic ramework or automatically learning implied global constraints with parameters. This is both the irst approach that permits to derive implied global constraints and the irst approach that learns implied constraints according to the actual domains, not just constraints derived rom structural or syntactical properties. We applied the generic ramework to special properties that global constraints oten satisy. Our experiments show that a very small eort spent learning implied constraints with our technique can dramatically improve the solving time. Reerences [Beldiceanu and Contejean, 1994] N. Beldiceanu and E. Contejean. Introducing global constraints in CHIP. Mathl. Comput. Modelling, 20(12):97 123, [Beldiceanu et al., 2005] N. Beldiceanu, M. Carlsson, and J-X. Rampon. Global constraint catalog. SICS Technical report T , URL: [Bessiere et al., 2005] C. Bessiere, R. Coletta, and T. Petit. Acquiring parameters o implied global constraints. Proceedings CP, pages , [Bessiere et al., 2006] C. Bessiere, E. Hebrard, B. Hnich, Z. Kiziltan, and T. Walsh. Among, common and disjoint constraints. In CSCLP: Recent Advances in Constraints, volume 3978 o LNCS, pages Springer, [Choco, 2005] Choco. A Java library or constraint satisaction problems, constraint programming and explanation-based constraint solving. URL: [Colton and Miguel, 2001] S. Colton and I. Miguel. Constraint generation via automated theory ormation. Proceedings CP, pages , [Hnich et al., 2001] B. Hnich, J. Richardson, and P. Flener. Towards automatic generation and evluation o implied constraints. Upsala University T.R., [Ilog, 2000] Ilog. Solver 5.0 User s Manual. URL: [Puget, 2004] J.F. Puget. Constraint programming next challenge: Simplicity o use. Proceedings CP, pages 5 8, [Quimper et al., 2003] C.-G. Quimper, P. Van Beek, A. Lopez-Ortiz, A. Golynski, and S. B. Sadjad. An eicient bounds consistency algorithm or the global cardinality constraint. Proceedings CP, pages , [Quimper et al., 2004] C.-G. Quimper, A. López-Ortiz, P. van Beek, and A. Golynski. Improved algorithms or the global cardinality constraint. Proceedings CP, [Régin, 1996] J-C. Régin. Generalized arc consistency or global cardinality constraint. Proceedings AAAI, pages , [Régin, 2001] J-C. Régin. Minimization o the number o breaks in sports scheduling problems using constraints programming. DIMACS series in Discrete Mathematics and Theoretical Computer Science, 57: , [Smith et al., 1999] B. Smith, K. Stergiou, and T. Walsh. Modelling the golomb ruler problem. In J.C. Régin and W. Nuijten, editors, Proceedings IJCAI 99 workshop on non-binary constraints, Stockholm, Sweden, [Sterling et al., 1989] L. Sterling, A. Bundy, L. Byrd, R. O Keee, and B. Silver. Solving symbolic equations with PRESS. Journal o Symbolic Computation, pages 71 84, 1989.

The Tractability of Global Constraints

The Tractability of Global Constraints The Tractability of Global Constraints Christian Bessiere ½, Emmanuel Hebrard ¾, Brahim Hnich ¾, and Toby Walsh ¾ ¾ ½ LIRMM-CNRS, Montpelier, France. bessiere@lirmm.fr Cork Constraint Computation Center,

More information

Global Constraints. Combinatorial Problem Solving (CPS) Enric Rodríguez-Carbonell (based on materials by Javier Larrosa) February 22, 2019

Global Constraints. Combinatorial Problem Solving (CPS) Enric Rodríguez-Carbonell (based on materials by Javier Larrosa) February 22, 2019 Global Constraints Combinatorial Problem Solving (CPS) Enric Rodríguez-Carbonell (based on materials by Javier Larrosa) February 22, 2019 Global Constraints Global constraints are classes o constraints

More information

Disjoint, Partition and Intersection Constraints for Set and Multiset Variables

Disjoint, Partition and Intersection Constraints for Set and Multiset Variables Disjoint, Partition and Intersection Constraints for Set and Multiset Variables Christian Bessiere ½, Emmanuel Hebrard ¾, Brahim Hnich ¾, and Toby Walsh ¾ ¾ ½ LIRMM, Montpelier, France. Ö Ð ÖÑÑ Ö Cork

More information

The Objective Sum Constraint

The Objective Sum Constraint The Objective Sum Constraint Jean-Charles Régin 1 and Thierry Petit 2 jcregin@gmail.com; thierry.petit@mines-nantes.fr 1 Université de Nice-Sophia Antipolis, I3S, CNRS. 2 École des Mines de Nantes, LINA,

More information

Range and Roots: Two Common Patterns for Specifying and Propagating Counting and Occurrence Constraints

Range and Roots: Two Common Patterns for Specifying and Propagating Counting and Occurrence Constraints Range and Roots: Two Common Patterns for Specifying and Propagating Counting and Occurrence Constraints Christian Bessiere LIRMM, CNRS and U. Montpellier Montpellier, France bessiere@lirmm.fr Brahim Hnich

More information

Propagate the Right Thing: How Preferences Can Speed-Up Constraint Solving

Propagate the Right Thing: How Preferences Can Speed-Up Constraint Solving Propagate the Right Thing: How Preferences Can Speed-Up Constraint Solving Christian Bessiere Anais Fabre* LIRMM-CNRS (UMR 5506) 161, rue Ada F-34392 Montpellier Cedex 5 (bessiere,fabre}@lirmm.fr Ulrich

More information

The All Different and Global Cardinality Constraints on Set, Multiset and Tuple Variables

The All Different and Global Cardinality Constraints on Set, Multiset and Tuple Variables The All Different and Global Cardinality Constraints on Set, Multiset and Tuple Variables Claude-Guy Quimper 1 and Toby Walsh 2 1 School of Computer Science, University of Waterloo, Canada, cquimper@math.uwaterloo.ca

More information

MATRIX ALGORITHM OF SOLVING GRAPH CUTTING PROBLEM

MATRIX ALGORITHM OF SOLVING GRAPH CUTTING PROBLEM UDC 681.3.06 MATRIX ALGORITHM OF SOLVING GRAPH CUTTING PROBLEM V.K. Pogrebnoy TPU Institute «Cybernetic centre» E-mail: vk@ad.cctpu.edu.ru Matrix algorithm o solving graph cutting problem has been suggested.

More information

A Fast Arc Consistency Algorithm for n-ary Constraints

A Fast Arc Consistency Algorithm for n-ary Constraints A Fast Arc Consistency Algorithm for n-ary Constraints Olivier Lhomme 1 and Jean-Charles Régin 2 1 ILOG, 1681, route des Dolines, 06560 Valbonne, FRANCE 2 Computing and Information Science, Cornell University,

More information

3 No-Wait Job Shops with Variable Processing Times

3 No-Wait Job Shops with Variable Processing Times 3 No-Wait Job Shops with Variable Processing Times In this chapter we assume that, on top of the classical no-wait job shop setting, we are given a set of processing times for each operation. We may select

More information

Undirected Forest Constraints

Undirected Forest Constraints Undirected Forest Constraints Nicolas Beldiceanu 1, Irit Katriel 2, and Xavier Lorca 1 1 LINA FRE CNRS 2729, École des Mines de Nantes, FR-44307 Nantes Cedex 3, France. {Nicolas.Beldiceanu,Xavier.Lorca}@emn.fr

More information

Piecewise polynomial interpolation

Piecewise polynomial interpolation Chapter 2 Piecewise polynomial interpolation In ection.6., and in Lab, we learned that it is not a good idea to interpolate unctions by a highorder polynomials at equally spaced points. However, it transpires

More information

Two Set-Constraints for Modeling and Efficiency

Two Set-Constraints for Modeling and Efficiency Two Set-Constraints for Modeling and Efficiency Willem-Jan van Hoeve 1 and Ashish Sabharwal 2 1 Tepper School of Business, Carnegie Mellon University 5000 Forbes Avenue, Pittsburgh, PA 15213, U.S.A. vanhoeve@andrew.cmu.edu

More information

The Complexity of Global Constraints

The Complexity of Global Constraints The Complexity of Global Constraints Christian Bessiere LIRMM-CNRS Montpellier, France email: bessiere@lirmm.fr Emmanuel Hebrard and Brahim Hnich and Toby Walsh Cork Constraint Computation Centre University

More information

An Efficient Configuration Methodology for Time-Division Multiplexed Single Resources

An Efficient Configuration Methodology for Time-Division Multiplexed Single Resources An Eicient Coniguration Methodology or Time-Division Multiplexed Single Resources Benny Akesson 1, Anna Minaeva 1, Přemysl Šůcha 1, Andrew Nelson 2 and Zdeněk Hanzálek 1 1 Czech Technical University in

More information

The All Different and Global Cardinality Constraints on Set, Multiset and Tuple Variables

The All Different and Global Cardinality Constraints on Set, Multiset and Tuple Variables The All Different and Global Cardinality Constraints on Set, Multiset and Tuple Variables Claude-Guy Quimper 1 and Toby Walsh 2 1 School of Computer Science, University of Waterloo, Canada cquimper@math.uwaterloo.ca

More information

Unifying and extending hybrid tractable classes of CSPs

Unifying and extending hybrid tractable classes of CSPs Journal of Experimental & Theoretical Artificial Intelligence Vol. 00, No. 00, Month-Month 200x, 1 16 Unifying and extending hybrid tractable classes of CSPs Wady Naanaa Faculty of sciences, University

More information

A Proposed Approach for Solving Rough Bi-Level. Programming Problems by Genetic Algorithm

A Proposed Approach for Solving Rough Bi-Level. Programming Problems by Genetic Algorithm Int J Contemp Math Sciences, Vol 6, 0, no 0, 45 465 A Proposed Approach or Solving Rough Bi-Level Programming Problems by Genetic Algorithm M S Osman Department o Basic Science, Higher Technological Institute

More information

Using VCS with the Quartus II Software

Using VCS with the Quartus II Software Using VCS with the Quartus II Sotware December 2002, ver. 1.0 Application Note 239 Introduction As the design complexity o FPGAs continues to rise, veriication engineers are inding it increasingly diicult

More information

A Fast and Simple Algorithm for Bounds Consistency of the AllDifferent Constraint

A Fast and Simple Algorithm for Bounds Consistency of the AllDifferent Constraint A Fast and Simple Algorithm for Bounds Consistency of the AllDifferent Constraint Alejandro Lopez-Ortiz 1, Claude-Guy Quimper 1, John Tromp 2, Peter van Beek 1 1 School of Computer Science 2 C WI University

More information

Among, Common and Disjoint Constraints

Among, Common and Disjoint Constraints Among, Common and Disjoint Constraints Christian Bessiere 1, Emmanuel Hebrard 2, Brahim Hnich 3, Zeynep Kiziltan 4,andTobyWalsh 2 1 LIRMM, CNRS/University of Montpellier, France bessiere@lirmm.fr 2 NICTA

More information

Using auxiliary variables and implied constraints to model non-binary problems

Using auxiliary variables and implied constraints to model non-binary problems Using auxiliary variables and implied constraints to model non-binary problems Barbara Smith School of Computer Studies University of Leeds Leeds LS2 9JT England bms@scs.leeds.ac.uk Kostas Stergiou Department

More information

Classifier Evasion: Models and Open Problems

Classifier Evasion: Models and Open Problems Classiier Evasion: Models and Open Problems Blaine Nelson 1, Benjamin I. P. Rubinstein 2, Ling Huang 3, Anthony D. Joseph 1,3, and J. D. Tygar 1 1 UC Berkeley 2 Microsot Research 3 Intel Labs Berkeley

More information

CP Models for Maximum Common Subgraph Problems

CP Models for Maximum Common Subgraph Problems CP Models for Maximum Common Subgraph Problems Samba Ndojh Ndiaye and Christine Solnon Université de Lyon, CNRS Université Lyon 1, LIRIS, UMR5205, F-69622, France Abstract. The distance between two graphs

More information

Automated Planning for Feature Model Configuration based on Functional and Non-Functional Requirements

Automated Planning for Feature Model Configuration based on Functional and Non-Functional Requirements Automated Planning or Feature Model Coniguration based on Functional and Non-Functional Requirements Samaneh Soltani 1, Mohsen Asadi 1, Dragan Gašević 2, Marek Hatala 1, Ebrahim Bagheri 2 1 Simon Fraser

More information

Consistency and Set Intersection

Consistency and Set Intersection Consistency and Set Intersection Yuanlin Zhang and Roland H.C. Yap National University of Singapore 3 Science Drive 2, Singapore {zhangyl,ryap}@comp.nus.edu.sg Abstract We propose a new framework to study

More information

Rough Connected Topologized. Approximation Spaces

Rough Connected Topologized. Approximation Spaces International Journal o Mathematical Analysis Vol. 8 04 no. 53 69-68 HIARI Ltd www.m-hikari.com http://dx.doi.org/0.988/ijma.04.4038 Rough Connected Topologized Approximation Spaces M. J. Iqelan Department

More information

Binary recursion. Unate functions. If a cover C(f) is unate in xj, x, then f is unate in xj. x

Binary recursion. Unate functions. If a cover C(f) is unate in xj, x, then f is unate in xj. x Binary recursion Unate unctions! Theorem I a cover C() is unate in,, then is unate in.! Theorem I is unate in,, then every prime implicant o is unate in. Why are unate unctions so special?! Special Boolean

More information

Some Inequalities Involving Fuzzy Complex Numbers

Some Inequalities Involving Fuzzy Complex Numbers Theory Applications o Mathematics & Computer Science 4 1 014 106 113 Some Inequalities Involving Fuzzy Complex Numbers Sanjib Kumar Datta a,, Tanmay Biswas b, Samten Tamang a a Department o Mathematics,

More information

Constraint Propagation: The Heart of Constraint Programming

Constraint Propagation: The Heart of Constraint Programming Constraint Propagation: The Heart of Constraint Programming Zeynep KIZILTAN Department of Computer Science University of Bologna Email: zeynep@cs.unibo.it URL: http://zeynep.web.cs.unibo.it/ What is it

More information

9.8 Graphing Rational Functions

9.8 Graphing Rational Functions 9. Graphing Rational Functions Lets begin with a deinition. Deinition: Rational Function A rational unction is a unction o the orm P where P and Q are polynomials. Q An eample o a simple rational unction

More information

Algorithm design. algorithms, instances, correctness efficiency, order of growth, tractability a paradigm: exhaustive search

Algorithm design. algorithms, instances, correctness efficiency, order of growth, tractability a paradigm: exhaustive search today algorithms, instances, correctness eiciency, order o growth, tractability a paradigm: exhaustive search pseudocodes design o algorithms (always!) chapter 2, section 1 and 2 rom Jon Kleinberg, Eva

More information

A CSP Search Algorithm with Reduced Branching Factor

A CSP Search Algorithm with Reduced Branching Factor A CSP Search Algorithm with Reduced Branching Factor Igor Razgon and Amnon Meisels Department of Computer Science, Ben-Gurion University of the Negev, Beer-Sheva, 84-105, Israel {irazgon,am}@cs.bgu.ac.il

More information

An Efficient Bounds Consistency Algorithm for the Global Cardinality Constraint

An Efficient Bounds Consistency Algorithm for the Global Cardinality Constraint An Efficient Bounds Consistency Algorithm for the Global Cardinality Constraint Claude-Guy Quimper, Peter van Beek, Alejandro López-Ortiz, Alexander Golynski, and Sayyed Bashir Sadjad School of Computer

More information

arxiv: v1 [cs.dm] 6 May 2009

arxiv: v1 [cs.dm] 6 May 2009 Solving the 0 1 Multidimensional Knapsack Problem with Resolution Search Sylvain Boussier a, Michel Vasquez a, Yannick Vimont a, Saïd Hanafi b and Philippe Michelon c arxiv:0905.0848v1 [cs.dm] 6 May 2009

More information

Optimal and Suboptimal Singleton Arc Consistency Algorithms

Optimal and Suboptimal Singleton Arc Consistency Algorithms Optimal and Suboptimal Singleton Arc Consistency Algorithms Christian Bessiere LIRMM (CNRS / University of Montpellier) 161 rue Ada, Montpellier, France bessiere@lirmm.fr Romuald Debruyne École des Mines

More information

COMS W4705, Spring 2015: Problem Set 2 Total points: 140

COMS W4705, Spring 2015: Problem Set 2 Total points: 140 COM W4705, pring 2015: Problem et 2 Total points: 140 Analytic Problems (due March 2nd) Question 1 (20 points) A probabilistic context-ree grammar G = (N, Σ, R,, q) in Chomsky Normal Form is deined as

More information

Classifier Evasion: Models and Open Problems

Classifier Evasion: Models and Open Problems . In Privacy and Security Issues in Data Mining and Machine Learning, eds. C. Dimitrakakis, et al. Springer, July 2011, pp. 92-98. Classiier Evasion: Models and Open Problems Blaine Nelson 1, Benjamin

More information

2. Getting Started with the Graphical User Interface

2. Getting Started with the Graphical User Interface February 2011 NII52017-10.1.0 2. Getting Started with the Graphical User Interace NII52017-10.1.0 The Nios II Sotware Build Tools (SBT) or Eclipse is a set o plugins based on the popular Eclipse ramework

More information

Adaptive Singleton-Based Consistencies

Adaptive Singleton-Based Consistencies Proceedings of the Twenty-Eighth AAAI Conference on Artificial Intelligence Adaptive Singleton-Based Consistencies Amine Balafrej CNRS, U. Montpellier, France U. Mohammed V Agdal, Morocco Christian Bessiere

More information

Validating Plans with Durative Actions via Integrating Boolean and Numerical Constraints

Validating Plans with Durative Actions via Integrating Boolean and Numerical Constraints Validating Plans with Durative Actions via Integrating Boolean and Numerical Constraints Roman Barták Charles University in Prague, Faculty of Mathematics and Physics Institute for Theoretical Computer

More information

2. Design Planning with the Quartus II Software

2. Design Planning with the Quartus II Software November 2013 QII51016-13.1.0 2. Design Planning with the Quartus II Sotware QII51016-13.1.0 This chapter discusses key FPGA design planning considerations, provides recommendations, and describes various

More information

Constraint Acquisition

Constraint Acquisition Constraint Acquisition Christian Bessiere University of Montpellier, France Frédéric Koriche University of Artois, France Nadjib Lazaar University of Montpellier, France Barry O Sullivan University College

More information

A Constraint Programming Based Approach to Detect Ontology Inconsistencies

A Constraint Programming Based Approach to Detect Ontology Inconsistencies The International Arab Journal of Information Technology, Vol. 8, No. 1, January 2011 1 A Constraint Programming Based Approach to Detect Ontology Inconsistencies Moussa Benaissa and Yahia Lebbah Faculté

More information

Joint Congestion Control and Scheduling in Wireless Networks with Network Coding

Joint Congestion Control and Scheduling in Wireless Networks with Network Coding Title Joint Congestion Control and Scheduling in Wireless Networks with Network Coding Authors Hou, R; Wong Lui, KS; Li, J Citation IEEE Transactions on Vehicular Technology, 2014, v. 63 n. 7, p. 3304-3317

More information

Crossword Puzzles as a Constraint Problem

Crossword Puzzles as a Constraint Problem Crossword Puzzles as a Constraint Problem Anbulagan and Adi Botea NICTA and Australian National University, Canberra, Australia {anbulagan,adi.botea}@nicta.com.au Abstract. We present new results in crossword

More information

Section II. Nios II Software Development

Section II. Nios II Software Development Section II. Nios II Sotware Development This section o the Embedded Design Handbook describes how to most eectively use the Altera tools or embedded system sotware development, and recommends design styles

More information

GABRIEL C. DRUMMOND-COLE

GABRIEL C. DRUMMOND-COLE MILY RIHL: -CTGORIS FROM SCRTCH (TH SIC TWO-CTGORY THORY) GRIL C. DRUMMOND-COL I m trying to be really gentle on the category theory, within reason. So in particular, I m expecting a lot o people won t

More information

Decomposable Constraints

Decomposable Constraints Decomposable Constraints Ian Gent 1, Kostas Stergiou 2, and Toby Walsh 3 1 University of St Andrews, St Andrews, Scotland. ipg@dcs.st-and.ac.uk 2 University of Strathclyde, Glasgow, Scotland. ks@cs.strath.ac.uk

More information

Lagrangian relaxations for multiple network alignment

Lagrangian relaxations for multiple network alignment Noname manuscript No. (will be inserted by the editor) Lagrangian relaxations or multiple network alignment Eric Malmi Sanjay Chawla Aristides Gionis Received: date / Accepted: date Abstract We propose

More information

Decision Support Systems for E-Purchasing using Case-Based Reasoning and Rating Based Collaborative Filtering

Decision Support Systems for E-Purchasing using Case-Based Reasoning and Rating Based Collaborative Filtering Decision Support Systems or E-Purchasing using Case-Based Reasoning and Rating Based Collaborative Filtering ABSTRACT The amount o trade conducted electronically has grown dramatically since the wide introduction

More information

VALCSP solver : a combination of Multi-Level Dynamic Variable Ordering with Constraint Weighting

VALCSP solver : a combination of Multi-Level Dynamic Variable Ordering with Constraint Weighting VALCS solver : a combination of Multi-Level Dynamic Variable Ordering with Constraint Weighting Assef Chmeiss, Lakdar Saïs, Vincent Krawczyk CRIL - University of Artois - IUT de Lens Rue Jean Souvraz -

More information

A Generic Scheme for Integrating Strong Local Consistencies into Constraint Solvers

A Generic Scheme for Integrating Strong Local Consistencies into Constraint Solvers A Generic Scheme for Integrating Strong Local Consistencies into Constraint Solvers Julien Vion, Thierry Petit, and Narendra Jussien École des Mines de Nantes, LINA UMR CNRS 6241, 4, rue Alfred Kastler,

More information

THE CONSTRAINT PROGRAMMER S TOOLBOX. Christian Schulte, SCALE, KTH & SICS

THE CONSTRAINT PROGRAMMER S TOOLBOX. Christian Schulte, SCALE, KTH & SICS THE CONSTRAINT PROGRAMMER S TOOLBOX Christian Schulte, SCALE, KTH & SICS Constraint Programming 2 What is constraint programming? Sudoku is constraint programming 3 Sudoku...is constraint programming!

More information

A Requirement Specification Language for Configuration Dynamics of Multiagent Systems

A Requirement Specification Language for Configuration Dynamics of Multiagent Systems A Requirement Speciication Language or Coniguration Dynamics o Multiagent Systems Mehdi Dastani, Catholijn M. Jonker, Jan Treur* Vrije Universiteit Amsterdam, Department o Artiicial Intelligence, De Boelelaan

More information

Automated Theorem Proving and Proof Checking. Proof Checking. Motivation. Cunning Plan. Standard Architecture. History

Automated Theorem Proving and Proof Checking. Proof Checking. Motivation. Cunning Plan. Standard Architecture. History Automated Theorem Proving and Proo Checking Engler: Automatically Generating Malicious Disks using Symex IEEE Security and Privacy 2006 Use CIL and Symbolic Execution on Linux FS code Special model o memory,

More information

Reducing the Bandwidth of a Sparse Matrix with Tabu Search

Reducing the Bandwidth of a Sparse Matrix with Tabu Search Reducing the Bandwidth o a Sparse Matrix with Tabu Search Raael Martí a, Manuel Laguna b, Fred Glover b and Vicente Campos a a b Dpto. de Estadística e Investigación Operativa, Facultad de Matemáticas,

More information

Model Restarts for Structural Symmetry Breaking

Model Restarts for Structural Symmetry Breaking Model Restarts for Structural Symmetry Breaking Daniel Heller, Aurojit Panda, Meinolf Sellmann, and Justin Yip Brown University, Department of Computer Science 5 Waterman Street, P.O. Box 9, Providence,

More information

Generell Topologi. Richard Williamson. May 6, 2013

Generell Topologi. Richard Williamson. May 6, 2013 Generell Topologi Richard Williamson May 6, Thursday 7th January. Basis o a topological space generating a topology with a speciied basis standard topology on R examples Deinition.. Let (, O) be a topological

More information

The Branch & Move algorithm: Improving Global Constraints Support by Local Search

The Branch & Move algorithm: Improving Global Constraints Support by Local Search Branch and Move 1 The Branch & Move algorithm: Improving Global Constraints Support by Local Search Thierry Benoist Bouygues e-lab, 1 av. Eugène Freyssinet, 78061 St Quentin en Yvelines Cedex, France tbenoist@bouygues.com

More information

CS 161: Design and Analysis of Algorithms

CS 161: Design and Analysis of Algorithms CS 161: Design and Analysis o Algorithms Announcements Homework 3, problem 3 removed Greedy Algorithms 4: Human Encoding/Set Cover Human Encoding Set Cover Alphabets and Strings Alphabet = inite set o

More information

A strong local consistency for constraint satisfaction

A strong local consistency for constraint satisfaction A strong local consistency for constraint satisfaction Romuald Debruyne Ecole des Mines de Nantes 4, rue Alfred Kastler, La Chantrerie 4437 Nantes cedex 3 - France Email: debruyne@lirmm.fr Abstract Filtering

More information

A more efficient algorithm for perfect sorting by reversals

A more efficient algorithm for perfect sorting by reversals A more efficient algorithm for perfect sorting by reversals Sèverine Bérard 1,2, Cedric Chauve 3,4, and Christophe Paul 5 1 Département de Mathématiques et d Informatique Appliquée, INRA, Toulouse, France.

More information

Road Sign Analysis Using Multisensory Data

Road Sign Analysis Using Multisensory Data Road Sign Analysis Using Multisensory Data R.J. López-Sastre, S. Lauente-Arroyo, P. Gil-Jiménez, P. Siegmann, and S. Maldonado-Bascón University o Alcalá, Department o Signal Theory and Communications

More information

2. Recommended Design Flow

2. Recommended Design Flow 2. Recommended Design Flow This chapter describes the Altera-recommended design low or successully implementing external memory interaces in Altera devices. Altera recommends that you create an example

More information

Constraint Acquisition via Partial Queries

Constraint Acquisition via Partial Queries Constraint Acquisition via Partial Queries Christian Bessiere 1 Remi Coletta 1 Emmanuel Hebrard 2 George Katsirelos 3 Nadjib Lazaar 1 Nina Narodytska 4 Claude-Guy Quimper 5 Toby Walsh 4 1 CNRS, U. Montpellier,

More information

is a kind of generalization of AC-3 to non-binary constraints. As AC-3, that algorithm has a bad worst-case time complexity (O(er 2 d r+1 ), with e th

is a kind of generalization of AC-3 to non-binary constraints. As AC-3, that algorithm has a bad worst-case time complexity (O(er 2 d r+1 ), with e th Arc consistency for general constraint networks: preliminary results Christian Bessiere LIRMM{CNRS (UMR 5506) 161 rue Ada 34392 Montpellier cedex 5, France Email: bessiere@lirmm.fr Jean-Charles Regin ILOG

More information

Minion: Fast, Scalable Constraint Solving. Ian Gent, Chris Jefferson, Ian Miguel

Minion: Fast, Scalable Constraint Solving. Ian Gent, Chris Jefferson, Ian Miguel Minion: Fast, Scalable Constraint Solving Ian Gent, Chris Jefferson, Ian Miguel 1 60 Second Introduction to CSPs Standard Definition A CSP is a tuple V: list of variables D: a domain for each

More information

Himanshu Jain and Kalyanmoy Deb, Fellow, IEEE

Himanshu Jain and Kalyanmoy Deb, Fellow, IEEE An Evolutionary Many-Objective Optimization Algorithm Using Reerence-point Based Non-dominated Sorting Approach, Part II: Handling Constraints and Extending to an Adaptive Approach Himanshu Jain and Kalyanmoy

More information

CP-based Local Branching

CP-based Local Branching CP-based Local Branching Zeynep Kiziltan 1, Andrea Lodi 2, Michela Milano 2, and Fabio Parisini 2 1 Department of Computer Science, University of Bologna, Italy. zeynep@cs.unibo.it 2 D.E.I.S., University

More information

Neighbourhood Operations

Neighbourhood Operations Neighbourhood Operations Neighbourhood operations simply operate on a larger neighbourhood o piels than point operations Origin Neighbourhoods are mostly a rectangle around a central piel Any size rectangle

More information

Combining forces to solve Combinatorial Problems, a preliminary approach

Combining forces to solve Combinatorial Problems, a preliminary approach Combining forces to solve Combinatorial Problems, a preliminary approach Mohamed Siala, Emmanuel Hebrard, and Christian Artigues Tarbes, France Mohamed SIALA April 2013 EDSYS Congress 1 / 19 Outline Context

More information

Hybrid Set Domains to Strengthen Constraint Propagation and Reduce Symmetries

Hybrid Set Domains to Strengthen Constraint Propagation and Reduce Symmetries Hybrid Set Domains to Strengthen Constraint Propagation and Reduce Symmetries Andrew Sadler and Carmen Gervet IC Parc, Imperial College London, SW7 2AZ, U.K. {ajs2,cg6}@icparc.ic.ac.uk Abstract. In CP

More information

Modelling for Constraint Programming

Modelling for Constraint Programming Modelling for Constraint Programming Barbara M. Smith Cork Constraint Computation Centre, University College Cork, Ireland September 2005 1 Introduction Constraint programming can be a successful technology

More information

SIMULATION OPTIMIZER AND OPTIMIZATION METHODS TESTING ON DISCRETE EVENT SIMULATIONS MODELS AND TESTING FUNCTIONS

SIMULATION OPTIMIZER AND OPTIMIZATION METHODS TESTING ON DISCRETE EVENT SIMULATIONS MODELS AND TESTING FUNCTIONS SIMULATION OPTIMIZER AND OPTIMIZATION METHODS TESTING ON DISCRETE EVENT SIMULATIONS MODELS AND TESTING UNCTIONS Pavel Raska (a), Zdenek Ulrych (b), Petr Horesi (c) (a) Department o Industrial Engineering

More information

THE FINANCIAL CALCULATOR

THE FINANCIAL CALCULATOR Starter Kit CHAPTER 3 Stalla Seminars THE FINANCIAL CALCULATOR In accordance with the AIMR calculator policy in eect at the time o this writing, CFA candidates are permitted to use one o two approved calculators

More information

Concavity. Notice the location of the tangents to each type of curve.

Concavity. Notice the location of the tangents to each type of curve. Concavity We ve seen how knowing where a unction is increasing and decreasing gives a us a good sense o the shape o its graph We can reine that sense o shape by determining which way the unction bends

More information

1 Linear programming relaxation

1 Linear programming relaxation Cornell University, Fall 2010 CS 6820: Algorithms Lecture notes: Primal-dual min-cost bipartite matching August 27 30 1 Linear programming relaxation Recall that in the bipartite minimum-cost perfect matching

More information

CS485/685 Computer Vision Spring 2012 Dr. George Bebis Programming Assignment 2 Due Date: 3/27/2012

CS485/685 Computer Vision Spring 2012 Dr. George Bebis Programming Assignment 2 Due Date: 3/27/2012 CS8/68 Computer Vision Spring 0 Dr. George Bebis Programming Assignment Due Date: /7/0 In this assignment, you will implement an algorithm or normalizing ace image using SVD. Face normalization is a required

More information

On Covering a Graph Optimally with Induced Subgraphs

On Covering a Graph Optimally with Induced Subgraphs On Covering a Graph Optimally with Induced Subgraphs Shripad Thite April 1, 006 Abstract We consider the problem of covering a graph with a given number of induced subgraphs so that the maximum number

More information

Query-driven Constraint Acquisition

Query-driven Constraint Acquisition Query-driven Constraint Acquisition Christian Bessiere LIRMM-CNRS U. Montpellier, France bessiere@lirmm.fr Remi Coletta LRD Montpellier, France coletta@l-rd.fr Barry O Sullivan 4C, Computer Science Dept.

More information

CS261: A Second Course in Algorithms Lecture #16: The Traveling Salesman Problem

CS261: A Second Course in Algorithms Lecture #16: The Traveling Salesman Problem CS61: A Second Course in Algorithms Lecture #16: The Traveling Salesman Problem Tim Roughgarden February 5, 016 1 The Traveling Salesman Problem (TSP) In this lecture we study a famous computational problem,

More information

Theorem 2.9: nearest addition algorithm

Theorem 2.9: nearest addition algorithm There are severe limits on our ability to compute near-optimal tours It is NP-complete to decide whether a given undirected =(,)has a Hamiltonian cycle An approximation algorithm for the TSP can be used

More information

Arc Consistency for Dynamic CSPs

Arc Consistency for Dynamic CSPs Arc Consistency for Dynamic CSPs Malek Mouhoub mouhoubm@cs.uregina.ca Department of Computer Science, University of Regina 3737 Waskana Parkway, Regina SK, Canada, S4S 0A2 ABSTRACT Constraint Satisfaction

More information

foldr CS 5010 Program Design Paradigms Lesson 5.4

foldr CS 5010 Program Design Paradigms Lesson 5.4 oldr CS 5010 Program Design Paradigms Lesson 5.4 Mitchell Wand, 2012-2014 This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License. 1 Introduction In this lesson,

More information

(67686) Mathematical Foundations of AI July 30, Lecture 11

(67686) Mathematical Foundations of AI July 30, Lecture 11 (67686) Mathematical Foundations of AI July 30, 2008 Lecturer: Ariel D. Procaccia Lecture 11 Scribe: Michael Zuckerman and Na ama Zohary 1 Cooperative Games N = {1,...,n} is the set of players (agents).

More information

Some Applications of Graph Bandwidth to Constraint Satisfaction Problems

Some Applications of Graph Bandwidth to Constraint Satisfaction Problems Some Applications of Graph Bandwidth to Constraint Satisfaction Problems Ramin Zabih Computer Science Department Stanford University Stanford, California 94305 Abstract Bandwidth is a fundamental concept

More information

A Constraint Seeker: Finding and Ranking Global Constraints from Examples

A Constraint Seeker: Finding and Ranking Global Constraints from Examples A Constraint Seeker: Finding and Ranking Global Constraints from Examples Nicolas Beldiceanu 1 and Helmut Simonis 2 1 TASC team (INRIA/CNRS), Mines de Nantes, France Nicolas.Beldiceanu@mines-nantes.fr

More information

A New Algorithm for Singleton Arc Consistency

A New Algorithm for Singleton Arc Consistency A New Algorithm for Singleton Arc Consistency Roman Barták, Radek Erben Charles University, Institute for Theoretical Computer Science Malostranské nám. 2/25, 118 Praha 1, Czech Republic bartak@kti.mff.cuni.cz,

More information

Ad-hoc Workflow: Problems and Solutions

Ad-hoc Workflow: Problems and Solutions Ad-hoc Worklow: Problems and Solutions M. Voorhoeve and W. van der Aalst Dept. o Mathematics and Computing Science Eindhoven University o Technology Eindhoven, The Netherlands, 5600MB Abstract The paper

More information

Improved algorithm for finding (a,b)-super solutions

Improved algorithm for finding (a,b)-super solutions Improved algorithm for finding (a,b)-super solutions Emmanuel Hebrard National ICT Australia and University of New South Wales Sydney, Australia. ehebrard@cse.unsw.edu.au Brahim Hnich Cork Constraint Computation

More information

Binary Morphological Model in Refining Local Fitting Active Contour in Segmenting Weak/Missing Edges

Binary Morphological Model in Refining Local Fitting Active Contour in Segmenting Weak/Missing Edges 0 International Conerence on Advanced Computer Science Applications and Technologies Binary Morphological Model in Reining Local Fitting Active Contour in Segmenting Weak/Missing Edges Norshaliza Kamaruddin,

More information

Formal Model. Figure 1: The target concept T is a subset of the concept S = [0, 1]. The search agent needs to search S for a point in T.

Formal Model. Figure 1: The target concept T is a subset of the concept S = [0, 1]. The search agent needs to search S for a point in T. Although this paper analyzes shaping with respect to its benefits on search problems, the reader should recognize that shaping is often intimately related to reinforcement learning. The objective in reinforcement

More information

Binary Encodings of Non-binary Constraint Satisfaction Problems: Algorithms and Experimental Results

Binary Encodings of Non-binary Constraint Satisfaction Problems: Algorithms and Experimental Results Journal of Artificial Intelligence Research 24 (2005) 641-684 Submitted 04/05; published 11/05 Binary Encodings of Non-binary Constraint Satisfaction Problems: Algorithms and Experimental Results Nikolaos

More information

Computation with No Memory, and Rearrangeable Multicast Networks

Computation with No Memory, and Rearrangeable Multicast Networks Discrete Mathematics and Theoretical Computer Science DMTCS vol. 16:1, 2014, 121 142 Computation with No Memory, and Rearrangeable Multicast Networks Serge Burckel 1 Emeric Gioan 2 Emmanuel Thomé 3 1 ERMIT,

More information

The Complexity of Reasoning with Global Constraints

The Complexity of Reasoning with Global Constraints Constraints DOI 10.1007/s10601-006-9007-3 The Complexity of Reasoning with Global Constraints Christian Bessiere Emmanuel Hebrard Brahim Hnich Toby Walsh Springer Science + Business Media, LLC 2006 Abstract

More information

Finite Domain Bounds Consistency Revisited

Finite Domain Bounds Consistency Revisited Finite Domain Bounds Consistency Revisited C.W. Choi 1, W. Harvey 2, J.H.M. Lee 1, and P.J. Stuckey 3 1 Department of Computer Science and Engineering, The Chinese University of Hong Kong, Shatin, N.T.,

More information

Connecting Definition and Use? Tiger Semantic Analysis. Symbol Tables. Symbol Tables (cont d)

Connecting Definition and Use? Tiger Semantic Analysis. Symbol Tables. Symbol Tables (cont d) Tiger source program Tiger Semantic Analysis lexical analyzer report all lexical errors token get next token parser construct variable deinitions to their uses report all syntactic errors absyn checks

More information

Objective Function Designing Led by User Preferences Acquisition

Objective Function Designing Led by User Preferences Acquisition Objective Function Designing Led by User Preerences Acquisition Patrick Taillandier and Julien Gauri Abstract Many real world problems can be deined as optimisation problems in which the aim is to maximise

More information

PLANAR GRAPH BIPARTIZATION IN LINEAR TIME

PLANAR GRAPH BIPARTIZATION IN LINEAR TIME PLANAR GRAPH BIPARTIZATION IN LINEAR TIME SAMUEL FIORINI, NADIA HARDY, BRUCE REED, AND ADRIAN VETTA Abstract. For each constant k, we present a linear time algorithm that, given a planar graph G, either

More information