Solving the Maximum Satisfiability Problem Using an Evolutionary Local Search Algorithm
|
|
- Samantha Walker
- 6 years ago
- Views:
Transcription
1 154 The International Arab Journal of Information Technology, Vol. 2, No. 2, April 2005 Solving the Maximum Satisfiability Problem Using an Evolutionary Local Search Algorithm Mohamed El Bachir Menai 1 and Mohamed Batouche 2 1 Artificial Intelligence Laboratory, University of Paris 8, France 2 Computer Science Department, University Mentouri of Constantine, Algeria Abstract: The MAXimum propositional SATisfiability problem (MAXSAT) is a well known NP-hard optimization problem with many theoretical and practical applications in artificial intelligence and mathematical logic. Heuristic local search algorithms are widely recognized as the most effective approaches used to solve them. However, their performance depends both on their complexity and their tuning parameters which are controlled experimentally and remain a difficult task. Extremal Optimization (EO) is one of the simplest heuristic methods with only one free parameter, which has proved competitive with the more elaborate general-purpose method on graph partitioning and coloring. It is inspired by the dynamics of physical systems with emergent complexity and their ability to self-organize to reach an optimal adaptation state. In this paper, we propose an extremal optimization procedure for MAXSAT and consider its effectiveness by computational experiments on a benchmark of random instances. Comparative tests showed that this procedure improves significantly previous results obtained on the same benchmark with other modern local search methods like WSAT, simulated annealing and Tabu Search (TS). Keywords: Constraint satisfaction, MAXSAT, heuristic local search, extremal optimization. Received February 29, 2004; accepted June 30, Introduction An expression is satisfiable if it is true under some interpretation. The SATisfiability problem (SAT) involves finding a truth assignment that satisfies a CNF formula in propositional logic. There are two main approaches to SAT. One class of solutions uses a systematic approach, meaning that each possible assignment of truth values is tested until a solution is found. The other involves randomly testing assignments, and is called stochastic method. The first approach guarantees to find a solution if one exists, but in the worst case can be very inefficient. The second approach is more effective for practically solving large SAT instances but it cannot be used to prove that a given problem instance is unsatisfiable. The class of all problems for which it is not known if a polynomial time algorithm exists, is called Nondeterministic Polynomial (NP) and the related problems are called NP-complete. These problems require algorithms of exponential time complexity in the worst case to obtain optimal solutions. SAT is the archetypical NP problem, that is all NP-complete problems can be polynomially reduced to it (The concept of polynomial-time reductibility) [8]. Indeed, many problems in various areas like artificial intelligence, computer aided design and databases, involve the solution of SAT instances or its variants. An extension of SAT is the MAXimum SATisfiability problem (MAXSAT) which consists of satisfying the maximum number of clauses of the propositional formula. MAXSAT is of considerable interest from both the theoretical side and the practical side. It plays an important role in the characterization of different approximation classes like Polynomial Time Approximation Algorithm (PTAA) and Polynomial Time Approximation Scheme (PTAS) [34] and many optimization problems can be formulated in this form of satisfiability. MAXSAT is known to be NP-hard (an optimization problem that has a related NP-complete decision version problem) even when each clause contains exactly two literals MAX2SAT [9, 17]. Since finding an exact solution to this problem requires exponential time, approximation algorithms to find near optimal solutions in polynomial time, appear to be viable. Developing efficient algorithms and heuristics for MAXSAT, can then lead to general approaches for solving combinatorial optimization problems. Extremal Optimization (EO) was recently introduced as a heuristic search method [5] for approximating solutions to hard optimization problems. It is inspired by the Bak-Sneppen model of biological evolution and natural selection [1] where the high degree of adaptation of most species emerge from the dynamics through a selection against the extremely bad ones [5]. The present work was encouraged by the high quality results obtained with EO on graph coloring [6] and graph partitioning problems [5, 7]. This method has been proved competitive and even
2 Solving the Maximum Satisfiability Problem Using an Evolutionary Local Search Algorithm 155 better than the more elaborate general purpose heuristics such as simulated annealing [19] or Tabu Search (TS) [13, 14] on these hard optimization problems. In this paper, we extend the EO method to solve MAXSAT problems and compare it to the more frequently used local search methods for this problem such as simulated annealing [32], TS [21] and the well known procedure WSAT [29]. This paper is organized as follows. Section 2 presents the MAXSAT problem and related methods to solve it. Section 3 presents the Extremal Optimization method. Its adaptation to handle MAXSAT problems and implementation are discussed in section 4. Experimental results and conclusion follow, respectively, in sections 5 and MAXSAT Problem Formally, SAT is defined as follows: Given a set of n variables X = {x 1, x 2,, x n } and a set of literals over X L x, x / x X. A clause C on X is a disjunction of literals. An assignment of Boolean variables is a substitution of these variables by a vector v {0, 1} n. A clause C is satisfied by an assignment v if the value of the clause equals 1 (C (v) = 1), otherwise the clause is unsatisfied (C (v) = 1). Given m clauses C 1, C 2,, C m, the SAT problem asks to determine an assignment v 0 {0, 1} n that satisfies the Boolean formula in conjunctive normal form (CNF) C1 C 2... Cm or to derive its infeasibility. A weighted formula is a pair WF = {CF, W} where m CF = (C i ) i m is a clause form and W w i im is an integer vector; for each i m, w i is the weight of the clause C i. An assignment v {0, 1} n determines a weight value in the weighted formula WF as: wc CF, W, v wi (1) c v1 ] The MAXSAT problem asks to determine an assignment v 0 {0, 1} n that maximizes the sum of the weights of satisfied clauses: wc (CF, W, v 0 ) = Max {wc (CF, w, v) v {0, 1} n } (2) MAXkSAT is the subset of MAXSAT instances in which each clause has exactly k literals. If each weight is equal to one, we call the problem unweighted MAXSAT, otherwise we speak of weighted MAXSAT or simply MAXSAT. Note that SAT is a special case of the MAXSAT in which all clauses have unit weight and one wants to decide if there is any truth assignment of total weight m. MAXSAT algorithms can be classified as complete and incomplete depending on whether they can find the optimal assignment. Complete algorithms can determine optimal assignments but they are computationally expensive and are not appropriate in i solving large instances. These include Davis-Putnam s procedure [10], and various heuristics using different branching rules and reinforcement techniques like equivalence reasoning [20] or backbone detecting [11]. Incomplete methods are usually faster and can solve some large problems of thousands of variables that complete methods cannot handle. In this outline, stochastic local search has become an important general purpose method for solving satisfiability problems: It starts with a random initial solution and tries to improve it by moving to neighbouring solutions. It can be trapped in local poor minima or plateaus and it requires, therefore, a strategy to both escape from these local minima and to guide the search toward good solutions. Its performance lies in the choice of the initial solution and the transformation applied to the current solution. Many stochastic local search methods have been designed such as simulated annealing [32], TS [21], Greedy Randomized Adaptive Search Procedure (GRASP) [26], Discrete Lagrangian based search Method (DLM) [31] and Reactive search [3]. Although the list of competitive heuristics is not exhaustive, the most accepted one is Greedy SATisfiability (GSAT) defined initially for SAT [18, 27, 28, 30] and applied next to MAXSAT. It begins with a randomly generated initial assignment and at each iteration, it flips the variable that had the largest decrease in unsatisfied clauses. It accepts also the moves which either produce the same objective function value or increase it. This process is repeated until a maximum number of non-improving moves is reached. Different noise strategies to escape from local optima are added to GSAT like WSAT [29], Novelty and R-Novelty [22], and UnitWalk [15]. The WalkSAT (WSAT) procedure [29] is a particularly powerful variant of GSAT. It mainly consists of a random walk on the unsatisfied clauses. The experimental results obtained by Jiang et al. [16] with an adaptered version of WSAT on MAXSAT encodings of the Steiner tree problem showed that this approach is competitive with the best current specialized Steiner tree algorithm. In the TS method [13, 14], a memory of forbidden moves called tabu list is used to avoid the search process revisiting the previously found solutions. At each iteration, a configuration once visited is made tabu for a given number of iterations. Different historybased heuristics have been proposed to intensify and diversify the search into previously unexplored regions of the search space with information collected from the previous phase. HSAT [12] introduces a tie-breaking rule into GSAT so that if more moves produce the same best results, the preferred move is the one that has been applied for the longest period. The results obtained on some SAT benchmark tasks present a better performance with respect to WSAT. TSAT is a Tabu search procedure for SAT problems [21] that has
3 156 The International Arab Journal of Information Technology, Vol. 2, No. 2, April 2005 also been shown to be competitive with WSAT on hard random 3-SAT instances. simulated annealing is a method inspired by natural systems [19]. It emulates the behaviour of frustrated physical systems in thermal equilibrium: A state of minimum energy may be attained by cooling the system slowly according to a temperature schedule. The simulated annealing local search method moves through the space configurations according to the Metropolis algorithm [24] driving the system to equilibrium dynamics. Experimental comparison between GSAT and simulated annealing [32] on hard satisfiability problems indicated that simulated annealing satisfied at least as many clauses as GSAT, however, simulated annealing requires careful tuning of temperature schedule parameters. Selman et al. [27] affirm that they were unable to find a cooling schedule that outperformed GSAT. 3. Extremal Optimization Method The concept of Self-Organized Criticality (SOC) was introduced by Bak et al. in 1987 [2] to describe some open, dissipative, spatially extended systems such as biological evolution, earthquakes and solar flares, that spontaneously achieve a critical state characterised by power-law distribution of event sizes [33]. The Bak- Sneppen model [1] is a SOC model which was proposed to describe the self-organization phenomenon in the biological evolution of species. In this model, species have an associated value between 0 and 1 called fitness and a selection process against the extremely bad ones is applied. At each iteration, species having the smallest fitness value is replaced by a random value which impacts obviously the fitness of interconnected species. After a sufficient number of steps, the system reaches a highly correlated state SOC in which all species have reached a fitness of optimal adaptation. This coevolutionary process has been converted into an optimization algorithm called Extremal Optimization (EO) and introduced by Boettcher and Percus [5]. The basic structure of the algorithm EO proceeds as follows. Consider a system described by a set of N couples of variables x i each with an associated fitness value (or individual cost) [5]: 1. Choose at random an initial state of the system. 2. Rank each variable x i of the system according to its fitness value 3. Update the variable with the smallest fitness according to some move class. 4. Repeat (2) a preset number of times. It consists of updating extremely undesirable variables of a sub-optimal solution, replacing them by random values to ensure efficient exploring of many local minima. The rank ordering allows EO to maintain well-adapted pieces of a solution, while updating an unfit variable gives the system enough flexibility to explore various space configurations. In addition, EO gives no consideration to the moves outcome. A general modification of EO [5, 6], noted -EO, consists of ranking all variables from rank n = 1 for the worst fitness to rank n = N for the best fitness. For a given value of, a power-law probability distribution over the rank order is considered: P (n) n (1 n N) (3) At each update, select a rank k according to P (k) and change the variable x k state. The worst variable (with rank 1) will be chosen most frequently, while the higher ones will sometimes be updated. In this way, a bias against worst variables is maintained and no rank gets completely excluded from the selection process. However, the search performance depends on the parameter. For = 0, the algorithm is simply a random walk through the search space. While for too large values of the process approaches a deterministic local search where only a small number of variables with particularly bad fitness would be chosen at each iteration. The optimal value of is placed at a point between having large enough to descent into local minima while having just small enough to not get trapped inside the basin of any local minimum [4]. Boettcher and Percus [4, 7] have established a relation between, the run time t max and the number of the variables of the system N to estimate the optimal value of Let t = AN, where A is a constant, then: lnn N ln A ~ 1 N, 1 A N (4) ln At this optimal value, the best fitness variables are not completely excluded from the selection process and hence, more space configurations can be reached so that greatest performance can be obtained. It may appear that the strategy of EO is similar to an ineffective random search, but in fact, by persistent selection against the worst fitness, the algorithm returns frequently to near-optimal configurations. A disadvantage of EO remains in the haziness of variable individual fitness definition. Also, in highly connected systems, EO is slowed down significantly by the fitness re-evaluating process at each iteration. However, these disadvantages could be ineffective in the case of problems that have a natural choice of fitness function with a low variable connectivity like satisfiability problems. EO complements approximation methods inspired by equilibrium statistical physics like simulated annealing [19]. But when EO fluctuates to take the system far from the equilibrium state [5, 6], simulated annealing drives the system to equilibrium dynamics. The experiment results presented by Boettcher and Percus [6, 7] compare EO with simulated annealing on
4 Solving the Maximum Satisfiability Problem Using an Evolutionary Local Search Algorithm 157 benchmarks of the Graph Partitioning problem (GP) and the Travelling Salesman Problem (TSP). For the GP problem, EO appears to be more successful over a large range of graphs while simulated annealing is useful for highly connected graphs. In the TSP Euclidian case, EO results trail those of simulated annealing. But, in the non-euclidian TSP case, EO results outperform simulated annealing ones. 4. Extremal Optimization for MAXSAT Problem Given a MAXSAT problem instance of n Boolean variables and m weighted clauses, the definition of the fitness i of a variable x i depends on its relation with the other variables [4]. In a previous study [23] we have defined i as the sum of weights of clauses satisfied by the variable x i over the total weights sum. The results obtained with such a definition were not very satisfying because a lowly connected variable will be too fit to be updated even if it satisfies all of its weights. For example, given an unweighted MAXSAT problem of 100 clauses (all the weights are equal to 1) and 20 variables, a variable that appears in 5 clauses and satisfies all of them, will have a fitness of 5/100, while another variable which appears in all the clauses and satisfies only 10 clauses, will have a fitness of 10/100 and be considered as better than the previous variable. Hence, the algorithm will continue to update a variable that has satisfied all of its clauses. This waste of search time, explains the low quality solutions obtained with such definition. In this new implementation, we consider a more effective definition of a variable fitness as the negation of the sum of weights of clauses unsatisfied by x i : i w x C and C v 0 j (5) i j The cost function C (S) is the individual cost contributions i for each variable x i. C (S) has to be minimized in order to maximize the number of satisfying clauses. Then, the following equation holds: C S j n i1 (6) Procedure: EO-MAXSAT(WeightedClauses, MaxSteps, Tho) 1. S := a random truth assignment over the variables that appear in clauses. 2. Sbest := S. 3. UnsatClauses := clauses not satisfied by S. 4. WeightUnsat := sum of UnsatClauses weights. 5. For k := 1 to MaxSteps do a. if S satisfies WeightedClauses then return (S) b. Evaluate i for each x i in WeightedClauses according to equation (5) i c. Rank x i according to i from the worst to the best Tho d. Select a rank j according to P j j e. S := S in which xj value is flipped f. If C (S) < C (Sbest) then Sbest := S g. S := S h. Update (UnsatClauses, WeightUnsat) Endfor 6. Return (Sbest, C(Sbest), UnsatClauses, WeightUnsat) End {EO-MAXSAT} Figure 1. General structure of the procedure EO for MAXSAT. The pseudo code of the procedure EO-MAXSAT (EO for MAXSAT) is described in Figure Lines 1-4: The search begins at a random truth assignment S and the current best assignment, Sbest, is set to S. Further, the initial set of unsatisfied clauses and their total weights are calculated. 2. Line 5: A fixed number of tries MaxSteps is executed. Each step in the search corresponds to flipping the truth value assigned to a variable according to EO strategy. 3. Lines b-c: The variable individual fitnesses are evaluated and sorted using a hash table to accelerate the EO process. 4. Lines d-e: A variable is selected according to the power-law distribution (equation (3)) and its value is flipped. 5. Lines f-h: Sbest is related to the current assignment with the minimum cost (equation (6)). The current assignment, the set of unsatisfied clauses and their total weights are then maintained. 6. Line 6: The current best assignment is returned when no model can be found. 5. Experimental Results We now present experimental results obtained with a prototype of the procedure EO-MAXSAT. We compare it to WSAT, simulated annealing, and TS. The code of WSAT was taken from the web site e.tar. A version of TS has been developed, while for simulated annealing we refer to experimental results presented in [29]. The algorithms were coded in C language and run on a computer (Pentium III, 512 Mb RAM, 450 MHz clock). The test suite is dedicated to randomly generated MAX3SAT and MAX4SAT instances mostly unsatisfied [25]. They are simply generated and do not hold any hidden structure inherent to a specific problem. The generator of such instances is available at the URL given above. For random MAX3SAT the instances considered are (100, 200), (100, 500), (100, 700), (300, 600), (300, 800), (300, 1500), (300, 2000) and (500, 5000) where each couple (n, m) means n
5 158 The International Arab Journal of Information Technology, Vol. 2, No. 2, April 2005 variables and m clauses. For random MAX4SAT, the instances considered are: (100, 700), (300, 1500), (300, 2500) and (300, 3000). For each couple (n, m) 50 instances were randomly generated. In the following, we refer to MAX3SAT and MAX4SAT instances with U3-n: m and U4-n: m respectively. The first experiments involve the investigation of the optimal value of. The algorithm was run 10 times for each instance and the parameter MaxSteps was fixed to 100n. Figure 2 presents the average number of unsatisfied clauses for each instance while varying between 1.1 and 2. Results show that optimal performance had been observed for ranging from 1.4 to 1.6 for all instances. For U3 instances, is equal to 1.6 for n = 100 (Figure 2-a) and equal to 1.5 for n = 300 or n = 500 (Figures 2-b and 2-c). While for U4 instances, is equal to 1.5 for n = 100 and between 1.4 and 1.5 for n = 300 (Figures 2-d and 2-e). These numerical values are relatively close to those predicted by equation (4). ( = for n = 100, = for n = 300 and = for n = 500). We note that equation (4) cannot be applied for long run times (A > n). Subsequently, we examined the effectiveness of the procedure EO-MAXSAT on the benchmark instances where was set to the default value of 1.6 for n = 100 and to 1.5 for n = 300 or n = 500. Table 1 shows the average, minimum and maximum number of unsatisfied clauses for each instance obtained over 10 runs and the maximum CPU time allowed for each run. Satisfying assignments were found for 5 out of 6 known satisfiable problems (U3-100:200, U3-300:600, U3-300:800, U4-300:1500 and U4-300:2500). The good robustness of the algorithm could be suggested by the empirical values obtained for the standard deviation. Table 1. Average results over 10 runs achieved by EO-MAXSAT on MAX3SAT (U3-n:m) and MAX4SAT (U4-n:m) instances. Problem Identifier Avg Min Max Std. Dev. CPU secs U3-100: U3-100: U3-100: U3-300: U3-300: U3-300: U3-300: U3-500: ,00 5, U4-100: U4-300: U4-300: U4-300: Table 2 represents the average number of unsatisfied clauses for U3 and U4 instances. The column in the table denoted with simulated annealing is derived from [29]. It represents reported results obtained with simulated annealing on the same instances. The Additional columns refer to our experiments on EO-MAXSAT, TS and WSAT where the maximum allowed run-times are the same as reported in Table 1. The tests show that EO-MAXSAT improves significantly upon the results obtained with simulated annealing and TS on all the benchmark instances. EO-MAXSAT has been able to satisfy 5 out of 6 satisfiable problems, while WSAT has found models for all these problems. However, EO- MAXSAT has outperformed WSAT on 5 out of 6 remaining unsatisfiable problems. Table 2. Average number of unsatisfied clauses for the benchmark instances. The data for simulated annealing are derived from [29]. Problem Identifier EO-MAXSAT Simulated Annealing TS WSAT ( p = 0.5) U3-100: U3-100: U3-100: U3-300: U3-300: U3-300: U3-300: U3-500: U4-100: U4-300: U4-300: U4-300: Next, we compare the variation of the average number of unsatisfied clauses as a function of the number of iterations for EO-MAXSAT and WSAT. Figure 3 shows results obtained on two hard instances U3-300:2000 and U3-500:5000. It is easy to observe that EO-MAXSAT converges faster to a local optimum than WSAT. For example, on U3-300: 2000, assignments containing in average 44 violated clauses, were obtained at the 400 th iteration for EO-MAXSAT but only after the 1000 th iteration for WSAT. On the instance U3-500:5000, EO-MAXSAT was able to reduce the number of violated clauses to 175 after 1000 iterations, while WSAT reached this local optimum after iterations. As a result, the average performance of EO- MAXSAT exceeds that of simulated annealing, TS and WSAT on the tested benchmarks. Its high quality solution can be explained as follows. On the one hand, the large fluctuations of EO allow the search to escape quickly from local minima and to explore other assignments in the search space, and on the other hand, the extremal selection process obliges the search to frequently visit near-optimal assignments. Comparable to WSAT, simulated annealing and TS, the procedure EO-MAXSAT is incomplete, which means that there is no guarantee that it can find a model to the problem if one exists. However, when a solution is found, we are sure that it is correct, which means that the procedure is sound. EO-MAXSAT is constructed around a local
6 Solving the Maximum Satisfiability Problem Using an Evolutionary Local Search Algorithm 159 search scheme like WSAT family algorithms. The main difference between EO and WSAT, simulated annealing and TS, resides in the fact that EO makes moves using a fitness that is based not on anticipated outcome but purely on the current state of each variable. Also, EO needs only one tuning parameter, while simulated annealing requires careful tuning temperature schedule [19] parameters. Log () Log () Log () Log() Log() Log () 100,0 10,0 1,0 0,1 100,0 10,0 200,0 180,0 160,0 10,0 1,0 0,1 0,0 1,0 0,1 0,0 Free parameter of EO-MAXSAT U3-100:700 U3-100:500 U3-100:200 : Free parameter of EO-MAXSAT (a) : Free parameter of EO-MAXSAT U3-300:600 U3-300:800 U3-300:1500 U3-300:2000 : Free parameter of EO-MAXSAT (b) : Free parameter of EO : Free parameter of EO (c) : Free parameter of EO-MAXSAT U3-500:5000 U4-100:700 U4-300:1500 : Free parameter of EO-MAXSAT (d) 10,0 8,0 6,0 4,0 2,0 : Free parameter of EO : Free parameter of EO (e) U4-300:2500 U4-300:3000 Figure 2. The effect of the parameter on the average number of unsatisfied clauses WSAT : U3-300:2000 EO: U3-300:2000 WSAT: U3-500:5000 EO: U3-500:5000 1,e+2 1,e+3 1,e+4 1,e+5 1,e+6 Log Log(iterations) Figure 3. Average number of unsatisfied clauses as a function of the number of iterations for EO-MAXSAT and WSAT. 6. Conclusion Extremal optimization is a surprisingly simple and powerful method to find high quality solutions to hard optimization problems. It belongs to the family of stochastic local search algorithms such as WSAT, simulated annealing and TS. Its straightforward implementation and test provide motivation to adapt it for solving various classes of these problems. In this paper, we have shown how to deal with this method to solve the MAXSAT problem, one of the most important problems in the encoding-resolution paradigm. The derived procedure EO-MAXSAT is sound but incomplete. It has been tested on random MAX3SAT and MAX4SAT problems. The experimental results demonstrate its superiority with respect to simulated annealing, TS and even to WSAT on the considered instances. The high performance achieved is due to the flexibility maintained by the EO process to explore more space configurations and to retain well adapted pieces of a solution. Its fast convergence to a near optimal solution may be useful in practice for especially large instances. Actually we are performing additional tests on other SAT and MAXSAT instances from the DIMACS archive at the
7 160 The International Arab Journal of Information Technology, Vol. 2, No. 2, April 2005 URL /challenges/ and comparisons with other local search methods like DLM. Acknowledgments We are particularly indebted to Stefan Boettcher at Emory University for his valuable comments on our study. We thank Prof. Selman B. at Cornell University who made available his MAXSAT random generator software. The comments of the anonymous referees are gratefully acknowledged. References [1] Bak P. and Sneppen K., Punctuated Equilibrium and Criticality in a Simple Model of Evolution, Physical Review Letters, vol. 71, pp , [2] Bak P., Tang C., and Wiesenfeld K., Self- Organized Criticality: An Explanation of 1/f- Noise, Physical Review Letters, vol. 86 no. 23, pp , [3] Battiti R. and Protasi M., Reactive Search: A History-Sensitive Heuristic for MAX-SAT, ACM Journal of Experimental Algorithmics, vol. 2, [4] Boettcher S. and Grigni M., Jamming Model for the Extremal Optimization Heuristic, Journal of Physics A: Mathematical and General, vol. 35, pp , [5] Boettcher S. and Percus A. G., Nature s Way of Optimizing, Elsevier Science, Artificial Intelligence, vol. 119, pp , [6] Boettcher S. and Percus A. G., Optimization with Extremal Dynamics, Physical Review Letters, vol. 86, no. 23, pp , [7] Boettcher S. and Percus A. G., Extremal Optimization for Graph Partitioning, Physical Review E, vol. 64, pp. 1-13, [8] Cook S. A., The Complexity of Theorem Proving Procedures, in Proceedings of the 3 rd Annual ACM Symposium of the Theory of Computation, pp , [9] Crescenzi P. and Kann V., How to Find the Best Approximation Results: A Follow-up to Garey and Johnson, ACM SIGACT News, vol. 29 no. 4, pp , [10] Davis M. and Putnam M., A Computing Procedure for Quantification Theory, Journal of the ACM, vol. 7. pp , [11] Dubois O. and Dequen G., A Backbone-Search Heuristic for Efficient Solving of Hard 3-SAT Formulae, in Proceedings of the IJCAI 01, pp , San Francisco, CA, [12] Gent I. and Walsh T., Unsatisfied Variables in Local Search, in Hybrid Problems, Hybrid Solutions, Hallam J. (Eds), IOS Press, Amsterdam, pp , [13] Glover F., Tabu Search: Part I, ORSA Journal on Computing, vol. 1, no. 3, pp , [14] Glover F., Tabu Search: Part II, ORSA Journal on Computing, vol. 2, no. 1, pp. +32, [15] Hirsch E. A. and Kojevnikov A., UnitWalk: A New SAT Solver That Uses Local Search Guided by Unit Clause Elimination, in Proceedings of the 5 th International Symposium on the Theory and Applications of Satisfiability Testing (SAT 2002), Cincinnatti, Ohio, USA, pp , [16] Jiang Y., Kautz H. A., and Selman B., Solving Problems With Hard and Soft Constraints Using a Stochastic Algorithm for MAX-SAT, in Proceedings of the 1 st International Joint Workshop on Artificial Intelligence and Operations Research, [17] Johnson D., Approximation Algorithms for Combinatorial Problems, Journal of Computer and System Sciences, vol. 9, pp , [18] Kautz H. A. and Selman B., Pushing the Envelope: Planning, Propositional Logic, and Stochastic Search, in Proceedings of the AAAI 96, vol. 2, pp , MIT Press, [19] Kirkpatrick S., Gelatt C. D., and Vecchi P. M., Optimization by Simulated Annealing, Science, vol. 220, no. 4598, pp , [20] Li C.M., Integrating Equivalence Reasoning into Davis-Putnam Procedure, in Proceedings of the AAAI 00, pp , [21] Mazure B., Sais L., and Gregoire E., Tabu Search for SAT, in Proceedings of the 14 th National Conference on Artificial Intelligence and 9 th Innovative Applications of Artificial Intelligence Conference, pp , [22] Mc Allester D., Selman B., and Kautz H., Evidence for Invariants in Local Search, in Proceedings of the IJCAI-97, [23] Menaï M. B. and Batouche M., EO for MAXSAT, in Proceedings of the International Conference on Artificial Intelligence (IC-AI 02), Las Vegas, USA, vol. 3, pp , [24] Metropolis N., Rosenbluth A. W., Rosenbluth M. N., Teller A. H., and Teller E., Equation of State Calculations by Fast Computing Machines, Journal of Chemical Physics, vol. 21, pp , [25] Mitchell D., Selman B., and Levesque H. J., Hard and Easy Distributions of SAT Problems, in Proceedings of the Tenth National Conference on Artificial Intelligence, AAAI, San Jose, CA, pp , [26] Resende M. G. C., Pitsoulis L. S., and Pardalos P. M., Approximate Solution of Weighted MAX-SAT Problems Using GRASP, DIMACS Series on Discrete Mathematics and Theoretical
8 Solving the Maximum Satisfiability Problem Using an Evolutionary Local Search Algorithm 161 Computer Science, American Mathematical Society, vol. 35, pp , [27] Selman B. and Kautz H. A., An Empirical Study of Greedy Local Search for Satisfiability Testing, in Proceedings of the 11 th National Conference on Artificial Intelligence, pp , [28] Selman B. and Kautz H. A., Domain Independent Extensions to GSAT: Solving Large Structured Satisfiability Problems, in Proceedings of the IJCAI-93, pp , [29] Selman B., Kautz H. A., and Cohen B., Noise Strategies for Improving Local Search, in Proceedings of the 12 th National Conference on Artificial Intelligence, pp , [30] Selman B., Levesque H., and Mitchell D., A New Method for Solving Hard Satisfiability Problems, in Proceedings of the 10 th National Conference on Artificial Intelligence, pp , [31] Shang Y. and Wah B. W., A Discrete Lagrangian-Based Global-Search Method for Solving Satisfiability Problems, Journal of Global Optimization, vol. 12, pp , [32] Spears W. M., Simulated Annealing for Hard Satisfiability Problems, in Johnson D. S. and Trick M. A. (Eds), DIMACS Series in Discrete Mathematics and Theoretical Computer Science, American Mathematical Society, vol. 26, pp , [33] Vieira M. S., Are Avalanches in Sandpiles a Chaotic Phenomenon?, Research Report, [34] Yannakakis M., On the Approximation of Maximum Satisfiability, Journal of Algorithms, vol. 17, pp , Mohamed El Bachir Menai is currently a researcher at the AI Laboratory of the University of Paris 8, France. From 1988 to 2003, he was a researcher and an assistant professor at the Computer Science Department of the University of Tebessa. He received his BSc in computing systems from the University of Annaba in His main interests are in the areas of problem complexity, heuristic search, evolutionary computation, and quantum computing. Mohamed Batouche received his MSc and PhD degrees in computer science from the Institut National Polytechnique de Lorraine (INPL), France, in 1989 and 1993, respectively. Currently, he is a fulltime professor at the University Mentouri of Constantine, Algeria. His research areas include artificial intelligence and pattern recognition.
Massively Parallel Seesaw Search for MAX-SAT
Massively Parallel Seesaw Search for MAX-SAT Harshad Paradkar Rochester Institute of Technology hp7212@rit.edu Prof. Alan Kaminsky (Advisor) Rochester Institute of Technology ark@cs.rit.edu Abstract The
More informationalgorithms, i.e., they attempt to construct a solution piece by piece and are not able to offer a complete solution until the end. The FM algorithm, l
The FMSAT Satisfiability Solver: Hypergraph Partitioning meets Boolean Satisfiability Arathi Ramani, Igor Markov framania, imarkovg@eecs.umich.edu February 6, 2002 Abstract This report is intended to present
More informationAn Introduction to SAT Solvers
An Introduction to SAT Solvers Knowles Atchison, Jr. Fall 2012 Johns Hopkins University Computational Complexity Research Paper December 11, 2012 Abstract As the first known example of an NP Complete problem,
More informationCS227: Assignment 1 Report
1 CS227: Assignment 1 Report Lei Huang and Lawson Wong April 20, 2008 1 Introduction Propositional satisfiability (SAT) problems have been of great historical and practical significance in AI. Despite
More informationSolving Problems with Hard and Soft Constraints Using a Stochastic Algorithm for MAX-SAT
This paper appeared at the st International Joint Workshop on Artificial Intelligence and Operations Research, Timberline, Oregon, 995. Solving Problems with Hard and Soft Constraints Using a Stochastic
More informationFoundations of AI. 8. Satisfiability and Model Construction. Davis-Putnam, Phase Transitions, GSAT and GWSAT. Wolfram Burgard & Bernhard Nebel
Foundations of AI 8. Satisfiability and Model Construction Davis-Putnam, Phase Transitions, GSAT and GWSAT Wolfram Burgard & Bernhard Nebel Contents Motivation Davis-Putnam Procedure Average complexity
More informationEvidence for Invariants in Local Search
This paper appears in the Proceedings of the Fourteenth National Conference on Artificial Intelligence (AAAI-97), Providence, RI, 1997. Copyright 1997 American Association for Artificial Intelligence.
More informationSpeeding Up the ESG Algorithm
Speeding Up the ESG Algorithm Yousef Kilani 1 and Abdullah. Mohdzin 2 1 Prince Hussein bin Abdullah Information Technology College, Al Al-Bayt University, Jordan 2 Faculty of Information Science and Technology,
More informationOn the Run-time Behaviour of Stochastic Local Search Algorithms for SAT
From: AAAI-99 Proceedings. Copyright 1999, AAAI (www.aaai.org). All rights reserved. On the Run-time Behaviour of Stochastic Local Search Algorithms for SAT Holger H. Hoos University of British Columbia
More informationSimple mechanisms for escaping from local optima:
The methods we have seen so far are iterative improvement methods, that is, they get stuck in local optima. Simple mechanisms for escaping from local optima: I Restart: re-initialise search whenever a
More informationUnit 8: Coping with NP-Completeness. Complexity classes Reducibility and NP-completeness proofs Coping with NP-complete problems. Y.-W.
: Coping with NP-Completeness Course contents: Complexity classes Reducibility and NP-completeness proofs Coping with NP-complete problems Reading: Chapter 34 Chapter 35.1, 35.2 Y.-W. Chang 1 Complexity
More informationSatisfiability. Michail G. Lagoudakis. Department of Computer Science Duke University Durham, NC SATISFIABILITY
Satisfiability Michail G. Lagoudakis Department of Computer Science Duke University Durham, NC 27708 COMPSCI 271 - Spring 2001 DUKE UNIVERSITY Page 1 Why SAT? Historical Reasons The first NP-COMPLETE problem
More informationCS-E3200 Discrete Models and Search
Shahab Tasharrofi Department of Information and Computer Science, Aalto University Lecture 7: Complete and local search methods for SAT Outline Algorithms for solving Boolean satisfiability problems Complete
More informationof m clauses, each containing the disjunction of boolean variables from a nite set V = fv 1 ; : : : ; vng of size n [8]. Each variable occurrence with
A Hybridised 3-SAT Algorithm Andrew Slater Automated Reasoning Project, Computer Sciences Laboratory, RSISE, Australian National University, 0200, Canberra Andrew.Slater@anu.edu.au April 9, 1999 1 Introduction
More informationRandom Subset Optimization
Random Subset Optimization Boi Faltings and Quang-Huy Nguyen Artificial Intelligence Laboratory (LIA), Swiss Federal Institute of Technology (EPFL), IN-Ecublens, CH-1015 Ecublens, Switzerland, boi.faltings
More informationDiscrete Lagrangian-Based Search for Solving MAX-SAT Problems. Benjamin W. Wah and Yi Shang West Main Street. Urbana, IL 61801, USA
To appear: 15th International Joint Conference on Articial Intelligence, 1997 Discrete Lagrangian-Based Search for Solving MAX-SAT Problems Abstract Weighted maximum satisability problems (MAX-SAT) are
More informationAbstract. Keywords 1 Introduction 2 The MAX-W-SAT Problem
Abstract. The satisfiability problem or SAT for short and many of its variants have been widely and constantly studied these last two decades. Competitive general purpose algorithms based on meta-heuristics
More informationA complete algorithm to solve the graph-coloring problem
A complete algorithm to solve the graph-coloring problem Huberto Ayanegui and Alberto Chavez-Aragon Facultad de Ciencias Basicas, Ingenieria y Tecnologia, Universidad Autonoma de Tlaxcala, Calzada de Apizaquito
More informationNote: In physical process (e.g., annealing of metals), perfect ground states are achieved by very slow lowering of temperature.
Simulated Annealing Key idea: Vary temperature parameter, i.e., probability of accepting worsening moves, in Probabilistic Iterative Improvement according to annealing schedule (aka cooling schedule).
More informationA Multilevel Greedy Algorithm for the Satisfiability Problem
A Multilevel Greedy Algorithm for the Satisfiability Problem 3 Noureddine Bouhmala 1 and Xing Cai 2 1 Vestfold University College, 2 Simula Research Laboratory, Norway 1. Introduction The satisfiability
More informationOn Computing Minimum Size Prime Implicants
On Computing Minimum Size Prime Implicants João P. Marques Silva Cadence European Laboratories / IST-INESC Lisbon, Portugal jpms@inesc.pt Abstract In this paper we describe a new model and algorithm for
More informationRandom Walk With Continuously Smoothed Variable Weights
Random Walk With Continuously Smoothed Variable Weights Steven Prestwich Cork Constraint Computation Centre Department of Computer Science University College Cork, Ireland s.prestwich@cs.ucc.ie Abstract.
More informationMetaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini
Metaheuristic Development Methodology Fall 2009 Instructor: Dr. Masoud Yaghini Phases and Steps Phases and Steps Phase 1: Understanding Problem Step 1: State the Problem Step 2: Review of Existing Solution
More informationConstraint Satisfaction Problems
Constraint Satisfaction Problems Greedy Local Search Bernhard Nebel, Julien Hué, and Stefan Wölfl Albert-Ludwigs-Universität Freiburg June 19, 2007 Nebel, Hué and Wölfl (Universität Freiburg) Constraint
More informationKalev Kask and Rina Dechter. Department of Information and Computer Science. University of California, Irvine, CA
GSAT and Local Consistency 3 Kalev Kask and Rina Dechter Department of Information and Computer Science University of California, Irvine, CA 92717-3425 fkkask,dechterg@ics.uci.edu Abstract It has been
More informationBoolean Functions (Formulas) and Propositional Logic
EECS 219C: Computer-Aided Verification Boolean Satisfiability Solving Part I: Basics Sanjit A. Seshia EECS, UC Berkeley Boolean Functions (Formulas) and Propositional Logic Variables: x 1, x 2, x 3,, x
More informationA Stochastic Non-CNF SAT Solver
A Stochastic Non-CNF SAT Solver Rafiq Muhammad and Peter J. Stuckey NICTA Victoria Laboratory, Department of Computer Science and Software Engineering, The University of Melbourne, Victoria 3010, Australia
More informationEvolutionary Computation Algorithms for Cryptanalysis: A Study
Evolutionary Computation Algorithms for Cryptanalysis: A Study Poonam Garg Information Technology and Management Dept. Institute of Management Technology Ghaziabad, India pgarg@imt.edu Abstract The cryptanalysis
More informationDiscrete Optimization. Lecture Notes 2
Discrete Optimization. Lecture Notes 2 Disjunctive Constraints Defining variables and formulating linear constraints can be straightforward or more sophisticated, depending on the problem structure. The
More informationABHELSINKI UNIVERSITY OF TECHNOLOGY
Local Search Algorithms for Random Satisfiability Pekka Orponen (joint work with Sakari Seitz and Mikko Alava) Helsinki University of Technology Local Search Algorithms for Random Satisfiability 1/30 Outline
More informationLecture: Iterative Search Methods
Lecture: Iterative Search Methods Overview Constructive Search is exponential. State-Space Search exhibits better performance on some problems. Research in understanding heuristic and iterative search
More informationSet 5: Constraint Satisfaction Problems Chapter 6 R&N
Set 5: Constraint Satisfaction Problems Chapter 6 R&N ICS 271 Fall 2017 Kalev Kask ICS-271:Notes 5: 1 The constraint network model Outline Variables, domains, constraints, constraint graph, solutions Examples:
More informationA New Reduction from 3-SAT to Graph K- Colorability for Frequency Assignment Problem
A New Reduction from 3-SAT to Graph K- Colorability for Frequency Assignment Problem Prakash C. Sharma Indian Institute of Technology Survey No. 113/2-B, Opposite to Veterinary College, A.B.Road, Village
More informationEECS 219C: Computer-Aided Verification Boolean Satisfiability Solving. Sanjit A. Seshia EECS, UC Berkeley
EECS 219C: Computer-Aided Verification Boolean Satisfiability Solving Sanjit A. Seshia EECS, UC Berkeley Project Proposals Due Friday, February 13 on bcourses Will discuss project topics on Monday Instructions
More informationGraph k-colorability through Threshold Accepting and Davis-Putnam
Graph k-colorability through Threshold Accepting and Davis-Putnam JUAN FRAUSTO-SOLÍS 1 HÉCTOR SANVICENTE-SANCHEZ 2 MARCO ANTONIO CRUZ-CHAVEZ 3 MÓNICA LARRE BOLAÑOS-CACHO 1 JOSÉ CRISPIN ZAVALA-DÍAZ 3 HUBERTO
More information4.1 Review - the DPLL procedure
Applied Logic Lecture 4: Efficient SAT solving CS 4860 Spring 2009 Thursday, January 29, 2009 The main purpose of these notes is to help me organize the material that I used to teach today s lecture. They
More informationThe MAX-SAX Problems
STOCHASTIC LOCAL SEARCH FOUNDATION AND APPLICATION MAX-SAT & MAX-CSP Presented by: Wei-Lwun Lu 1 The MAX-SAX Problems MAX-SAT is the optimization variant of SAT. Unweighted MAX-SAT: Finds a variable assignment
More informationGRASP. Greedy Randomized Adaptive. Search Procedure
GRASP Greedy Randomized Adaptive Search Procedure Type of problems Combinatorial optimization problem: Finite ensemble E = {1,2,... n } Subset of feasible solutions F 2 Objective function f : 2 Minimisation
More informationGSAT and Local Consistency
GSAT and Local Consistency Kalev Kask Computer Science Department University of California at Irvine Irvine, CA 92717 USA Rina Dechter Computer Science Department University of California at Irvine Irvine,
More informationNP-Completeness of 3SAT, 1-IN-3SAT and MAX 2SAT
NP-Completeness of 3SAT, 1-IN-3SAT and MAX 2SAT 3SAT The 3SAT problem is the following. INSTANCE : Given a boolean expression E in conjunctive normal form (CNF) that is the conjunction of clauses, each
More informationNP-Hardness. We start by defining types of problem, and then move on to defining the polynomial-time reductions.
CS 787: Advanced Algorithms NP-Hardness Instructor: Dieter van Melkebeek We review the concept of polynomial-time reductions, define various classes of problems including NP-complete, and show that 3-SAT
More informationSet 5: Constraint Satisfaction Problems
Set 5: Constraint Satisfaction Problems ICS 271 Fall 2014 Kalev Kask ICS-271:Notes 5: 1 The constraint network model Outline Variables, domains, constraints, constraint graph, solutions Examples: graph-coloring,
More informationSolving the Satisfiability Problem Using Finite Learning Automata
International Journal of Computer Science & Applications Vol. 4 Issue 3, pp 15-29 2007 Technomathematics Research Foundation Solving the Satisfiability Problem Using Finite Learning Automata Ole-Christoffer
More informationThe Impact of Branching Heuristics in Propositional Satisfiability Algorithms
The Impact of Branching Heuristics in Propositional Satisfiability Algorithms João Marques-Silva Technical University of Lisbon, IST/INESC, Lisbon, Portugal jpms@inesc.pt URL: http://algos.inesc.pt/~jpms
More informationSLS Algorithms. 2.1 Iterative Improvement (revisited)
SLS Algorithms Stochastic local search (SLS) has become a widely accepted approach to solving hard combinatorial optimisation problems. An important characteristic of many recently developed SLS methods
More informationHeuristic Backtracking Algorithms for SAT
Heuristic Backtracking Algorithms for SAT A. Bhalla, I. Lynce, J.T. de Sousa and J. Marques-Silva IST/INESC-ID, Technical University of Lisbon, Portugal fateet,ines,jts,jpmsg@sat.inesc.pt Abstract In recent
More informationA two phase exact algorithm for MAX SAT and. weighted MAX SAT problems *
A two phase exact algorithm for MAX SAT and weighted MAX SAT problems * Brian BorchersJr Judith Furmanl March 31, 1997 Abstract We describe a two phase algorithm for MAX SAT and weighted MAX oat problems.
More informationAn Adaptive Noise Mechanism for WalkSAT
From: AAAI-02 Proceedings. Copyright 2002, AAAI (www.aaai.org). All rights reserved. An Adaptive Noise Mechanism for WalkSAT Holger H. Hoos University of British Columbia Computer Science Department 2366
More informationNon-deterministic Search techniques. Emma Hart
Non-deterministic Search techniques Emma Hart Why do local search? Many real problems are too hard to solve with exact (deterministic) techniques Modern, non-deterministic techniques offer ways of getting
More informationAn Evolutionary Framework for 3-SAT Problems
Journal of Computing and Information Technology - CIT 11, 2003, 3, 185-191 185 An Evolutionary Framework for 3-SAT Problems István Borgulya University of Pécs, Hungary In this paper we present a new evolutionary
More informationApplying Local Search to Temporal Reasoning
Applying Local Search to Temporal Reasoning J. Thornton, M. Beaumont and A. Sattar School of Information Technology, Griffith University Gold Coast, Southport, Qld, Australia 4215 {j.thornton, m.beaumont,
More informationStochastic greedy local search Chapter 7
Stochastic greedy local search Chapter 7 ICS-275 Winter 2016 Example: 8-queen problem Main elements Choose a full assignment and iteratively improve it towards a solution Requires a cost function: number
More informationQingTing: A Fast SAT Solver Using Local Search and E cient Unit Propagation
QingTing: A Fast SAT Solver Using Local Search and E cient Unit Propagation Xiao Yu Li, Matthias F. Stallmann, and Franc Brglez Dept. of Computer Science, NC State Univ., Raleigh, NC 27695, USA {xyli,mfms,brglez}@unity.ncsu.edu
More informationv b) Λ (a v b) Λ (a v b) AND a v b a _ v b v b
A NN Algorithm for Boolean Satisability Problems William M. Spears AI Center - Code 5514 Naval Research Laboratory Washington, D.C. 20375-5320 202-767-9006 (W) 202-767-3172 (Fax) spears@aic.nrl.navy.mil
More informationExample: Map coloring
Today s s lecture Local Search Lecture 7: Search - 6 Heuristic Repair CSP and 3-SAT Solving CSPs using Systematic Search. Victor Lesser CMPSCI 683 Fall 2004 The relationship between problem structure and
More informationDynamic Variable Filtering for Hard Random 3-SAT Problems
Dynamic Variable Filtering for Hard Random 3-SAT Problems Anbulagan, John Thornton, and Abdul Sattar School of Information Technology Gold Coast Campus, Griffith University PMB 50 Gold Coast Mail Centre,
More informationIterated Robust Tabu Search for MAX-SAT
Iterated Robust Tabu Search for MAX-SAT Kevin Smyth 1, Holger H. Hoos 1, and Thomas Stützle 2 1 Department of Computer Science, University of British Columbia, Vancouver, B.C., V6T 1Z4, Canada {hoos,ksmyth}@cs.ubc.ca
More informationJournal of Global Optimization, 10, 1{40 (1997) A Discrete Lagrangian-Based Global-Search. Method for Solving Satisability Problems *
Journal of Global Optimization, 10, 1{40 (1997) c 1997 Kluwer Academic Publishers, Boston. Manufactured in The Netherlands. A Discrete Lagrangian-Based Global-Search Method for Solving Satisability Problems
More informationSolving the Boolean Satisfiability Problem Using Multilevel Techniques
Solving the Boolean Satisfiability Problem Using Multilevel Techniques Sirar Salih Yujie Song Supervisor Associate Professor Noureddine Bouhmala This Master s Thesis is carried out as a part of the education
More informationEECS 219C: Formal Methods Boolean Satisfiability Solving. Sanjit A. Seshia EECS, UC Berkeley
EECS 219C: Formal Methods Boolean Satisfiability Solving Sanjit A. Seshia EECS, UC Berkeley The Boolean Satisfiability Problem (SAT) Given: A Boolean formula F(x 1, x 2, x 3,, x n ) Can F evaluate to 1
More informationNP versus PSPACE. Frank Vega. To cite this version: HAL Id: hal https://hal.archives-ouvertes.fr/hal
NP versus PSPACE Frank Vega To cite this version: Frank Vega. NP versus PSPACE. Preprint submitted to Theoretical Computer Science 2015. 2015. HAL Id: hal-01196489 https://hal.archives-ouvertes.fr/hal-01196489
More informationNotes on CSP. Will Guaraldi, et al. version 1.7 4/18/2007
Notes on CSP Will Guaraldi, et al version 1.7 4/18/2007 Abstract Original abstract This document is a survey of the fundamentals of what we ve covered in the course up to this point. The information in
More informationPre-requisite Material for Course Heuristics and Approximation Algorithms
Pre-requisite Material for Course Heuristics and Approximation Algorithms This document contains an overview of the basic concepts that are needed in preparation to participate in the course. In addition,
More informationUnrestricted Backtracking Algorithms for Satisfiability
From: AAAI Technical Report FS-01-04. Compilation copyright 2001, AAAI (www.aaai.org). All rights reserved. Unrestricted Backtracking Algorithms for Satisfiability I. Lynce, L. Baptista and J. Marques-Silva
More informationKalev Kask and Rina Dechter
From: AAAI-96 Proceedings. Copyright 1996, AAAI (www.aaai.org). All rights reserved. A Graph-Based Method for Improving GSAT Kalev Kask and Rina Dechter Department of Information and Computer Science University
More information(Stochastic) Local Search Algorithms
DM841 DISCRETE OPTIMIZATION Part 2 Heuristics (Stochastic) Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Outline 1. 2. 3. Components 2 Outline 1. 2. 3. Components
More informationSPATIAL OPTIMIZATION METHODS
DELMELLE E. (2010). SPATIAL OPTIMIZATION METHODS. IN: B. WHARF (ED). ENCYCLOPEDIA OF HUMAN GEOGRAPHY: 2657-2659. SPATIAL OPTIMIZATION METHODS Spatial optimization is concerned with maximizing or minimizing
More informationAn Analysis and Comparison of Satisfiability Solving Techniques
An Analysis and Comparison of Satisfiability Solving Techniques Ankur Jain, Harsha V. Madhyastha, Craig M. Prince Department of Computer Science and Engineering University of Washington Seattle, WA 98195
More informationCopyright 2000, Kevin Wayne 1
Guessing Game: NP-Complete? 1. LONGEST-PATH: Given a graph G = (V, E), does there exists a simple path of length at least k edges? YES. SHORTEST-PATH: Given a graph G = (V, E), does there exists a simple
More informationHandbook of Constraint Programming 245 Edited by F. Rossi, P. van Beek and T. Walsh c 2006 Elsevier All rights reserved
Handbook of Constraint Programming 245 Edited by F. Rossi, P. van Beek and T. Walsh c 2006 Elsevier All rights reserved Chapter 8 Local Search Methods Holger H. Hoos and Edward Tsang Local search is one
More informationThe Augmented Regret Heuristic for Staff Scheduling
The Augmented Regret Heuristic for Staff Scheduling Philip Kilby CSIRO Mathematical and Information Sciences, GPO Box 664, Canberra ACT 2601, Australia August 2001 Abstract The regret heuristic is a fairly
More informationGraph k-colorability through Threshold Accepting and Davis-Putnam
Graph k-colorability through Threshold Accepting and Davis-Putnam JUAN FRAUSTO-SOLÍS 1 HÉCTOR SANVICENTE-SANCHEZ 2 MARCO ANTONIO CRUZ-CHAVEZ 3 MÓNICA LARRE BOLAÑOS-CACHO 1 JOSÉ CRISPIN ZAVALA-DÍAZ 3 HUBERTO
More informationThe Scaling of Search Cost. Ian P. Gent and Ewan MacIntyre and Patrick Prosser and Toby Walsh.
The Scaling of Search Cost Ian P. Gent and Ewan MacIntyre and Patrick Prosser and Toby Walsh Apes Research Group, Department of Computer Science, University of Strathclyde, Glasgow G XH, Scotland Email:
More informationConstraint Satisfaction Problems
Constraint Satisfaction Problems Tuomas Sandholm Carnegie Mellon University Computer Science Department [Read Chapter 6 of Russell & Norvig] Constraint satisfaction problems (CSPs) Standard search problem:
More informationSAT-Solving: Performance Analysis of Survey Propagation and DPLL
SAT-Solving: Performance Analysis of Survey Propagation and DPLL Christian Steinruecken June 2007 Abstract The Boolean Satisfiability Problem (SAT) belongs to the class of NP-complete problems, meaning
More informationFull CNF Encoding: The Counting Constraints Case
Full CNF Encoding: The Counting Constraints Case Olivier Bailleux 1 and Yacine Boufkhad 2 1 LERSIA, Université de Bourgogne Avenue Alain Savary, BP 47870 21078 Dijon Cedex olivier.bailleux@u-bourgogne.fr
More informationAn Efficient Global-Search Strategy in Discrete Lagrangian Methods for Solving Hard Satisfiability Problems
From: AAAI-00 Proceedings. Copyright 2000, AAAI (www.aaai.org). All rights reserved. An Efficient Global-Search Strategy in Discrete Lagrangian Methods for Solving Hard Satisfiability Problems Zhe Wu and
More informationComplexity Classes and Polynomial-time Reductions
COMPSCI 330: Design and Analysis of Algorithms April 19, 2016 Complexity Classes and Polynomial-time Reductions Lecturer: Debmalya Panigrahi Scribe: Tianqi Song 1 Overview In this lecture, we introduce
More informationA Virtual Laboratory for Study of Algorithms
A Virtual Laboratory for Study of Algorithms Thomas E. O'Neil and Scott Kerlin Computer Science Department University of North Dakota Grand Forks, ND 58202-9015 oneil@cs.und.edu Abstract Empirical studies
More informationConfiguration landscape analysis and backbone guided local search. Part I: Satisfiability and maximum satisfiability
Artificial Intelligence 158 (2004) 1 26 www.elsevier.com/locate/artint Configuration landscape analysis and backbone guided local search. Part I: Satisfiability and maximum satisfiability Weixiong Zhang
More informationLearning Techniques for Pseudo-Boolean Solving and Optimization
Learning Techniques for Pseudo-Boolean Solving and Optimization José Faustino Fragoso Fremenin dos Santos September 29, 2008 Abstract The extension of conflict-based learning from Propositional Satisfiability
More informationStochastic Local Search Methods for Dynamic SAT an Initial Investigation
Stochastic Local Search Methods for Dynamic SAT an Initial Investigation Holger H. Hoos and Kevin O Neill Abstract. We introduce the dynamic SAT problem, a generalisation of the satisfiability problem
More informationA Randomized Algorithm for Minimizing User Disturbance Due to Changes in Cellular Technology
A Randomized Algorithm for Minimizing User Disturbance Due to Changes in Cellular Technology Carlos A. S. OLIVEIRA CAO Lab, Dept. of ISE, University of Florida Gainesville, FL 32611, USA David PAOLINI
More informationA COMPARISON OF SEMI-DETERMINISTIC AND STOCHASTIC SEARCH TECHNIQUES
Citation: Connor, A.M. & Shea, K. (000) A comparison of semi-deterministic and stochastic search techniques. In I.C. Parmee [Ed.] Evolutionary Design & Manufacture (Selected Papers from ACDM 00), pp 87-98.
More informationn Informally: n How to form solutions n How to traverse the search space n Systematic: guarantee completeness
Advanced Search Applications: Combinatorial Optimization Scheduling Algorithms: Stochastic Local Search and others Analyses: Phase transitions, structural analysis, statistical models Combinatorial Problems
More informationInc*: An Incremental Approach for Improving Local Search Heuristics
Inc*: An Incremental Approach for Improving Local Search Heuristics Mohamed Bader-El-Den and Riccardo Poli Department of Computing and Electronic Systems, University of Essex, Wivenhoe Park, Colchester
More informationREACTIVE SEARCH FOR MAX-SAT: DIVERSIFICATION- BIAS PROPERTIES WITH PROHIBITIONS AND PENALTIES
DEPARTMENT OF INFORMATION AND COMMUNICATION TECHNOLOGY 38050 Povo Trento (Italy), Via Sommarive 14 http://dit.unitn.it/ REACTIVE SEARCH FOR MAX-SAT: DIVERSIFICATION- BIAS PROPERTIES WITH PROHIBITIONS AND
More informationLower Bounds and Upper Bounds for MaxSAT
Lower Bounds and Upper Bounds for MaxSAT Federico Heras, Antonio Morgado, and Joao Marques-Silva CASL, University College Dublin, Ireland Abstract. This paper presents several ways to compute lower and
More informationOutline of the module
Evolutionary and Heuristic Optimisation (ITNPD8) Lecture 2: Heuristics and Metaheuristics Gabriela Ochoa http://www.cs.stir.ac.uk/~goc/ Computing Science and Mathematics, School of Natural Sciences University
More informationNew Trials on Test Data Generation: Analysis of Test Data Space and Design of Improved Algorithm
New Trials on Test Data Generation: Analysis of Test Data Space and Design of Improved Algorithm So-Yeong Jeon 1 and Yong-Hyuk Kim 2,* 1 Department of Computer Science, Korea Advanced Institute of Science
More informationNew Worst-Case Upper Bound for #2-SAT and #3-SAT with the Number of Clauses as the Parameter
Proceedings of the Twenty-Fourth AAAI Conference on Artificial Intelligence (AAAI-10) New Worst-Case Upper Bound for #2-SAT and #3-SAT with the Number of Clauses as the Parameter Junping Zhou 1,2, Minghao
More informationEvolving Variable-Ordering Heuristics for Constrained Optimisation
Griffith Research Online https://research-repository.griffith.edu.au Evolving Variable-Ordering Heuristics for Constrained Optimisation Author Bain, Stuart, Thornton, John, Sattar, Abdul Published 2005
More informationIntroduction to Optimization
Introduction to Optimization Approximation Algorithms and Heuristics November 21, 2016 École Centrale Paris, Châtenay-Malabry, France Dimo Brockhoff Inria Saclay Ile-de-France 2 Exercise: The Knapsack
More informationMassively Parallel Approximation Algorithms for the Knapsack Problem
Massively Parallel Approximation Algorithms for the Knapsack Problem Zhenkuang He Rochester Institute of Technology Department of Computer Science zxh3909@g.rit.edu Committee: Chair: Prof. Alan Kaminsky
More informationVertex Cover Approximations
CS124 Lecture 20 Heuristics can be useful in practice, but sometimes we would like to have guarantees. Approximation algorithms give guarantees. It is worth keeping in mind that sometimes approximation
More informationComplexity. Congestion Games. Algorithmic Game Theory. Alexander Skopalik Algorithmic Game Theory 2013 Congestion Games
Algorithmic Game Theory Complexity of pure Nash equilibria We investigate the complexity of finding Nash equilibria in different kinds of congestion games. Our study is restricted to congestion games with
More informationSatisfiability-Based Algorithms for 0-1 Integer Programming
Satisfiability-Based Algorithms for 0-1 Integer Programming Vasco M. Manquinho, João P. Marques Silva, Arlindo L. Oliveira and Karem A. Sakallah Cadence European Laboratories / INESC Instituto Superior
More information3 No-Wait Job Shops with Variable Processing Times
3 No-Wait Job Shops with Variable Processing Times In this chapter we assume that, on top of the classical no-wait job shop setting, we are given a set of processing times for each operation. We may select
More informationExtending the Reach of SAT with Many-Valued Logics
Extending the Reach of SAT with Many-Valued Logics Ramón Béjar a, Alba Cabiscol b,césar Fernández b, Felip Manyà b and Carla Gomes a a Dept. of Comp. Science, Cornell University, Ithaca, NY 14853 USA,
More information6.034 Notes: Section 3.1
6.034 Notes: Section 3.1 Slide 3.1.1 In this presentation, we'll take a look at the class of problems called Constraint Satisfaction Problems (CSPs). CSPs arise in many application areas: they can be used
More information