A Simple ¾-Approximation Algorithm for MAX SAT

Similar documents
A Simple, Greedy Approximation Algorithm for MAX SAT

GREEDY ALGORITHMS FOR THE MAXIMUM SATISFIABILITY PROBLEM: SIMPLE ALGORITHMS AND INAPPROXIMABILITY BOUNDS

Online Stochastic Matching CMSC 858F: Algorithmic Game Theory Fall 2010

CSC Linear Programming and Combinatorial Optimization Lecture 12: Semidefinite Programming(SDP) Relaxation

Lecture 1. 2 Motivation: Fast. Reliable. Cheap. Choose two.

CSC2420 Fall 2012: Algorithm Design, Analysis and Theory An introductory (i.e. foundational) level graduate course.

15-451/651: Design & Analysis of Algorithms November 4, 2015 Lecture #18 last changed: November 22, 2015

CS 598CSC: Approximation Algorithms Lecture date: March 2, 2011 Instructor: Chandra Chekuri

A List Heuristic for Vertex Cover

Coverage Approximation Algorithms

Copyright 2000, Kevin Wayne 1

Approximation Algorithms

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18

Local Search Approximation Algorithms for the Complement of the Min-k-Cut Problems

arxiv:cs/ v1 [cs.cc] 28 Apr 2003

The Online Connected Facility Location Problem

What Can We Do? CS125 Lecture 20 Fall 2014

Vertex Cover Approximations

Approximation Algorithms

Algorithm Design and Analysis

Approximation Algorithms

Fall CS598CC: Approximation Algorithms. Chandra Chekuri

Lecture 9. Semidefinite programming is linear programming where variables are entries in a positive semidefinite matrix.

The Subtour LP for the Traveling Salesman Problem

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/18/14

Complexity Classes and Polynomial-time Reductions

CS-E3200 Discrete Models and Search

6 Randomized rounding of semidefinite programs

/ Approximation Algorithms Lecturer: Michael Dinitz Topic: Linear Programming Date: 2/24/15 Scribe: Runze Tang

Algorithmic Problems Related to Internet Graphs

2SAT Andreas Klappenecker

CS227: Assignment 1 Report

Analysis of an ( )-Approximation Algorithm for the Maximum Edge-Disjoint Paths Problem with Congestion Two

Iterative Methods in Combinatorial Optimization. R. Ravi Carnegie Bosch Professor Tepper School of Business Carnegie Mellon University

Lecture 7: Asymmetric K-Center

CSC2420 Fall 2012: Algorithm Design, Analysis and Theory

Approximation Algorithms

Approximation Algorithms

Topic: Local Search: Max-Cut, Facility Location Date: 2/13/2007

ON PRIMAL-DUAL SCHEMA FOR THE MINIMUM SATISFIABILITY PROBLEM

Massively Parallel Seesaw Search for MAX-SAT

Dual-Based Approximation Algorithms for Cut-Based Network Connectivity Problems

An Efficient Approximation for the Generalized Assignment Problem

Robust Combinatorial Optimization with Exponential Scenarios

Hardness of Approximation for the TSP. Michael Lampis LAMSADE Université Paris Dauphine

Primal-Dual Methods for Approximation Algorithms

Graph Balancing: A Special Case of Scheduling Unrelated Parallel Machines

Deterministic Algorithms for Rank Aggregation and Other Ranking and Clustering Problems

P vs. NP. Simpsons: Treehouse of Horror VI

Graph Theory and Optimization Approximation Algorithms

Expected Approximation Guarantees for the Demand Matching Problem

algorithms, i.e., they attempt to construct a solution piece by piece and are not able to offer a complete solution until the end. The FM algorithm, l

The MAX-SAX Problems

CMPSCI611: Approximating SET-COVER Lecture 21

The Resolution Algorithm

Boolean Functions (Formulas) and Propositional Logic

A Computational Theory of Clustering

11. APPROXIMATION ALGORITHMS

Satisfiability (SAT) Applications. Extensions/Related Problems. An Aside: Example Proof by Machine. Annual Competitions 12/3/2008

Absorbing Random walks Coverage

Chapter 8 DOMINATING SETS

On Approximating Minimum Vertex Cover for Graphs with Perfect Matching

NP-Completeness of 3SAT, 1-IN-3SAT and MAX 2SAT

The Design of Approximation Algorithms

Approximation Algorithms

Simple mechanisms for escaping from local optima:

MVE165/MMG630, Applied Optimization Lecture 8 Integer linear programming algorithms. Ann-Brith Strömberg

Chapter 8 DOMINATING SETS

Greedy algorithms Or Do the right thing

Improved Algorithm for Weighted Flow Time

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018

Optimisation While Streaming

The implementation is written in Python, instructions on how to run it are given in the last section.

Who s The Weakest Link?

NP Completeness. Andreas Klappenecker [partially based on slides by Jennifer Welch]

Sub-exponential Approximation Schemes: From Dense to Almost-Sparse. Dimitris Fotakis Michael Lampis Vangelis Paschos

Rounding procedures in Semidefinite Programming

3 INTEGER LINEAR PROGRAMMING

Lecture 16: Gaps for Max-Cut

Sub-exponential Approximation Schemes: From Dense to Almost-Sparse. Dimitris Fotakis Michael Lampis Vangelis Paschos

8 NP-complete problem Hard problems: demo

Discrete Optimization with Decision Diagrams

Algorithms, Spring 2014, CSE, OSU Greedy algorithms II. Instructor: Anastasios Sidiropoulos

Improved Approximation Algorithms for Budgeted Allocations

Questions? You are given the complete graph of Facebook. What questions would you ask? (What questions could we hope to answer?)

11. APPROXIMATION ALGORITHMS

Scribe from 2014/2015: Jessica Su, Hieu Pham Date: October 6, 2016 Editor: Jimmy Wu

Recitation 4: Elimination algorithm, reconstituted graph, triangulation

Where Can We Draw The Line?

Combinatorial Optimization

The Ordered Covering Problem

Probabilistically Checkable Proofs with Low Amortized Query Complexity

An Introduction to SAT Solvers

Approximating Min-sum Set Cover 1

Randomized rounding of semidefinite programs and primal-dual method for integer linear programming. Reza Moosavi Dr. Saeedeh Parsaeefard Dec.

Lecture 14: Linear Programming II

Absorbing Random walks Coverage

Online Unit Clustering in Higher Dimensions

12 Introduction to LP-Duality

Theorem 2.9: nearest addition algorithm

Transcription:

A Simple ¾-Approximation Algorithm for MAX SAT David P. Williamson Joint work with Matthias Poloczek (Cornell), Georg Schnitger (Frankfurt), and Anke van Zuylen (William & Mary)

Maximum Satisfiability Input: n Boolean variables x 1,, x n m clauses C 1,, C ) with weights w + 0 each clause is a disjunction of literals, e.g. C 1 = x₁ x₂ x ₃ Goal: truth assignment to the variables that maximizes the weight of the satisfied clauses

Approximation Algorithms An α-approximation algorithm runs in polynomial time and returns a solution of at least α times the optimal. For a randomized algorithm, we ask that the expected value is at least α times the optimal.

A ½-approximation algorithm Set each x 4 to true with probability ½. Then if l + is the number of literals in clause j

What about a deterministic algorithm? Use the method of conditional expectations (Erdős and Selfridge 73, Spencer 87) If E W x 9 true E W x 9 false then set x 9 true, otherwise false. Similarly, if X 4C9 is event of how first i 1 variables are set, then if E W X 4C9, x 4 true E W X 4C9, x 4 false, set x 4 true. Show inductively that E[W X 4 ] E W 9 I OPT.

An LP relaxation

Nonlinear randomized rounding (Goemans, W 94) Pick any function fsuch that 1 4 CK f x 4 KC9. Set x 4 true with probability f(y 4 ), where y is an optimal LP solution.

Analysis

Integrality gap The result is tight since LP solution z 9 = z I = z R = z S = 1 and y 9 = y I = 9 I feasible for instance above, but OPT = 3. Chan, Lee, Raghavendra, Steurer (STOC 13) show no superpolynomially sized LP can give a better integrality gap.

Current status NP-hard to approximate better than 0.875 (Håstad 01) Combinatorial approximation algorithms Johnson s algorithm (1974): Simple ½-approximation algorithm (Greedy version of the randomized algorithm) Improved analysis of Johnson s algorithm: 2 / 3 -approx. guarantee [Chen, Friesen, Zheng 99, Engebretsen 04] Randomizing variable order improves guarantee slightly [Costello, Shapira, Tetali SODA 11] Algorithms using Linear or Semidefinite Programming Yannakakis 94, Goemans, W 94: Question ¾-approximation [W 98]: Is algorithms possible to obtain a 3/4- Best guarantee 0.7969 [Avidor, Berkovitch, Zwick 05] approximation algorithm without solving a linear program?

(Selected) results Poloczek, Schnitger (SODA 11): randomized Johnson combinatorial ¾- approximation algorithm Van Zuylen (WAOA 11): Simplification of randomized Johnson probabilities and analysis Buchbinder, Feldman, Naor, and Schwartz (FOCS 12): Another ¾-approximation algorithm for MAX SAT as a special case of submodular function maximization Can be shown that their MAX SAT alg is equivalent to van Zuylen s.

(Selected) results Poloczek, Schnitger 11 Van Zuylen 11 Buchbinder, Feldman, Naor and Schwartz 12 Common properties: iteratively set the variables in an online fashion, the probability of setting x i to true depends on clauses containing x 4 or x 4 that are still undecided.

Today Give textbook version of Buchbinder et al. s algorithm with an even simpler analysis (Poloczek, van Zuylen, W, LATIN 14) Give a simple deterministic version of the algorithm (Poloczek, Schnitger, van Zuylen, W, manuscript) Give an experimental analysis that shows that the algorithm works very well in practice (Poloczek, W, SEA 2016)

Buchbinder et al. s approach Keep two bounds on the solution Lower bound LB = weight of clauses already satisfied Upper bound UB = weight of clauses not yet unsatisfied Greedy can focus on two things: maximize LB, maximize UB, but either choice has bad examples E.g. x₁ x₂ (wt 1+ε), x ₁ (wt 1) x₁ x₂ (wt 1+ε), x ₁ (wt ε), x ₂ (wt 1) Key idea: make choices to increase B = ½ (LB+UB)

LB 0 (= 0) B 0 = ½(LB 0 +UB 0 ) UB 0 (= w j )

Weight of undecided clauses satisfied by x 1 = true Weight of undecided clauses unsatisfied by x 1 = true LB 0 LB 1 B 0 = ½(LB 0 +UB 0 ) UB 1 UB 0 Set x 1 to true

Weight of undecided clauses satisfied by x 1 = true B 1 Weight of undecided clauses unsatisfied by x 1 = true LB 0 LB 1 B 0 UB 1 UB 0 Set x 1 to true

Weight of undecided clauses satisfied by x 1 = true B 1 Weight of undecided clauses unsatisfied by x 1 = true LB 0 LB 1 LB 1 B 0 UB 1 UB 1 UB 0 Set x 1 to true or Set x 1 to false

Weight of undecided clauses satisfied by x 1 = true B 1 B 1 Weight of undecided clauses unsatisfied by x 1 = true LB 0 LB 1 LB 1 B 0 UB 1 UB 1 UB 0 Set x 1 to true or Set x 1 to false Guaranteed that (B 1 -B 0 )+(B 1 -B 0 ) 0 t 1 f 1

Weight of undecided clauses satisfied by x i = true Remark: This is the algorithm proposed independently by BFNS 12 and vz 11 B i B i Weight of undecided clauses unsatisfied by x i = true B i-1 LB i-1 LB i LB i UB i UB i UB i-1 Algorithm: if t i < 0, set x i to false if f < 0, set x i i to true else, set x i to true with probability U 4 U 4 VW 4 (B i -B i-1 )+(B i -B i-1 ) 0 t i f i

Example Initalize: LB = 0 UB = 6 Step 1: Clause Weight x 9 2 x 9 x I 1 x I x R 3 t 1 = 9 I LB + UB = 9 I 1 + ( 2) = 9 I f 1 = 9 I LB + UB = 9 I 2 + 0 = 1 Set x 1 to false

Example Step 2: Clause Weight x 9 2 x 9 x I 1 x I x R 3 t 2 = 9 I LB + UB = 9 I 1 + 0 = 9 I f I = 9 I LB + UB = 9 I 3 + ( 1) = 1 Set x 2 to true with probability 1/3 and to false with probability 2/3

Example Clause Weight x 9 2 x 9 x I 1 x I x R 3 Algorithm s solution: x 9 = false x I = true w.p. 1/3 and false w.p. 2/3 x R = true Expected weight of satisfied clauses: 5 9 R

Relating Algorithm to Optimum Let x 9, x I,, x _ be an optimal truth assignment Let OPT i = weight of clauses satisfied if setting x 9,, x 4 as the algorithm does, and x 4V9 =,, x _ = x _ x 4V9 Key Lemma: E B 4 B 4C9 E[OPT 4C9 OPT 4 ]

Let x 9, x I,, x _ an optimal truth assignment OPT Let OPT i = weight of clauses B 1 satisfied if setting OPT 1 x 9,, x _ as the algorithm does, and x 4V9 =,, x _ = x _ LB 0 B 0 UB 0 x 4V9 Key Lemma: E B 4 B 4C9 E[OPT 4C9 OPT 4 ]

Let an optimal truth assignment OPT n = B n = weight of ALG s solution OPT Let = weight of clauses satisfied B 1 if setting OPT as the 1 algorithm does, and LB 0 B 0 UB 0 Key Lemma: B 0 ½ OPT ½ (OPT-B 0 ) Conclusion: expected weight of ALG s solution is E B _ B c + 1 2 OPT B c = 1 2 OPT + B c 3 4 OPT

Relating Algorithm to Optimum Weight of undecided clauses satisfied by x i = true B i B i Weight of undecided clauses unsatisfied by x i = true B i-1 LB i-1 LB i LB i UB i UB i UB i-1 Suppose x 4 = true Want to show: If algorithm sets x 4 to true, B 4 B 4C9 = t 4 OPT 4C9 OPT 4 = 0 If algorithm sets x 4 to false, B 4 B 4C9 = f 4 OPT 4C9 OPT 4 LB 4 LB 4C9 + UB 4 UB 4C9 = 2 B 4 B 4C9 = 2t 4 Key Lemma: E B 4 B 4C9 E[OPT 4C9 OPT 4 ]

Relating Algorithm to Optimum Want to show: Key Lemma: E B 4 B 4C9 E[OPT 4C9 OPT 4 ] Know: If algorithm sets x 4 to true, B 4 B 4C9 = t 4 OPT 4C9 OPT 4 = 0 If algorithm sets x 4 to false, B 4 B 4C9 = f 4 OPT 4C9 OPT 4 2t 4 Case 1: f 4 < 0 (algorithm sets x i to true): E B 4 B 4C9 = t 4 > 0 = E OPT 4C9 OPT 4 Case 2: t 4 < 0 (algorithm sets x i to false): E B 4 B 4C9 = f 4 > 0 > 2t 4 E OPT 4C9 OPT 4

Relating Algorithm to Optimum Want to show: Key Lemma: E B 4 B 4C9 E[OPT 4C9 OPT 4 ] Know: If algorithm sets x 4 to true, B 4 B 4C9 = t 4 OPT 4C9 OPT 4 = 0 If algorithm Equal to sets x 4 to false, (t B 4 B 4C9 = f 4 4 f 4 ) 2 +2t 4 f 4 OPT 4C9 OPT 4 2t 4 Case 3: t 4 0, f 4 0 (algorithm sets x i to true w.p. U ef Ue VW e ): U E B 4 B 4C9 = t e W 4 + f e U e VW 4 = 9 (t e U e VW e U e VW 42 + f 42 ) e t 4 f 4 E OPT 4C9 OPT 4 0 + 2t t 4 + f 4 = 1 (2t 4 t 4 + f 4 t 4 + f 4 f 4 ) 4

Question Is there a simple combinatorial deterministic ¾-approximation algorithm?

Deterministic variant? Greedily maximizing B i is not good enough: Clause Weight x 9 1 x 9 x I 2+ε x I 1 x I x R 2+ε.. x _C9 1 x _C9 x _ 2+ε Optimal assignment sets all variables to true OPT = (n-1)(3+ε) Greedily increasing B i sets variables x 9,, x _C9 to false GREEDY= (n-1)(2+ε)

A negative result Poloczek (ESA 11): No deterministic priority algorithm can be a ¾ -approximation algorithm, using scheme introduced by Borodin, Nielsen, and Rackoff 03. Algorithm makes one pass over the variables and sets them. Only looks at weights of clauses in which current variable appears positively and negatively (not at the other variables in such clauses). Restricted in information used to choose next variable to set.

But It is possible with a two-pass algorithm (Thanks to Ola Svensson). First pass: Set variables x 4 fractionally (i.e. probability that x 4 true), so that E W R S OPT. Second pass: Use method of conditional expectations to get deterministic solution of value at least as much.

Buchbinder et al. s approach expected Keep two bounds on the fractional solution Lower bound LB = weight of clauses already satisfied Upper bound UB = weight of clauses not yet unsatisfied expected Greedy can focus on two things: maximize LB, maximize UB, but either choice has bad examples expected Key idea: make choices to increase B = ½ (LB+UB)

As before Let t 4 be (expected) increase in bound B 4C9 if we set x 4 true; f 4 be (expected) increase in bound if we set x 4 false. Algorithm: For i 1 to n if t i < 0, set x 4 to 0 if f < 0, set x i 4 to 1 else, set x 4 to U 4 U 4 VW 4 For i 1 to n If E W X 4C9, x 4 true E W X 4C9, x 4 false, set x 4 true Else set x 4 false

Analysis Proof that after the first pass E W R OPT S is almost the same as before. Proof that final solution output has value at least E W R OPT is via method of S conditional expectation. Algorithm can be implemented in linear time.

Experimental Analysis How well do these algorithms work on structured instances? How do they compare to other types of algorithms (e.g. local search)? Can we use the randomization to our advantage?

The Instances From SAT and MAX SAT competitions in 2014 and 2015, all unweighted: Industrial/applications: formal verification, crypto attacks, etc (300 + 55 instances) Crafted: Max cut, graph isomorphism, etc (300 + 402 instances) Random: With various ratios of clauses/variables (225 + 702 instances) Sizes: Average for industrial:.5m variables in 2M clauses Largest: 14M in 53M clauses Larger in SAT instances than MAX SAT

The Measure Rather than approximation ratio, we use the totality ratio, ratio of satisfied clauses to the number of clauses in the input.

Greedy Algorithms SAT/Industrial instances: Johnson s algorithm (JA) versus Randomized Greedy (RG) versus the 2-pass algorithm (2Pass).

Local Search We compared the greedy algorithms versus a number of local search algorithms studied by Pankratov and Borodin (SAT 2010). WalkSAT: Selman, Kautz, Cohen (1993), Kautz (2014) Non-Oblivious Local Search (NOLS): Khanna, Motwani, Sudan, Vazirani (1998) Simulated Annealing (SA): Spears (1993)

A Hybrid Algorithm Adding the last 10 iterations of simulated annealing on top of 2-Pass worked really well, not that much slower. The last 10 iterations by themselves was slightly faster, only slightly worse.

Randomization Suppose we randomize over the variable orderings? Costello, Shapira, and Tetali (SODA 11) show this improves the worst-case performance of Johnson s algorithm. For industrial instances, this makes the performance of the greedy algorithms worse: Johnson s alg from 98% to 95.8%, RG from 95.7% to 92.8%.

Randomization What about multiple trials of RG (10x)? Increases average fraction of satisfied clause by only 0.07%.

Conclusion We show this two-pass idea works for other problems as well (e.g. deterministic ½- approximation algorithm for MAX DICUT, MAX NAE SAT). Can we characterize the problems for which it does work?

Conclusion More broadly, are there other places in which we can reduce the computation needed for approximation algorithms and make them practical? E.g. Trevisan 13/Soto 15 give a.614-approximation algorithm for Max Cut using a spectral algorithm. Can we beat ¾ using a spectral algorithm? For just MAX 2SAT? We can get.817 for balanced instances (Paul, Poloczek, W LATIN 16) Curiously, the algorithm seems to beat the GW SDP algorithm on average in practice (Paul et al.)

Thanks for your time and attention.