Hardness of Approximation for the TSP Michael Lampis LAMSADE Université Paris Dauphine Sep 2, 2015
Overview Hardness of Approximation What is it? How to do it? (Easy) Examples The PCP Theorem What is it? How to use it? The Traveling Salesman Problem Approximation algorithms Strategy for Proving Hardness Other tools Expander Graphs Bounded-Occurrence CSPs A full reduction for the TSP Parameterized Approximation Schemes 2 / 39
Hardness of Approximation
Hardness of Approximation Day Summary: Approximation Algorithms with a performance guarantee Parameterized Approximation Schemes 4 / 39
Hardness of Approximation Day Summary: Approximation Algorithms with a performance guarantee Reminder: We have an (NP-hard) optimization problem We want to design an algorithm that gives good enough solution We want a guarantee of this For all instances I we have SOL(I) OPT(I) < r Parameterized Approximation Schemes 4 / 39
Hardness of Approximation Day Summary: Approximation Algorithms with a performance guarantee How close to 1 can we get the approximation ratio r? Typical situation: An initial algorithm gives some (bad) r. Then someone comes up with an improvement Repeat... Parameterized Approximation Schemes 4 / 39
Hardness of Approximation Day Summary: Approximation Algorithms with a performance guarantee How close to 1 can we get the approximation ratio r? Typical situation: An initial algorithm gives some (bad) r. Then someone comes up with an improvement Repeat... Until we are stuck! Now what? Parameterized Approximation Schemes 4 / 39
Hardness of Approximation Day Summary: Approximation Algorithms with a performance guarantee How close to 1 can we get the approximation ratio r? Typical situation: An initial algorithm gives some (bad) r. Then someone comes up with an improvement Repeat... Until we are stuck! Now what? Parameterized Approximation Schemes 4 / 39
Hardness of Approximation Day Summary: Approximation Algorithms with a performance guarantee How close to 1 can we get the approximation ratio r? Typical situation: An initial algorithm gives some (bad) r. Then someone comes up with an improvement Repeat... Until we are stuck! Now what? Goal of the theory of Hardness of Approximation: Prove the we are not incompetent! Parameterized Approximation Schemes 4 / 39
Hardness of Approximation: Can we do it? Approximation Algorithms vs. Hardness Poly-time algorithms vs. NP-completeness Algorithms vs. Complexity Parameterized Approximation Schemes 5 / 39
Hardness of Approximation: Can we do it? Approximation Algorithms vs. Hardness Poly-time algorithms vs. NP-completeness Algorithms vs. Complexity Parameterized Approximation Schemes 5 / 39
Hardness of Approximation: Can we do it? Approximation Algorithms vs. Hardness Poly-time algorithms vs. NP-completeness Algorithms vs. Complexity Parameterized Approximation Schemes 5 / 39
Hardness of Approximation: Can we do it? Approximation Algorithms vs. Hardness Poly-time algorithms vs. NP-completeness Algorithms vs. Complexity The two are related! Parameterized Approximation Schemes 5 / 39
Hardness of Approximation: Can we do it? Approximation Algorithms vs. Hardness Poly-time algorithms vs. NP-completeness Algorithms vs. Complexity The main tool will be algorithmic: Reductions Reminder: Basic tool of NP-hardness A is NP-hard. Reduce A to B. B is NP-hard. Parameterized Approximation Schemes 5 / 39
Hardness of Approximation: Can we do it? Approximation Algorithms vs. Hardness Poly-time algorithms vs. NP-completeness Algorithms vs. Complexity The main tool will be algorithmic: Reductions Reminder: Basic tool of NP-hardness A is NP-hard. Reduce A to B. B is NP-hard. Approximation version: Approximation Preserving Reductions Idea: A has no good approximation algorithm. We reduce A to B Conclusion: B has no good approximation algorithm. Parameterized Approximation Schemes 5 / 39
Hardness of Approximation: Can we do it? Approximation Algorithms vs. Hardness Poly-time algorithms vs. NP-completeness Algorithms vs. Complexity The main tool will be algorithmic: Reductions Reminder: Basic tool of NP-hardness A is NP-hard. Reduce A to B. B is NP-hard. Approximation version: Approximation Preserving Reductions Idea: A has no good approximation algorithm. We reduce A to B Conclusion: B has no good approximation algorithm. Parameterized Approximation Schemes 5 / 39
Hardness of Approximation: Can we do it? There are a couple of serious problems with this approach. Parameterized Approximation Schemes 6 / 39
Hardness of Approximation: Can we do it? There are a couple of serious problems with this approach. What is the first hard to approximate problem? Recall: Cook s theorem gives us a first NP-hard problem. Then we reduce from that. Here, we don t have a problem to begin from... Parameterized Approximation Schemes 6 / 39
Hardness of Approximation: Can we do it? There are a couple of serious problems with this approach. What is the first hard to approximate problem? Recall: Cook s theorem gives us a first NP-hard problem. Then we reduce from that. Here, we don t have a problem to begin from... How can we prove that a problem does not have a good approximation algorithm? This implies that it does not have a poly-time exact algorithm. P NP!! Parameterized Approximation Schemes 6 / 39
Hardness of Approximation: Can we do it? There are a couple of serious problems with this approach. What is the first hard to approximate problem? Recall: Cook s theorem gives us a first NP-hard problem. Then we reduce from that. Here, we don t have a problem to begin from... How can we prove that a problem does not have a good approximation algorithm? This implies that it does not have a poly-time exact algorithm. P NP!! Parameterized Approximation Schemes 6 / 39
Hardness of Approximation: How to do it We cannot avoid the second problem (without resolving P=NP) We will prove all our hardness results assuming P NP Parameterized Approximation Schemes 7 / 39
Hardness of Approximation: How to do it We cannot avoid the second problem (without resolving P=NP) We will prove all our hardness results assuming P NP We can solve the first problem using gap-introducting reductions. Parameterized Approximation Schemes 7 / 39
Hardness of Approximation: How to do it We cannot avoid the second problem (without resolving P=NP) We will prove all our hardness results assuming P NP We can solve the first problem using gap-introducting reductions. A gap-introducting reduction from SAT to a problem A has the following properties Given a SAT formula φ it produces in polynomial time an instance I of A (Completeness): If φ is satisfiable then OPT(I) > c (Soundness): If φ is not satisfiable then OPT(I) < s Parameterized Approximation Schemes 7 / 39
Hardness of Approximation: How to do it We cannot avoid the second problem (without resolving P=NP) We will prove all our hardness results assuming P NP We can solve the first problem using gap-introducting reductions. A gap-introducting reduction from SAT to a problem A has the following properties Given a SAT formula φ it produces in polynomial time an instance I of A (Completeness): If φ is satisfiable then OPT(I) > c (Soundness): If φ is not satisfiable then OPT(I) < s This establishes that no algorithm can achieve approximation ratio better than c/s Parameterized Approximation Schemes 7 / 39
Gap introduction: An easy example Recall the NP-hard Graph Coloring Problem Given a graph G(V,E) we want to find a coloring of the vertices such that any two neighbors have different colors. Objective: Minimize the number of colors used. Suppose my friend Bob claims to have designed an algorithm for Graph Coloring with approximation ratio 1.1. Parameterized Approximation Schemes 8 / 39
Gap introduction: An easy example Recall the NP-hard Graph Coloring Problem Given a graph G(V,E) we want to find a coloring of the vertices such that any two neighbors have different colors. Objective: Minimize the number of colors used. Suppose my friend Bob claims to have designed an algorithm for Graph Coloring with approximation ratio 1.1. This proves that P=NP! Parameterized Approximation Schemes 8 / 39
Gap introduction: An easy example Recall the NP-hard Graph Coloring Problem Given a graph G(V,E) we want to find a coloring of the vertices such that any two neighbors have different colors. Objective: Minimize the number of colors used. Suppose my friend Bob claims to have designed an algorithm for Graph Coloring with approximation ratio 1.1. Recall: Deciding if a graph can be colored with 3 colors in NP-hard. Parameterized Approximation Schemes 8 / 39
Gap introduction: An easy example Recall the NP-hard Graph Coloring Problem Given a graph G(V,E) we want to find a coloring of the vertices such that any two neighbors have different colors. Objective: Minimize the number of colors used. Suppose my friend Bob claims to have designed an algorithm for Graph Coloring with approximation ratio 1.1. Recall: Deciding if a graph can be colored with 3 colors in NP-hard. Translation: there is reduction which given a SAT formula φ produces either a graph that can be 3-colored, or one that needs more colors. Parameterized Approximation Schemes 8 / 39
Gap introduction: An easy example Recall the NP-hard Graph Coloring Problem Given a graph G(V,E) we want to find a coloring of the vertices such that any two neighbors have different colors. Objective: Minimize the number of colors used. Suppose my friend Bob claims to have designed an algorithm for Graph Coloring with approximation ratio 1.1. Recall: Deciding if a graph can be colored with 3 colors in NP-hard. Translation: there is reduction which given a SAT formula φ produces either a graph that can be 3-colored, or one that needs more colors. Run Bob s algorithm on this graph. If the graph can be 3-colored, the algorithm is guaranteed to produce a solution with at most 3 1.1 = 3.3 colors (!!) Otherwise, the algorithm will return a solution with at least 4 colors. From the number of colors of the solution we can deduce if the formula was satisfiable! Parameterized Approximation Schemes 8 / 39
Gap introduction: An easy example Recall the NP-hard Graph Coloring Problem Given a graph G(V,E) we want to find a coloring of the vertices such that any two neighbors have different colors. Objective: Minimize the number of colors used. Suppose my friend Bob claims to have designed an algorithm for Graph Coloring with approximation ratio 1.1. Recall: Deciding if a graph can be colored with 3 colors in NP-hard. Translation: there is reduction which given a SAT formula φ produces either a graph that can be 3-colored, or one that needs more colors. Run Bob s algorithm on this graph. If the graph can be 3-colored, the algorithm is guaranteed to produce a solution with at most 3 1.1 = 3.3 colors (!!) Otherwise, the algorithm will return a solution with at least 4 colors. From the number of colors of the solution we can deduce if the formula was satisfiable! Parameterized Approximation Schemes 8 / 39
TSP Traveling Salesman Problem: Given: Edge-weighted complete graph, weights follow triangle inequality Output: A tour that visits each vertex exactly once Objective: Minimize total cost Parameterized Approximation Schemes 9 / 39
TSP Traveling Salesman Problem: What if we don t have the triangle inequality? Parameterized Approximation Schemes 9 / 39
TSP Traveling Salesman Problem: What if we don t have the triangle inequality? Reduction from Hamiltonian Cycle Ham. Cycle: Given a graph, is there a cycle that visits each vertex exactly once? Given graph G(V,E) construct an instance of TSP Each edge E has weight 1 Each non-edge has weight w YES: There is a TSP tour with weight V NO: Any TSP tour has weight V 1+w No algorithm can have ratio better than V 1+w V We can now set w to something huge! (e.g. w = 2 n ) Parameterized Approximation Schemes 9 / 39
Gap Introduction: A non-trivial example Graph Balancing / Scheduling with restricted Assignment Given: n machines and m jobs. Each job has a duration and a set of machines it is allowed to run on. Output: An assignment of jobs to machines. Objective: Minimize makespan (time needed for last machine to finish all its jobs). Parameterized Approximation Schemes 10 / 39
Gap Introduction: A non-trivial example Graph Balancing / Scheduling with restricted Assignment Given: n machines and m jobs. Each job has a duration and a set of machines it is allowed to run on. Output: An assignment of jobs to machines. Objective: Minimize makespan (time needed for last machine to finish all its jobs). We are mainly interested in the case of the problem where each job can run on two machines. Parameterized Approximation Schemes 10 / 39
Gap Introduction: A non-trivial example Graph Balancing / Scheduling with restricted Assignment Given: n machines and m jobs. Each job has a duration and a set of machines it is allowed to run on. Output: An assignment of jobs to machines. Objective: Minimize makespan (time needed for last machine to finish all its jobs). We are mainly interested in the case of the problem where each job can run on two machines. Example: Parameterized Approximation Schemes 10 / 39
Gap Introduction: A non-trivial example Graph Balancing / Scheduling with restricted Assignment Given: n machines and m jobs. Each job has a duration and a set of machines it is allowed to run on. Output: An assignment of jobs to machines. Objective: Minimize makespan (time needed for last machine to finish all its jobs). We are mainly interested in the case of the problem where each job can run on two machines. Example: Parameterized Approximation Schemes 10 / 39
Gap Introduction: A non-trivial example Graph Balancing / Scheduling with restricted Assignment Given: n machines and m jobs. Each job has a duration and a set of machines it is allowed to run on. Output: An assignment of jobs to machines. Objective: Minimize makespan (time needed for last machine to finish all its jobs). We are mainly interested in the case of the problem where each job can run on two machines. Example: Parameterized Approximation Schemes 10 / 39
Gap Introduction: A non-trivial example Graph Balancing / Scheduling with restricted Assignment Given: n machines and m jobs. Each job has a duration and a set of machines it is allowed to run on. Output: An assignment of jobs to machines. Objective: Minimize makespan (time needed for last machine to finish all its jobs). We are mainly interested in the case of the problem where each job can run on two machines. Example: Parameterized Approximation Schemes 10 / 39
Gap Introduction: A non-trivial example Target Theorem: There is no approximation algorithm for Graph Balancing with ratio better than 3/2. Plan: Gap-Introducing reduction from 3-SAT Satisfiable formula maximum load 2 Unsatisfiable formula maximum load 3. Parameterized Approximation Schemes 11 / 39
Gap Introduction: A non-trivial example Target Theorem: There is no approximation algorithm for Graph Balancing with ratio better than 3/2. Plan: Gap-Introducing reduction from 3-SAT Satisfiable formula maximum load 2 Unsatisfiable formula maximum load 3. Thm: 3-OCC-3-SAT is NP-hard This is the version of 3-SAT where each variable appears at most 3 times and each literal at most twice. Proof: Replace each appearance of variable x with a fresh variable x 1,x 2,...,x n Add the clauses (x 1 x 2 ) (x 2 x 3 )... (x n x 1 ) Parameterized Approximation Schemes 11 / 39
Example continued Reduction: 3-OCC-3-SAT Graph Balancing Parameterized Approximation Schemes 12 / 39
Example continued Reduction: 3-OCC-3-SAT Graph Balancing For each variable create an edge of weight 2 and two vertices Parameterized Approximation Schemes 12 / 39
Example continued Reduction: 3-OCC-3-SAT Graph Balancing For each create a vertex and connect it with its literals, here c 1 = (x 1 x 2 x 3 ) Parameterized Approximation Schemes 12 / 39
Example continued Reduction: 3-OCC-3-SAT Graph Balancing A truth assignment orients heavy edges towards the false literal Parameterized Approximation Schemes 12 / 39
Example continued Reduction: 3-OCC-3-SAT Graph Balancing In order to achieve load=2 we must find a true literal in each clause Parameterized Approximation Schemes 12 / 39
Recap Gap-introducing reductions Reduce an NP-hard problem to instances of our problem which are very different in the YES/NO cases. This implies hardness of approximation for our problem Next step, reduct to other problems... Unfortunately, direct gap-introducing reductions are very rare. Usually work for problems of the form Max-Min Does not work for Min-Avg, Min-Sum,... How to prove that such problems are hard? Parameterized Approximation Schemes 13 / 39
The PCP Theorem
Min-Max or Min-Sum? Consider the MAX-3-SAT problem Given: 3-SAT formula Objective: Find assignment that satisfies most clauses We can try the same trick to prove it s hard to approximate YES: OPT(I) = m NO: OPT(I) m 1 No approximation better than m 1 m Parameterized Approximation Schemes 15 / 39
Min-Max or Min-Sum? Consider the MAX-3-SAT problem Given: 3-SAT formula Objective: Find assignment that satisfies most clauses We can try the same trick to prove it s hard to approximate YES: OPT(I) = m NO: OPT(I) m 1 No approximation better than m 1 m Unfortunately, this ratio is basically 1... Generally, direct gap-introducing reductions are hard to do for problems where a bad instance needs to have many problems. To prove that such problems are hard we generally need the famous PCP theorem. Parameterized Approximation Schemes 15 / 39
The PCP theorem: approximation view Theorem: There is a polynomial-time reduction from 3-SAT to 3-SAT with the following properties If the original formula φ is satisfiable then the new formula φ is also If the original formula is not satisfiable, then any assignment satisfies at most an r fraction of φ, where r < 1 a constant indepent of φ. Parameterized Approximation Schemes 16 / 39
The PCP theorem: approximation view Theorem: There is a polynomial-time reduction from 3-SAT to 3-SAT with the following properties If the original formula φ is satisfiable then the new formula φ is also If the original formula is not satisfiable, then any assignment satisfies at most an r fraction of φ, where r < 1 a constant indepent of φ. Translation: The PCP theorem gives a gap-introducing reduction to MAX-3-SAT. This produces a starting problem from which we can do reductions to show that other problems are hard. In this way, the PCP theorem is to approximation hardness what Cook s theorem is to NP-completeness. Parameterized Approximation Schemes 16 / 39
The PCP theorem: approximation view Theorem: There is a polynomial-time reduction from 3-SAT to 3-SAT with the following properties If the original formula φ is satisfiable then the new formula φ is also If the original formula is not satisfiable, then any assignment satisfies at most an r fraction of φ, where r < 1 a constant indepent of φ. Translation: The PCP theorem gives a gap-introducing reduction to MAX-3-SAT. This produces a starting problem from which we can do reductions to show that other problems are hard. In this way, the PCP theorem is to approximation hardness what Cook s theorem is to NP-completeness. But it is also much more... Parameterized Approximation Schemes 16 / 39
The PCP theorem: proof-checking view Problems in NP: a short proof for YES instances E.g. SAT, 3-Coloring Mathematical Theorems themselves!?!? Parameterized Approximation Schemes 17 / 39
The PCP theorem: proof-checking view Problems in NP: a short proof for YES instances E.g. SAT, 3-Coloring Mathematical Theorems themselves!?!? Given such a proof/certificate, how can we verify it s correct? We have to read it, of course. All of it??? Parameterized Approximation Schemes 17 / 39
The PCP theorem: proof-checking view Problems in NP: a short proof for YES instances E.g. SAT, 3-Coloring Mathematical Theorems themselves!?!? Given such a proof/certificate, how can we verify it s correct? We have to read it, of course. All of it??? PCP theorem (informal statement): There is a way to write the proof so that its size stays roughly the same but it can be verified with high probability by reading a constant number of bits. Parameterized Approximation Schemes 17 / 39
The PCP theorem: proof-checking view Problems in NP: a short proof for YES instances E.g. SAT, 3-Coloring Mathematical Theorems themselves!?!? Given such a proof/certificate, how can we verify it s correct? We have to read it, of course. All of it??? PCP theorem (informal statement): There is a way to write the proof so that its size stays roughly the same but it can be verified with high probability by reading a constant number of bits. This is unbelievable! (and it made the NY Times) Parameterized Approximation Schemes 17 / 39
The PCP theorem: implications Equivalence of two forms: a 3-SAT formula for which it is easy to verify a certificate (assignment) is a formula for which every assignment makes many clauses false. Using the PCP theorem we have some (tiny) constant for the hardness of MAX-3-SAT Is this all? Parameterized Approximation Schemes 18 / 39
The PCP theorem: implications Equivalence of two forms: a 3-SAT formula for which it is easy to verify a certificate (assignment) is a formula for which every assignment makes many clauses false. Using the PCP theorem we have some (tiny) constant for the hardness of MAX-3-SAT Is this all? [Hastad 2001] There is no better than 7/8-approximation for MAX-E3-SAT There is no better than 1/2-approximation for MAX-E3-LIN2 In MAX-E3-LIN2 we are given equations of the form x y z = 0 and want to satisfy as many as possible. MAX-E3-LIN2 is a common starting point for inapproximability reductions. These results match the performance of the trivial algorithm! Parameterized Approximation Schemes 18 / 39
The Traveling Salesman Problem
The Traveling Salesman Problem Input: An edge-weighted graph G(V,E) Objective: Find an ordering of the vertices v 1,v 2,...,v n such that d(v 1,v 2 ) + d(v 2,v 3 ) +... + d(v n,v 1 ) is minimized. d(v i,v j ) is the shortest-path distance of v i,v j on G Parameterized Approximation Schemes 20 / 39
The Traveling Salesman Problem Parameterized Approximation Schemes 20 / 39
The Traveling Salesman Problem Parameterized Approximation Schemes 20 / 39
The Traveling Salesman Problem Parameterized Approximation Schemes 20 / 39
The Traveling Salesman Problem Parameterized Approximation Schemes 20 / 39
The Traveling Salesman Problem Parameterized Approximation Schemes 20 / 39
The Traveling Salesman Problem Parameterized Approximation Schemes 20 / 39
The Traveling Salesman Problem Parameterized Approximation Schemes 20 / 39
The Traveling Salesman Problem Parameterized Approximation Schemes 20 / 39
TSP Approximations Upper bounds 3 2 approximation (Christofides 1976) For graphic (un-weighted) case 3 2 ǫ approximation (Oveis Gharan et al. FOCS 11) 1.461 approximation (Mömke and Svensson FOCS 11) approximation (Mucha STACS 12) 1.4 approximation (Sebö and Vygen arxiv 12) For ATSP the best ratio is O(log n/ log log n) (Asadpour et al. SODA 10) 13 9 Parameterized Approximation Schemes 21 / 39
TSP Approximations Lower bounds Problem is APX-hard (Papadimitriou and Yannakakis 93) 5381 2805 5380-inapproximable, ATSP 2804 (Engebretsen STACS 99) 3813 3812-inapproximable (Böckenhauer et al. STACS 00) 220 117 219-inapproximable, ATSP 116 (Papadimitriou and Vempala STOC 00, Combinatorica 06) Current best (Karpinski, L., Schmied): Theorem It is NP-hard to approximate TSP better than 123 122 better than 75 74. and ATSP Parameterized Approximation Schemes 22 / 39
TSP Approximations Lower bounds Problem is APX-hard (Papadimitriou and Yannakakis 93) 5381 2805 5380-inapproximable, ATSP 2804 (Engebretsen STACS 99) 3813 3812-inapproximable (Böckenhauer et al. STACS 00) 220 117 219-inapproximable, ATSP 116 (Papadimitriou and Vempala STOC 00, Combinatorica 06) Current best (Karpinski, L., Schmied): Theorem It is NP-hard to approximate TSP better than 123 122 better than 75 74. and ATSP Notice the huge distance between the best algorithm (50% error) and hardness (0.8% error)... Parameterized Approximation Schemes 22 / 39
Reduction Technique We reduce some inapproximable CSP (e.g. MAX-3SAT) to TSP. Parameterized Approximation Schemes 23 / 39
Reduction Technique First, design some gadgets to represent the clauses Parameterized Approximation Schemes 23 / 39
Reduction Technique Then, add some choice vertices to represent truth assignments to variables Parameterized Approximation Schemes 23 / 39
Reduction Technique For each variable, create a path through clauses where it appears positive Parameterized Approximation Schemes 23 / 39
Reduction Technique... and another path for its negative appearances Parameterized Approximation Schemes 23 / 39
Reduction Technique Parameterized Approximation Schemes 23 / 39
Reduction Technique A truth assignment dictates a general path Parameterized Approximation Schemes 23 / 39
Reduction Technique Parameterized Approximation Schemes 23 / 39
Reduction Technique Parameterized Approximation Schemes 23 / 39
Reduction Technique We must make sure that gadgets are cheaper to traverse if corresponding clause is satisfied Parameterized Approximation Schemes 23 / 39
Reduction Technique If a clause is not satisfied, we will pay more. We need many clauses to be unsatisfied in a No instance to have a big gap. (PCP theorem) Parameterized Approximation Schemes 23 / 39
Reduction Technique For the converse direction we must also make sure that cheating tours are not optimal! Parameterized Approximation Schemes 23 / 39
How to ensure consistency Basic idea here: consistency would be easy if each variable occurred at most c times, c a constant. Cheating would only help a tour fix a bounded number of clauses. Parameterized Approximation Schemes 24 / 39
How to ensure consistency Basic idea here: consistency would be easy if each variable occurred at most c times, c a constant. Cheating would only help a tour fix a bounded number of clauses. We will rely on techniques and tools used to prove inapproximability for bounded-occurrence CSPs. This is where expander graphs are important. Main tool: amplifier graph constructions due to Berman and Karpinski. Parameterized Approximation Schemes 24 / 39
How to ensure consistency Basic idea here: consistency would be easy if each variable occurred at most c times, c a constant. Cheating would only help a tour fix a bounded number of clauses. We will rely on techniques and tools used to prove inapproximability for bounded-occurrence CSPs. This is where expander graphs are important. Main tool: amplifier graph constructions due to Berman and Karpinski. Expander graphs are a generally useful tool, so let s take a look at what they are... Parameterized Approximation Schemes 24 / 39
Expander and Amplifier Graphs
Expander Graphs Informal description: An expander graph is a well-connected and sparse graph. Parameterized Approximation Schemes 26 / 39
Expander Graphs Informal description: An expander graph is a well-connected and sparse graph. Definition: A graph G(V,E) is an expander if For all S V with S V 2 we have for some constant c E(S,V \S) S The maximum degree is bounded c Parameterized Approximation Schemes 26 / 39
Expander Graphs Informal description: An expander graph is a well-connected and sparse graph. In any possible partition of the vertices into two sets, there are many edges crossing the cut. This is achieved even though the graph has low degree, therefore few edges. Parameterized Approximation Schemes 26 / 39
Expander Graphs Informal description: An expander graph is a well-connected and sparse graph. In any possible partition of the vertices into two sets, there are many edges crossing the cut. This is achieved even though the graph has low degree, therefore few edges. Example: Parameterized Approximation Schemes 26 / 39
Expander Graphs Informal description: An expander graph is a well-connected and sparse graph. In any possible partition of the vertices into two sets, there are many edges crossing the cut. This is achieved even though the graph has low degree, therefore few edges. Example: A complete bipartite graph is well-connected but not sparse. Parameterized Approximation Schemes 26 / 39
Expander Graphs Informal description: An expander graph is a well-connected and sparse graph. In any possible partition of the vertices into two sets, there are many edges crossing the cut. This is achieved even though the graph has low degree, therefore few edges. Example: A complete bipartite graph is well-connected but not sparse. Parameterized Approximation Schemes 26 / 39
Expander Graphs Informal description: An expander graph is a well-connected and sparse graph. In any possible partition of the vertices into two sets, there are many edges crossing the cut. This is achieved even though the graph has low degree, therefore few edges. Example: A complete bipartite graph is well-connected but not sparse. Parameterized Approximation Schemes 26 / 39
Expander Graphs Informal description: An expander graph is a well-connected and sparse graph. In any possible partition of the vertices into two sets, there are many edges crossing the cut. This is achieved even though the graph has low degree, therefore few edges. Example: A grid is sparse but not well-connected. Parameterized Approximation Schemes 26 / 39
Expander Graphs Informal description: An expander graph is a well-connected and sparse graph. In any possible partition of the vertices into two sets, there are many edges crossing the cut. This is achieved even though the graph has low degree, therefore few edges. Example: A grid is sparse but not well-connected. Parameterized Approximation Schemes 26 / 39
Expander Graphs Informal description: An expander graph is a well-connected and sparse graph. In any possible partition of the vertices into two sets, there are many edges crossing the cut. This is achieved even though the graph has low degree, therefore few edges. Example: A grid is sparse but not well-connected. Parameterized Approximation Schemes 26 / 39
Expander Graphs Informal description: An expander graph is a well-connected and sparse graph. In any possible partition of the vertices into two sets, there are many edges crossing the cut. This is achieved even though the graph has low degree, therefore few edges. Example: An infinite binary tree is a good expander. Parameterized Approximation Schemes 26 / 39
Expander Graphs Informal description: An expander graph is a well-connected and sparse graph. In any possible partition of the vertices into two sets, there are many edges crossing the cut. This is achieved even though the graph has low degree, therefore few edges. Example: An infinite binary tree is a good expander. Parameterized Approximation Schemes 26 / 39
Expander Graphs Informal description: An expander graph is a well-connected and sparse graph. In any possible partition of the vertices into two sets, there are many edges crossing the cut. This is achieved even though the graph has low degree, therefore few edges. Example: An infinite binary tree is a good expander. Parameterized Approximation Schemes 26 / 39
Applications of Expanders Expander graphs have a number of applications Proof of PCP theorem Derandomization Error-correcting codes Parameterized Approximation Schemes 27 / 39
Applications of Expanders Expander graphs have a number of applications Proof of PCP theorem Derandomization Error-correcting codes... and inapproximability of bounded occurrence CSPs! Parameterized Approximation Schemes 27 / 39
Applications of Expanders Expanders and inapproximability Consider the standard reduction from 3-SAT to 3-OCC-3-SAT Replace each appearance of variable x with a fresh variable x 1,x 2,...,x n Add the clauses (x 1 x 2 ) (x 2 x 3 )... (x n x 1 ) Parameterized Approximation Schemes 27 / 39
Applications of Expanders Expanders and inapproximability Consider the standard reduction from 3-SAT to 3-OCC-3-SAT Replace each appearance of variable x with a fresh variable x 1,x 2,...,x n Add the clauses (x 1 x 2 ) (x 2 x 3 )... (x n x 1 ) Problem: This does not preserve inapproximability! We could add (x i x j ) for all i,j. This ensures consistency but adds too many clauses and does not decrease number of occurrences! Parameterized Approximation Schemes 27 / 39
Applications of Expanders Expanders and inapproximability We modify this using a 1-expander [Papadimitriou Yannakakis 91] Recall: a 1-expander is a graph s.t. in each partition of the vertices the number of edges crossing the cut is larger than the number of vertices of the smaller part. Parameterized Approximation Schemes 27 / 39
Applications of Expanders Expanders and inapproximability We modify this using a 1-expander [Papadimitriou Yannakakis 91] Replace each appearance of variable x with a fresh variable x 1,x 2,...,x n Construct an n-vertex 1-expander. For each edge (i,j) add the clauses (x i x j ) (x j x i ) Parameterized Approximation Schemes 27 / 39
Applications of Expanders Why does this work? Suppose that in the new instance the optimal assignment sets some of the x i s to 0 and others to 1. This gives a partition of the 1-expander. Each edge cut by the partition corresponds to an unsatisfied clause. Number of cut edges > number of minority assigned vertices = number of clauses lost by being consistent. Hence, it is always optimal to give the same value to all x i s. Also, because expander graphs are sparse, only linear number of clauses added. This gives some inapproximability constant. Parameterized Approximation Schemes 27 / 39
Limits of expanders Expanders sound useful. But how good expanders can we get? We want: Low degree few edges High expansion (at least 1). These are conflicting goals! Parameterized Approximation Schemes 28 / 39
Limits of expanders Expanders sound useful. But how good expanders can we get? We want: Low degree few edges High expansion (at least 1). These are conflicting goals! The smallest for which we currently know we can have expansion 1 is = 6. [Bollobás 88] Parameterized Approximation Schemes 28 / 39
Limits of expanders Expanders sound useful. But how good expanders can we get? We want: Low degree few edges High expansion (at least 1). These are conflicting goals! The smallest for which we currently know we can have expansion 1 is = 6. [Bollobás 88] Problem: = 6 is too large, = 5 probably won t work... Parameterized Approximation Schemes 28 / 39
Amplifiers Amplifiers are expanders for some of the vertices. The other vertices are thrown in to make consistency easier to achieve. This allows us to get smaller. Parameterized Approximation Schemes 29 / 39
Amplifiers Amplifiers are expanders for some of the vertices. The other vertices are thrown in to make consistency easier to achieve. This allows us to get smaller. 5-regular amplifier [Berman Karpinski 03] Bipartite graph. n vertices on left, 0.8n vertices on right. 4-regular on left, 5-regular on right. Graph constructed randomly. Crucial Property: whp any partition cuts more edges than the number of left vertices on the smaller set. Parameterized Approximation Schemes 29 / 39
Amplifiers Amplifiers are expanders for some of the vertices. The other vertices are thrown in to make consistency easier to achieve. This allows us to get smaller. 3-regular wheel amplifier [Berman Karpinski 01] Start with a cycle on 7n vertices. Every seventh vertex is a contact vertex. Other vertices are checkers. Take a random perfect matching of checkers. Parameterized Approximation Schemes 29 / 39
Back to the Reduction
Overview We start from an instance of MAX-E3-LIN2. Given a set of linear equations (mod 2) each of size three satisfy as many as possible. Problem known to be 2-inapproximable (Håstad) Parameterized Approximation Schemes 31 / 39
Overview We use the Berman-Karpinski amplifier construction to obtain an instance where each variable appears exactly 5 times (and most equations have size 2). Parameterized Approximation Schemes 31 / 39
Overview Parameterized Approximation Schemes 31 / 39
Overview A simple trick reduces this to the 1in3 predicate. Parameterized Approximation Schemes 31 / 39
Overview From this instance we construct a graph. Parameterized Approximation Schemes 31 / 39
1in3-SAT Input: A set of clauses (l 1 l 2 l 3 ), l 1,l 2,l 3 literals. Objective: A clause is satisfied if exactly one of its literals is true. Satisfy as many clauses as possible. Easy to reduce MAX-LIN2 to this problem. Especially for size two equations (x+y = 1) (x y). Naturally gives gadget for TSP In TSP we d like to visit each vertex at least once, but not more than once (to save cost) Parameterized Approximation Schemes 32 / 39
TSP and Euler tours Parameterized Approximation Schemes 33 / 39
TSP and Euler tours Parameterized Approximation Schemes 33 / 39
TSP and Euler tours Parameterized Approximation Schemes 33 / 39
TSP and Euler tours A TSP tour gives an Eulerian multi-graph composed with edges of G. An Eulerian multi-graph composed with edges of G gives a TSP tour. TSP Select a multiplicity for each edge so that the resulting multi-graph is Eulerian and total cost is minimized Note: no edge is used more than twice Parameterized Approximation Schemes 33 / 39
Gadget Forced Edges We would like to be able to dictate in our construction that a certain edge has to be used at least once. Parameterized Approximation Schemes 34 / 39
Gadget Forced Edges If we had directed edges, this could be achieved by adding a dummy intermediate vertex Parameterized Approximation Schemes 34 / 39
Gadget Forced Edges Here, we add many intermediate vertices and evenly distribute the weight w among them. Think of B as very large. Parameterized Approximation Schemes 34 / 39
Gadget Forced Edges At most one of the new edges may be unused, and in that case all others are used twice. Parameterized Approximation Schemes 34 / 39
Gadget Forced Edges In that case, adding two copies of that edge to the solution doesn t hurt much (for B sufficiently large). Parameterized Approximation Schemes 34 / 39
1in3 Gadget Let s design a gadget for (x y z) Parameterized Approximation Schemes 35 / 39
1in3 Gadget First, three entry/exit points Parameterized Approximation Schemes 35 / 39
1in3 Gadget Connect them... Parameterized Approximation Schemes 35 / 39
1in3 Gadget... with forced edges Parameterized Approximation Schemes 35 / 39
1in3 Gadget The gadget is a connected component. A good tour visits it once. Parameterized Approximation Schemes 35 / 39
1in3 Gadget... like this Parameterized Approximation Schemes 35 / 39
1in3 Gadget This corresponds to an unsatisfied clause Parameterized Approximation Schemes 35 / 39
1in3 Gadget This corresponds to a dishonest tour Parameterized Approximation Schemes 35 / 39
1in3 Gadget The dishonest tour pays this edge twice. How expensive must it be before cheating becomes suboptimal? Note that w = 10 suffices, since the two cheating variables appear in at most 10 clauses. Parameterized Approximation Schemes 35 / 39
Construction High-level view: construct an origin s and two terminal vertices for each variable. Parameterized Approximation Schemes 36 / 39
Construction Connect them with forced edges Parameterized Approximation Schemes 36 / 39
Construction Add the gadgets Parameterized Approximation Schemes 36 / 39
Construction An honest traversal for x 2 looks like this Parameterized Approximation Schemes 36 / 39
Construction A dishonest traversal looks like this... Parameterized Approximation Schemes 36 / 39
Construction... but there must be cheating in two places There are as many doubly-used forced edges as affected variables w 5 Parameterized Approximation Schemes 36 / 39
Construction... but there must be cheating in two places There are as many doubly-used forced edges as affected variables w 5 In fact, no need to write off affected clauses. Use random assignment for cheated variables and some of them will be satisfied Parameterized Approximation Schemes 36 / 39
Under the carpet Many details missing Dishonest variables are set randomly but not independently to ensure that some clauses are satisfied with probability 1. The structure of the instance (from BK amplifier) must be taken into account to calculate the final constant. Parameterized Approximation Schemes 37 / 39
Under the carpet Many details missing Dishonest variables are set randomly but not independently to ensure that some clauses are satisfied with probability 1. The structure of the instance (from BK amplifier) must be taken into account to calculate the final constant. Theorem: There is no 185 184 approximation algorithm for TSP, unless P=NP. Parameterized Approximation Schemes 37 / 39
Under the carpet Many details missing Dishonest variables are set randomly but not independently to ensure that some clauses are satisfied with probability 1. The structure of the instance (from BK amplifier) must be taken into account to calculate the final constant. Theorem: There is no 185 184 approximation algorithm for TSP, unless P=NP. Can we do better? Parameterized Approximation Schemes 37 / 39
Under the carpet Many details missing Dishonest variables are set randomly but not independently to ensure that some clauses are satisfied with probability 1. The structure of the instance (from BK amplifier) must be taken into account to calculate the final constant. Theorem: There is no 185 184 approximation algorithm for TSP, unless P=NP. Can we do better? Parameterized Approximation Schemes 37 / 39
Summary Hardness of Approximation theory is the evil twin of the theory of approximation algorithms. It relies on some deep mathematical tools PCP theorem, expander graphs,... We discussed some general common patterns Local vs. global errors, gaps,... Parameterized Approximation Schemes 38 / 39
Summary Hardness of Approximation theory is the evil twin of the theory of approximation algorithms. It relies on some deep mathematical tools PCP theorem, expander graphs,... We discussed some general common patterns Local vs. global errors, gaps,... Area still under construction! Still far from answer for TSP and many other prominent problems! For Graph Balancing the answer is between 1.5 and 1.75. Can we make more progress? Parameterized Approximation Schemes 38 / 39
The end Questions? Parameterized Approximation Schemes 39 / 39