NP Completeness. Andreas Klappenecker [partially based on slides by Jennifer Welch]

Similar documents
Partha Sarathi Mandal

Greedy algorithms Or Do the right thing

val(y, I) α (9.0.2) α (9.0.3)

Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R + Goal: find a tour (Hamiltonian cycle) of minimum cost

Approximation Algorithms

CS270 Combinatorial Algorithms & Data Structures Spring Lecture 19:

Module 6 NP-Complete Problems and Heuristics

Module 6 NP-Complete Problems and Heuristics

Module 6 NP-Complete Problems and Heuristics

Coping with NP-Completeness

COMP 355 Advanced Algorithms Approximation Algorithms: VC and TSP Chapter 11 (KT) Section (CLRS)

Module 6 P, NP, NP-Complete Problems and Approximation Algorithms

Institute of Operating Systems and Computer Networks Algorithms Group. Network Algorithms. Tutorial 4: Matching and other stuff

Theorem 2.9: nearest addition algorithm

1 Better Approximation of the Traveling Salesman

Outline. CS38 Introduction to Algorithms. Approximation Algorithms. Optimization Problems. Set Cover. Set cover 5/29/2014. coping with intractibility

CMSC 451: Lecture 22 Approximation Algorithms: Vertex Cover and TSP Tuesday, Dec 5, 2017

Traveling Salesperson Problem (TSP)

NP Completeness. Andreas Klappenecker [partially based on slides by Jennifer Welch]

CS 4407 Algorithms. Lecture 8: Circumventing Intractability, using Approximation and other Techniques

1 Variations of the Traveling Salesman Problem

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Approximation Algorithms

Algorithm Design and Analysis

NP-Hard (A) (B) (C) (D) 3 n 2 n TSP-Min any Instance V, E Question: Hamiltonian Cycle TSP V, n 22 n E u, v V H

Prove, where is known to be NP-complete. The following problems are NP-Complete:

Basic Approximation algorithms

Notes for Lecture 24

Introduction to Approximation Algorithms

Approximation Algorithms

Unit 8: Coping with NP-Completeness. Complexity classes Reducibility and NP-completeness proofs Coping with NP-complete problems. Y.-W.

Approximation Algorithms

More NP-complete Problems. CS255 Chris Pollett May 3, 2006.

Graphs and Algorithms 2015

Lecture 1. 2 Motivation: Fast. Reliable. Cheap. Choose two.

Coping with the Limitations of Algorithm Power Exact Solution Strategies Backtracking Backtracking : A Scenario

CSE 548: Analysis of Algorithms. Lecture 13 ( Approximation Algorithms )

1 The Traveling Salesperson Problem (TSP)

The k-center problem Approximation Algorithms 2009 Petros Potikas

APPROXIMATION ALGORITHMS FOR GEOMETRIC PROBLEMS

Assignment 5: Solutions

Steiner Trees and Forests

Lecture 8: The Traveling Salesman Problem

Theory of Computing. Lecture 10 MAS 714 Hartmut Klauck

2 Approximation Algorithms for Metric TSP

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018

Approximation Algorithms

Greedy Approximations

(Refer Slide Time: 01:00)

Approximation Algorithms

35 Approximation Algorithms

1 The Traveling Salesman Problem

Best known solution time is Ω(V!) Check every permutation of vertices to see if there is a graph edge between adjacent vertices

Optimal tour along pubs in the UK

Design and Analysis of Algorithms

Introduction to Algorithms

Polynomial time approximation algorithms

V1.0: Seth Gilbert, V1.1: Steven Halim August 30, Abstract. d(e), and we assume that the distance function is non-negative (i.e., d(x, y) 0).

Traveling Salesman Problem. Algorithms and Networks 2014/2015 Hans L. Bodlaender Johan M. M. van Rooij

Notes 4 : Approximating Maximum Parsimony

1 The Traveling Salesman Problem

11. APPROXIMATION ALGORITHMS

CS261: A Second Course in Algorithms Lecture #16: The Traveling Salesman Problem

Travelling Salesman Problem. Algorithms and Networks 2015/2016 Hans L. Bodlaender Johan M. M. van Rooij

Approximation Algorithms

CSE 417 Branch & Bound (pt 4) Branch & Bound

Fall CS598CC: Approximation Algorithms. Chandra Chekuri


15-451/651: Design & Analysis of Algorithms November 4, 2015 Lecture #18 last changed: November 22, 2015

Matching 4/21/2016. Bipartite Matching. 3330: Algorithms. First Try. Maximum Matching. Key Questions. Existence of Perfect Matching

Chapter Design Techniques for Approximation Algorithms

Slides on Approximation algorithms, part 2: Basic approximation algorithms

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I. Instructor: Shaddin Dughmi

1. The following graph is not Eulerian. Make it into an Eulerian graph by adding as few edges as possible.

Graphs and Algorithms 2016

Greedy Algorithms and Matroids. Andreas Klappenecker

Dist(Vertex u, Vertex v, Graph preprocessed) return u.dist v.dist

Greedy Algorithms and Matroids. Andreas Klappenecker

GRAPH THEORY and APPLICATIONS. Matchings

3 Euler Tours, Hamilton Cycles, and Their Applications

Shortest Path Algorithms

Lecture Overview. 2 Shortest s t path. 2.1 The LP. 2.2 The Algorithm. COMPSCI 530: Design and Analysis of Algorithms 11/14/2013

Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret

11. APPROXIMATION ALGORITHMS

Stanford University CS261: Optimization Handout 1 Luca Trevisan January 4, 2011

Chapter 23. Minimum Spanning Trees

An approximation algorithm for a bottleneck k-steiner tree problem in the Euclidean plane

Better approximations for max TSP

Minimum Spanning Trees

Solving NP-hard Problems on Special Instances

Questions... How does one show the first problem is NP-complete? What goes on in a reduction? How hard are NP-complete problems?

Matching Theory. Figure 1: Is this graph bipartite?

Lecture 7. s.t. e = (u,v) E x u + x v 1 (2) v V x v 0 (3)

MITOCW watch?v=zm5mw5nkzjg

31.6 Powers of an element

Decision Problems. Observation: Many polynomial algorithms. Questions: Can we solve all problems in polynomial time? Answer: No, absolutely not.

An O(log n/ log log n)-approximation Algorithm for the Asymmetric Traveling Salesman Problem

1 Minimum Spanning Trees (MST) b 2 3 a. 10 e h. j m

Design and Analysis of Algorithms

Approximation Basics

Spanning Trees, greedy algorithms. Lecture 22 CS2110 Fall 2017

Transcription:

NP Completeness Andreas Klappenecker [partially based on slides by Jennifer Welch]

Dealing with NP-Complete Problems

Dealing with NP-Completeness Suppose the problem you need to solve is NP-complete. What do you do? You might be able to show that bad running time does not happen for inputs of interest You might find heuristics to improve running time in many cases (but no guarantees) You may be able to find a polynomial time algorithm that is guaranteed to give an answer close to optimal

Optimization Problems Concentrate on approximation algorithms for optimization problems: every candidate solution has a positive cost Minimization problem: goal is to find smallest cost solution Ex: Vertex cover problem, cost is size of VC Maximization problem: goal is to find largest cost solution Ex: Clique problem, cost is size of clique

Approximation Algorithms An approximation algorithm for an optimization problem runs in polynomial time and always returns a candidate solution

Ratio Bound Ratio bound: Bound the ratio of the cost of the solution returned by the approximation algorithm and the cost of an optimal solution minimization problem: cost of approx solution / cost of optimal solution maximization problem: cost of optimal solution / cost of approx solution So ratio is always at least 1, goal is to get it as close to 1 as we can

Approximation Algorithms A poly-time algorithm A is called a δ-approximation algorithm for a minimization problem P if and only if for every problem instance I of P with an optimal solution value OPT(I), it delivers a solution of value A(I) satisfying A(I) δopt(i).

Approximation Algorithms A poly-time algorithm A is called a δ-approximation algorithm for a maximization problem P if and only if for every problem instance I of P with an optimal solution value OPT(I), it delivers a solution of value A(I) satisfying δa(i) OPT(I).

Vertex Cover

Approximation Algorithm for Minimum Vertex Cover Problem input: G = (V,E) C := E' := E while E' do pick any (u,v) in E' C := C U {u,v} remove from E' every edge incident on u or v endwhile return C

Min VC Approx Algorithm Time is O(E), which is polynomial. How good an approximation does it provide? Let's look at an example.

Min VC Approx Alg Example b c d a e f g choose (b,c): remove (b,c), (b,a), (c,e), (c,d) choose (e,f): remove (e,f), (e,d), (d,f) Answer: {b,c,e,f,d,g} Optimal answer: {b,d,e} Algorithm's ratio bound is 6/3 = 2.

Ratio Bound of Min VC Alg Theorem: Min VC approximation algorithm has ratio bound of 2. Proof: Let A be the total set of edges chosen to be removed. Size of VC returned is 2* A since no two edges in A share an endpoint. Size of A is at most size of a min VC since min VC must contain at least one node for each edge in A. Thus cost of approx solution is at most twice cost of optimal solution

More on Min VC Approx Alg Why not run the approx alg and then divide by 2 to get the optimal cost? Because answer is not always exactly twice the optimal, just never more than twice the optimal. For instance, a different choice of edges to remove gives a different answer: Choosing (d,e) and then (b,c) produces answer {b,c,d,e} with cost 4 as opposed to optimal cost 3

Metric TSP

Triangle Inequality Assume TSP inputs with the triangle inequality: distances satisfy property that for all cities a, b, and c, dist(a,c) dist(a,b) + dist(b,c) i.e., shortest path between 2 cities is the direct route this metric TSP is still NP-hard Depending on what you are modeling with the distances, an application might or might satisfy this condition.

TSP Approximation Algorithm input: set of cities and distances b/w them that satisfy the triangle inequality create complete graph G = (V,E), where V is set of cities and weight on edge (a,b) is dist(a,b) compute MST of G Go twice around the MST to get a tour (that will have duplicates) Remove duplicates to avoid visiting a city more than once

Example 8 8 2 1 7 2 1 7 3 3 5 4 6 5 4 6 Form MST DFS Traversal, Double Edges, Eulerian Tour of 2-Multigraph Skip cities that were already visited

Analysis of TSP Approx Alg Running time is polynomial (creating complete graph takes O(V 2 ) time, Kruskal's MST algorithm takes time O(E log E) = O(V 2 log V). How good is the quality of the solution?

Analysis of TSP Approx Alg cost of approx solution 2*weight of MST, by triangle inequality Why? a b c when tour created by going around the MST is adjusted to remove duplicate nodes, the two red edges are replaced with the dashed diagonal edge

Analysis of TSP Approx Alg weight of MST < length of min tour Why? Min tour minus one edge is a spanning tree T, whose weight must be at least the weight of MST. And weight of min tour is greater than weight of T.

Analysis of TSP Approx Alg Putting the pieces together: cost of approx solution 2*weight of MST 2*cost of min tour So approx ratio is at most 2.

(non-metric) TSP

TSP Without Triangle Inequality Theorem: If P NP, then no polynomial time approximation algorithm for TSP (w/o triangle inequality) can have a constant ratio bound. Proof: We will show that if there is such an approximation algorithm, then we could solve a known NP-complete problem (Hamiltonian cycle) in polynomial time, so P would equal NP.

HC Exact Algorithm using TSP Approximation Algorithm Input: G = (V,E) 1. convert G to this TSP input: one city for each node in V distance between cities u and v is 1 if (u,v) is in E distance between cities u and v is r* V if (u,v) is not in E, where r is the ratio bound of the TSP approx alg Note: This TSP input does not satisfy the triangle inequality

HC (Exact) Algorithm Using TSP Approximation Algorithm 2. run TSP approx alg on the input just created 3. if cost of approx solution returned in step 2 is r* V then return YES else return NO Running time is polynomial.

Correctness of HC Algorithm If G has a HC, then optimal tour in TSP input constructed corresponds to that cycle and has weight V. Approx algorithm returns answer with cost at most r* V. So if G has HC, then algorithm returns YES.

Correctness of HC Algorithm If G has no HC, then optimal tour for TSP input constructed must use at least one edge not in G, which has weight r* V. So weight of optimal tour is > r* V, and answer returned by approx alg has weight > r* V. So if G has not HC, then algorithm returns NO.

Set Cover

Set Cover Given a universe U of n elements, a collection S = {S 1,S 2,...,S k } of subsets of U, and a cost function c: S->Q +, find a minimum cost subcollection of S that covers all the elements of U.

Example We might want to select a committee consisting of people who have combined all skills.

Cost-Effectiveness We are going to pick a set according to its cost effectiveness. Let C be the set of elements that are already covered. The cost effectiveness of a set S is the average cost at which it covers new elements: c(s)/ S-C.

Greedy Set Cover C := while C U do Find most cost effective set in current iteration, say S, and pick it. For each e in S, set price(e)= c(s)/ S-C. C := C S Output C

Theorem Greedy Set Cover is an H m -approximation algorithm, where m = max{ S i : 1<=i<=k}.

Lemma For all sets T in S, we have e in T price(e) <= c(t) H x with x= T Proof: Let e in T (S i \ j<i S j ) and V i = T \ j<i S j be the remaining part of T before being covered by the greedy cover.

Lemma (2) Then the greedy property implies that price(e) <= c(t)/ V i Let e 1,...,e m be the elements of T in the order chosen by the greedy algorithm. It follows that price(e k ) <= w(t)/( T -k+1). Summing over all k yields the claim.

Proof of the Theorem Let A be the optimal set cover and B the set cover returned by the greedy algorithm. price(e) <= S in A e in S price(e) By the lemma, this is bounded by T in A c(t)h T The latter sum is bounded by T in A c(t) times the Harmonic number of the cardinality of the largest set in S.

Example Let U = {e 1,...,e n } S = { {e 1 },...,{e n }, {e 1,...,e n } } c({e i }) = 1/i c({e 1,...,e n })=1+ε The greedy algorithm computes a cover of cost H n and the optimal cover is 1+ε