Greedy algorithms Or Do the right thing

Similar documents
NP Completeness. Andreas Klappenecker [partially based on slides by Jennifer Welch]

Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R + Goal: find a tour (Hamiltonian cycle) of minimum cost

NP-Hard (A) (B) (C) (D) 3 n 2 n TSP-Min any Instance V, E Question: Hamiltonian Cycle TSP V, n 22 n E u, v V H

val(y, I) α (9.0.2) α (9.0.3)

Module 6 NP-Complete Problems and Heuristics

Theorem 2.9: nearest addition algorithm

Partha Sarathi Mandal

Module 6 NP-Complete Problems and Heuristics

Institute of Operating Systems and Computer Networks Algorithms Group. Network Algorithms. Tutorial 4: Matching and other stuff

1 The Traveling Salesperson Problem (TSP)

Approximation Algorithms

Module 6 NP-Complete Problems and Heuristics

Module 6 P, NP, NP-Complete Problems and Approximation Algorithms

Lecture 8: The Traveling Salesman Problem

Approximation Algorithms

35 Approximation Algorithms

Unit 8: Coping with NP-Completeness. Complexity classes Reducibility and NP-completeness proofs Coping with NP-complete problems. Y.-W.

Lecture 1. 2 Motivation: Fast. Reliable. Cheap. Choose two.

Coping with NP-Completeness

COMP 355 Advanced Algorithms Approximation Algorithms: VC and TSP Chapter 11 (KT) Section (CLRS)

CS 4407 Algorithms. Lecture 8: Circumventing Intractability, using Approximation and other Techniques

APPROXIMATION ALGORITHMS FOR GEOMETRIC PROBLEMS

CMSC 451: Lecture 22 Approximation Algorithms: Vertex Cover and TSP Tuesday, Dec 5, 2017

Introduction to Algorithms

Questions? You are given the complete graph of Facebook. What questions would you ask? (What questions could we hope to answer?)

Optimal tour along pubs in the UK

1 Variations of the Traveling Salesman Problem

CSE 548: Analysis of Algorithms. Lecture 13 ( Approximation Algorithms )

Algorithm Design and Analysis

CS270 Combinatorial Algorithms & Data Structures Spring Lecture 19:

Notes for Lecture 24

Prove, where is known to be NP-complete. The following problems are NP-Complete:

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502)

CS261: A Second Course in Algorithms Lecture #16: The Traveling Salesman Problem

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Approximation Algorithms

Notes for Recitation 9

Vertex Cover Approximations

Advanced Methods in Algorithms HW 5

P NP. Approximation Algorithms. Computational hardness. Vertex cover. Vertex cover. Vertex cover. Plan: Vertex Cover Metric TSP 3SAT

Approximation Algorithms

More NP-complete Problems. CS255 Chris Pollett May 3, 2006.

V1.0: Seth Gilbert, V1.1: Steven Halim August 30, Abstract. d(e), and we assume that the distance function is non-negative (i.e., d(x, y) 0).

Approximation Algorithms

Steiner Trees and Forests

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18

Greedy Approximations

Approximation Algorithms

1 Better Approximation of the Traveling Salesman

Questions... How does one show the first problem is NP-complete? What goes on in a reduction? How hard are NP-complete problems?

Traveling Salesman Problem. Algorithms and Networks 2014/2015 Hans L. Bodlaender Johan M. M. van Rooij

Theory of Computing. Lecture 10 MAS 714 Hartmut Klauck

Assignment 5: Solutions

Steven Skiena. skiena

2 Approximation Algorithms for Metric TSP

NP-complete Reductions

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/18/14

Decision Problems. Observation: Many polynomial algorithms. Questions: Can we solve all problems in polynomial time? Answer: No, absolutely not.

Introduction to Approximation Algorithms

NP Completeness. Andreas Klappenecker [partially based on slides by Jennifer Welch]

2. Optimization problems 6

CS154, Lecture 18: PCPs, Hardness of Approximation, Approximation-Preserving Reductions, Interactive Proofs, Zero-Knowledge, Cold Fusion, Peace in

Complexity Classes and Polynomial-time Reductions

Best known solution time is Ω(V!) Check every permutation of vertices to see if there is a graph edge between adjacent vertices

NP-Complete Reductions 2

Travelling Salesman Problem. Algorithms and Networks 2015/2016 Hans L. Bodlaender Johan M. M. van Rooij

1 Definition of Reduction

NP-Hardness. We start by defining types of problem, and then move on to defining the polynomial-time reductions.

Greedy Algorithms 1 {K(S) K(S) C} For large values of d, brute force search is not feasible because there are 2 d {1,..., d}.

Basic Approximation algorithms

Lecture 24: More Reductions (1997) Steven Skiena. skiena

Graphs and Algorithms 2015

P vs. NP. Simpsons: Treehouse of Horror VI

P and NP (Millenium problem)

Multi-Objective Combinatorial Optimization: The Traveling Salesman Problem and Variants

Reductions and Satisfiability

CSE 417 Branch & Bound (pt 4) Branch & Bound

Lecture 20: Satisfiability Steven Skiena. Department of Computer Science State University of New York Stony Brook, NY

3 Euler Tours, Hamilton Cycles, and Their Applications

Matching 4/21/2016. Bipartite Matching. 3330: Algorithms. First Try. Maximum Matching. Key Questions. Existence of Perfect Matching

Traveling Salesperson Problem (TSP)

Slides on Approximation algorithms, part 2: Basic approximation algorithms

An O(log n/ log log n)-approximation Algorithm for the Asymmetric Traveling Salesman Problem

9.1 Cook-Levin Theorem

(Refer Slide Time: 01:00)

Improved approximation ratios for traveling salesperson tours and paths in directed graphs

Better approximations for max TSP

Fall CS598CC: Approximation Algorithms. Chandra Chekuri

Greedy Algorithms 1. For large values of d, brute force search is not feasible because there are 2 d

MITOCW watch?v=zm5mw5nkzjg

Dynamic programming. Trivial problems are solved first More complex solutions are composed from the simpler solutions already computed

1 Minimum Spanning Trees (MST) b 2 3 a. 10 e h. j m

The k-center problem Approximation Algorithms 2009 Petros Potikas

CMPSCI611: The SUBSET-SUM Problem Lecture 18

1 Introduction. 1. Prove the problem lies in the class NP. 2. Find an NP-complete problem that reduces to it.

Stanford University CS261: Optimization Handout 1 Luca Trevisan January 4, 2011

Chapter 9 Graph Algorithms

11/22/2016. Chapter 9 Graph Algorithms. Introduction. Definitions. Definitions. Definitions. Definitions

Outline. CS38 Introduction to Algorithms. Approximation Algorithms. Optimization Problems. Set Cover. Set cover 5/29/2014. coping with intractibility

Approximation Algorithms

8 NP-complete problem Hard problems: demo

Transcription:

Greedy algorithms Or Do the right thing March 1, 2005 1 Greedy Algorithm Basic idea: When solving a problem do locally the right thing. Problem: Usually does not work. VertexCover (Optimization Version) Instance: A graph G, and integer k. Q: Return the smallest subset S V (G), s.t. S touches all the edges of G. Example: VertexCover problem: Greedy algorithm always takes the vertex with the highest degree, add it to the cover set, remove it from the graph, and repeats. Counter Example: Optimal solution is the black vertices, but greedy would pick the four white vertices. Well, maybe we do not get the optimal vertex cover, but we still get some kind of vertex cover which is good? Q: What is good? Definition 1.1 A minimization problem is an optimization problem, where we look for a valid solution that minimizes a certain target function. Example 1.2 VertexCover, the target function is the size of the cover. Formally Opt(G) = min S V (G),S cover of G S The V ertexcover(g) is just the set S realizing this minimum. 1

Definition 1.3 Let Opt(G) denote the value of the target function for the optimal solution. Good = vertex cover of size close to the optimal solution. Definition 1.4 Algorithm A for a minimization problem achieves an approximation factor A(G) α if for all inputs, we have: α. Opt(G) Example 1.5 An algorithm is a 2-approximation for VertexCover,. if it outputs a vertex cover which is at most twice the size of the optimal solution for vertex cover. Q: How good is the Greedy VertexCover?. Optimal solution is the black vertices, but greedy would pick the four white vertices. Thus, approximation factor is at least 4/3. Example 1.6 Does the greedy VertexCover algorithm is a 2-approximation? Ansser: No... Remove the blue vertex... And add it to the VC Remove red vertex So, approximation algorith removes 8 vertices Optimal vertex cover uses 6 vertices. This is just shows approx up to 8/6=4/3. However, extending this example to larger n (homework exercise), shows that the approximation algorithm in the worst case is a Ω(log(n)) approximation. Theorem 1.7 The greedy algorithm for VertexCover achieves Θ(logn) approximation, where n is the number of vertices in the graph. 2

Proof: Lower bound follows from the example indicated above. Upper bound will follow from similar proofs we will do shortly, and is omitted. 1.1 Two for the price of one (Pay more, get less) Q: Any better approximation algorithm for vertex cover? Algorithm: Approx-Vertex-Cover Choose an edge from G, add both endpoints to the vertex cover, and remove the two vertices from G and repeat. Theorem 1.8 Approx-Vertex-Cover achieves approximation factor 2. Proof: Every edge removed contains at least one vertex of the optimal solution. As such, the cover generated is at most twice larger than the optimal. 2 Traveling Salesman Person Theorem 2.1 TSP can not be approximated within any factor unless P = NP. Proof: Consider the reduction from Hamiltonian cycle into TSP. We set the weight of every edge to 1 if it was present in the instance of the hamiltonian cycle, and 2 otherwise. In the resulting complete graph, if there is a tour price n then there is a HC in the original graph. If on the other hand, there was no cycle in G then the cheapest TSP is of price n + 1. Instead of 2, use cn, for c an arbitrary constant. Clearly, if G does not contain any Hamiltonian cycle in then the price of the TSP is at least cn + 1. If one can do a c-approximation in polynomial time then using it on the TSP graph, would yield a tour of price cn if a tour of price n exists. But a tour of price cn exists iff G has a himtilontian cycle. 2.1 Traveling salesman problem with the triangle inequality G = (V, E) - graph c(e) - The cost of traveling on the edge e Definition 2.2 (Triangle inequality) For any u, v, w in V (G) we have: c(u, v) c(u, w) + c(w, v) Purpose: Develope a 2-approximation algorithm. Observations: 1. C opt - optiamal TSP Cycle in G. 3

2. C opt is a spanning graph. 3. w(c opt ) w(cheapest spanning graph of V ) 4. Cheapest spanning graph of G, is just the minimum spanning tree. w(c opt ) w(mst (G)) 5. MST can be computed in O(n log n + m) time. 6. Convert the MST into a tour. (a) Convert each edge of T = MST (G) by duplicating each edge. Let T denote the new graph. w(t ) = 2w(MST (G)) (b) For every vertex v V (T ) we have d(v) is an even number. (c) The graph T is Eulerian. (d) Let C denote the Eulerian cycle in T s u C v w w(c) = w(t ) = 2w(MST (G)) We next normalize C by shortcutting through vertices we already visited: s u D C v w 4

By the triangle ineqouality:w(uw) w(uv) + w(vw). Thus, the resulting cycle D is not longer. w(mst ) w(t SP ) w(d) w(c) = 2 w(mst ) 2 w(t SP ) Theorem 2.3 TSP with the triangle inequality, can be approximated up to a factor of 2 in O(n log n + m) time. 3 Max Exact 3SAT Instance of the 3SAT problem: Problem: Max 3SAT F = (x 1 + x 2 + x 3 )(x 4 + x 1 + x 2 ) Instance: A collection of clauses: C 1,..., C m. Question: Find the assignment to x 1,..., x n that satisfies the maximum number of clauses. Max 3SAT is NP-Hard. For example, F becomes: x 1 + x 2 + x 3 x 4 + x 1 + x 2 Note, that this is a maximization problem. Definition 3.1 Algorithm A for a maximization problem achieves an approximation factor α if for all inputs, we have: A(G) Opt(G) α. Theorem 3.2 There is an algorithm which achieves (7/8)-approximation in polynomial time. Namely, if the instance has m clauses it satisfies (7/8)m. Algorithm: RandMax3SAT x i 1 with probability 1/2, and 0 otherwise. Return x 1,..., x n Y i - the i-th clause is satisfied by the random assignment Y i is an indicator variable - get 1 if the i-th clause is satisfied, else 0. Y i = { 1 C i is satisfied by rand assignment 0 Otherwise. 5

Clearly, the number of clauses satified by the given assignment is: Y = m Y i Lemma 3.3 E[Y = (7/8)m, where m is the number of clauses in the input. Proof: We have [ m E[Y = E Y i = m E [Y i by linearity of expectation. Now, what is the probability that Y i = 0? [ P Y i = 0 = 1 2 1 2 1 2 = 1 8 This is the probability that C i is not satisified. C i is made out of exactly three literals, and as such...? [ [ P Y i = 1 = 1 P Y i = 0 = 7 8. Thus, [ [ E [Y i = P Y i = 0 0 + P Y i = 1 1 = 7 8. Namely, [ E # of clauses sat = E[Y = m E [Y i = 7 8 m. 6