Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502)

Similar documents
Theorem 2.9: nearest addition algorithm

val(y, I) α (9.0.2) α (9.0.3)

Lecture 8: The Traveling Salesman Problem

Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R + Goal: find a tour (Hamiltonian cycle) of minimum cost

Introduction to Approximation Algorithms

Module 6 NP-Complete Problems and Heuristics

CMSC 451: Lecture 22 Approximation Algorithms: Vertex Cover and TSP Tuesday, Dec 5, 2017

COMP 355 Advanced Algorithms Approximation Algorithms: VC and TSP Chapter 11 (KT) Section (CLRS)

CS261: A Second Course in Algorithms Lecture #16: The Traveling Salesman Problem

2. Optimization problems 6

Module 6 P, NP, NP-Complete Problems and Approximation Algorithms

Module 6 NP-Complete Problems and Heuristics

Greedy algorithms Or Do the right thing

V1.0: Seth Gilbert, V1.1: Steven Halim August 30, Abstract. d(e), and we assume that the distance function is non-negative (i.e., d(x, y) 0).

Module 6 NP-Complete Problems and Heuristics

1 Variations of the Traveling Salesman Problem

Advanced Methods in Algorithms HW 5

1 The Traveling Salesperson Problem (TSP)

2 Approximation Algorithms for Metric TSP

Approximation Algorithms

35 Approximation Algorithms

Basic Approximation algorithms

1 The Traveling Salesman Problem

The Traveling Salesman Problem on Grids with Forbidden Neighborhoods

Assignment 5: Solutions

CS270 Combinatorial Algorithms & Data Structures Spring Lecture 19:

Fall CS598CC: Approximation Algorithms. Chandra Chekuri

An O(log n/ log log n)-approximation Algorithm for the Asymmetric Traveling Salesman Problem

/ Approximation Algorithms Lecturer: Michael Dinitz Topic: Linear Programming Date: 2/24/15 Scribe: Runze Tang

(Refer Slide Time: 01:00)

Travelling Salesman Problem. Algorithms and Networks 2015/2016 Hans L. Bodlaender Johan M. M. van Rooij

Approximation Algorithms

Traveling Salesman Problem. Algorithms and Networks 2014/2015 Hans L. Bodlaender Johan M. M. van Rooij

Lecture 1. 2 Motivation: Fast. Reliable. Cheap. Choose two.

CSE 548: Analysis of Algorithms. Lecture 13 ( Approximation Algorithms )

Slides on Approximation algorithms, part 2: Basic approximation algorithms

Approximation Algorithms

Notes for Recitation 9

Outline. CS38 Introduction to Algorithms. Approximation Algorithms. Optimization Problems. Set Cover. Set cover 5/29/2014. coping with intractibility

1 The Traveling Salesman Problem

Steiner Trees and Forests

Optimal tour along pubs in the UK

Notes for Lecture 24

DO NOT RE-DISTRIBUTE THIS SOLUTION FILE

Approximation Algorithms

Graphs and Algorithms 2015

Partha Sarathi Mandal

Topic 10 Part 2 [474 marks]

CSC Design and Analysis of Algorithms. Lecture 4 Brute Force, Exhaustive Search, Graph Traversal Algorithms. Brute-Force Approach

Theory of Computing. Lecture 10 MAS 714 Hartmut Klauck

Chapter Design Techniques for Approximation Algorithms


Modules. 6 Hamilton Graphs (4-8 lectures) Introduction Necessary conditions and sufficient conditions Exercises...

Math 15 - Spring Homework 2.6 Solutions 1. (2.6 # 20) The following graph has 45 vertices. In Sagemath, we can define it like so:

Improved approximation ratios for traveling salesperson tours and paths in directed graphs

MAT 145: PROBLEM SET 6

CS 4407 Algorithms. Lecture 8: Circumventing Intractability, using Approximation and other Techniques

Technische Universität München Zentrum Mathematik

Coping with the Limitations of Algorithm Power Exact Solution Strategies Backtracking Backtracking : A Scenario

Restricted Delivery Problems on a Network. December 17, Abstract

Precept 4: Traveling Salesman Problem, Hierarchical Clustering. Qian Zhu 2/23/2011

Notes 4 : Approximating Maximum Parsimony

Prove, where is known to be NP-complete. The following problems are NP-Complete:

Algorithms and Data Structures: Minimum Spanning Trees (Kruskal) ADS: lecture 16 slide 1

Unit 8: Coping with NP-Completeness. Complexity classes Reducibility and NP-completeness proofs Coping with NP-complete problems. Y.-W.

Combinatorial Optimization - Lecture 14 - TSP EPFL

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I. Instructor: Shaddin Dughmi

3 Euler Tours, Hamilton Cycles, and Their Applications

NP Completeness. Andreas Klappenecker [partially based on slides by Jennifer Welch]

Traveling Salesperson Problem (TSP)

Chapter 14 Section 3 - Slide 1

MC 302 GRAPH THEORY 10/1/13 Solutions to HW #2 50 points + 6 XC points

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Approximation Algorithms

Technische Universität München Zentrum Mathematik

Lecture Notes: Euclidean Traveling Salesman Problem

Acyclic Edge Colorings of Graphs

APPROXIMATION ALGORITHMS FOR GEOMETRIC PROBLEMS

12.1 Formulation of General Perfect Matching

Decision Problems. Observation: Many polynomial algorithms. Questions: Can we solve all problems in polynomial time? Answer: No, absolutely not.

SLS Methods: An Overview

COMP Analysis of Algorithms & Data Structures

CS264: Homework #4. Due by midnight on Wednesday, October 22, 2014

Basic Combinatorics. Math 40210, Section 01 Fall Homework 4 Solutions

Constructive and destructive algorithms

CME 305: Discrete Mathematics and Algorithms Instructor: Reza Zadeh HW#3 Due at the beginning of class Thursday 02/26/15

14 More Graphs: Euler Tours and Hamilton Cycles

Assignment 4 Solutions of graph problems

Stanford University CS261: Optimization Handout 1 Luca Trevisan January 4, 2011

Greedy Approximations

Simple Graph. General Graph

MAS 341: GRAPH THEORY 2016 EXAM SOLUTIONS

Solution for Homework set 3

DO NOT RE-DISTRIBUTE THIS SOLUTION FILE

5.5 The Travelling Salesman Problem

Best known solution time is Ω(V!) Check every permutation of vertices to see if there is a graph edge between adjacent vertices

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018

arxiv: v1 [cs.ro] 20 Mar 2016

The Traveling Salesperson Problem with Forbidden Neighborhoods on Regular 3D Grids

Design and Analysis of Algorithms

Questions... How does one show the first problem is NP-complete? What goes on in a reduction? How hard are NP-complete problems?

Transcription:

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik Combinatorial Optimization (MA 4502) Dr. Michael Ritter Problem Sheet 4 Homework Problems Problem 4.1 (An Example of Lifting) Consider the polytope { } P := conv x {0, 1} 7 : 11x 1 + 6x 2 + 6x 3 + 5x 4 + 5x 5 + 4x 6 + x 7 19. a) Show that the inequality x 3 + x 4 + x 5 + x 6 3 is valid for P. b) Compute liftings for the above inequality by sequentially lifting the seventh, second and first coordinate in that order. Is the result different from what was obtained in the lecture? Answer to Problem 4.1 a) Suppose the inequality is not valid for P, then there is some integral point x P that violates the inequality. The only 0-1-solutions that can violate the inequality need to satisfy x 3 = x 4 = x 5 = x 6 = 1, thus 11x 1 + 6x 2 + 6x 3 + 5x 4 + 5x 5 + 4x 6 + x 7 6 + 5 + 5 + 4 = 20, clearly a contradiction to x P. (Of course, we already knew this from the lecture. Also, one could just use the argument that this inequality is a knapsack cover inequality.) b) We start by lifting the seventh coordinate, i. e., we determine the maximum α 7 0 such that x 3 + x 4 + x 5 + x 6 + α 7 x 7 3 is still valid for P. For x 7 = 0 the value of α 7 can be chosen arbitrarily, hence we only need to consider vectors in P where x 7 = 1. We then get α 7 3 (x 3 + x 4 + x 5 + x 6 ) for all x P { x R 7 : x 7 = 1 }. Thus we have to solve the following optimization problem to determine α 7 : max x 3 + x 4 + x 5 + x 6 11x 1 + 6x 2 + 6x 3 + 5x 4 + 5x 5 + 4x 6 19 1 = 18 x {0, 1} 7 It is easy to see that the optimal objective value for that problem is 3, as any three of the items {3, 4, 5, 6} may be chosen, but not alle four. (We do not even need to come up with a solution, the objective value suffices.) Thus, α 7 = 3 3 = 0 and the inequality remains unchanged: x 3 + x 4 + x 5 + x 6 3. Page 1 of 6

Next, we lift the second coordinate into the inequality x 3 + x 4 + x 5 + x 6 3, i. e., we determine a maximum α 2 such that α 2 x 2 + x 3 + x 4 + x 5 + x 6 3 is valid for P. With the same arguments as above, this leads to the following optimization problem: max x 3 + x 4 + x 5 + x 6 11x 1 + 6x 3 + 5x 4 + 5x 5 + 4x 6 + x 7 19 6 = 13 x {0, 1} 7 The optimal objective value is 2, hence α 2 = 3 2 = 1 and the lifted inequality becomes x 2 + x 3 + x 4 + x 5 + x 6 3. That leaves us with the first coordinate to lift into the inequality, thus we determine α 1 such that α 1 x 1 + x 2 + x 3 + x 4 + x 5 + x 6 3 is valid for P. This yields the optimization problem max x 2 + x 3 + x 4 + x 5 + x 6 6x 2 + 6x 3 + 5x 4 + 5x 5 + 4x 6 + x 7 19 11 = 8 x {0, 1} 7, which has an optimal objective value of 1. Therefore α 7 = 3 1 = 2 and the lifted inequality becomes 2x 1 + x 2 + x 3 + x 4 + x 5 + x 6 3 Tutorial Problems Problem 4.2 (Inapproximability of TSP) Show that there does not exist a k-approximation algorithm for TSP for any constant factor k 1, unless P = N P. Answer to Problem 4.2 As in the lecture, we consider an instance G = (V, E) of the Hamilton Circuit Problem, i. e. we have to decide whether G is Hamiltonian or not. Assume there is a k-approximation algorithm for TSP for some fixed constant k 1. We define an instance of TSP in the complete graph K n (with V = [n]) by assigning the following weights to the edges of K n : c(e) := { 1, if e E; kn + 1, otherwise. Applying the k-approximation algorithm to this instance yields a solution with objective value z approx such that z approx k z (where z is the optimal objective value). If G is Hamiltonian, there is a tour with objective value n in our instance (and this is optimal, because no edge has weight less than 1), so our approximation algorithm would return a tour of at most length kn. This would certainly imply that the tour cannot use any edge of weight kn + 1, thus all edges it uses are present in G already. Page 2 of 6

The converse is also true: If the approximation algorithm returns a tour of length strictly greater than kn, then z > n (otherwise, the approximation would have to yield a solution kz kn) and the original graph G is not Hamiltonian. Hence if there was a k-approximation for TSP, we could use it to construct a polynomial time algorithm for Hamilton Circuit, effectively proving P = N P. Problem 4.3 In the lecture you have proved that the minimum spanning tree heuristic (MST heuristic) is a 2-approximation for metric TSP. Give an example that shows that this bound is asymptotically tight, i. e. construct a family of TSP instances, each with an MST solution, such that the approximation ratio of these solutions converges to 2 for large vertex numbers. Answer to Problem 4.3 Consider a n 2-grid in the euclidean plane R 2 with edge length 1, see Figure 1. An optimal tour (as shown in the figure) has length 2n (a shorter tour that visits all nodes is not possible). On the other hand, consider the spanning tree and corresponding Euler tour depicted in Figure 2. Starting at the upper right vertex and following the Euler tour downwards results in the MST solution depicted in Figure 3. For a 2 n-grid with euclidean distances, this yields a TSP tour of length Thus, the approximation ratio of this solution is 3n 3 + 1 + (n 2) 2 > 4n 5. 4n 5 2n = 2 5 2 for n. 2n Figure 1: A TSP instance where the MST heuristic has approximation ratio 2, together with an optimal tour. Figure 2: A TSP instance with MST and Euler tour. Problem 4.4 (Nearest Neighbour Heuristic) In this exercise we will look at two common heuristics for the construction of a traveling salesman tour, the Nearest-Neighbour (NN) and the Nearest Insert (NI) algorithm. Throughout the exercise, assume we have an instance of Metric TSP given by a complete graph K n = ([n], E) on n N nodes with edge weights c 0. The Nearest Neighbour heuristic constructs a feasible tour as follows: (1) Randomly choose a start node and start with the tour that only consists of that node. Page 3 of 6

Figure 3: A TSP instance with an MST heuristic solution, starting at the upper right vertex. (2) Repeat the following step until all nodes are contained in the tour: Among all nodes not yet in the tour, choose one that is closest to the last inserted node and append it to the tour. The Nearest Insert heuristic constructs a feasible tour as follows: (1) Randomly choose a start node, determine the node closest to it and start with the tour that only consists of these two nodes. (2) Repeat the following steps until all nodes are contained in the tour: Among all nodes not yet in the tour, choose one that is closest to the set of nodes already contained in the tour. Find an edge in the tour where the new node may be inserted a minimum cost and insert it. a) Show that Nearest Neighbour does not yield a k-approximation for any k 1. b) Show that Nearest Insert is a 2-approximation for Metric TSP. Hint: The algorithm chooses the new nodes in the same order that Prim s algorithm would use. Try to bound the cost of insertion in comparison to the cost of adding a new edge to the spanning tree construction in Prim s algorithm. Answer to Problem 4.4 a) Let p N and consider the complete graph G p = (V p, E p ) with V P = 2 (8 2 p 3) nodes arranged on a subgrid of Z 2 in the following way (see also Figure 4): V p = {(x, 0) : x = 1,..., 8 2 p 3} {(x, 1) : x = 1,..., 8 2 p 3} m 0 l 0 Figure 4: The graph G 0 and a partial nearest neighbour tour. Page 4 of 6

We denote the lower left node by l k and the upper middle vertex by m k. We will now inductively construct a series of graphs G 0, G 1,... with a partial nearest neighbour tour similar to the one shown in G 0, starting in the lower left node l p and ending in the upper middle node m p. The length of such a tour will be exactly 2 p (12 + 4p) 3, while the optimal tour will have length V p = 2 (8 2 p 3). This means, the approximation ratio of nearest neighbour is not better than 2 p (12 + 4p) 3 2 (8 2 p 3) = (12 + 4p) 3 2 p 16 6 2 p p, thus for every assumed approximation factor k there is a sufficiently large p that contradicts the assumption. (Note that the partial nearest neighbour tour that we exhibited here is of course not uniquely determined. However, we may make it unique by adjusting the edge lengths a little.) We will prove the claim on the tour length by induction. For p = 0, the tour depicted above has length 9, which is exactly 1 (12 + 0) 3, thus the claim is true for p = 0. Consider some p 1, then the graph G p consists of two copies of G p 1 (denoted by G p 1 and G p 1 ) and 6 extra nodes arranged in the way that is indicated in Figure 5. The length of the partial nearest neighbour tour depicted there is twice the length of the partial tours in the smaller graph G p 1, plus 5 edges of length 1 each plus two longer edges which have a length of 4 2 p 1 1 each. In total, we get a length of 2 (2 p 1 (12 + 4(p 1)) 3) + 5 + 2 (4 2 p 1) =2 p (12 + 4p 4) 6 + 5 + 4 2 p 2 =2 p (12 + 4p) 3 m p 1 m p m p 1 l p = l p 1 l p 1 Figure 5: The graph G 0 and a partial nearest neighbour tour. Remark: The idea for this construction is due to Stefan Hougardy and Mirko Wilde. More details can be found in their paper On the Nearest Neighbor Rule for the Metric Traveling Salesman Problem available at http://arxiv.org/abs/1401.2071. b) Consider Prim s algorithm for the determination of a minimum spanning tree and let the algorithm start with the same start node that NI is using. Prim would then determine the closest node and add that to the spanning tree, just as NI does, so the first two nodes chosen are equal for both algorithms. In all of the following steps, Prim s algorithm will always determine one node that is closest to the set of nodes already contained in the partial tree, but is not itself contained in the tree so far, so the chosen node is again the same for NI and Prim. We conclude that both algorithms may be assumed to add the nodes in the same order if we use the same start node (and break ties in a consistent way). Now consider the j-th step of the NI algorith and denote the note that is added in that step by v j. Adding this node to the tour incurs an additional cost of tourcost(v j ). Furthermore, let t be the node where the minimum distance of v j to the set of nodes already in the tour is attained, Page 5 of 6

thus {t, v j } is the edge that is added to the partial spanning tree by Prim s algorithm in step j. Finally, let u, w denote the two nodes already in the tour before step j such that node v j will be added to the tour between u and w. The cost of adding v j is then tourcost(v j ) = c(u, v j ) + c(v j, w) c(u, w). As the node t is already contained in the partial tour in step j, there is some node t that is a neighbour of t in the partial tour. As v j was not inserted between t and t, we know that tourcost(v j ) c(t, v j ) + c(v j, t ) c(t, t ). Employing the triangle inequality c(v j, t ) c(t, t ) + c(t, v j ), we conclude tourcost(v j ) c(t, v j ) + c(v j, t ) c(t, t ) c(t, v j ) + c(t, v j ) = 2 c(t, v j ), thus the cost of inserting v j into the partial tour is bounded by twice the cost of adding the new edge {t, v j } to the partial spanning tree in each step. This amounts to c(τ) 2 c(t ), where τ is the NI tour constructed by the algorithm and T is a minimal spanning tree. As the optimal tour contains a spanning tree, the cost of T is less or equal than that of an optimal solution τ OPT, hence c(τ) 2 c(t ) 2 c(τ OPT ), proving the desired approximation ratio of 2. Page 6 of 6