Practice Final Exam 2: Solutions

Similar documents
Practice Final Exam 1

Repetition: Primal Dual for Set Cover

Approximation Algorithms

AM 121: Intro to Optimization Models and Methods Fall 2017

Homework 2: Multi-unit combinatorial auctions (due Nov. 7 before class)

Algorithms, Spring 2014, CSE, OSU Greedy algorithms II. Instructor: Anastasios Sidiropoulos

Approximation Algorithms

Approximation Algorithms

Notes for Lecture 24

Approximation Algorithms

Notes for Lecture 20

Combinatorial Optimization

Problem set 2. Problem 1. Problem 2. Problem 3. CS261, Winter Instructor: Ashish Goel.

CPSC 536N: Randomized Algorithms Term 2. Lecture 10

Mathematical and Algorithmic Foundations Linear Programming and Matchings

Solutions for Operations Research Final Exam

1. Lecture notes on bipartite matching February 4th,

31.6 Powers of an element

Linear Programming. Course review MS-E2140. v. 1.1

Submodularity Reading Group. Matroid Polytopes, Polymatroid. M. Pawan Kumar

10. EXTENDING TRACTABILITY

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748

Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

Approximation Algorithms: The Primal-Dual Method. My T. Thai

CS 231: Algorithmic Problem Solving

MVE165/MMG630, Applied Optimization Lecture 8 Integer linear programming algorithms. Ann-Brith Strömberg

COMP260 Spring 2014 Notes: February 4th

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 36

3 INTEGER LINEAR PROGRAMMING

DM515 Spring 2011 Weekly Note 7

Lecture 3: Totally Unimodularity and Network Flows

Optimization Methods in Management Science

ACO Comprehensive Exam October 12 and 13, Computability, Complexity and Algorithms

Theorem 2.9: nearest addition algorithm

The Ascendance of the Dual Simplex Method: A Geometric View

Decision Problems. Observation: Many polynomial algorithms. Questions: Can we solve all problems in polynomial time? Answer: No, absolutely not.

1 Linear programming relaxation

Math 414 Lecture 30. The greedy algorithm provides the initial transportation matrix.

Some Advanced Topics in Linear Programming

11 Linear Programming

6 Randomized rounding of semidefinite programs

Lecture Overview. 2 Shortest s t path. 2.1 The LP. 2.2 The Algorithm. COMPSCI 530: Design and Analysis of Algorithms 11/14/2013

Graph Algorithms Matching

Introduction to Mathematical Programming IE406. Lecture 20. Dr. Ted Ralphs

Introduction to Mathematical Programming IE496. Final Review. Dr. Ted Ralphs

Dual-fitting analysis of Greedy for Set Cover

Solutions for the Exam 6 January 2014

Linear Programming Duality and Algorithms

Outline. CS38 Introduction to Algorithms. Linear programming 5/21/2014. Linear programming. Lecture 15 May 20, 2014

Introduction to Algorithms May 14, 2003 Massachusetts Institute of Technology Professors Erik Demaine and Shafi Goldwasser.

Bipartite Matching & the Hungarian Method

The Design of Approximation Algorithms

ALGORITHMS EXAMINATION Department of Computer Science New York University December 17, 2007

Steiner Trees and Forests

Algorithm and Complexity of Disjointed Connected Dominating Set Problem on Trees

5. Consider the tree solution for a minimum cost network flow problem shown in Figure 5.

56:272 Integer Programming & Network Flows Final Examination -- December 14, 1998

4 Integer Linear Programming (ILP)

Methods and Models for Combinatorial Optimization Exact methods for the Traveling Salesman Problem

1. Lecture notes on bipartite matching

CMPSCI 311: Introduction to Algorithms Practice Final Exam

In this lecture, we ll look at applications of duality to three problems:

Lecture 7. s.t. e = (u,v) E x u + x v 1 (2) v V x v 0 (3)

CMPSCI611: The SUBSET-SUM Problem Lecture 18

Duality. Primal program P: Maximize n. Dual program D: Minimize m. j=1 c jx j subject to n. j=1. i=1 b iy i subject to m. i=1

8 Matroid Intersection

A List Heuristic for Vertex Cover

Primal Heuristics in SCIP

Linear and Integer Programming :Algorithms in the Real World. Related Optimization Problems. How important is optimization?

CS261: Problem Set #2

Bulldozers/Sites A B C D

CMPSCI611: Approximating SET-COVER Lecture 21

56:272 Integer Programming & Network Flows Final Exam -- December 16, 1997

12 Introduction to LP-Duality

Primal-Dual Methods for Approximation Algorithms

Introduction to Approximation Algorithms

16.410/413 Principles of Autonomy and Decision Making

Algorithm Design Methods. Some Methods Not Covered

Improved Gomory Cuts for Primal Cutting Plane Algorithms

Linear Programming. Linear Programming. Linear Programming. Example: Profit Maximization (1/4) Iris Hui-Ru Jiang Fall Linear programming

CSC 505, Fall 2000: Week 12

Linear programming and duality theory

Approximation Algorithms

CSC 8301 Design & Analysis of Algorithms: Linear Programming

Linear Programming Motivation: The Diet Problem

/ Approximation Algorithms Lecturer: Michael Dinitz Topic: Linear Programming Date: 2/24/15 Scribe: Runze Tang

Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

10. EXTENDING TRACTABILITY

Lecture 8: The Traveling Salesman Problem

Integer Programming as Projection

Polynomial-Time Approximation Algorithms

Lecture 14: Linear Programming II

NP-Complete Problems

Graphs and Network Flows IE411. Lecture 21. Dr. Ted Ralphs

Subset sum problem and dynamic programming

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018

Solution for Homework set 3

Heuristic Optimization Today: Linear Programming. Tobias Friedrich Chair for Algorithm Engineering Hasso Plattner Institute, Potsdam

Graphs and Orders Cours MPRI

Size of a problem instance: Bigger instances take

Transcription:

lgorithm Design Techniques Practice Final Exam 2: Solutions 1. The Simplex lgorithm. (a) Take the LP max x 1 + 2x 2 s.t. 2x 1 + x 2 3 x 1 x 2 2 x 1, x 2 0 and write it in dictionary form. Pivot: add x 1 to basis, remove x 3. z = x 1 + 2x 2 x 3 = 3 2x 1 x 2 x 4 = 2 x 1 + x 2 z = 1 2 3 x 2 ) + 2x 2 2x 1 = 3 x 3 x 2 x 4 = 2 1(3 x 2 3 x 2 ) + x 2 Pivot: add x 2 to basis, remove x 1. z = 3 2 1x 2 3 + 3x 2 2 x 1 = 3 2 1x 2 3 1x 2 2 x 4 = 1 2 1x 2 3 + 3x 2 2 z = 3 2 1x 2 3 + 3(3 x 2 3 2x 1 ) x 2 = 3 x 3 2x 1 x 4 = 1 2 1x 2 3 + 3(3 x 2 3 2x 1 ) z = 6 2x 3 3x 1 x 2 = 3 x 3 2x 1 x 4 = 5 2x 3 3x 1 1

So the optimal solution is (x 1, x 2 ) = (0, 3) with objective value z = x 1 + 2x 2 = 6. (b) The dual is min 3y 1 + 2y 2 s.t. 2y 1 + y 2 1 y 1 y 2 2 y 1, y 2 0 The top row of the final dictionary was z = 6 3x 1 2x 3 We know that the dual variables y 1, y 2 should be set equal to the coefficients of the slack variables x 3, x 4. Thus (y 1, y 2 ) = (2, 0). Clearly this is dual feasible [You should verify this]. It has a dual value of 3y 1 + 2y 2 = 6. So, by weak duality, our solution of value 6 in a) to the primal shows that this must be dual optimal. 2

2. Local Search. (a) Given an undirected graph G = (V, E), the maximum cut problem is find a set S V such that δ(s) is maximised. (b) Consider any vertex v, and suppose it has degree deg(v). The final cut δ(s t ) contains at least 1 deg(v) edges incident to vertex v otherwise the 2 local search algorithm would not have terminated (since we could improve the cut by moving v). Summing over all vertices we see that δ(s t ) contains at least half the edges in the graph. Obviously, the optimal cut δ(s ) contains at most the total number of edges in the graph. The result follows. 3. Parameterised Complexity. (a) problem is fixed parameter tractable if it has an algorithm to solve it that runs in time f(k) p(n), where n is the problem input size and k is the size of the optimal solution. Here p() is a polynomial function but f() need not be. (b) Randomly colour the vertices with colours {1, 2,..., k}. We now search for a cycle whose vertices are coloured in that same order. Let V 1 be the vertices coloured 1, let V 2 be be the vertices coloured 2 and which have an edge to some vertex in V 1, let V 3 be the vertices coloured 3 and which have an edge to some vertex in V 2, etc. Thus V k is the set of vertices at the end of a path coloured 1, 2,..., k. For each vertex v in V k, do a reverse search to find all the vertices in V 1 that begin multi-coloured paths that end at v. For each such vertex u check if (u, v) is an edge. If so we have found a multi-coloured cycle. This process can be done in polynomial time in the graph size. The probability that a k-cycle is multi-coloured is at least ( 1 k )k so repeating this colouring experiment independently enough times (say k k log n times) will let us find a k-cycle with high probability in time O(f(k) p(n)) as desired. 3

4. Branch and Bound. The branch and bound tree is shown below. Dashed circles corresponded to feasible solutions (that is, perfect matchings); dotted circles correspond to suboptimal subtrees that can be pruned away as we have already found better integral solutions. Note that by the branching rule we first explore the subpath B then B to find the feasible solutions BCD and BDC. The latter has value 22, which allows us to prune the whole of the subtree rooted at as well as the remainder of the nodes in the subtree rooted at B. Next we branch down C and C but leads to no better solutions. Finally we search the subpath D and D leading to the solutions DBC and DCB both of value 21. Everything remaining in the subtree of D can now be pruned. So we have found an optimal solution. JOB 1 CBB : 22 B BCC : 19 C CBB : 19 D DBB : 20 JOB 2 BCC : 19 BDCC : 23 C BCD : 26 D CBB : 19 CDBB : 23 B D CBD : 29 DBB : 20 DBCC : 25 B C DCBB : 22 JOB 3 C D BCD : 23 BDC : 22 B D CBD : 24 CDB : 23 B C DBC : 21 DCB : 21 4

5. NP-Completeness. (a) Given n (positive) integers x 1, x 2,..., x n. The partition problem asks if there is a subset S [n] such that x i = x i i S i/ S (b) i. Bin Packing is in NP (we can easily check a proposed solution to confirm a YES instance). The reduction from Partition is as follows. Set k = 2 and C = 1 2 i s i, where s i = x i. Clearly, there is a bin packing that uses two bins if and only if there is a partition S [n] such that x i = x i i S i/ S ii. Bin Covering is in NP (we can easily check a proposed solution to confirm a YES instance). The reduction from Partition is as follows. Set k = 2 and R = 1 2 i s i, where s i = x i. Clearly, there is a bin covering that uses two bins if and only if there is a partition S [n] such that x i = x i i S i/ S 6. pproximation lgorithms. (a) n α-approximation algorithm for a maximisation problem P always returns in polynomial time a feasible solution S to any instance I P such that the value of the optimal solution for that instance is at most an α factor greater that value of S. (b) We use a greedy algorithm. First observe that any item i with s i R will be placed in its own bin in the optimal solution. So our greedy algorithm will do that as well. So we may assume that s i < R for all i. The greedy algorithm simply places items in Bin 1 until it is overfull. Then it places items in Bin 2 until it is overfull, etc. The greedy algorithm clearly runs in polynomial time and returns a feasible solution. Let s show it gives a factor 3-approximation guarantee. ssume that the optimal solution fills k bins, and the greedy algorithm fills l bins. So 5

i s i k R. Now since s i < R for all i, the greedy algorithm overfills a bin by at most max i s i < R. Thus each bin has items of total size at most R + R. The greedy algorithm may use one extra bin that is not full, that is, one bin that has items of total size at most R. s every item is used we have s i < l 2R + R But by the optimal solution we know s i k R i i So Thus k R < l 2R + R l 1 2 (k 1) 1 3 k provided k 3. If k < 3 then we trivially get a 2-approximation algorithm by placing all the items in one bin. So we have a 3-approximation algorithm. [Remark. We can actually refine the analysis to show that this is a 2- approximation algorithm. To do this, let item x i be the item that first overfills bin i. Items x 1,..., x l are clearly in at most l bins of the optimal solution. The remaining items have total size less than (l + 1)R because they don t fill bins 1 to l + 1 (recall one bin in greedy could be left unfull). These items thus may cover at most l bins of opt. So opt covers at most l + l = 2l bins.] 6