College of Computer & Information Science Fall 2007 Northeastern University 14 September 2007

Similar documents
ORIE 6300 Mathematical Programming I September 2, Lecture 3

Mathematical and Algorithmic Foundations Linear Programming and Matchings

6.854 Advanced Algorithms. Scribes: Jay Kumar Sundararajan. Duality

Linear Programming. Larry Blume. Cornell University & The Santa Fe Institute & IHS

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 29

11 Linear Programming

Lecture 16 October 23, 2014

/ Approximation Algorithms Lecturer: Michael Dinitz Topic: Linear Programming Date: 2/24/15 Scribe: Runze Tang

Dual-fitting analysis of Greedy for Set Cover

1 Linear programming relaxation

Math 5593 Linear Programming Lecture Notes

Some Advanced Topics in Linear Programming

Section Notes 5. Review of Linear Programming. Applied Math / Engineering Sciences 121. Week of October 15, 2017

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 36

1. Lecture notes on bipartite matching February 4th,

Submodularity Reading Group. Matroid Polytopes, Polymatroid. M. Pawan Kumar

DM545 Linear and Integer Programming. Lecture 2. The Simplex Method. Marco Chiarandini

maximize c, x subject to Ax b,

Approximation Algorithms

1. Lecture notes on bipartite matching

Linear programming and duality theory

4 Integer Linear Programming (ILP)

In this lecture, we ll look at applications of duality to three problems:

Lecture 2 - Introduction to Polytopes

Linear Programming Duality and Algorithms

Outline. CS38 Introduction to Algorithms. Linear programming 5/21/2014. Linear programming. Lecture 15 May 20, 2014

Lecture 7. s.t. e = (u,v) E x u + x v 1 (2) v V x v 0 (3)

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I. Instructor: Shaddin Dughmi

CS 372: Computational Geometry Lecture 10 Linear Programming in Fixed Dimension

Advanced Operations Research Techniques IE316. Quiz 1 Review. Dr. Ted Ralphs

Approximation Algorithms

Section Notes 4. Duality, Sensitivity, and the Dual Simplex Algorithm. Applied Math / Engineering Sciences 121. Week of October 8, 2018

Duality. Primal program P: Maximize n. Dual program D: Minimize m. j=1 c jx j subject to n. j=1. i=1 b iy i subject to m. i=1

A List Heuristic for Vertex Cover

Lecture notes on the simplex method September We will present an algorithm to solve linear programs of the form. maximize.

Linear Programming in Small Dimensions

Lecture 5: Duality Theory

Introduction to Mathematical Programming IE496. Final Review. Dr. Ted Ralphs

In this chapter we introduce some of the basic concepts that will be useful for the study of integer programming problems.

Lecture 2 September 3

Discrete Optimization 2010 Lecture 5 Min-Cost Flows & Total Unimodularity

6. Lecture notes on matroid intersection

Applied Lagrange Duality for Constrained Optimization

Steiner Trees and Forests

Approximation Algorithms: The Primal-Dual Method. My T. Thai

Integer Programming Theory

12.1 Formulation of General Perfect Matching

EC 521 MATHEMATICAL METHODS FOR ECONOMICS. Lecture 2: Convex Sets

POLYHEDRAL GEOMETRY. Convex functions and sets. Mathematical Programming Niels Lauritzen Recall that a subset C R n is convex if

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18

Linear Programming Motivation: The Diet Problem

4 Greedy Approximation for Set Cover

OPERATIONS RESEARCH. Linear Programming Problem

AM 121: Intro to Optimization Models and Methods Fall 2017

Lesson 17. Geometry and Algebra of Corner Points

CS675: Convex and Combinatorial Optimization Spring 2018 The Simplex Algorithm. Instructor: Shaddin Dughmi

Primal-Dual Methods for Approximation Algorithms

Lecture 10,11: General Matching Polytope, Maximum Flow. 1 Perfect Matching and Matching Polytope on General Graphs

1 Linear Programming. 1.1 Optimizion problems and convex polytopes 1 LINEAR PROGRAMMING

Advanced Operations Research Techniques IE316. Quiz 2 Review. Dr. Ted Ralphs

16.410/413 Principles of Autonomy and Decision Making

FACES OF CONVEX SETS

CS6841: Advanced Algorithms IIT Madras, Spring 2016 Lecture #1: Properties of LP Optimal Solutions + Machine Scheduling February 24, 2016

Lecture Notes 2: The Simplex Algorithm

2 The Fractional Chromatic Gap

CS522: Advanced Algorithms

Three Dimensional Geometry. Linear Programming

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 1: Introduction to Optimization. Instructor: Shaddin Dughmi

11.1 Facility Location

3. The Simplex algorithmn The Simplex algorithmn 3.1 Forms of linear programs

15-451/651: Design & Analysis of Algorithms October 11, 2018 Lecture #13: Linear Programming I last changed: October 9, 2018

4 LINEAR PROGRAMMING (LP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1

Lecture 4: Rational IPs, Polyhedron, Decomposition Theorem

Notes for Lecture 20

Lecture 14: Linear Programming II

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/18/14

Lecture 4: Linear Programming

Repetition: Primal Dual for Set Cover

Combinatorial Optimization

Numerical Optimization

Linear Programming. Linear Programming. Linear Programming. Example: Profit Maximization (1/4) Iris Hui-Ru Jiang Fall Linear programming

Greedy Approximations

Week 5. Convex Optimization

Artificial Intelligence

A linear program is an optimization problem of the form: minimize subject to

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 50

Heuristic Optimization Today: Linear Programming. Tobias Friedrich Chair for Algorithm Engineering Hasso Plattner Institute, Potsdam

Lecture 6: Faces, Facets

A Primal-Dual Approximation Algorithm for Partial Vertex Cover: Making Educated Guesses

2 Solving the Online Fractional Algorithm and Online Rounding Scheme

Introduction to Constrained Optimization

MA4254: Discrete Optimization. Defeng Sun. Department of Mathematics National University of Singapore Office: S Telephone:

Linear Programming: Introduction

Notes taken by Mea Wang. February 11, 2005

12 Introduction to LP-Duality

AMS : Combinatorial Optimization Homework Problems - Week V

Lecture Overview. 2 Shortest s t path. 2.1 The LP. 2.2 The Algorithm. COMPSCI 530: Design and Analysis of Algorithms 11/14/2013

5. Lecture notes on matroid intersection

CSC Linear Programming and Combinatorial Optimization Lecture 12: Semidefinite Programming(SDP) Relaxation

A PARAMETRIC SIMPLEX METHOD FOR OPTIMIZING A LINEAR FUNCTION OVER THE EFFICIENT SET OF A BICRITERIA LINEAR PROBLEM. 1.

Transcription:

College of Computer & Information Science Fall 2007 Northeastern University 14 September 2007 CS G399: Algorithmic Power Tools I Scribe: Eric Robinson Lecture Outline: Linear Programming: Vertex Definitions Linear Programming: Optimal at a Vertex LP: Duality LP: Duality: Weak Duality LP: Duality: Set Cover This lecture gives an introduction to the geometry of linear programming We discuss three equivalent definitions of a vertex of a polytope, and then show that there exists an optimal solution to any bounded feasible LP lies at a vertex of the underlying polytope We then introduce the important concept of LP duality and present an alternative analysis of the greedy set cover algorithm using the idea of dual fitting References for the material covered in this lecture are [Goe94, Kar91, Vaz03] 1 Linear Programming Linear Programming was described in the previous class Recall the standard form: min c T x st Ax b x 0 11 Vertex Definitions As was discussed the previous class, three definitions of a vertex, x, for a convex polytope, P, can be used: y 0 st x + y, x y P y, z P, α (0, 1) st αy + (1 α)z = x n linearly independant equations that are tight at x The proof that definitions 1 and 2 are equivalent shall be left to the reader Here we will prove that claims 1 and 3 are equivalent

111 1 implies 3 Suppose 3 Let A be the submatrix corresponding to the tighe inequalities, let b be the corresponding right hand sides This leads to the inequality A x = b, where there are remaining equations A and right hand sides, b b The rank of A is strictly less than n This implies that y 0 st A y = 0 Consider x + λy, A (x + λy) = A x + λa y = b F indλ 0 st x + λy, x λ P Now consider any one of the non-tight inequalities, j A j x > b j Apply λ, getting A j (x + λy) = A j x + λa j y It is known that A j x > b j Because A j x = b j is not tight, λa j y must be feasible in one direction for a short distance, and the other direction infinitely We shall select a distance λ j equal to that short distance Finally, we shall select λ = min j λ j Because the associated constraints are not tight, λ cannot be zero This directly contradicts 1 112 3 implies 1 Let A denote the submatrix for tight inequalities, as was done above This implies the rank of A n Suppose neg1 This implies y 0 st x + y, x y P Consider the fact that A (x + y) b, A (x y) b This implies that A y = 0, which implies that the rank of A is less than n This contradicts the assertion made earlier 12 Optimality at a Vertex The claim was made in the previous class that an optimal solution must occur at a vertex More specifically, if the linear program is feasible and bounded, then v V ertices(p ) st x P, C T v C T x This statement, along with fundamentally stating that an optimal solution must occur at a vertex, also shows the decidability of solving for the system, as one can simply check all the vertices A proof follows 121 Proof The proof shall proceed by picking a non-vertex point and showing that there exists as good or better of a solution with at least one less non-tight equation Hence progress is made towards a vertex by applying this process iteratively To find a vertex that is optimal this iteration will have to be done at most n + m times Suppose x P and x V ertices(p ) This implies y 0 st x + y, x y P This in turn implies (x + y, x y 0) (x i = 0 y i = 0) Consider v = x + λy C T v = C T x + λc T y Assume without loss of generality that C T y 0 If this was not the case, the other direction could simply have been selected (x y) Now suppose that A and b give the set of tight variables, as before We know that A (x + y) b This implies A y = 0, and that A (x + λy) = b Once again, we will find the λ j values for the non-tight equations There are two possibilities: 2

λ j 0: In this case, we selecte the smallest such λ j value, λ The point x + λy now makes one additional inequality tight while satisfying the property that it keeps the previously tight inequalities tight And we have made progress towards a solution j λ j < 0: Here, either C T y < 0, in which case the solution is unbounded, or C T y < 0 in which case we use the go to the λ j 0 case, as our cost function will remain the same no matter which direction we move in 13 Duality One can view any minimization linear program as a maximization Consider the following linear system: min 3x 1 + 2x 2 + 8x 3 st x 1 x 2 + 2x 3 5 x 1 + 2x 2 + 4x 3 10 x 1, x 2, x 3 0 Where Z is OPT, we know Z = 3x 1 + 2x 2 + 8x 3, for some x 1, x 2, x 3 P By adding two of the inequalities, we arrive at 2x 1 + x 2 + x 3 15 Since x 1, x 2, x 3 0, we know that Z 15 But we aren t limited to addition, multiplication is another way the equations can be combined So how is this new formulation bouned? This is done by using the dual formulation, D of the minimization, which for this problem is: max 5y 1 + 10y 2 st y 1 + y 2 3 y 1 + 2y 2 2 2y 1 + 4y 2 8 y 1, y 2 0 The theory of LP duality (sometimes referred to as the Strong Duality Theorem) says that if the primal LP P is bounded and feasible, then the value of the primal LP equals the value of the dual LP 131 Weak Duality Weak duality makes only the claim that the value of the primal LP is at least the value of the dual LP Consider the primal P and its dual D: P D min c T x max b T y st Ax b st A T y c x 0 y 0 Suppose that x is an optimal solution to P and y is an optimal solution to D We need only 3

show that c T x b T y c T x (A T y ) T x = y T Ax b T y x T A T y = (y T Ax ) T Noting that the last equation of each of these comparison are identical (since the transpose of a scalar is the scalar itself) leads to the desired conclusion 132 Set cover analysis using dual fitting The set cover linear programming problem, P, can be recast into its dual framework, D: P D min c s x s max e U y e st S e x s 1 e U st e s y e c s s S x s 0 s S y e 0 e U Analysis We seek to construct a dual-feasible solution with a cost close to the integer solution We shall look at y e, or price(e), the cost of an element when it is first covered This corresponds cost(s) to when e is first covered The cost of the greedy solution number of uncovered elements covered is e U y e The problem with this is that the constraints for the dual are not covered because the greedy solution adds in an additional cost for previously covered elements So, we will uniformly decrease the y e s, y e = price(e) α The claim is that y is dual feasible if α =, where n is the number of elements in U To prove this, we need to show e s y e c s If we considered the elements ordered again, and pick any set s S, we ll see that a portion of the elements have been covered, and a portion have not This means that: y e1 c s k 1 y e2 c s k 1 1 Or, e S y e c s ( 1 k + 1 k 1 + + 1) = c s H k c s 4

So, from this, we can show that the greedy solution is an approximation: Cost of greedy algorithm = cost(y) OP T (D) proof above OP T (P ) weak duality OP T fractional to integral References [Goe94] M Goemans Introduction to Linear Programming Lecture notes, available from http://www-mathmitedu/ goemans/, October 1994 [Kar91] H Karloff Linear Programming Birkhäuser, Boston, MA, 1991 [Vaz03] V Vazirani Approximation Algorithms Springer-Verlag, 2003 5