CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 36

Similar documents
CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 29

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 50

Mathematical and Algorithmic Foundations Linear Programming and Matchings

Introduction to Mathematical Programming IE496. Final Review. Dr. Ted Ralphs

Some Advanced Topics in Linear Programming

1 Linear programming relaxation

Duality. Primal program P: Maximize n. Dual program D: Minimize m. j=1 c jx j subject to n. j=1. i=1 b iy i subject to m. i=1

6.854 Advanced Algorithms. Scribes: Jay Kumar Sundararajan. Duality

Combinatorial Optimization

Lecture 5: Duality Theory

Outline. Combinatorial Optimization 2. Finite Systems of Linear Inequalities. Finite Systems of Linear Inequalities. Theorem (Weyl s theorem :)

Advanced Operations Research Techniques IE316. Quiz 2 Review. Dr. Ted Ralphs

Linear Programming Duality and Algorithms

Approximation Algorithms

1. Lecture notes on bipartite matching February 4th,

Section Notes 5. Review of Linear Programming. Applied Math / Engineering Sciences 121. Week of October 15, 2017

Integer Programming Theory

Linear programming and duality theory

Problem set 2. Problem 1. Problem 2. Problem 3. CS261, Winter Instructor: Ashish Goel.

Linear Programming: Introduction

Dual-fitting analysis of Greedy for Set Cover

Linear Programming. Larry Blume. Cornell University & The Santa Fe Institute & IHS

Linear Programming Motivation: The Diet Problem

11 Linear Programming

Section Notes 4. Duality, Sensitivity, and the Dual Simplex Algorithm. Applied Math / Engineering Sciences 121. Week of October 8, 2018

Civil Engineering Systems Analysis Lecture XIV. Instructor: Prof. Naveen Eluru Department of Civil Engineering and Applied Mechanics

Lecture 6: Faces, Facets

Outline. CS38 Introduction to Algorithms. Linear programming 5/21/2014. Linear programming. Lecture 15 May 20, 2014

Lecture 14: Linear Programming II

Heuristic Optimization Today: Linear Programming. Tobias Friedrich Chair for Algorithm Engineering Hasso Plattner Institute, Potsdam

Linear Programming. Linear programming provides methods for allocating limited resources among competing activities in an optimal way.

6. Lecture notes on matroid intersection

College of Computer & Information Science Fall 2007 Northeastern University 14 September 2007

/ Approximation Algorithms Lecturer: Michael Dinitz Topic: Linear Programming Date: 2/24/15 Scribe: Runze Tang

Notes for Lecture 20

DM545 Linear and Integer Programming. Lecture 2. The Simplex Method. Marco Chiarandini

Discrete Optimization 2010 Lecture 5 Min-Cost Flows & Total Unimodularity

Approximation Algorithms

ACO Comprehensive Exam October 12 and 13, Computability, Complexity and Algorithms

CS675: Convex and Combinatorial Optimization Spring 2018 Consequences of the Ellipsoid Algorithm. Instructor: Shaddin Dughmi

Linear Programming. Course review MS-E2140. v. 1.1

Artificial Intelligence

Approximation slides 1. An optimal polynomial algorithm for the Vertex Cover and matching in Bipartite graphs

3 No-Wait Job Shops with Variable Processing Times

1. Lecture notes on bipartite matching

Math 5593 Linear Programming Lecture Notes

Advanced Algorithms Linear Programming

LECTURES 3 and 4: Flows and Matchings

Lecture 3. Corner Polyhedron, Intersection Cuts, Maximal Lattice-Free Convex Sets. Tepper School of Business Carnegie Mellon University, Pittsburgh

Approximation Algorithms: The Primal-Dual Method. My T. Thai

4 Integer Linear Programming (ILP)

3 INTEGER LINEAR PROGRAMMING

Linear Programming. Slides by Carl Kingsford. Apr. 14, 2014

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748

Submodularity Reading Group. Matroid Polytopes, Polymatroid. M. Pawan Kumar

15-451/651: Design & Analysis of Algorithms October 11, 2018 Lecture #13: Linear Programming I last changed: October 9, 2018

Lecture 7. s.t. e = (u,v) E x u + x v 1 (2) v V x v 0 (3)

Lecture 10,11: General Matching Polytope, Maximum Flow. 1 Perfect Matching and Matching Polytope on General Graphs

Linear Programming [Read Chapters F and G first.] Status: Text beta. Needs figures.

THEORY OF LINEAR AND INTEGER PROGRAMMING

Introduction to Mathematical Programming IE406. Lecture 20. Dr. Ted Ralphs

MVE165/MMG631 Linear and integer optimization with applications Lecture 9 Discrete optimization: theory and algorithms

CS675: Convex and Combinatorial Optimization Spring 2018 The Simplex Algorithm. Instructor: Shaddin Dughmi

Linear and Integer Programming :Algorithms in the Real World. Related Optimization Problems. How important is optimization?

In this chapter we introduce some of the basic concepts that will be useful for the study of integer programming problems.

CSE 417 Network Flows (pt 4) Min Cost Flows

Design and Analysis of Algorithms (V)

Lecture notes on the simplex method September We will present an algorithm to solve linear programs of the form. maximize.

CSE 417 Network Flows (pt 3) Modeling with Min Cuts

1 Bipartite maximum matching

Topic: Local Search: Max-Cut, Facility Location Date: 2/13/2007

Applied Lagrange Duality for Constrained Optimization

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I. Instructor: Shaddin Dughmi

Algorithms for Integer Programming

AM 121: Intro to Optimization Models and Methods Fall 2017

MVE165/MMG630, Applied Optimization Lecture 8 Integer linear programming algorithms. Ann-Brith Strömberg

Primal-Dual Methods for Approximation Algorithms

Repetition: Primal Dual for Set Cover

Linear Programming and its Applications

Polytopes Course Notes

CS 372: Computational Geometry Lecture 10 Linear Programming in Fixed Dimension

CS612 Algorithms for Electronic Design Automation

Linear Programming. Linear Programming. Linear Programming. Example: Profit Maximization (1/4) Iris Hui-Ru Jiang Fall Linear programming

CMSC 451: Linear Programming

Approximation Algorithms

Discrete Optimization. Lecture Notes 2

1 The Traveling Salesperson Problem (TSP)

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018

The Simplex Algorithm

Chapter 15 Introduction to Linear Programming

DM515 Spring 2011 Weekly Note 7

5. Lecture notes on matroid intersection

AMS : Combinatorial Optimization Homework Problems - Week V

Linear Programming in Small Dimensions

LP-Modelling. dr.ir. C.A.J. Hurkens Technische Universiteit Eindhoven. January 30, 2008

CMPSCI611: The Simplex Algorithm Lecture 24

Approximation Algorithms

MA4254: Discrete Optimization. Defeng Sun. Department of Mathematics National University of Singapore Office: S Telephone:

Linear Programming 1

4 LINEAR PROGRAMMING (LP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1

Transcription:

CS 473: Algorithms Ruta Mehta University of Illinois, Urbana-Champaign Spring 2018 Ruta (UIUC) CS473 1 Spring 2018 1 / 36

CS 473: Algorithms, Spring 2018 LP Duality Lecture 20 April 3, 2018 Some of the slides are courtesy Prof. Chekuri1 Ruta (UIUC) CS473 2 Spring 2018 2 / 36

Part I Recall Ruta (UIUC) CS473 3 Spring 2018 3 / 36

Feasible Solutions and Lower Bounds Consider the program maximize 4x 1 + 2x 2 subject to x 1 + 3x 2 5 2x 1 4x 2 10 x 1 + x 2 7 x 1 5 Ruta (UIUC) CS473 4 Spring 2018 4 / 36

Feasible Solutions and Lower Bounds Consider the program maximize 4x 1 + 2x 2 subject to x 1 + 3x 2 5 2x 1 4x 2 10 x 1 + x 2 7 x 1 5 1 (2, 0) also feasible, and gives a better bound of 8. Ruta (UIUC) CS473 4 Spring 2018 4 / 36

Feasible Solutions and Lower Bounds Consider the program maximize 4x 1 + 2x 2 subject to x 1 + 3x 2 5 2x 1 4x 2 10 x 1 + x 2 7 x 1 5 1 (2, 0) also feasible, and gives a better bound of 8. 2 How good is 8 when compared with σ? Ruta (UIUC) CS473 4 Spring 2018 4 / 36

Obtaining Upper Bounds 1 Let us multiply the first constraint by 2 and the and add it to second constraint 2( x 1 + 3x 2 ) 2(5) +1( 2x 1 4x 2 ) 1(10) 4x 1 + 2x 2 20 2 Thus, 20 is an upper bound on the optimum value! Ruta (UIUC) CS473 5 Spring 2018 5 / 36

Generalizing... 1 Multiply first equation by y 1, second by y 2, third by y 3 and fourth by y 4 (all of y 1, y 2, y 3, y 4 being positive) and add y 1 ( x 1 + 3x 2 ) y 1 (5) +y 2 ( 2x 1 4x 2 ) y 2 (10) +y 3 ( x 1 + x 2 ) y 3 (7) +y 4 ( x 1 ) y 4 (5) (y 1 + 2y 2 + y 3 + y 4 )x 1 + (3y 1 4y 2 + y 3 )x 2... 2 5y 1 + 10y 2 + 7y 3 + 5y 4 is an upper bound, provided coefficients of x i are same as in the objective function, i.e., y 1 + 2y 2 + y 3 + y 4 = 4 3y 1 4y 2 + y 3 = 2 3 The best upper bound is when 5y 1 + 10y 2 + 7y 3 + 5y 4 is minimized! Ruta (UIUC) CS473 6 Spring 2018 6 / 36

Dual LP: Example Thus, the optimum value of program maximize 4x 1 + 2x 2 subject to x 1 + 3x 2 5 2x 1 4x 2 10 x 1 + x 2 7 x 1 5 is upper bounded by the optimal value of the program minimize 5y 1 + 10y 2 + 7y 3 + 5y 4 subject to y 1 + 2y 2 + y 3 + y 4 = 4 3y 1 4y 2 + y 3 = 2 y 1, y 2 0 Ruta (UIUC) CS473 7 Spring 2018 7 / 36

Dual Linear Program Given a linear program Π in canonical form maximize subject to d j=1 c jx j d j=1 a ijx j b i i = 1, 2,... n the dual Dual(Π) is given by minimize subject to n i =1 b i y i n i =1 y i a ij = c j y i 0 j = 1, 2,... d i = 1, 2,... n Ruta (UIUC) CS473 8 Spring 2018 8 / 36

Dual Linear Program Given a linear program Π in canonical form maximize subject to d j=1 c jx j d j=1 a ijx j b i i = 1, 2,... n the dual Dual(Π) is given by minimize subject to n i =1 b i y i n i =1 y i a ij = c j y i 0 j = 1, 2,... d i = 1, 2,... n Proposition Dual(Dual(Π)) is equivalent to Π Ruta (UIUC) CS473 8 Spring 2018 8 / 36

Duality Theorems Theorem (Weak Duality) If x is a feasible solution to Π and y is a feasible solution to Dual(Π) then c x y b. Ruta (UIUC) CS473 9 Spring 2018 9 / 36

Duality Theorems Theorem (Weak Duality) If x is a feasible solution to Π and y is a feasible solution to Dual(Π) then c x y b. Theorem (Strong Duality) If x is an optimal solution to Π and y is an optimal solution to Dual(Π) then c x = y b. Many applications! Maxflow-Mincut theorem can be deduced from duality. Ruta (UIUC) CS473 9 Spring 2018 9 / 36

Weak Duality Theorem (Weak Duality) If x is a feasible solution to Π and y is a feasible solution to Dual(Π) then c x y b. We already saw the proof by the way we derived it but we will do it again formally. Proof. Since y is feasible in Dual(Π): y A = c Ruta (UIUC) CS473 10 Spring 2018 10 / 36

Weak Duality Theorem (Weak Duality) If x is a feasible solution to Π and y is a feasible solution to Dual(Π) then c x y b. We already saw the proof by the way we derived it but we will do it again formally. Proof. Since y is feasible in Dual(Π): y A = c Therefore c x = y Ax Ruta (UIUC) CS473 10 Spring 2018 10 / 36

Weak Duality Theorem (Weak Duality) If x is a feasible solution to Π and y is a feasible solution to Dual(Π) then c x y b. We already saw the proof by the way we derived it but we will do it again formally. Proof. Since y is feasible in Dual(Π): y A = c Therefore c x = y Ax Since x is feasible in Π, Ax b and hence, c x = y Ax y b Ruta (UIUC) CS473 10 Spring 2018 10 / 36

Strong Duality Complementary Slackness maximize : subject to c x Ax b Dual minimize : subject to y b ya = c y 0 Definition (Complementary Slackness) x feasible in Π and y feasible in Dual(Π), s.t., i = 1..n, y i > 0 (Ax) i = b i Ruta (UIUC) CS473 11 Spring 2018 11 / 36

Strong Duality Complementary Slackness maximize : subject to c x Ax b Dual minimize : subject to y b ya = c y 0 Definition (Complementary Slackness) x feasible in Π and y feasible in Dual(Π), s.t., i = 1..n, y i > 0 (Ax) i = b i Theorem (x, y ) satisfies complementary slackness Strong duality holds, i.e., c x = y b. Ruta (UIUC) CS473 11 Spring 2018 11 / 36

Strong Duality Complementary Slackness maximize : subject to c x Ax b Dual minimize : subject to y b ya = c y 0 Definition (Complementary Slackness) x feasible in Π and y feasible in Dual(Π), s.t., i = 1..n, y i > 0 (Ax) i = b i Theorem (x, y ) satisfies complementary slackness Strong duality holds, i.e., c x = y b. Q: Why complementary slackness is satisfied at the optimum? Ruta (UIUC) CS473 11 Spring 2018 11 / 36

Complementary Slackness: Geometric View maximize : subject to c x Ax b Dual minimize : subject to y b ya = c y 0 i = 1..n, y i > 0 (Ax) i = b i Ruta (UIUC) CS473 12 Spring 2018 12 / 36

Complementary Slackness: Geometric View maximize : subject to c x Ax b Dual minimize : subject to y b ya = c y 0 i = 1..n, y i > 0 (Ax) i = b i x : optimum vertex. Suppose first d inequalities are tight at x. Ruta (UIUC) CS473 12 Spring 2018 12 / 36

Complementary Slackness: Geometric View maximize : subject to c x Ax b Dual minimize : subject to y b ya = c y 0 i = 1..n, y i > 0 (Ax) i = b i x : optimum vertex. Suppose first d inequalities are tight at x. c is in the cone of Ruta (UIUC) CS473 12 Spring 2018 12 / 36

Optimality implies Complementary Slackness x : Optimum vertex. Ruta (UIUC) CS473 13 Spring 2018 13 / 36

Optimality implies Complementary Slackness x : Optimum vertex. First d inequalities tight at x. Ax b splits into Âx = ˆb, Ãx < b Ruta (UIUC) CS473 13 Spring 2018 13 / 36

Optimality implies Complementary Slackness x : Optimum vertex. First d inequalities tight at x. Ax b splits into Âx = ˆb, Ãx < b Suppose c is NOT in the cone of rows of Â. Ruta (UIUC) CS473 13 Spring 2018 13 / 36

Optimality implies Complementary Slackness x : Optimum vertex. First d inequalities tight at x. Ax b splits into Âx = ˆb, Ãx < b Suppose c is NOT in the cone of rows of Â. Ruta (UIUC) CS473 13 Spring 2018 13 / 36

Optimality implies Complementary Slackness x : Optimum vertex. First d inequalities tight at x. Ax b splits into Âx = ˆb, Ãx < b Suppose c is NOT in the cone of rows of Â. There exists a hyperplane separating c from the cone. Ruta (UIUC) CS473 13 Spring 2018 13 / 36

Optimality implies Complementary Slackness x : Optimum vertex. First d inequalities tight at x. Ax b splits into Âx = ˆb, Ãx < b Suppose c is NOT in the cone of rows of Â. There exists a hyperplane separating c from the cone. Suppose cone is on the negative side, and c on the positive size. If the d is the normal vector of the hyperplane, then formally, Âd < 0, c d > 0 Ruta (UIUC) CS473 13 Spring 2018 13 / 36

Optimality implies Complementary Slackness x : Optimum vertex. First d inequalities tight at x. Ax b splits into Âx = ˆb, Ãx < b Suppose c is NOT in the cone of rows of Â. There exists a hyperplane separating c from the cone. Suppose cone is on the negative side, and c on the positive size. If the d is the normal vector of the hyperplane, then formally, Âd < 0, c d > 0 Also known as Farkas Lemma Ruta (UIUC) CS473 13 Spring 2018 13 / 36

Optimality implies Complementary Slackness x : Optimum vertex. Ruta (UIUC) CS473 14 Spring 2018 14 / 36

Optimality implies Complementary Slackness x : Optimum vertex. First d inequalities tight at x. Ax b splits into Âx = ˆb, Ãx < b Suppose c is NOT in the cone of rows of Â. There exists a hyperplane separating c from the cone. Âd < 0, c d > 0 Also known as Farkas Lemma Ruta (UIUC) CS473 14 Spring 2018 14 / 36

Optimality implies Complementary Slackness x : Optimum vertex. First d inequalities tight at x. Ax b splits into Âx = ˆb, Ãx < b Suppose c is NOT in the cone of rows of Â. There exists a hyperplane separating c from the cone. Âd < 0, c d > 0 Also known as Farkas Lemma Choose v. v. tiny ɛ > 0 such that Ã(x + ɛd) b. Â(x + ɛd) = Âx + ɛâd < ˆb Ruta (UIUC) CS473 14 Spring 2018 14 / 36

Optimality implies Complementary Slackness x : Optimum vertex. First d inequalities tight at x. Ax b splits into Âx = ˆb, Ãx < b Suppose c is NOT in the cone of rows of Â. There exists a hyperplane separating c from the cone. Âd < 0, c d > 0 Also known as Farkas Lemma Choose v. v. tiny ɛ > 0 such that Ã(x + ɛd) b. Â(x + ɛd) = Âx + ɛâd < ˆb (x + ɛd) is feasible in Π Ruta (UIUC) CS473 14 Spring 2018 14 / 36

Optimality implies Complementary Slackness x : Optimum vertex. First d inequalities tight at x. Ax b splits into Âx = ˆb, Ãx < b Suppose c is NOT in the cone of rows of Â. There exists a hyperplane separating c from the cone. Âd < 0, c d > 0 Also known as Farkas Lemma Choose v. v. tiny ɛ > 0 such that Ã(x + ɛd) b. Â(x + ɛd) = Âx + ɛâd < ˆb (x + ɛd) is feasible in Π c (x + ɛd) = c x + ɛ(c d) Ruta (UIUC) CS473 14 Spring 2018 14 / 36

Optimality implies Complementary Slackness x : Optimum vertex. First d inequalities tight at x. Ax b splits into Âx = ˆb, Ãx < b Suppose c is NOT in the cone of rows of Â. There exists a hyperplane separating c from the cone. Âd < 0, c d > 0 Also known as Farkas Lemma Choose v. v. tiny ɛ > 0 such that Ã(x + ɛd) b. Â(x + ɛd) = Âx + ɛâd < ˆb (x + ɛd) is feasible in Π c (x + ɛd) = c x + ɛ(c d) > c x Ruta (UIUC) CS473 14 Spring 2018 14 / 36

Optimality implies Complementary Slackness x : Optimum vertex. First d inequalities tight at x. Ax b splits into Âx = ˆb, Ãx < b Suppose c is NOT in the cone of rows of Â. There exists a hyperplane separating c from the cone. Âd < 0, c d > 0 Also known as Farkas Lemma Choose v. v. tiny ɛ > 0 such that Ã(x + ɛd) b. Â(x + ɛd) = Âx + ɛâd < ˆb (x + ɛd) is feasible in Π c (x + ɛd) = c x + ɛ(c d) > c x x is NOT optimum! Ruta (UIUC) CS473 14 Spring 2018 14 / 36

Proof of Strong Duality x : Optimum vertex. First d inequalities tight at x. Ax b splits into Âx = ˆb, Ãx < b Then c IS in the cone of rows of Â. Ruta (UIUC) CS473 15 Spring 2018 15 / 36

Proof of Strong Duality x : Optimum vertex. First d inequalities tight at x. Ax b splits into Âx = ˆb, Ãx < b Then c IS in the cone of rows of Â. y feasible in Dual(Π) such that (x, y ) satisfies complementary slackness. Ruta (UIUC) CS473 15 Spring 2018 15 / 36

Proof of Strong Duality x : Optimum vertex. First d inequalities tight at x. Ax b splits into Âx = ˆb, Ãx < b Then c IS in the cone of rows of Â. y feasible in Dual(Π) such that (x, y ) satisfies complementary slackness. (x, y ) satisfies strong duality, c x = y b. Ruta (UIUC) CS473 15 Spring 2018 15 / 36

Proof of Strong Duality x : Optimum vertex. First d inequalities tight at x. Ax b splits into Âx = ˆb, Ãx < b Then c IS in the cone of rows of Â. y feasible in Dual(Π) such that (x, y ) satisfies complementary slackness. (x, y ) satisfies strong duality, c x = y b. (implies y optimal in Dual(Π)). Ruta (UIUC) CS473 15 Spring 2018 15 / 36

Proof of Strong Duality x : Optimum vertex. First d inequalities tight at x. Ax b splits into Âx = ˆb, Ãx < b Then c IS in the cone of rows of Â. y feasible in Dual(Π) such that (x, y ) satisfies complementary slackness. (x, y ) satisfies strong duality, c x = y b. (implies y optimal in Dual(Π)). Theorem (Strong Duality) If x is an optimal solution to Π and y is an optimal solution to Dual(Π) then c x = y b. Ruta (UIUC) CS473 15 Spring 2018 15 / 36

Optimization vs Feasibility Suppose we want to solve LP of the form: max cx subject to Ax b It is an optimization problem. Can we reduce it to a decision problem? Ruta (UIUC) CS473 16 Spring 2018 16 / 36

Optimization vs Feasibility Suppose we want to solve LP of the form: max cx subject to Ax b It is an optimization problem. Can we reduce it to a decision problem? Yes, via binary search. Find the largest values of σ such that the system of inequalities Ax b, cx σ is feasible. Feasible implies that there is at least one solution. Ruta (UIUC) CS473 16 Spring 2018 16 / 36

Optimization vs Feasibility Suppose we want to solve LP of the form: max cx subject to Ax b It is an optimization problem. Can we reduce it to a decision problem? Yes, via binary search. Find the largest values of σ such that the system of inequalities Ax b, cx σ is feasible. Feasible implies that there is at least one solution. Caveat: to do binary search need to know the range of numbers. Skip for now since we need to worry about precision issues etc. Ruta (UIUC) CS473 16 Spring 2018 16 / 36

Certificate for (in)feasibility Suppose we have a system of m inequalities in n variables defined by Ax b How can we convince some one that there is a feasible solution? How can we convince some one that there is no feasible solution? Ruta (UIUC) CS473 17 Spring 2018 17 / 36

Theorem of the Alternatives Theorem Let A R m n and b R m. Exactly one of the following two holds (i) The system Ax b is feasible. (ii) There is a y R m such that y 0 and ya = 0 and yb < 0. In other words, if Ax b is infeasible we can demonstrate it as follows: Find a non-negative combination of the rows of A (given by certificate y) to get a contradiction 0 < 0. Ruta (UIUC) CS473 18 Spring 2018 18 / 36

Theorem of the Alternatives Theorem Let A R m n and b R m. Exactly one of the following two holds (i) The system Ax b is feasible. (ii) There is a y R m such that y 0 and ya = 0 and yb < 0. In other words, if Ax b is infeasible we can demonstrate it as follows: Find a non-negative combination of the rows of A (given by certificate y) to get a contradiction 0 < 0. 0 = (ya) Ruta (UIUC) CS473 18 Spring 2018 18 / 36

Theorem of the Alternatives Theorem Let A R m n and b R m. Exactly one of the following two holds (i) The system Ax b is feasible. (ii) There is a y R m such that y 0 and ya = 0 and yb < 0. In other words, if Ax b is infeasible we can demonstrate it as follows: Find a non-negative combination of the rows of A (given by certificate y) to get a contradiction 0 < 0. 0 = (ya)x = y(ax) = Ruta (UIUC) CS473 18 Spring 2018 18 / 36

Theorem of the Alternatives Theorem Let A R m n and b R m. Exactly one of the following two holds (i) The system Ax b is feasible. (ii) There is a y R m such that y 0 and ya = 0 and yb < 0. In other words, if Ax b is infeasible we can demonstrate it as follows: Find a non-negative combination of the rows of A (given by certificate y) to get a contradiction 0 < 0. 0 = (ya)x = y(ax) = y b < 0! Ruta (UIUC) CS473 18 Spring 2018 18 / 36

Theorem of the Alternatives Theorem Let A R m n and b R m. Exactly one of the following two holds (i) The system Ax b is feasible. (ii) There is a y R m such that y 0 and ya = 0 and yb < 0. In other words, if Ax b is infeasible we can demonstrate it as follows: Find a non-negative combination of the rows of A (given by certificate y) to get a contradiction 0 < 0. 0 = (ya)x = y(ax) = y b < 0! (possibly) the Dual is unbounded! Ruta (UIUC) CS473 18 Spring 2018 18 / 36

Theorem of the Alternatives Theorem Let A R m n and b R m. Exactly one of the following two holds (i) The system Ax b is feasible. (ii) There is a y R m such that y 0 and ya = 0 and yb < 0. In other words, if Ax b is infeasible we can demonstrate it as follows: Find a non-negative combination of the rows of A (given by certificate y) to get a contradiction 0 < 0. 0 = (ya)x = y(ax) = y b < 0! (possibly) the Dual is unbounded! The preceding theorem can also be used to prove strong duality. Ruta (UIUC) CS473 18 Spring 2018 18 / 36

Theorem of the Alternatives Farkas lemma is another such theorem. These are related. Ruta (UIUC) CS473 18 Spring 2018 18 / 36 Theorem Let A R m n and b R m. Exactly one of the following two holds (i) The system Ax b is feasible. (ii) There is a y R m such that y 0 and ya = 0 and yb < 0. In other words, if Ax b is infeasible we can demonstrate it as follows: Find a non-negative combination of the rows of A (given by certificate y) to get a contradiction 0 < 0. 0 = (ya)x = y(ax) = y b < 0! (possibly) the Dual is unbounded! The preceding theorem can also be used to prove strong duality.

Duality for another canonical form Compactly, for the primal LP Π max c x subject to Ax b, x 0 the dual LP is Dual(Π) min y b subject to ya c, y 0 Definition (Complementary Slackness) x feasible in Π and y feasible in Dual(Π), s.t., i = 1,..., n, y i > 0 (Ax) i = b i j = 1,..., d, x j > 0 (ya) j = c j Ruta (UIUC) CS473 19 Spring 2018 19 / 36

In General... from Jeff s notes Ruta (UIUC) CS473 20 Spring 2018 20 / 36

Some Useful Duality Properties Assume primal LP is a maximization LP. For a given LP, Dual is another LP. The variables in the dual correspond to non-trival primal constraints and vice-versa. Dual of the dual LP give us back the primal LP. Weak and strong duality theorems. Ruta (UIUC) CS473 21 Spring 2018 21 / 36

Some Useful Duality Properties Assume primal LP is a maximization LP. For a given LP, Dual is another LP. The variables in the dual correspond to non-trival primal constraints and vice-versa. Dual of the dual LP give us back the primal LP. Weak and strong duality theorems. If primal is unbounded (objective achieves infinity) then dual LP is infeasible. Why? If dual LP had a feasible solution it would upper bound the primal LP which is not possible. Ruta (UIUC) CS473 21 Spring 2018 21 / 36

Some Useful Duality Properties Assume primal LP is a maximization LP. For a given LP, Dual is another LP. The variables in the dual correspond to non-trival primal constraints and vice-versa. Dual of the dual LP give us back the primal LP. Weak and strong duality theorems. If primal is unbounded (objective achieves infinity) then dual LP is infeasible. Why? If dual LP had a feasible solution it would upper bound the primal LP which is not possible. Primal and dual optimum solutions satisfy complementary slackness conditions (discussed soon). Ruta (UIUC) CS473 21 Spring 2018 21 / 36

Part II Examples of Duality Ruta (UIUC) CS473 22 Spring 2018 22 / 36

Network flow s-t flow in directed graph G = (V, E) with capacities c. Assume for simplicity that no incoming edges into s. max x(s, v) (s,v) E x(u, v) x(v, w) = 0 v V \ {s, t} (u,v) E (v,w) E x(u, v) c(u, v) (u, v) E x(u, v) 0 (u, v) E. Ruta (UIUC) CS473 23 Spring 2018 23 / 36

Network flow: Equivalent formulation Directed graph G = (V, E), with capacities on edges. Add a t to s edge with infinite capacity. Now maximize flow on this edge. max x(t, s) x(u, v) x(v, w) = 0 v V (u,v) E (v,w) E x(u, v) c(u, v) (u, v) E \ (t, s) x(u, v) 0 (u, v) E. Ruta (UIUC) CS473 24 Spring 2018 24 / 36

Dual of Network Flow Ruta (UIUC) CS473 25 Spring 2018 25 / 36

Part III Integer Linear Programming Ruta (UIUC) CS473 26 Spring 2018 26 / 36

Integer Linear Programming Problem Find a vector x Z d (integer values) that maximize subject to d j=1 c jx j d j=1 a ijx j b i for i = 1... n Input is matrix A = (a ij ) R n d, column vector b = (b i ) R n, and row vector c = (c j ) R d Ruta (UIUC) CS473 27 Spring 2018 27 / 36

Factory Example maximize x 1 + 6x 2 subject to x 1 200 x 2 300 x 1 + x 2 400 x 1, x 2 0 Suppose we want x 1, x 2 to be integer valued. Ruta (UIUC) CS473 28 Spring 2018 28 / 36

Factory Example Figure x 2 300 x 1 200 1 Feasible values of x 1 and x 2 are integer points in shaded region 2 Optimization function is a line; moving the line until it just leaves the final integer point in feasible region, gives optimal values Ruta (UIUC) CS473 29 Spring 2018 29 / 36

Factory Example Figure x 2 300 x 1 200 1 Feasible values of x 1 and x 2 are integer points in shaded region 2 Optimization function is a line; moving the line until it just leaves the final integer point in feasible region, gives optimal values Ruta (UIUC) CS473 29 Spring 2018 29 / 36

Factory Example Figure x 2 300 x 1 200 1 Feasible values of x 1 and x 2 are integer points in shaded region 2 Optimization function is a line; moving the line until it just leaves the final integer point in feasible region, gives optimal values Ruta (UIUC) CS473 29 Spring 2018 29 / 36

Integer Programming Can model many difficult discrete optimization problems as integer programs! Therefore integer programming is a hard problem. NP-hard. Ruta (UIUC) CS473 30 Spring 2018 30 / 36

Integer Programming Can model many difficult discrete optimization problems as integer programs! Therefore integer programming is a hard problem. NP-hard. Can relax integer program to linear program and approximate. Ruta (UIUC) CS473 30 Spring 2018 30 / 36

Integer Programming Can model many difficult discrete optimization problems as integer programs! Therefore integer programming is a hard problem. NP-hard. Can relax integer program to linear program and approximate. Practice: integer programs are solved by a variety of methods 1 branch and bound 2 branch and cut 3 adding cutting planes 4 linear programming plays a fundamental role Ruta (UIUC) CS473 30 Spring 2018 30 / 36

Example: Maximum Independent Set Definition Given undirected graph G = (V, E) a subset of nodes S V is an independent set (also called a stable set) if for there are no edges between nodes in S. That is, if u, v S then (u, v) E. Input Graph G = (V, E) Goal Find maximum sized independent set in G Ruta (UIUC) CS473 31 Spring 2018 31 / 36

Example: Dominating Set Definition Given undirected graph G = (V, E) a subset of nodes S V is a dominating set if for all v V, either v S or a neighbor of v is in S. Input Graph G = (V, E), weights w(v) 0 for v V Goal Find minimum weight dominating set in G Ruta (UIUC) CS473 32 Spring 2018 32 / 36

Example: s-t minimum cut and implicit constraints Input Graph G = (V, E), edge capacities c(e), e E. s, t V Goal Find minimum capacity s-t cut in G. Ruta (UIUC) CS473 33 Spring 2018 33 / 36

Linear Programs with Integer Vertices Suppose we know that for a linear program all vertices have integer coordinates. Ruta (UIUC) CS473 34 Spring 2018 34 / 36

Linear Programs with Integer Vertices Suppose we know that for a linear program all vertices have integer coordinates. Then solving linear program is same as solving integer program. We know how to solve linear programs efficiently (polynomial time) and hence we get an integer solution for free! Ruta (UIUC) CS473 34 Spring 2018 34 / 36

Linear Programs with Integer Vertices Suppose we know that for a linear program all vertices have integer coordinates. Then solving linear program is same as solving integer program. We know how to solve linear programs efficiently (polynomial time) and hence we get an integer solution for free! Luck or Structure: 1 Linear program for flows with integer capacities have integer vertices 2 Linear program for matchings in bipartite graphs have integer vertices 3 A complicated linear program for matchings in general graphs have integer vertices. All of above problems can hence be solved efficiently. Ruta (UIUC) CS473 34 Spring 2018 34 / 36

Linear Programs with Integer Vertices Meta Theorem: A combinatorial optimization problem can be solved efficiently if and only if there is a linear program for problem with integer vertices. Consequence of the Ellipsoid method for solving linear programming. In a sense linear programming and other geometric generalizations such as convex programming are the most general problems that we can solve efficiently. Ruta (UIUC) CS473 35 Spring 2018 35 / 36

Summary 1 Linear Programming is a useful and powerful (modeling) problem. Ruta (UIUC) CS473 36 Spring 2018 36 / 36

Summary 1 Linear Programming is a useful and powerful (modeling) problem. 2 Can be solved in polynomial time. Practical solvers available commercially as well as in open source. Whether there is a strongly polynomial time algorithm is a major open problem. Ruta (UIUC) CS473 36 Spring 2018 36 / 36

Summary 1 Linear Programming is a useful and powerful (modeling) problem. 2 Can be solved in polynomial time. Practical solvers available commercially as well as in open source. Whether there is a strongly polynomial time algorithm is a major open problem. 3 Geometry and linear algebra are important to understand the structure of LP and in algorithm design. Vertex solutions imply that LPs have poly-sized optimum solutions. This implies that LP is in NP. Ruta (UIUC) CS473 36 Spring 2018 36 / 36

Summary 1 Linear Programming is a useful and powerful (modeling) problem. 2 Can be solved in polynomial time. Practical solvers available commercially as well as in open source. Whether there is a strongly polynomial time algorithm is a major open problem. 3 Geometry and linear algebra are important to understand the structure of LP and in algorithm design. Vertex solutions imply that LPs have poly-sized optimum solutions. This implies that LP is in NP. 4 Duality is a critical tool in the theory of linear programming. Duality implies the Linear Programming is in co-np. Do you see why? Ruta (UIUC) CS473 36 Spring 2018 36 / 36

Summary 1 Linear Programming is a useful and powerful (modeling) problem. 2 Can be solved in polynomial time. Practical solvers available commercially as well as in open source. Whether there is a strongly polynomial time algorithm is a major open problem. 3 Geometry and linear algebra are important to understand the structure of LP and in algorithm design. Vertex solutions imply that LPs have poly-sized optimum solutions. This implies that LP is in NP. 4 Duality is a critical tool in the theory of linear programming. Duality implies the Linear Programming is in co-np. Do you see why? 5 Integer Programming in NP-Complete. LP-based techniques critical in heuristically solving integer programs. Ruta (UIUC) CS473 36 Spring 2018 36 / 36