CS6841: Advanced Algorithms IIT Madras, Spring 2016 Lecture #1: Properties of LP Optimal Solutions + Machine Scheduling February 24, 2016

Similar documents
Mathematical and Algorithmic Foundations Linear Programming and Matchings

ORIE 6300 Mathematical Programming I September 2, Lecture 3

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018

College of Computer & Information Science Fall 2007 Northeastern University 14 September 2007

1. Lecture notes on bipartite matching

Approximation Algorithms

1. Lecture notes on bipartite matching February 4th,

11. APPROXIMATION ALGORITHMS

11.1 Facility Location

11. APPROXIMATION ALGORITHMS

Lecture 6: Faces, Facets

11. APPROXIMATION ALGORITHMS

2 Solving the Online Fractional Algorithm and Online Rounding Scheme

6.854 Advanced Algorithms. Scribes: Jay Kumar Sundararajan. Duality

Approximation slides 1. An optimal polynomial algorithm for the Vertex Cover and matching in Bipartite graphs

15-451/651: Design & Analysis of Algorithms October 11, 2018 Lecture #13: Linear Programming I last changed: October 9, 2018

15-854: Approximations Algorithms Lecturer: Anupam Gupta Topic: Direct Rounding of LP Relaxations Date: 10/31/2005 Scribe: Varun Gupta

Linear Programming Duality and Algorithms

Lesson 17. Geometry and Algebra of Corner Points

POLYHEDRAL GEOMETRY. Convex functions and sets. Mathematical Programming Niels Lauritzen Recall that a subset C R n is convex if

6. Lecture notes on matroid intersection

EC 521 MATHEMATICAL METHODS FOR ECONOMICS. Lecture 2: Convex Sets

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 29

Algorithm Design and Analysis

Lecture 7: Asymmetric K-Center

Introduction to Operations Research

Lecture 14: Linear Programming II

Graphs and Network Flows IE411. Lecture 21. Dr. Ted Ralphs

In this lecture, we ll look at applications of duality to three problems:

Iterative Methods in Combinatorial Optimization. R. Ravi Carnegie Bosch Professor Tepper School of Business Carnegie Mellon University

/ Approximation Algorithms Lecturer: Michael Dinitz Topic: Linear Programming Date: 2/24/15 Scribe: Runze Tang

Lecture Notes 2: The Simplex Algorithm

Advanced Operations Research Techniques IE316. Quiz 1 Review. Dr. Ted Ralphs

1 Unweighted Set Cover

11 Linear Programming

Lecture 2 - Introduction to Polytopes

DM545 Linear and Integer Programming. Lecture 2. The Simplex Method. Marco Chiarandini

15-451/651: Design & Analysis of Algorithms November 4, 2015 Lecture #18 last changed: November 22, 2015

Lecture 4: Rational IPs, Polyhedron, Decomposition Theorem

Numerical Optimization

Steiner Trees and Forests

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I. Instructor: Shaddin Dughmi

Lecture 9: Pipage Rounding Method

Math 414 Lecture 30. The greedy algorithm provides the initial transportation matrix.

12.1 Formulation of General Perfect Matching

Notes for Lecture 24

Applied Lagrange Duality for Constrained Optimization

OPERATIONS RESEARCH. Linear Programming Problem

Lecture 10,11: General Matching Polytope, Maximum Flow. 1 Perfect Matching and Matching Polytope on General Graphs

CSC Linear Programming and Combinatorial Optimization Lecture 12: Semidefinite Programming(SDP) Relaxation

Approximation Algorithms: The Primal-Dual Method. My T. Thai

CS522: Advanced Algorithms

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 36

Lecture notes on the simplex method September We will present an algorithm to solve linear programs of the form. maximize.

Lecture 7. s.t. e = (u,v) E x u + x v 1 (2) v V x v 0 (3)

Section Notes 5. Review of Linear Programming. Applied Math / Engineering Sciences 121. Week of October 15, 2017

Lecture 1. 2 Motivation: Fast. Reliable. Cheap. Choose two.

CS369G: Algorithmic Techniques for Big Data Spring

LECTURES 3 and 4: Flows and Matchings

6.854J / J Advanced Algorithms Fall 2008

ACO Comprehensive Exam October 12 and 13, Computability, Complexity and Algorithms

Math 414 Lecture 2 Everyone have a laptop?

4 Integer Linear Programming (ILP)

6 Randomized rounding of semidefinite programs

4 Greedy Approximation for Set Cover

Module 7. Independent sets, coverings. and matchings. Contents

C&O 355 Lecture 16. N. Harvey

Lecture 2. 1 Introduction. 2 The Set Cover Problem. COMPSCI 632: Approximation Algorithms August 30, 2017

Greedy Approximations

Treewidth and graph minors

Lecture 5: Properties of convex sets

MA4254: Discrete Optimization. Defeng Sun. Department of Mathematics National University of Singapore Office: S Telephone:

1 Linear Programming. 1.1 Optimizion problems and convex polytopes 1 LINEAR PROGRAMMING

Week 5. Convex Optimization

Lecture 3. Corner Polyhedron, Intersection Cuts, Maximal Lattice-Free Convex Sets. Tepper School of Business Carnegie Mellon University, Pittsburgh

Computability Theory

Minimum sum multicoloring on the edges of trees

Problem set 2. Problem 1. Problem 2. Problem 3. CS261, Winter Instructor: Ashish Goel.

Ma/CS 6b Class 2: Matchings

In this chapter we introduce some of the basic concepts that will be useful for the study of integer programming problems.

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18

Matching Theory. Figure 1: Is this graph bipartite?

3 No-Wait Job Shops with Variable Processing Times

Outline. CS38 Introduction to Algorithms. Linear programming 5/21/2014. Linear programming. Lecture 15 May 20, 2014

LP Geometry: outline. A general LP. minimize x c T x s.t. a T i. x b i, i 2 M 1 a T i x = b i, i 2 M 3 x j 0, j 2 N 1. where

Approximation Algorithms

CMPSCI611: The Simplex Algorithm Lecture 24

Finite Math Linear Programming 1 May / 7

Approximation Algorithms

IE 5531: Engineering Optimization I

Induction Review. Graphs. EECS 310: Discrete Math Lecture 5 Graph Theory, Matching. Common Graphs. a set of edges or collection of two-elt subsets

Duality. Primal program P: Maximize n. Dual program D: Minimize m. j=1 c jx j subject to n. j=1. i=1 b iy i subject to m. i=1

Math 5593 Linear Programming Lecture Notes

Greedy Algorithms and Matroids. Andreas Klappenecker

1 Linear programming relaxation

Introduction to Mathematical Programming IE496. Final Review. Dr. Ted Ralphs

Introduction to optimization

CS473-Algorithms I. Lecture 13-A. Graphs. Cevdet Aykanat - Bilkent University Computer Engineering Department

Trapezoidal decomposition:

The Simplex Algorithm

Transcription:

CS6841: Advanced Algorithms IIT Madras, Spring 2016 Lecture #1: Properties of LP Optimal Solutions + Machine Scheduling February 24, 2016 Lecturer: Ravishankar Krishnaswamy Scribe: Anand Kumar(CS15M010) 1 Linear Programming Linear programming is technique for solving the optimization problem. it has equational form such that c T x minimize Ax = b x 0 where we are given matrix A of size m n where m is number of constraints and n is number of variables. where we are given m-vector b = (b 1, b 2,...b m ) T and n-vector c = (c 1, c 2,...c n ) T. Find an n-vector x = (x 1, x 2,...x n ) T to minimize the c T x subject to constraints Ax = b x 0 Linear programming is technique for optimized feasible solution for objective function while following the linear inequalities and linear equalities. The constraints in linear program forms what is called a polytope. Linear programming algorithm find a point in feasible region, if there exist, on which the objective function has minimum (or maximum) value. Solutions of LPs can be categorized into three useful types: Basic feasible Solution : Suppose the rows of A are linearly independent, i.e., m n (otherwise we can delete redundant rows of A). Now, a Basic Solution is defined as follows: Let S [n] be such that columns of A S are linearly independent and S = m. Then A S has size m m and rank m. Then define x S = (A S ) 1 b, and define x to be equal to x S for the indices in S and 0 for all other indices. All such completed x s are called basic solutions. Additionally if x s 0 the solution is called a basic feasible solution. Of course, we set x [n]\s = 0. Theorem 1.1. IF polytope is feasible then there exist a basic feasible solution Theorem 1.2. If linear programming has optimal solution then there exist a feasible solution which is also a basic feasible solution, means optimal solution is also a basic feasible solution Proof. Proof was covered in Lecture 08. 1

Now we define the second and third kind of solutions, which take a geometric view point. Definition 1 : x P is called a vertex/corner point, where P is the polytope. iff c such that (minimization problem) c T x < c T x x P such that x x Definition 2 x P called extreme point if x can not be written as λx 1 + (1 λ)x 2 for x 1, x 2 P Definition 3 x P is basic feasible solotion if,s [n], S = m A S has rank m x S = (A S ) 1 b 0 x [n]\s = 0 Theorem 1.3. corner extreme bfs Proof. 1. x is corner = x is extreme lets prove by contradiction suppose x is not extreme x = λx 1 + (1 λ)x 2 where x 1, x 2 p c T x = λc T x 1 + (1 λ)c T x 2 < λc T x + (1 λ)c T x = c T x 2. x is extreme = x is basic feasible solution x is extreme = x is bfs s = j : x j > 0 all entries in x are strictly positive we need to show that (a) s m (b) A s has rank m (c) x s = (A s ) 1 b (d) x [n]\s = 0 proving all above Suppose that s > m We know that rank(a)=m which means columns of s are linearly independent w s 0 such that A s.w s = 0 extended to A by setting w j = 0 if j s A.w = 0 w 0 2

x 1 = x εw such that Ax 1 = b (Aw = 0) x 2 = x + εw such that Ax 2 = b if ε is small enough x 1 0 x 2 0 same proof if s < m but column of A s are linearly dependents s m and A s linearly independent extreme s = s some m s column such that rank(s )= m and s = m x s = (A s ) 1 b is the desired basic feasible solution here x s = x, A s has rank m X s is the unique slotion to (A s ) 1 b 3. x is basic feasible solution(bfs) = x is corner x is bfs S A s id full rank s = m x s = (A s ) 1 b x [n]\s = 0 choose c i if i [n]\ c i = 0 i s c T x < c T x for x p x x, c T x = 0 claim c t x > 0 for all x x, x p if Ax = b andx = 0 implies x = x 2 Application- Machine Scheduling There are n jobs and m machines. We have to find assignment of jobs to machine to minimize the load maximum load.this problem is similar to graham s list scheduling algorithm. This machine scheduling problem NP-hard problem. It is also similar to subset sum problem. one approach to solve this problem is greedy approach which is 2 approximation. This greedy algorithm perform worse when there are m jobs of size 1 and one job of size m.where p j is the processing time of job j. makespan minimization on unrelated machines: let p i,j processing time of j th job on i th machine. Machines are identical if p i,j = p j i. x i,j is variable for job j assign to i th machine. Mininise T(makespan) using LP n x i,j = 1 i=1 j(jobs) pi,j.x i,j T x i,j 0 i, j Consider the instance where there are m machines and only one job, and each machine needs a time m to process the job. The optimal integral solution will assign the job arbitrarily to some machine, with a makespan of m, while the optimal solution of our LP relaxation is fractional, and splits the job in equal parts to all the machines, with a makespan of 1; so the integrality gap is at least m. let have one job p i,j = p j i [machine] optimal LP, x i,j = 1 m T = p j m 3

by itself LP is not useful for multiplicative comparison.it can not find schedule for makespan ct where c is constant circumvent the LP gap we guess the optimal makespan value T so the optimization problem become the feasibility checking problem xi,j = 1 j(jobs) pi,j.x i,j T p i.j x i.j + s i = T How to round this LP solve LP, X= optimal solution X is basic feasible point and extreme point where x i,j 0 s i 0 M=number of constraints= m(machines)+n(jobs) number of nonzeor x i,j s m + n truly fractional variables 0 x i,j 1 m Let represent job scheduling by the graph G. this graph has an edge(i,j) ifx i,j > 0. this is bipartide graph. Figure 1.1: Bipartide of machine and job lemma: Any extreme point od LP(T ) can have atmost m fractionally assigned jobs. Proof. Let X be an extreme point of LP(T ), and k represents the number of job-machine assignations in X, i.e. the number of non-zero variables, so by the previous lemma we know that k n + m. Each job needs at least one such assignation, and it needs more than one assignation iff it is fractionally assigned. Therefore the number of fractionally assigned jobs in X is k n m Lemma for any extreme point X claim graph G(X) look like forest without cycle -on cycle. Proof. Fix an extreme point X. The graph G(X) is bipartite, it has m+n vertices and at most m+n edges. consider two case. If G is connected, then G(X) necessarily corresponds to a bipartide tree ( one vertices set is having m vertices other set is having n vertices ) with one extra edge means one cycle. so it be come a tree with one cycle. 4

If G is not connected, then we must prove that every connected component is a tree with at most one cycle.lets prove by contradiction, lets there is a connected component C which is not a tree with at most one cycle C represent remaining G(x).Then C and C, are two separate scheduling instances, and the restrictions of X in them, X C and X C. but the X C cannot be an extreme point, because C= G(X C ) is connected and not tree with at most one cycle. so it would violate case 1.x C can be written as combination of other feasible points such as X C =λy 1 + (1 λ)y 2 where y 1 y 2. but X could also be explained in terms of other feasible points of original instances such that X = λ(x C + y 1 ) + (1 λ)(x C + y 2 ), which is contradicts the fact that x was an extreme point. Therefore each components of G is a tree with at most one cycle. Let X be the optimal extreme point of LP(T ). it gives makespan of value T OP T.if We consider the graph G(X) with many components and each components is tree with atmost one cycle. Graph is actually contain machine and fractionally assigned jobs, where edge represent job assigned to machine. all leaves are machine. we just need to find a matching that have all the jobs. till there are leaves in the graph, pick a leaf add the corresponding edge to the matching, and remove from the graph the corresponding job with all its child when their are no more leaves,graph become either a empty graph or an even cycle.in the end we have perfect matching so we can find a matching that covers all fractionally assigned jobs. 5