R n a T i x = b i} is a Hyperplane.

Similar documents
Advanced Operations Research Techniques IE316. Quiz 1 Review. Dr. Ted Ralphs

Lecture Notes 2: The Simplex Algorithm

DM545 Linear and Integer Programming. Lecture 2. The Simplex Method. Marco Chiarandini

Mathematical and Algorithmic Foundations Linear Programming and Matchings

COMP331/557. Chapter 2: The Geometry of Linear Programming. (Bertsimas & Tsitsiklis, Chapter 2)

Lesson 17. Geometry and Algebra of Corner Points

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 29

4 LINEAR PROGRAMMING (LP) E. Amaldi Fondamenti di R.O. Politecnico di Milano 1

Section Notes 5. Review of Linear Programming. Applied Math / Engineering Sciences 121. Week of October 15, 2017

The Simplex Algorithm

IE 5531: Engineering Optimization I

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 50

Linear programming and duality theory

LP Geometry: outline. A general LP. minimize x c T x s.t. a T i. x b i, i 2 M 1 a T i x = b i, i 2 M 3 x j 0, j 2 N 1. where

An iteration of the simplex method (a pivot )

Advanced Operations Research Techniques IE316. Quiz 2 Review. Dr. Ted Ralphs

ORIE 6300 Mathematical Programming I September 2, Lecture 3

Outline. CS38 Introduction to Algorithms. Linear programming 5/21/2014. Linear programming. Lecture 15 May 20, 2014

Submodularity Reading Group. Matroid Polytopes, Polymatroid. M. Pawan Kumar

3. The Simplex algorithmn The Simplex algorithmn 3.1 Forms of linear programs

Some Advanced Topics in Linear Programming

Linear Optimization. Andongwisye John. November 17, Linkoping University. Andongwisye John (Linkoping University) November 17, / 25

Discrete Optimization 2010 Lecture 5 Min-Cost Flows & Total Unimodularity

CS675: Convex and Combinatorial Optimization Spring 2018 The Simplex Algorithm. Instructor: Shaddin Dughmi

Introduction to Operations Research

maximize c, x subject to Ax b,

Linear Programming Duality and Algorithms

POLYHEDRAL GEOMETRY. Convex functions and sets. Mathematical Programming Niels Lauritzen Recall that a subset C R n is convex if

Introduction to Mathematical Programming IE406. Lecture 4. Dr. Ted Ralphs

Math 414 Lecture 2 Everyone have a laptop?

16.410/413 Principles of Autonomy and Decision Making

Chapter 4 Concepts from Geometry

Lecture 4: Linear Programming

Linear Programming and its Applications

Linear Programming. Course review MS-E2140. v. 1.1

Lecture 2 - Introduction to Polytopes

Math 5593 Linear Programming Lecture Notes

OPERATIONS RESEARCH. Linear Programming Problem

Linear Programming in Small Dimensions

Linear programming II João Carlos Lourenço

AMATH 383 Lecture Notes Linear Programming

AMS : Combinatorial Optimization Homework Problems - Week V

CS 372: Computational Geometry Lecture 10 Linear Programming in Fixed Dimension

Chapter 15 Introduction to Linear Programming

Advanced Linear Programming. Organisation. Lecturers: Leen Stougie, CWI and Vrije Universiteit in Amsterdam

4.1 Graphical solution of a linear program and standard form

Introduction to Mathematical Programming IE496. Final Review. Dr. Ted Ralphs

Polytopes Course Notes

Introduction to Linear Programming

NATCOR Convex Optimization Linear Programming 1

Linear Programming Motivation: The Diet Problem

MA4254: Discrete Optimization. Defeng Sun. Department of Mathematics National University of Singapore Office: S Telephone:

What is linear programming (LP)? NATCOR Convex Optimization Linear Programming 1. Solving LP problems: The standard simplex method

Artificial Intelligence

Duality. Primal program P: Maximize n. Dual program D: Minimize m. j=1 c jx j subject to n. j=1. i=1 b iy i subject to m. i=1

15.082J and 6.855J. Lagrangian Relaxation 2 Algorithms Application to LPs

CMPSCI611: The Simplex Algorithm Lecture 24

CS522: Advanced Algorithms

MATH 310 : Degeneracy and Geometry in the Simplex Method

Integer Programming Theory

Theory of Linear Programming

Solutions for Operations Research Final Exam

Simplex Algorithm in 1 Slide

Applied Integer Programming

Dual-fitting analysis of Greedy for Set Cover

Introductory Operations Research

Computational Geometry

A PRIMAL-DUAL EXTERIOR POINT ALGORITHM FOR LINEAR PROGRAMMING PROBLEMS

The Simplex Method applies to linear programming problems in standard form :

Linear Programming. Larry Blume. Cornell University & The Santa Fe Institute & IHS

Part 4. Decomposition Algorithms Dantzig-Wolf Decomposition Algorithm

Modeling and Analysis of Hybrid Systems

Modeling and Analysis of Hybrid Systems

ACTUALLY DOING IT : an Introduction to Polyhedral Computation

Lecture 5: Properties of convex sets

Heuristic Optimization Today: Linear Programming. Tobias Friedrich Chair for Algorithm Engineering Hasso Plattner Institute, Potsdam

Lecture 3. Corner Polyhedron, Intersection Cuts, Maximal Lattice-Free Convex Sets. Tepper School of Business Carnegie Mellon University, Pittsburgh

4 Linear Programming (LP) E. Amaldi -- Foundations of Operations Research -- Politecnico di Milano 1

LECTURE 6: INTERIOR POINT METHOD. 1. Motivation 2. Basic concepts 3. Primal affine scaling algorithm 4. Dual affine scaling algorithm

Polyhedral Computation Today s Topic: The Double Description Algorithm. Komei Fukuda Swiss Federal Institute of Technology Zurich October 29, 2010

6.854 Advanced Algorithms. Scribes: Jay Kumar Sundararajan. Duality

Polar Duality and Farkas Lemma

/ Approximation Algorithms Lecturer: Michael Dinitz Topic: Linear Programming Date: 2/24/15 Scribe: Runze Tang

11 Linear Programming

Linear Programming. Linear programming provides methods for allocating limited resources among competing activities in an optimal way.

Lecture 12: Feasible direction methods

Convex Geometry arising in Optimization

Mathematical Programming and Research Methods (Part II)

COLUMN GENERATION IN LINEAR PROGRAMMING

The Simplex Algorithm for LP, and an Open Problem

FACES OF CONVEX SETS

15-451/651: Design & Analysis of Algorithms October 11, 2018 Lecture #13: Linear Programming I last changed: October 9, 2018

Finite Math Linear Programming 1 May / 7

Linear Programming Problems

Circuit Walks in Integral Polyhedra

Polyhedral Computation and their Applications. Jesús A. De Loera Univ. of California, Davis

5.3 Cutting plane methods and Gomory fractional cuts

Read: H&L chapters 1-6

Notes taken by Mea Wang. February 11, 2005

be a polytope. has such a representation iff it contains the origin in its interior. For a generic, sort the inequalities so that

Transcription:

Geometry of LPs Consider the following LP : min {c T x a T i x b i The feasible region is i =1,...,m}. X := {x R n a T i x b i i =1,...,m} = m i=1 {x Rn a T i x b i} }{{} X i The set X i is a Half-space. The set H i = {x R n a T i x = b i} is a Hyperplane. The feasible region X is given by the intersection of m half-spaces and is known as a Polyhedron. A Polyhedron is a closed convexset (verify?). If it is bounded it is called a Polytope. LP: Minimize a linear function over a polyhedral set. Move in the direction c as far as possible while staying within the feasible region X. Verify that any LP can be written in this form 1

Example min c 1 x 1 + c 2 x 2 s.t. x 1 + x 2 1, x 1 0, x 2 0. X2 X2 X1 + X2 = 1 c =(1, 1) T,x =(0, 0) T X1 X1 + X2 = 1 c =(1, 0) T,x =(0,x 2 ) T for 0 x 2 1 X1 X2 X2 X1 + X2 = 1 c =(0, 1) T,x =(x 1, 0) T for 0 x 1 X1 X1 + X2 = 1 c =( 1, 1) T Unbounded. X1 2

Extreme Point Optimality An optimal solution (if it exists) always lies on a boundary of the feasible region X If there is an optimal solution, then there is one at a corner or a vertex or an extreme point of the polyhedron X. A point x X is an extreme point of X if it cannot be expressed as a convex combination of some other points x 1,...,x m X. A polyhedron has a finite number of extreme points. Representation theorem: When X is a polytope, any point x X can be expressed as a convex combination of the extreme points of X, i.e. x = Ii=1 λ i x i where x i vert(x) for all i =1,...,I, Ii=1 λ i =1,andλ i 0. 3

Proof of Extreme Point Optimality Consider the LP min{c T x x X}, where X is a polytope. Let x be an optimal solution, and suppose that there are no extreme points of X that are optimal. Then c T x <c T x i for all x i vert(x). By the representation thm: x = I i=1 λ i x i,then c T x = > I i=1 I i=1 = c T x and we have a contradiction. λ i c T x i λ i c T x 4

Unbounded Polyhedra A feasible direction of an unbounded polyhedra X R n is a (non-zero) vector d R n, such that if x 0 X then (x 0 + λd) X for all λ 0. An extreme direction of an unbounded polyhedra X R n is a direction d R n that cannot be expressed as a convexcombination of other directions of X. A polyhedron has a finite number of extreme directions. Representation theorem for general polyhedra: Any point x in a non-empty polyhedron can be expressed as a convexcombination of the extreme points of X, and a linear combination of the extreme directions of X, i.e. x = I i=1 λ i x i + J j=1 µ j d j where x i vert(x) foralli =1,...,I, I i=1 λ i =1, λ i 0, d j are the extreme directions of X for all j =1,J,andµ j 0. Extreme point optimality for general polyhedra can be proven using the above result. 5

Characterizing Extreme Points Consider an LP in standard form: min{c T x Ax = b, x 0}, where c R n, A R m n, and b R m. that rank(a) =m n. Assume Let A B =[a i 1,...,a i m ] be a matrix formed by m l.i. columns of A (called a Basis). Let A N be the matrixof the remaining columns, i.e. A = [A B A N ]. Partition the vector x = [ xb x N ] where x B =(x i1,..., x im ) T (Basic variables), and x N is the vector of remaining components of x (Non-basic variables). Set x B = A 1 B b and x N =0. Notex = [ A 1 B b 0 ( n m a solution to Ax = b. Known as a Basic solution. How many basic solutions? Ans: At most ] ) is. If x B 0, i.e. x 0, then the solution is a Basic Feasible Solution (BFS). 6

Example Consider the following polyhedral set: X := {x R 2 x 1 + x 2 6, x 2 3, x 1 0 x 2 0}. In standard form: x 1 + x 2 + x 3 = 6 x 2 + x 4 = 3 x 1, x 2, x 3, x 4, 0 [ ] 1 1 1 0 Note A =[a 1,a 2,a 3,a 4 ]=. 0 1 0 1 Choose A B =[a 1,a 2 ]. Then x B =(x 1,x 2 ) T =(3, 3) T. The solution x =(3, 3, 0, 0) T is a basic solution (also BFS). Choose A B =[a 2,a 4 ]. Then x B =(x 2,x 4 ) T =(6, 3) T. The solution x =(0, 6, 0, 3) T is a basic solution (not BFS). Note that there are 5 = a 1 and a 3 are not l.i.). ( 4 2 ) 1 basic solutions (since 7

Example (contd) A x BFS [a 1,a 2 ] (3, 3, 0, 0) T Yes [a 1,a 4 ] (6, 0, 0, 3) T Yes [a 2,a 3 ] (0, 3, 3, 0) T Yes [a 2,a 4 ] (0, 6, 0, 3) T No [a 3,a 4 ] (0, 0, 6, 3) T Yes x1+x2 = 6 (0,6) (0,3) (3,3) x2 = 3 (0,0) (6,0) Each BFS correspond to an extreme point of the feasible region. Algorithm? Check all BFS (extreme points) and choose best. Exponential impractical for large problems. 8

Optimization Strategy Idea: Move from one extreme point to a better (adjacent) extreme point. Need to characterize adjacent extreme points. Need a strategy to move to a better adjacent extreme point. Need a stopping condition. Need to detect unboundedness and infeasibility. 9

Adjacent BFS Two extreme points are adjacent if they share an edge of the polyhedron. Algebraically, two BFS x 1 and x 2 are adjacent if (only) one of the basic variables in BFS x 1 is non-basic in BFS x 2, and vice versa. Moving from a BFS to an adjacent BFS one of the non-basic variables takes on a positive value (enters the basis), while one of the basic variables drops down to zero (leaves the basis). Values of other basic variables change too. (0, 3, 3, 0) (3, 3, 0, 0) (0, 0, 6, 3) (6, 0, 0, 4) 10

Improving Direction Which non-basic variable should enter the basis? Ans: One that improves the objective function value (z). Let N := {j x j is nonbasic, i.e. a j is a column of A N }. Also partition c = [ cb c N ]. Now, A B x B + A N x N = b x B = A 1 B b A 1 B A Nx N. The objective function value z = c T B x B + c T N x N = c T B A 1 B b + ( c T N ct B A 1 B A N ) xn = c T B A 1 B b + j N = c T B A 1 B b + j N (c j c T } B{{ A 1 B a j) x } j r j x j. Note for the current bfs the objective value is z = c T B x B = c T B A 1 B b. If a nonbasic variable x j becomes positive, the objective value increases by r j. Therefore choose j s.t. r j < 0 to reduce the obj. func. The quantity r j is known as the reduced cost for nonbasic variable x j. If r j 0 for all j N then the current solution is optimal. r j 11

Step Length How much should the value of the (chosen) nonbasic variable be increased? Ans: As long as all of the basic variables 0. Suppose we chose the nonbasic variable j to increase (enter into the basis). Then x B = A 1 B b A 1 B a jx j. Let B = {i x i is basic}, andlety ij =(A 1 B a j) i where i B and j N. Then, x i =(A 1 B b) i y ij x j i B. Suppose y ij > 0, then increasing x j would reduce x i. For feasibility, we must have x i 0foralli B,i.e. if y ij > 0 (A 1 B b) i y ij x j 0 i B, i.e. x j (A 1 B b) i y ij i B. Thus we can increase x j to: x j = min {i B y ij >0} (A 1 B b) i y ij. 12

Step Length (contd.) Suppose the minimization in the above expression is achieved for î B. Then we increase x j =(A 1 B b) î /y îj, and consequently the basic variable xî drops to zero (leaves the basis). Suppose y ij 0foralli B, then increasing x j would not cause any of the basic variables to decrease in value. We can then increase x j as much as we want without losing feasibility and keep on improving the objective. In this case the problem is unbounded. What if min {i B yij >0} { } (A 1 B b) i y ij = 0? This implies that one of the basic variables was at a value zero. Such a BFS is known as a degenerate BFS and may cause problems in the algorithm. However there are ways of recovering from degeneracy problems. 13

Finding an Initial BFS Consider an LP min{c T x Ax = b, x 0}. If A has an m m identity submatrix I, thenusei as the initial basis. Otherwise, suitably multiply the constraints by ±1 such that b 0. Introduce auxiliary variables y = (y 1,...,y m ) T, and construct the following LP: min e T y s.t. Ax+ Iy = b, x, y 0. Use I as the starting basis, and solve the above problem to optimality. If the optimal objective value is zero y =0,then we have a basis consisting of only the columns of A, which we can use as an initial BFS for the original problem. If the optimal objective value > 0, then the original problem is infeasible. 14

The Simplex Method - G. B. Dantzig (1947) 1. Find an initial feasible extreme point solution (BFS). If non exists, STOP, the LP is infeasible. 2. Find a direction (along an edge) in which the objective function improves (i.e. find j N such that r j < 0). If none exists, STOP, the current solution is optimal. 3. Move along the direction until you reach an adjacent feasible extreme point (BFS). If you can move along the edge of the feasible region without reaching an adjacent feasible extreme point (i.e. y ij 0), then STOP, the LP is unbounded. 4. Go to step 2 with the current feasible extreme point. 15

Remarks In the absence of degeneracy, the Simplexmethod terminates finitely with an optimal solution to the LP. In the worst case, it might require terminate. ( n m ) steps to In most practical problems the performance is much better. The performance is sensitive to the number of rows. Degeneracy might cause the method to cycle. However this can be avoided by some anti-cycling rules in choosing the variables leaving and entering the basis. The optimal solution produced by the simplexalgorithm is a BFS, therefore, (at most) m of the n variables are positive. 16