An iteration of the simplex method (a pivot )

Similar documents
Advanced Operations Research Techniques IE316. Quiz 1 Review. Dr. Ted Ralphs

Section Notes 5. Review of Linear Programming. Applied Math / Engineering Sciences 121. Week of October 15, 2017

Advanced Operations Research Techniques IE316. Quiz 2 Review. Dr. Ted Ralphs

Outline. CS38 Introduction to Algorithms. Linear programming 5/21/2014. Linear programming. Lecture 15 May 20, 2014

Lecture Notes 2: The Simplex Algorithm

Recap, and outline of Lecture 18

CS675: Convex and Combinatorial Optimization Spring 2018 The Simplex Algorithm. Instructor: Shaddin Dughmi

LP Geometry: outline. A general LP. minimize x c T x s.t. a T i. x b i, i 2 M 1 a T i x = b i, i 2 M 3 x j 0, j 2 N 1. where

DM545 Linear and Integer Programming. Lecture 2. The Simplex Method. Marco Chiarandini

R n a T i x = b i} is a Hyperplane.

The Simplex Algorithm

Lesson 17. Geometry and Algebra of Corner Points

CSC 8301 Design & Analysis of Algorithms: Linear Programming

Linear Programming. Course review MS-E2140. v. 1.1

Optimization of Design. Lecturer:Dung-An Wang Lecture 8

Linear Programming Motivation: The Diet Problem

4.1 Graphical solution of a linear program and standard form

Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch.

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 29

CSE 40/60236 Sam Bailey

Linear Optimization. Andongwisye John. November 17, Linkoping University. Andongwisye John (Linkoping University) November 17, / 25

3. The Simplex algorithmn The Simplex algorithmn 3.1 Forms of linear programs

AM 121: Intro to Optimization Models and Methods Fall 2017

Previously Local sensitivity analysis: having found an optimal basis to a problem in standard form,

Algorithmic Game Theory and Applications. Lecture 6: The Simplex Algorithm

Introduction to Mathematical Programming IE406. Lecture 4. Dr. Ted Ralphs

16.410/413 Principles of Autonomy and Decision Making

Linear programming and duality theory

Lecture 16 October 23, 2014

CSc 545 Lecture topic: The Criss-Cross method of Linear Programming

AMATH 383 Lecture Notes Linear Programming

Linear programming II João Carlos Lourenço

Math Models of OR: The Simplex Algorithm: Practical Considerations

Discrete Optimization 2010 Lecture 5 Min-Cost Flows & Total Unimodularity

The simplex method and the diameter of a 0-1 polytope

Mathematical and Algorithmic Foundations Linear Programming and Matchings

Read: H&L chapters 1-6

Some Advanced Topics in Linear Programming

Tuesday, April 10. The Network Simplex Method for Solving the Minimum Cost Flow Problem

Graphs and Network Flows IE411. Lecture 20. Dr. Ted Ralphs

The Simplex Algorithm. Chapter 5. Decision Procedures. An Algorithmic Point of View. Revision 1.0

Linear Programming Duality and Algorithms

Lecture notes on the simplex method September We will present an algorithm to solve linear programs of the form. maximize.

Section Notes 4. Duality, Sensitivity, and the Dual Simplex Algorithm. Applied Math / Engineering Sciences 121. Week of October 8, 2018

Notes taken by Mea Wang. February 11, 2005

Linear Programming. Linear programming provides methods for allocating limited resources among competing activities in an optimal way.

Solutions for Operations Research Final Exam

The Ascendance of the Dual Simplex Method: A Geometric View

Linear Programming in Small Dimensions

Finite Math Linear Programming 1 May / 7

MATH 310 : Degeneracy and Geometry in the Simplex Method

Linear Programming Problems

Fundamentals of Operations Research. Prof. G. Srinivasan. Department of Management Studies. Indian Institute of Technology Madras.

VARIANTS OF THE SIMPLEX METHOD

ORF 307: Lecture 14. Linear Programming: Chapter 14: Network Flows: Algorithms

ORIE 6300 Mathematical Programming I September 2, Lecture 3

Linear Programming Terminology

Introduction to Operations Research

IE 5531: Engineering Optimization I

/ Approximation Algorithms Lecturer: Michael Dinitz Topic: Linear Programming Date: 2/24/15 Scribe: Runze Tang

College of Computer & Information Science Fall 2007 Northeastern University 14 September 2007

Math 5593 Linear Programming Lecture Notes

New Directions in Linear Programming

Civil Engineering Systems Analysis Lecture XIV. Instructor: Prof. Naveen Eluru Department of Civil Engineering and Applied Mechanics

Introduction to Mathematical Programming IE496. Final Review. Dr. Ted Ralphs

BCN Decision and Risk Analysis. Syed M. Ahmed, Ph.D.

Introduction to Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

CS 372: Computational Geometry Lecture 10 Linear Programming in Fixed Dimension

Simulation. Lecture O1 Optimization: Linear Programming. Saeed Bastani April 2016

Subexponential lower bounds for randomized pivoting rules for the simplex algorithm

Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

Submodularity Reading Group. Matroid Polytopes, Polymatroid. M. Pawan Kumar

Artificial Intelligence

The Simplex Algorithm for LP, and an Open Problem

15.082J and 6.855J. Lagrangian Relaxation 2 Algorithms Application to LPs

LECTURE 6: INTERIOR POINT METHOD. 1. Motivation 2. Basic concepts 3. Primal affine scaling algorithm 4. Dual affine scaling algorithm

Lecture 4: Linear Programming

The Simplex Method applies to linear programming problems in standard form :

Approximation Algorithms: The Primal-Dual Method. My T. Thai

Chapter 15 Introduction to Linear Programming

11.1 Facility Location

AMS : Combinatorial Optimization Homework Problems - Week V

Part 4. Decomposition Algorithms Dantzig-Wolf Decomposition Algorithm

Discrete Optimization. Lecture Notes 2

Math 414 Lecture 2 Everyone have a laptop?

CMPSCI611: The Simplex Algorithm Lecture 24

Chapter 4: The Mechanics of the Simplex Method

Introduction. Linear because it requires linear functions. Programming as synonymous of planning.

Introduction to Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

The Simplex Algorithm

maximize minimize b T y subject to A T y c, 0 apple y. The problem on the right is in standard form so we can take its dual to get the LP c T x

Marginal and Sensitivity Analyses

Lecture 2 - Introduction to Polytopes

Introductory Operations Research

CASE STUDY. fourteen. Animating The Simplex Method. case study OVERVIEW. Application Overview and Model Development.

LECTURES 3 and 4: Flows and Matchings

Introduction to Linear Programming

Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

Integer Programming Theory

15-451/651: Design & Analysis of Algorithms October 11, 2018 Lecture #13: Linear Programming I last changed: October 9, 2018

Transcription:

Recap, and outline of Lecture 13 Previously Developed and justified all the steps in a typical iteration ( pivot ) of the Simplex Method (see next page). Today Simplex Method Initialization Start with a basis B and corresponding BFS x Iteration Perform a pivot starting at x. If a termination condition is met during the pivot, stop the algorithm, either with an optimal solution, or with proof of problem unboundedness. Otherwise, let B and y be as computed at the end of the pivot. Update Update B B and x y, and repeat the iteration Remaining questions: Will the algorithm always terminate with a solution/claim of unboundedness? How long will it take? How do we find a starting BFS And what if the problem is infeasible? IOE 510: Linear Programming I, Fall 2010 Outline of Lecture 13 Page 13 1 An iteration of the simplex method (a pivot ) 1. Iteration starts with a basis of columns A B(1),...,A B(m),and an associated BFS x. 2. Compute the reduced costs c j = c j c B B 1 A j for all nonbasic indices j. If c j 0forallj, terminate;x is an optimal solution. Otherwise, pick some j with c j < 0. 3. Compute u = B 1 A j.ifu 0, wehaveθ =. Terminate the algorithm and declare the problem unbounded. 4. If at least one component of u is positive, let θ x B(i) = min. i:u i >0 u i 5. Let l be such that θ = x B(l) /u l.formanewbasisby replacing A B(l) with A j.ify is the new basic feasible solution, the values of the new basic variables are y j = θ and y B(i) = x B(i) θ u i, i = l. IOE 510: Linear Programming I, Fall 2010 Outline of Lecture 13 Page 13 2

Results of the Simplex Method Theorem 3.3, Part I Assume that the feasible set is nonempty. Then, if the simplex method terminates, at termination, there are the following two possibilities: (a) We have an optimal basis B and an associated basic feasible solution which is optimal (b) We have found a vector d satisfying Ad = 0, d 0, and c d < 0, and the optimal cost is. Proof: If the algorithm stops, do we have a (correct) solution? If algorithm terminates in Step 2 of some pivot, B is an optimal basis, and hence current BFS is optimal. If algorithm terminates in Step 3 of some pivot, x + θd P for all θ, andsincec d = c j < 0, cost of x + θd can be made arbitrarily negative by taking θ sufficiently large. IOE 510: Linear Programming I, Fall 2010 Analysis of the Simplex Method Page 13 3 Convergence of the Simplex Method on non-degenerate problems Theorem 3.3 Assume that the feasible set is nonempty and that every feasible solution is nondegenerate. Then,the simplex method terminates after a finite number of iterations. At termination, there are the following two possibilities: (a) We have an optimal basis B and an associated basic feasible solution which is optimal (b) We have found a vector d satisfying Ad = 0, d 0, and c d < 0, and the optimal cost is. Proof: Remaining question: will the algorithm stop? In each completed pivot θ > 0 (since x is non-degenerate) thus, c y = c x + θ c d = c x + θ c j < c x thus, no basic solution is visited twice. Therefore, no basis is visited twice Since there is a finite number of bases, the algorithm has to stop after a finite number of iterations! IOE 510: Linear Programming I, Fall 2010 Analysis of the Simplex Method Page 13 4

Degenerate BFSs and the Simplex method If, during a pivot, more than one of the original basic variables becomes 0 at the point x + θ d,wehaveatieforwhichbasic variable should exit the basis. Only one of these (zero-valued) variables exits the basis. The others remain in the basis at zero level; hence the new basis is degenerate. If the current BFS is degenerate, θ may be equal to zero (when x B(i) =0andu i > 0 for some i). In this case we can still perform the change of basis, and Theorem 3.2 is still valid. The new basic variable enters the basis at value 0 We changed the basis, but haven t changed the corresponding BFS, i.e., haven t improved cost Unlike in Theorem 3.3, the algorithm might cycle, i.e., repeatedly visit the same basis. See example 3.6 for a situation where the simplex method cycles See example 3.5 for a situation where the simplex method does not cycle, even though it visits a degenerate BFS IOE 510: Linear Programming I, Fall 2010 Analysis of the Simplex Method Page 13 5 Pivoting rules (choosing entering and leaving variable) Entering: any x j with c j < 0 is eligible to enter Choose j with the smallest c j Choose j with the smallest resulting cost reduction (θ c j ) Careful: value of θ depends on choice of j! Smallest subscript rule: Choose the smallest j with c j < 0 and many others Leaving: any x B(i) with x B(i) /u i = θ is eligible to leave Smallest subscript rule: of all variables eligible to leave, choose the one with the smallest subscript. Smallest value of B(i), not the smallest value of i Lexicographic rule (see Section 3.4) Bland s rule Choose both entering and leaving variables according to the appropriate smallest subscript rule. Robert Bland, 1977: The simplex method implemented using Bland s pivoting rule never cycles; hence it is finite. May stay at the same degen. BFS for several iterations, but always different basis. IOE 510: Linear Programming I, Fall 2010 Analysis of the Simplex Method Page 13 6

Number of iterations needed by simplex method Computational effort/running time of an algorithm: work per iteration number of iterations. In practice simplex method is extremely fast Conventional wisdom suggests that number of iterations is usually proportional to the number of constraints Unless solving special bad LPs Number of iterations in the worst case exponential (1972) Theorem 3.5 considers an LP over a perturbed cube in n and simplex method initialized at a BFS adjacent to the optimal one. For a particular pivoting rule the simplex method requires 2 n 1 pivots on this example. But under a different pivoting rule in this example, we only need one pivot! Is there an exponential example for any pivoting rule? Hirsch Conjecture (n, m) m n, where (n, m) isthe maximal diameter of all bounded polyhedra in n that can be represented by m inequalities. The known result: (n, m) (2n) log 2 m IOE 510: Linear Programming I, Fall 2010 Analysis of the Simplex Method Page 13 7 Finding an initial BFS/proving infeasibility Consider the problem Wolog, b 0. (LP) min c x s.t. Ax = b x 0 We are not given a starting BFS. We don t even know if the problem is feasible. Introduce y m (artificial variables) and consider the auxiliary problem: (AUX) min y 1 + + y m s.t. Ax + Iy = b x 0 y 0 There is an obvious BFS for the auxiliary problem, so it can be solved with the simplex method. IOE 510: Linear Programming I, Fall 2010 Initializing the Simplex Method Page 13 8

Solving the auxiliary problem (AUX) min y 1 + + y m s.t. Ax + Iy = b x 0 y 0 Possible outcomes of applying the simplex method to (AUX): (AUX) cannot be unbounded If optimal cost of (AUX) is > 0, it implies original LP is infeasible If optimal cost of (AUX) is 0, it implies original LP is feasible If after termination all artificial variables are non-basic, we have found a BFS of the original LP If after termination some artificial variables are basic (at zero level), need to drive them out of the basis to obtain a BFS of the original LP. IOE 510: Linear Programming I, Fall 2010 Initializing the Simplex Method Page 13 9 Driving a artificial variable out of the basis Suppose solving (AUX) leads to a basis B consisting of k < m columns of the original matrix A: AB(i1 ),...,A B(ik ),and m k columns corresponding to artificial variables (at 0 level). If A has rank m, it is possible to select m k additional columns of A to form a basis corresponding to a (degenerate) BFS of (LP). How? (Recall: need linear independence of basic columns!) Suppose the lth basic variable is an artificial variable. Among vectors B 1 A j, j = B(i 1 ),...,B(i k ), find one that had a non-zero lth component. If found, have the corresponding (original) variable replace the artificial on the in the basis. Claim: The new column and columns of A already in the basis are linearly independent. Update B; now there is one fewer artificial basic variable What if (B 1 A j ) l =0forallj = B(i 1 ),...,B(i k )? Then the lth constraint was redundant, i.e., rank(a) < m Remove the lth constraint from (LP) and (AUX); m m 1 IOE 510: Linear Programming I, Fall 2010 Initializing the Simplex Method Page 13 10

Two-phase simplex method: Phase I Given an LP in standard form, 1. Ensure that b 0 (multiply some constraints by 1) 2. Introduce artificial variables y m and apply simplex to the auxiliary problem of minimizing m i=1 y i. 3. If the optimal cost of aux. problem is > 0, the original problem is infeasible terminate. 4. If the optimal cost of aux. problem is 0, the original problem is feasible. If there are no artif. variables are in the final basis, a BFS for the original problem is available. 5. If the lth basic variable is an artif. one, examine the lth entry of the vectors B 1 A j, j =1,...,n. If all these entries are 0, the lth row represents a redundant constraint, and is eliminated. If the lth element of the jth vector above is non-zero, change the basis with A j becoming the lth basic column. Repeat until no artificial variables remain in the basis. IOE 510: Linear Programming I, Fall 2010 Initializing the Simplex Method Page 13 11 Two-phase simplex method: Phase II 1. Let the final basis obtained from Phase I be the initial basis for Phase II. By construction, corresponding basic solution is a BFS. 2. Compute the reduced costs of all variables for this initial basis, using the cost coefficients of the original problem. 3. Apply simplex method to the original problem. Now we have a complete method, assuming that we use a pivoting rule that avoids cycling. In particular, If the problem is infeasible, it is detected in Phase I If the problem is feasible but rank(a) < m, thisisdetected and corrected at the end of Phase I, by identifying and eliminating redundant equality constrains. If the optimal cost is equal to, thisisdetectedwhile running Phase II Else, Phase II terminates with an optimal solution. IOE 510: Linear Programming I, Fall 2010 Initializing the Simplex Method Page 13 12