Advanced Operations Research Techniques IE316. Quiz 2 Review. Dr. Ted Ralphs

Similar documents
Introduction to Mathematical Programming IE496. Final Review. Dr. Ted Ralphs

Advanced Operations Research Techniques IE316. Quiz 1 Review. Dr. Ted Ralphs

Linear programming and duality theory

Section Notes 5. Review of Linear Programming. Applied Math / Engineering Sciences 121. Week of October 15, 2017

Some Advanced Topics in Linear Programming

Linear Programming. Course review MS-E2140. v. 1.1

Linear Programming. Linear programming provides methods for allocating limited resources among competing activities in an optimal way.

Read: H&L chapters 1-6

Outline. CS38 Introduction to Algorithms. Linear programming 5/21/2014. Linear programming. Lecture 15 May 20, 2014

Math 5593 Linear Programming Lecture Notes

DM545 Linear and Integer Programming. Lecture 2. The Simplex Method. Marco Chiarandini

Introduction to Mathematical Programming IE406. Lecture 20. Dr. Ted Ralphs

The Simplex Algorithm

Part 4. Decomposition Algorithms Dantzig-Wolf Decomposition Algorithm

AM 121: Intro to Optimization Models and Methods Fall 2017

MA4254: Discrete Optimization. Defeng Sun. Department of Mathematics National University of Singapore Office: S Telephone:

Linear Optimization. Andongwisye John. November 17, Linkoping University. Andongwisye John (Linkoping University) November 17, / 25

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 29

Section Notes 4. Duality, Sensitivity, and the Dual Simplex Algorithm. Applied Math / Engineering Sciences 121. Week of October 8, 2018

Lecture 5: Duality Theory

Optimization of Design. Lecturer:Dung-An Wang Lecture 8

Mathematical and Algorithmic Foundations Linear Programming and Matchings

An iteration of the simplex method (a pivot )

Recap, and outline of Lecture 18

Linear Programming Problems

Introduction. Linear because it requires linear functions. Programming as synonymous of planning.

Civil Engineering Systems Analysis Lecture XIV. Instructor: Prof. Naveen Eluru Department of Civil Engineering and Applied Mechanics

Marginal and Sensitivity Analyses

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 36

In this chapter we introduce some of the basic concepts that will be useful for the study of integer programming problems.

5. DUAL LP, SOLUTION INTERPRETATION, AND POST-OPTIMALITY

Linear Programming. Larry Blume. Cornell University & The Santa Fe Institute & IHS

Discrete Optimization 2010 Lecture 5 Min-Cost Flows & Total Unimodularity

AMATH 383 Lecture Notes Linear Programming

BCN Decision and Risk Analysis. Syed M. Ahmed, Ph.D.

Lecture 4: Linear Programming

COLUMN GENERATION IN LINEAR PROGRAMMING

The Ascendance of the Dual Simplex Method: A Geometric View

Finite Math Linear Programming 1 May / 7

Chapter 15 Introduction to Linear Programming

Lesson 17. Geometry and Algebra of Corner Points

COMP331/557. Chapter 2: The Geometry of Linear Programming. (Bertsimas & Tsitsiklis, Chapter 2)

16.410/413 Principles of Autonomy and Decision Making

MATHEMATICS II: COLLECTION OF EXERCISES AND PROBLEMS

Linear Programming in Small Dimensions

Chapter II. Linear Programming

LP Geometry: outline. A general LP. minimize x c T x s.t. a T i. x b i, i 2 M 1 a T i x = b i, i 2 M 3 x j 0, j 2 N 1. where

THEORY OF LINEAR AND INTEGER PROGRAMMING

Introduction to Mathematical Programming IE406. Lecture 4. Dr. Ted Ralphs

Lecture 4: Rational IPs, Polyhedron, Decomposition Theorem

CMPSCI611: The Simplex Algorithm Lecture 24

R n a T i x = b i} is a Hyperplane.

Graphs that have the feasible bases of a given linear

Artificial Intelligence

Decision Aid Methodologies In Transportation Lecture 1: Polyhedra and Simplex method

Linear Programming Motivation: The Diet Problem

CS675: Convex and Combinatorial Optimization Spring 2018 The Simplex Algorithm. Instructor: Shaddin Dughmi

3. The Simplex algorithmn The Simplex algorithmn 3.1 Forms of linear programs

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 50

Duality. Primal program P: Maximize n. Dual program D: Minimize m. j=1 c jx j subject to n. j=1. i=1 b iy i subject to m. i=1

Applied Integer Programming

Lecture Notes 2: The Simplex Algorithm

11 Linear Programming

Polytopes Course Notes

5.3 Cutting plane methods and Gomory fractional cuts

Lecture 3. Corner Polyhedron, Intersection Cuts, Maximal Lattice-Free Convex Sets. Tepper School of Business Carnegie Mellon University, Pittsburgh

CS 372: Computational Geometry Lecture 10 Linear Programming in Fixed Dimension

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I. Instructor: Shaddin Dughmi

Tribhuvan University Institute Of Science and Technology Tribhuvan University Institute of Science and Technology

Linear Programming: Introduction

AMS : Combinatorial Optimization Homework Problems - Week V

Lecture 16 October 23, 2014

Convexity: an introduction

maximize c, x subject to Ax b,

The Simplex Algorithm. Chapter 5. Decision Procedures. An Algorithmic Point of View. Revision 1.0

Integer Programming Theory

Linear programming II João Carlos Lourenço

Introduction to Operations Research

POLYHEDRAL GEOMETRY. Convex functions and sets. Mathematical Programming Niels Lauritzen Recall that a subset C R n is convex if

1. Lecture notes on bipartite matching February 4th,

CSC 8301 Design & Analysis of Algorithms: Linear Programming

4.1 The original problem and the optimal tableau

Solutions for Operations Research Final Exam

Math Introduction to Operations Research

CS675: Convex and Combinatorial Optimization Spring 2018 Consequences of the Ellipsoid Algorithm. Instructor: Shaddin Dughmi

INEN 420 Final Review

Lecture 9: Linear Programming

Design and Analysis of Algorithms (V)

Simulation. Lecture O1 Optimization: Linear Programming. Saeed Bastani April 2016

IE 5531: Engineering Optimization I

The simplex method and the diameter of a 0-1 polytope

Dual-fitting analysis of Greedy for Set Cover

Lecture 6: Faces, Facets

maximize minimize b T y subject to A T y c, 0 apple y. The problem on the right is in standard form so we can take its dual to get the LP c T x

David G. Luenberger Yinyu Ye. Linear and Nonlinear. Programming. Fourth Edition. ö Springer

Introduction to Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

Discuss mainly the standard inequality case: max. Maximize Profit given limited resources. each constraint associated to a resource

Civil Engineering Systems Analysis Lecture XV. Instructor: Prof. Naveen Eluru Department of Civil Engineering and Applied Mechanics

Linear Programming Duality and Algorithms

Integer Programming ISE 418. Lecture 1. Dr. Ted Ralphs

Transcription:

Advanced Operations Research Techniques IE316 Quiz 2 Review Dr. Ted Ralphs

IE316 Quiz 2 Review 1 Reading for The Quiz Material covered in detail in lecture Bertsimas 4.1-4.5, 4.8, 5.1-5.5, 6.1-6.3 Material covered briefly in lecture Bertsimas 4.6, 4.9

IE316 Quiz 2 Review 2 Deriving the Dual Problem Consider a standard form LP min{c T x : Ax = b, x 0}. To derive the dual problem, we use Lagrangian relaxation and consider the function [ g(p) = min c T x + p T (b Ax) ] x 0 in which infeasibility is penalized by a vector of dual prices p. For every vector p, g(p) is a lower bound on the optimal value of the original LP. To achieve the best bound, we considered maximizing g(p), which is equivalent to maximize p T b s.t. p T A c This LP is the dual to the original one.

IE316 Quiz 2 Review 3 From the Primal to the Dual We can dualize general LPs as follows PRIMAL minimize maximize DUAL b i 0 constraints b i 0 variables = b i free 0 c j variables 0 c j constraints free = c j

IE316 Quiz 2 Review 4 Relationship of the Primal and the Dual The following are the possible relationships between the primal and the dual: Finite Optimum Unbounded Infeasible Finite Optimum Possible Impossible Impossible Unbounded Impossible Impossible Possible Infeasible Impossible Possible Possible

IE316 Quiz 2 Review 5 Strong Duality and Complementary Slackness Theorem 1. (Strong Duality) If a linear programming problem has an optimal solution, so does its dual, and the respective optimal costs are equal. Theorem 2. If x and p are feasible primal and dual solutions, then x and p are optimal if and only if p T (Ax b) = 0, (c T p T A)x = 0. From complementary slackness, we can derive a number of alternative optimality conditions. The simplex algorithm always maintains complementary slackness

IE316 Quiz 2 Review 6 LPs with General Upper and Lower Bounds In many problems, the variables have explicit nonzero upper or lower bounds. These upper and lower bounds can be dealt with implicitly instead of being included as constraints. In this more general framework, all nonbasic variables are fixed at either their upper or lower bounds. For minimization, variables eligible to enter the basis are either Variables at their lower bounds with negative reduced costs, or Variables at their upper bounds with positive reduced cost. When no such variables exist, we are at optimality. For maximization, we can just reverse the signs.

IE316 Quiz 2 Review 7 Economic Interpretation The dual variables tell us the marginal change in the objective function per unit change in the right-hand side of a constraint. Hence, we can interpret the dual variables as prices associated with the resources defined by each constraint. Using the shadow prices, we can determine how much each product costs in terms of its constituent resources. The reduced cost of a product is the difference between its selling price and the (implicit) cost of the constituent resources. If we discover a product whose cost is less than its selling price, we try to manufacture more of that product to increase profit. With the new product mix, the demand for various resources is changed and their prices are adjusted. We continue until there is no product with cost less than its selling price. This is the same as having the reduced costs nonnegative. Complementary slackness says that we should only manufacture products for which cost and selling price are equal.

IE316 Quiz 2 Review 8 The Dual Simplex Method This leads to a dual version of the simplex method in tableau form. Recall the simplex tableau c T B x B c 1 c n x B(1). B 1 A 1 B 1 A n x B(m) In the dual simplex method, the basic variables are allowed to take on negative values, but we keep the reduced costs nonnegative. The pivot row is any row with a negative right-hand side. We choose the pivot column using a ratio test that ensures the reduced costs remain nonnegative (dual feasibility).

IE316 Quiz 2 Review 9 Optimal Bases Note that a given basis determines both a unique solution to the primal and a unique solution to the dual. x B = B 1 b p T = c T BB 1 Both the primal and dual solutions are basic and either one, or both, may be feasible. If they are both feasible, then they are both optimal and the basis is also optimal. Both versions of the simplex method go from one adjacent basic solution to another until reaching optimality. Both versions either terminate in a finite number of steps or cycle.

IE316 Quiz 2 Review 10 Geometric Interpretation of Optimality Suppose we have a problem in inequality form, so that the dual is in standard form, and a basis B. If I is the index set of binding constraints at the corresponding (nondegenerate) BFS, and we enforce complementary slackness, then dual feasibility is equivalent to p i a i = c. i I In other words, the objective function must be a nonnegative combination of the binding constraints. We can easily picture this graphically.

IE316 Quiz 2 Review 11 Farkas Lemma Proposition 1. Let A R m n and b R m be given. Then exactly one of the following holds: 1. x 0 such that Ax = b. 2. p such that p T A 0 T and p T b < 0. This is closely related to the geometric interpretation of optimality. In words, this says that either The vector b is in the cone C formed by all nonnegative linear combination of the columns of A, or There is a hyperplane separating b from C. From this lemma, we can derive much of optimization theory.

IE316 Quiz 2 Review 12 Recession Cones and Extreme Rays The recession cone associated with a polyhedron in inequality form is {d R n Ad 0}. The extreme rays of this cone are also called the extreme rays of the polyhedron. Note that a polyhedron has a finite number of non-equivalent extreme rays. Theorem 3. Consider the LP min{c T x Ax b} and assume the feasible region has at least one extreme point. The optimal cost is equal to if and only if some extreme ray d satisfies c T d < 0. When simplex terminates due to unboundedness, there is a column j with negative reduced cost for which basic direction j belongs to the recession cone. It is easy to show that this basic direction is an extreme ray of the recession cone.

IE316 Quiz 2 Review 13 Representation of Polyhedra Theorem 4. Let P = {x R n } be a nonempty polyhedron with at least one extreme point. Let x 1,..., x k be the extreme points and w 1,..., w r be the extreme rays. Then P = k λ i x i + i=1 r θ j w j λ i 0, θ j 0, j=1 k λ i = 1. i=1 Corollary 1. extreme rays. A nonempty polyhedron is bounded if and only if it has no Corollary 2. A nonempty bounded polyhedron, is the convex hull of its extreme points. Corollary 3. Every element of a polyhedral cone can be expressed as a nonnegative linear combination of extreme rays.

IE316 Quiz 2 Review 14 The Converse of the Representation Theorem Definition 1. P = A set Q is finitely generated if it is of the form k λ i x i + i=1 r θ j w j λ i 0, θ j 0, j=1 k λ i = 1. i=1 for given vectors x 1,..., x k and w 1,..., w r in R n. Theorem 5. Every finitely generated set is a polyhedron. The convex hull of finitely many vectors is a bounded polyhedron, also called a polytope.

IE316 Quiz 2 Review 15 The Fundamental Idea of Sensitivity Analysis Using the simplex algorithm to solve a standard form problem, we know that if B is an optimal basis, then two conditions are satisfied: B 1 b 0 c T c T B B 1 A 0 When the problem is changed, we can check to see how these conditions are affected. When using the simplex method, we always have B 1 available, so we can easily recompute the appropriate quantities. If the change causes the optimality conditions to be violated, we can usually re-solve from the current basis using either primal or dual simplex.

IE316 Quiz 2 Review 16 Adding New Variables and Constraints To add a new column A n+1, Simply compute the reduced cost, c n+1 C B B 1 A n+1, and the new column in the tableau, B 1 A n+1. Re-optimize using primal simplex. To add an inequality constraint a T m+1x b m+1, Add a slack variable into the basis and compute the new tableau row, [a T B 1 A a T m+1 1]. Re-optimize using dual simplex. To add an equality constraint a T m+1x = b m+1, Follow the same steps as above, except introduce an artificial variables instead of a slack variable. The artificial variable should be given a large cost in the objective function to drive it out of the basis.

IE316 Quiz 2 Review 17 Local Sensitivity Analysis For changes in the right-hand side, Recompute the values of the basic variables, B 1 b. Re-solve using dual simplex if necessary. For a changes in the cost vector, Recompute the reduced costs. Re-solve using primal simplex. For changes in a nonbasic column A j Recompute the reduced cost, c j c B B 1 A j. Recompute the column in the tableau, B 1 A j. For all of these changes, we can compute ranges within which the current basis remains optimal.

IE316 Quiz 2 Review 18 Global Dependence on the Cost and Right-hand Side Vectors Consider a family of polyhedra parameterized by the vector b P(b) = {x R n Ax = b, x 0} and the function F (b) = min x P(b) c T x. Consider the extreme points p 1,..., p N of the dual polyhedron. There must be an extremal optimum to the dual and so F (b) = max i=1,...,n (pi ) T b Hence, F (b) is a piecewise linear convex function. Theorem 6. Suppose that the linear program min{c T x Ax = ˆb, x 0} is feasible and that the optimal cost is finite. Then p is an optimal solution to the dual if and only if it is a subgradient of F at ˆb. A similar analysis applies to the cost vector.

IE316 Quiz 2 Review 19 Parametric Programming For a fixed matrix A, vectors b and c, and direction d and consider the problem min (c + θd) T x s.t. Ax = b x 0 If g(θ) is the optimal cost for a given θ, then as before g(θ) = min (c + i=1,...,n θd)t x i where x 1,..., x N are the extreme points of the feasible set. Hence, g(θ) is piecewise linear and concave.

IE316 Quiz 2 Review 20 The Parametric Simplex Method Determine an initial feasible basis. Determine the interval [θ 1, θ 2 ] for which this basis is optimal. Determine a variable j whose reduced cost is nonpositive for θ θ 2. If the corresponding column has no positive entries, then the problem is unbounded for θ > θ 2. Otherwise, rotate column j into the basis. Determine a new interval [θ 2, θ 3 ] in which the current basis is optimal. Iterate to find all breakpoint θ 1. Repeat the process to find breakpoints θ 1.

IE316 Quiz 2 Review 21 Large-scale Linear Programming For large LPs, we really need only consider Constraints that are binding at optimality. Variables that are basic at optimality. We derived methods for dynamically generating variables and constraints during the solution process. For variables, the methods are based on calculating the reduced costs for columns that do not currently appear in the tableau. Finding the column with the most negative reduced cost is an optimization problem that can sometimes be solved efficiently. For constraints, the methods are similar but are based on finding the constraint with the most negative slack value. These methods are needed for efficiency and in cases where we don t want to generate the constraint matrix a priori.