Crash-Starting the Simplex Method

Similar documents
Parallelizing the dual revised simplex method

Lagrangean relaxation - exercises

Programming, numerics and optimization

Introduction to Mathematical Programming IE496. Final Review. Dr. Ted Ralphs

MVE165/MMG630, Applied Optimization Lecture 8 Integer linear programming algorithms. Ann-Brith Strömberg

Computational issues in linear programming

Introduction to Optimization

MVE165/MMG631 Linear and integer optimization with applications Lecture 9 Discrete optimization: theory and algorithms

Matrix-free IPM with GPU acceleration

LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION. 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach

High performance computing and the simplex method

Improving Dual Bound for Stochastic MILP Models Using Sensitivity Analysis

Chapter II. Linear Programming

Gurobi Guidelines for Numerical Issues February 2017

Lecture 18: March 23

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

3 INTEGER LINEAR PROGRAMMING

A NEW SEQUENTIAL CUTTING PLANE ALGORITHM FOR SOLVING MIXED INTEGER NONLINEAR PROGRAMMING PROBLEMS

An augmented Lagrangian method for equality constrained optimization with fast infeasibility detection

An Introduction to Dual Ascent Heuristics

3 Interior Point Method

25. NLP algorithms. ˆ Overview. ˆ Local methods. ˆ Constrained optimization. ˆ Global methods. ˆ Black-box methods.

Constrained Optimization COS 323

Selected Topics in Column Generation

Computational Methods. Constrained Optimization

DEGENERACY AND THE FUNDAMENTAL THEOREM

The Alternating Direction Method of Multipliers

56:272 Integer Programming & Network Flows Final Exam -- December 16, 1997

The MIP-Solving-Framework SCIP

February 19, Integer programming. Outline. Problem formulation. Branch-andbound

COMS 4771 Support Vector Machines. Nakul Verma

Convex Optimization and Machine Learning

Lagrangian Relaxation

Solving basis pursuit: infeasible-point subgradient algorithm, computational comparison, and improvements

15.082J and 6.855J. Lagrangian Relaxation 2 Algorithms Application to LPs

DETERMINISTIC OPERATIONS RESEARCH

Surrogate Gradient Algorithm for Lagrangian Relaxation 1,2

Nonlinear Programming

Support Vector Machines. James McInerney Adapted from slides by Nakul Verma

Comparison of Interior Point Filter Line Search Strategies for Constrained Optimization by Performance Profiles

Contents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited.

Simulation. Lecture O1 Optimization: Linear Programming. Saeed Bastani April 2016

Lagrangian Relaxation: An overview

Algorithms for Integer Programming

Algorithms for convex optimization

Solving a Challenging Quadratic 3D Assignment Problem

David G. Luenberger Yinyu Ye. Linear and Nonlinear. Programming. Fourth Edition. ö Springer

Heuristics in Commercial MIP Solvers Part I (Heuristics in IBM CPLEX)

Julian Hall School of Mathematics University of Edinburgh. June 15th Parallel matrix inversion for the revised simplex method - a study

Solution Methods Numerical Algorithms

Penalty Alternating Direction Methods for Mixed- Integer Optimization: A New View on Feasibility Pumps

The Ascendance of the Dual Simplex Method: A Geometric View

A FACTOR GRAPH APPROACH TO CONSTRAINED OPTIMIZATION. A Thesis Presented to The Academic Faculty. Ivan Dario Jimenez

Characterizing Improving Directions Unconstrained Optimization

CONLIN & MMA solvers. Pierre DUYSINX LTAS Automotive Engineering Academic year

Sparse Optimization Lecture: Proximal Operator/Algorithm and Lagrange Dual

Ellipsoid Algorithm :Algorithms in the Real World. Ellipsoid Algorithm. Reduction from general case

Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm. Santos and Mateus (2007)

Parallel and Distributed Graph Cuts by Dual Decomposition

Department of Mathematics Oleg Burdakov of 30 October Consider the following linear programming problem (LP):

sbb COIN-OR Simple Branch-and-Cut

Alternating Direction Method of Multipliers

Lagrangean Methods bounding through penalty adjustment

Appendix A MATLAB s Optimization Toolbox Algorithms

Heuristics in MILP. Group 1 D. Assouline, N. Molyneaux, B. Morén. Supervisors: Michel Bierlaire, Andrea Lodi. Zinal 2017 Winter School

Discrete Optimization 2010 Lecture 5 Min-Cost Flows & Total Unimodularity

Theoretical Concepts of Machine Learning

Natural Language Processing

Outline. Column Generation: Cutting Stock A very applied method. Introduction to Column Generation. Given an LP problem

Column Generation: Cutting Stock

LaGO - A solver for mixed integer nonlinear programming

The Size Robust Multiple Knapsack Problem

Two-phase matrix splitting methods for asymmetric and symmetric LCP

Support Vector Machines.

NATCOR Convex Optimization Linear Programming 1

Column Generation Based Primal Heuristics

Introduction to Mathematical Programming IE406. Lecture 20. Dr. Ted Ralphs

Linear Programming. Course review MS-E2140. v. 1.1

Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

Algorithms for Decision Support. Integer linear programming models

The Gurobi Optimizer. Bob Bixby

Aim. Structure and matrix sparsity: Part 1 The simplex method: Exploiting sparsity. Structure and matrix sparsity: Overview

What is linear programming (LP)? NATCOR Convex Optimization Linear Programming 1. Solving LP problems: The standard simplex method

APPLIED OPTIMIZATION WITH MATLAB PROGRAMMING

Handling of constraints

Cloud Branching MIP workshop, Ohio State University, 23/Jul/2014

Integer Programming Chapter 9

The Alternating Direction Method of Multipliers

A Generic Separation Algorithm and Its Application to the Vehicle Routing Problem

Improving the heuristic performance of Benders decomposition

Recent Developments in Model-based Derivative-free Optimization

Solving a Challenging Quadratic 3D Assignment Problem

Exact solutions to mixed-integer linear programming problems

Lecture 14: Linear Programming II

COLUMN GENERATION IN LINEAR PROGRAMMING

PRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING. 1. Introduction

Programs. Introduction

Marginal and Sensitivity Analyses

Introduction to Optimization Problems and Methods

GPU acceleration of the matrix-free interior point method

Transcription:

Crash-Starting the Simplex Method Ivet Galabova Julian Hall School of Mathematics, University of Edinburgh Optimization Methods and Software December 2017 Ivet Galabova, Julian Hall Crash-Starting Simplex 1/20

Overview A linear programming problem is x b Better crash min s.t. c T x Ax = b x 0 x c x Feasibility crash f x 0 Crash x k Crossover x b Simplex x Ivet Galabova, Julian Hall Crash-Starting Simplex 2/20

Motivation A good starting basis is essential for the solution of large problems The Idiot crash is a heuristic aiming to improve feasibility Implemented by John Forrest in CLP Seeking primal feasibility prior to primal simplex x 0 Crash x k Crossover x b Simplex x Ivet Galabova, Julian Hall Crash-Starting Simplex 3/20

The Idiot crash No documentation but the code John Forrest: so bad that I called it the Idiot algorithm At each iteration minimize a weighted average of objective and infeasibilities Looks like quadratic penalty function approach but... Extra term in function minimized Looks like Augmented Lagrangian (AL) but... Approximate minimization Parameter adjustment Other details Ivet Galabova, Julian Hall Crash-Starting Simplex 4/20

Augmented Lagrangian In iteration k x k+1 = arg min x L A = c T x λ T r(x) + µ 2 r(x)t r(x), Residual vector r(x) = b Ax Parameters µ and λ λ i corresponds to the i th row λ converge to the optimal Lagrange multipliers Adjust parameters At start After some iterations s After some iterations l µ 0 = 2 µ l+1 := 10µ l λ 0 = 0 λ s+1 := λ s µr(x s ) Ivet Galabova, Julian Hall Crash-Starting Simplex 5/20

Alternating variables approach L A = c T x λ T r(x) + µ 2 r(x)t r(x) Singular Hessian Computationally expensive Alternating variables approach Approximate solution Cheap min x h(x) repeated 100 times to improve accuracy min h(x) x 1 min h(x) x 2 min x n... h(x) Ivet Galabova, Julian Hall Crash-Starting Simplex 6/20

The Idiot Crash The Idiot Crash structure Initialize x 0, µ 0, λ 0 = 0 For I 1..nMajor For i 1..nMinor j {1..n} end Choose new µ or λ end min h(x) = c T x λ T r(x) + µ x j 2 r(x)t r(x) Every few major iterations µ is increased Ivet Galabova, Julian Hall Crash-Starting Simplex 7/20

Parameters At the end of each major iteration the values of λ or µ are re-adjusted: Start with µ 0, λ 0 = 0, nminor = 2 At the end of each major iteration: Until 10% reduction in primal infeasibility set µ k+1 = 10µ k Set nminor = 105 The first major iteration with a new µ, λ is unchanged The next few major iterations set λ k+1 i = µr i (x k ) Increase µ and repeat λ converges to 0 Ivet Galabova, Julian Hall Crash-Starting Simplex 8/20

Idiot nature Not Augmented Lagrangian During initial major iterations µ is smaller so λ ensures reducing primal infeasibility has enough weight min h(x) = c T x λ T r(x) + µ x j 2 r(x)t r(x), Cycles through all columns Random starting point Alternating variables method Early termination of a major iteration is possible Condition on moving average of expected progress Ivet Galabova, Julian Hall Crash-Starting Simplex 9/20

Relevant work An Augmented Lagrangian Method for Linear Programming Beale, Hattersley, James (1985) The LP dual active set algorithm Hager (1998) An approximate, efficient LP solver for LP rounding Sridhar, Wright et al (2013) Sparse linear programming via primal and dual augmented coordinate descent Yen, Zhong et al (2015) Ivet Galabova, Julian Hall Crash-Starting Simplex 10/20

Performance Generally useless Single variable change limitation Solves Quadratic Assignment Problems (QAPs) Combinatorial optimization problem, special case of facility location Well known for being hard to solve The Idiot crash reduces linearisation solution time significantly Many zero Lagrange multipliers Ivet Galabova, Julian Hall Crash-Starting Simplex 11/20

Quadratic Assignment Problems QAP12 Solution time (s) QAP15 Solution time (s) 70 60 50 40 30 20 10 0 900 800 700 600 500 400 300 200 100 0 idiot crossover simplex idiot crossover simplex nmajor nmajor Ivet Galabova, Julian Hall Crash-Starting Simplex 12/20

Accuracy Residual error 10 4 The Idiot crash on QAPs Residual error 10 1 An approximate, efficient LP solver for LP rounding Sridhar, Wright et al (2013) Residual error 10 3 Sparse linear programming via primal and dual augmented coordinate descent Yen, Zhong et al (2015) Ivet Galabova, Julian Hall Crash-Starting Simplex 13/20

Change in objective value Iteration Penalty Idiot AL 0 0 0 0 1 449.307 449.307 0 2 447.728 447.672 4.759 3 446.731 446.676 0 4 520.996 520.936 236.608 5 519.322 519.324 110.543 6 518.447 518.448 492.086 7 527.641 527.642 457.191 9 527.391 527.393 530.405 Penalty method vs Idiot vs Augmented Lagrangian objective value Ivet Galabova, Julian Hall Crash-Starting Simplex 14/20

Quadratic Assignment Problems QAP12 Solution time (s) QAP15 Solution time (s) 70 60 50 40 30 20 10 0 900 800 700 600 500 400 300 200 100 0 idiot crossover simplex idiot crossover simplex nmajor nmajor Ivet Galabova, Julian Hall Crash-Starting Simplex 15/20

Idiot alone min s.t. f P = c T x Ax = b x 0 (P) max f D = b T y s.t. A T y c (D) Idiot applied to (P) Primal feasible Objective f P f Near-optimal, but not provably Idiot applied to (D) Dual feasible Objective f D f Ivet Galabova, Julian Hall Crash-Starting Simplex 16/20

Branch & bound Branch & bound tree LP problems at each node Lower bounds on f Allow for pruning Ivet Galabova, Julian Hall Crash-Starting Simplex 17/20

Branch & bound Branch & bound tree LP problems at each node Lower bounds on f Allow for pruning Ivet Galabova, Julian Hall Crash-Starting Simplex 18/20

Observations Time (s) Problem LB UB f Bounds Idiot+Crossover Simplex qap08 141.39 204.57 203.50 0.55 0.61 1.80 qap12 481.50 523.60 522.89 3.85 12.64 65.21 qap15 950.05 1042.05 1040.99 11.53 64.48 803.30 QAPs require integer solution to linearization Fast bounds on f accelerate branch & bound algorithms Ivet Galabova, Julian Hall Crash-Starting Simplex 19/20

Conclusion Finding a good starting basis beneficial but difficult The Idiot crash is somewhere between a quadratic penalty function method and an Augmented Lagrangian method Limitations Sensitive to parameters Can have disastrous effects Single variable change Effective on some problems Scope for accelerating B&B Ivet Galabova, Julian Hall Crash-Starting Simplex 20/20