Network Flows. 7. Multicommodity Flows Problems. Fall 2010 Instructor: Dr. Masoud Yaghini

Similar documents
Final Exam Spring 2003

Part 4. Decomposition Algorithms Dantzig-Wolf Decomposition Algorithm

15.082J and 6.855J. Lagrangian Relaxation 2 Algorithms Application to LPs

Introduction to Mathematical Programming IE406. Lecture 16. Dr. Ted Ralphs

Optimal network flow allocation

Lagrangean relaxation - exercises

56:272 Integer Programming & Network Flows Final Examination -- December 14, 1998

Lecture 4 Duality and Decomposition Techniques

56:272 Integer Programming & Network Flows Final Exam -- December 16, 1997

CONLIN & MMA solvers. Pierre DUYSINX LTAS Automotive Engineering Academic year

ONLY AVAILABLE IN ELECTRONIC FORM

Integer Programming: Algorithms - 2

Crew Scheduling Problem: A Column Generation Approach Improved by a Genetic Algorithm. Santos and Mateus (2007)

Selected Topics in Column Generation

Maximum flow problem CE 377K. March 3, 2015

Lagrangian Relaxation: An overview

A MRF Shape Prior for Facade Parsing with Occlusions Supplementary Material

Algorithms for Integer Programming

An Introduction to Dual Ascent Heuristics

Surrogate Gradient Algorithm for Lagrangian Relaxation 1,2

Lagrangian Relaxation

Lecture 18 Solving Shortest Path Problem: Dijkstra s Algorithm. October 23, 2009

A Distributed Framework for Correlated Data Gathering in Sensor Networks

Applied Lagrange Duality for Constrained Optimization

A Comparison of Mixed-Integer Programming Models for Non-Convex Piecewise Linear Cost Minimization Problems

Network optimization: An overview

Link Dimensioning and LSP Optimization for MPLS Networks Supporting DiffServ EF and BE Classes

Sparse Optimization Lecture: Proximal Operator/Algorithm and Lagrange Dual

Gate Sizing by Lagrangian Relaxation Revisited

Solutions for Operations Research Final Exam

Network Disruption Recovery for Multiple Pairs of Shortest Paths

Introduction to Mathematical Programming IE406. Lecture 20. Dr. Ted Ralphs

3 INTEGER LINEAR PROGRAMMING

Data Mining. 2.4 Data Integration. Fall Instructor: Dr. Masoud Yaghini. Data Integration

The only known methods for solving this problem optimally are enumerative in nature, with branch-and-bound being the most ecient. However, such algori

Lagrangean Methods bounding through penalty adjustment

PRIMAL-DUAL INTERIOR POINT METHOD FOR LINEAR PROGRAMMING. 1. Introduction

and 6.855J Lagrangian Relaxation I never missed the opportunity to remove obstacles in the way of unity. Mohandas Gandhi

Part 1. The Review of Linear Programming The Revised Simplex Method

Linear Programming Terminology

Lagrangian Cardinality Cuts and Variable Fixing for Capacitated Network Design?

An Improved Subgradiend Optimization Technique for Solving IPs with Lagrangean Relaxation

Linear Programming. Course review MS-E2140. v. 1.1

A NEW SEQUENTIAL CUTTING PLANE ALGORITHM FOR SOLVING MIXED INTEGER NONLINEAR PROGRAMMING PROBLEMS

Assignment 3b: The traveling salesman problem

A Simpler Generalized Shortest Paths Algorithm

to the Traveling Salesman Problem 1 Susanne Timsj Applied Optimization and Modeling Group (TOM) Department of Mathematics and Physics

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini

Gate Sizing and Vth Assignment for Asynchronous Circuits Using Lagrangian Relaxation. Gang Wu, Ankur Sharma and Chris Chu Iowa State University

The successful solution of large-scale mixed integer

LECTURE 13: SOLUTION METHODS FOR CONSTRAINED OPTIMIZATION. 1. Primal approach 2. Penalty and barrier methods 3. Dual approach 4. Primal-dual approach

Graphs and Network Flows IE411. Lecture 13. Dr. Ted Ralphs

The Shortest Path Problem. The Shortest Path Problem. Mathematical Model. Integer Programming Formulation

Math Models of OR: The Simplex Algorithm: Practical Considerations

Distributed Alternating Direction Method of Multipliers

Convex Optimization and Machine Learning

Improving Dual Bound for Stochastic MILP Models Using Sensitivity Analysis

Treatment Planning Optimization for VMAT, Tomotherapy and Cyberknife

MVE165/MMG631 Linear and integer optimization with applications Lecture 9 Discrete optimization: theory and algorithms

34. Recursion. Java. Summer 2008 Instructor: Dr. Masoud Yaghini

Module 27: Chained Matrix Multiplication and Bellman-Ford Shortest Path Algorithm

2 is not feasible if rounded. x =0,x 2

Data Mining. SPSS Clementine k-means Algorithm. Spring 2010 Instructor: Dr. Masoud Yaghini. Clementine

MVE165/MMG630, Applied Optimization Lecture 8 Integer linear programming algorithms. Ann-Brith Strömberg

10. Network dimensioning

A Note on the Separation of Subtour Elimination Constraints in Asymmetric Routing Problems

6. Tabu Search. 6.3 Minimum k-tree Problem. Fall 2010 Instructor: Dr. Masoud Yaghini

Kernels and Constrained Optimization

Unsplittable Flows. Hoon Cho and Alex Wein Final Project

Department of Mathematics Oleg Burdakov of 30 October Consider the following linear programming problem (LP):

Parallel and Distributed Graph Cuts by Dual Decomposition

Overlay Networks with Linear Capacity Constraints

asm* HD28 M414 no , JAN

Outline. Column Generation: Cutting Stock A very applied method. Introduction to Column Generation. Given an LP problem

Column Generation: Cutting Stock

Tuesday, April 10. The Network Simplex Method for Solving the Minimum Cost Flow Problem

Robust validation of network designs under uncertain demands and failures

Department of Electrical and Computer Engineering

Floorplan Area Minimization using Lagrangian Relaxation

The Optimal Virtual Path Design of ATM Networks. Conference on Local Computer Networks Proceedings, Minneapolis, USA, 2-5 November 1997, p.

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 61, NO. 11, JUNE 1, Optimal Wireless Communications With Imperfect Channel State Information

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I. Instructor: Shaddin Dughmi

Multiple-Depot Integrated Vehicle and Crew Scheduling

Designing WDM Optical Networks using Branch-and-Price

Fundamentals of Programming Session 13

The Multi-Item Capacitated Lot-Sizing Problem With Safety Stocks and Demand Shortage Costs

TIM 206 Lecture Notes Integer Programming

Integer Programming ISE 418. Lecture 7. Dr. Ted Ralphs

Minimum-Energy Broadcast and Multicast in Wireless Networks: An Integer Programming Approach and Improved Heuristic Algorithms

Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

Unit.9 Integer Programming

Computational study of the step size parameter of the subgradient optimization method

ASHORT product design cycle is critical for manufacturers

Data Mining Part 3. Associations Rules

Mon Tue Wed Thurs Fri

ACO Comprehensive Exam October 12 and 13, Computability, Complexity and Algorithms

1 Linear programming relaxation

Using Decomposition Techniques for Solving Large-Scale Capacitated Hub Location Problems with Single Assignment

Sequential Coordinate-wise Algorithm for Non-negative Least Squares Problem

6. Lecture notes on matroid intersection

Transcription:

In the name of God Network Flows 7. Multicommodity Flows Problems 7.2 Lagrangian Relaxation Approach Fall 2010 Instructor: Dr. Masoud Yaghini

The multicommodity flow problem formulation:

We associate nonnegative Lagrange multipliers w ij with the bundle constraints, creating the following Lagrangian subproblem: or, equivalently, subject to

Since the term Lagrangian Relaxation is a constant for any given choice of the Lagrange multipliers therefore we can ignore it. The resulting objective function has a cost associated with every flow variable

Since none of the constraints in this problem contains the flow variables for more than one of the commodities, the problem decomposes into separate minimum cost flow problems, one for each commodity. Consequently, to apply the subgradient optimization procedure to this problem, we would alternately (1) solve a set of minimum cost flow problems (for a fixed value of the Lagrange multipliers w) with the cost coefficients (2) update the multipliers by the algorithmic procedures

If denotes the optimal solution to the minimum cost flow subproblems when the Lagrange multipliers have the value at the qth iteration, the subgradient update formula becomes: In this expression, the notation [a] + denotes the positive part of a, that is, max(a, 0). The scalar θ q is a step size specifying how far we move from the current solution

If the subproblem solutions use more than the available capacity u ij of that arc, the update formula increases the multiplier on arc (i, j) by the amount: If the subproblem solutions use less than the If the subproblem solutions use less than the available capacity of that arc, the update formula reduces the Lagrange multiplier of arc (i, j) by the amount:

The decrease would cause the multiplier to become negative, we reduce the multiplier to value zero. We choose the step sizes θ q for iterations q = 1, 2,..., in accordance with:

Whenever we apply Lagrangian relaxation to any linear program, such as the multicommodity flow problem, the optimal value of the Lagrangian multiplier problem equals the optimal objective function value z* of the linear program.

Example: Lagrangian Relaxation Consider a two-commodity problem

For any choice of the Lagrange multipliers w l2 and for the two capacitated arcs (s l, t l ) and (1, 2), the problem decomposes into two shortest path problems. If we start with the Lagrange multipliers then in the subproblem solutions, the shortest path: the shortest path s 1 -t l carries 10 units of flow at a cost of 1(10) = 10 and, the shortest path s 2-1-2-t 2 carries 20 units of flow at a cost of 3(20) = 60 Therefore, L(0) = 10 + 60 = 70 is a lower bound on the optimal objective function value for the problem.

Since The update formulas become: If we choose θ 0 = 1, then

The new shortest path solutions send: 10 units on the path s 1 -t l at a cost of (1 + 5) (10) = 60 20 units on the path s 2 -t 2 at a cost of 5(20) = 100 The new lower bound obtained through: The value of the lower bound has decreased.

At this point: Lagrangian Relaxation so the update formulas become -10 If we choose θ 0 = 1,

The new shortest path solutions send: 10 units on the path s 1 -t l at a cost of (1 + 10) (10) = 110 and 20 units on the path s 2-1-2-t 2 at a cost of 3(20) = 60 If we continue by choosing the step sizes for the kth iteration as θ k = 1/k, we obtain the set of iterates:

From iteration 14 on, the values of the Lagrange multipliers oscillate about, and converge to, their optimal values: The optimal lower bound oscillates about its optimal value 150, which equals the optimal objective function value of the multicommodity flow problem.

This figure shows how the values of the Lagrange multipliers vary during the execution of the algorithm

This figure shows how the values of Lagrangian lower bounds vary during the execution of the algorithm

Advantages of subgradient optimization for solving the Lagrangian multiplier problem: (1) This solution approach permits us to exploit the underlying network flow structure. (2) The formulas for updating the Lagrange multipliers w ij are rather small computationally and very easy to encode in a computer program.

Limitations of subgradient optimization: (1) To ensure convergence we need to take small step sizes; as a result, the method does not converge very fast. (2) the optimal flows solve the Lagrangian subproblem, these subproblems might also have other optimal solutions that do not satisfy the bundle constramts. For example for the problem we have just solved, with the optimal Lagrange multipliers w 12 = 2 and, the shortest paths subproblems have solutions with 10 units on the path s l -t l and 20 units on the path s 2 -t 2. This solution violates the capacity of the arc (s 1, t l ). In general, to obtain optimal flows, even after we have solved the Lagrangian multiplier problem, requires additional work.

The End