Solving the Linear Transportation Problem by Modified Vogel Method

Similar documents
A New Product Form of the Inverse

Lecture notes on Transportation and Assignment Problem (BBE (H) QTM paper of Delhi University)

Available online through ISSN

The MOMC Method: a New Methodology to Find. Initial Solution for Transportation Problems

Advanced Approximation Method for Finding an Optimal Solution of Unbalanced Fuzzy Transportation Problems

A method for unbalanced transportation problems in fuzzy environment

Fundamentals of Operations Research. Prof. G. Srinivasan. Department of Management Studies. Indian Institute of Technology, Madras. Lecture No.

Zero Average Method to Finding an Optimal Solution of Fuzzy Transportation Problems

Transportation problem

New Methodology to Find Initial Solution for. Transportation Problems: a Case Study with Fuzzy Parameters

TRANSPORTATION AND ASSIGNMENT PROBLEMS

OPERATIONS RESEARCH. Transportation and Assignment Problems

Transportation Problems

A new approach for solving cost minimization balanced transportation problem under uncertainty

A New approach for Solving Transportation Problem

Math 414 Lecture 30. The greedy algorithm provides the initial transportation matrix.

An Appropriate Method for Real Life Fuzzy Transportation Problems

Antti Salonen KPP227 KPP227 1

Linear programming II João Carlos Lourenço

Mathematical and Algorithmic Foundations Linear Programming and Matchings

ALGORITHMIC APPROACH TO UNBALANCED FUZZY TRANSPORTATION PROBLEM. A. Samuel 1, P. Raja 2

IV. Special Linear Programming Models

H. W. Kuhn. Bryn Mawr College

A New Approach for Solving Unbalanced. Fuzzy Transportation Problems

The Islamic University of Gaza Faculty of Commerce Quantitative Analysis - Dr. Samir Safi Midterm #2-28/4/2014

A COMPUTATIONAL STUDY OF THE CONSTRAINED MAXIMUM FLOW PROBLEM

A Generalized Model for Fuzzy Linear Programs with Trapezoidal Fuzzy Numbers

VARIANTS OF THE SIMPLEX METHOD

4. Linear Programming

Tribhuvan University Institute Of Science and Technology Tribhuvan University Institute of Science and Technology

Solutions for Operations Research Final Exam

BCN Decision and Risk Analysis. Syed M. Ahmed, Ph.D.

George B. Dantzig Mukund N. Thapa. Linear Programming. 1: Introduction. With 87 Illustrations. Springer

NATCOR Convex Optimization Linear Programming 1

Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch.

Introduction to Operations Research

Module 7. Independent sets, coverings. and matchings. Contents

Introduction to Mathematical Programming IE406. Lecture 16. Dr. Ted Ralphs

A Comparative study on Algorithms for Shortest-Route Problem and Some Extensions

What is linear programming (LP)? NATCOR Convex Optimization Linear Programming 1. Solving LP problems: The standard simplex method

On the Computational Behavior of a Dual Network Exterior Point Simplex Algorithm for the Minimum Cost Network Flow Problem

Discrete Optimization. Lecture Notes 2

Final Exam Spring 2003

A Computational Study on the Number of. Iterations to Solve the Transportation Problem

Linear and Integer Programming :Algorithms in the Real World. Related Optimization Problems. How important is optimization?

Advanced Operations Research Techniques IE316. Quiz 1 Review. Dr. Ted Ralphs

Lecture 4: Primal Dual Matching Algorithm and Non-Bipartite Matching. 1 Primal/Dual Algorithm for weighted matchings in Bipartite Graphs

Math 5593 Linear Programming Lecture Notes

Chapter 15 Introduction to Linear Programming

MATH 423 Linear Algebra II Lecture 17: Reduced row echelon form (continued). Determinant of a matrix.

success of Business enterprise especially in manufacturing organization. Goods manufactured by firm need to be distributed to dealers, distributers

DEGENERACY AND THE FUNDAMENTAL THEOREM

A method for solving unbalanced intuitionistic fuzzy transportation problems

R n a T i x = b i} is a Hyperplane.

Application of Bounded Variable Simplex Algorithm in Solving Maximal Flow Model

4.1 The original problem and the optimal tableau

Primal Simplex Algorithm for the Pure Minimal Cost Flow Problem

Linear Programming. Linear programming provides methods for allocating limited resources among competing activities in an optimal way.

قالىا سبحانك ال علم لنا إال ما علمتنا صدق هللا العظيم. Lecture 5 Professor Sayed Fadel Bahgat Operation Research

5.4 Pure Minimal Cost Flow

The simplex method and the diameter of a 0-1 polytope

Journal of Business & Economics Research November, 2009 Volume 7, Number 11

The Dynamic Hungarian Algorithm for the Assignment Problem with Changing Costs

Optimization of Design. Lecturer:Dung-An Wang Lecture 8

Two-stage Interval Time Minimization Transportation Problem with Capacity Constraints

Lecture 11: Maximum flow and minimum cut

Complementary Graph Coloring

Artificial Intelligence

The Further Mathematics Support Programme

Extensions of Semidefinite Coordinate Direction Algorithm. for Detecting Necessary Constraints to Unbounded Regions

Graphs and Network Flows IE411. Lecture 20. Dr. Ted Ralphs

Some Advanced Topics in Linear Programming

1. Lecture notes on bipartite matching February 4th,

Introduction. Linear because it requires linear functions. Programming as synonymous of planning.

A New Approach to Solve Mixed Constraint Transportation Problem Under Fuzzy Environment

COMPARATIVE STUDY OF NEW PROPOSED METHOD FOR SOLVING ASSIGNMENT PROBLEM WITH THE EXISTING METHODS

A modification of the simplex method reducing roundoff errors

Fuzzy Transportation Problems with New Kind of Ranking Function

Graphs that have the feasible bases of a given linear

A hybrid GMRES and TV-norm based method for image restoration

15-451/651: Design & Analysis of Algorithms October 11, 2018 Lecture #13: Linear Programming I last changed: October 9, 2018

New Directions in Linear Programming

Vertex Magic Total Labelings of Complete Graphs 1

Module 9 : Numerical Relaying II : DSP Perspective

THE simplex algorithm [1] has been popularly used

1. What do you get as the integer and noninteger parts if you factor this as we did with our cutting planes:

MLR Institute of Technology

CMPSCI611: The Simplex Algorithm Lecture 24

Discrete Optimization 2010 Lecture 5 Min-Cost Flows & Total Unimodularity

Fuzzy Variable Linear Programming with Fuzzy Technical Coefficients

SUGGESTED SOLUTION CA FINAL MAY 2017 EXAM

CS675: Convex and Combinatorial Optimization Spring 2018 The Simplex Algorithm. Instructor: Shaddin Dughmi

A Feasible Region Contraction Algorithm (Frca) for Solving Linear Programming Problems

Worst case examples of an exterior point algorithm for the assignment problem

CSE 417 Network Flows (pt 4) Min Cost Flows

CHAPTER 2 LITERATURE REVIEW

MATH3016: OPTIMIZATION

Solution of m 3 or 3 n Rectangular Interval Games using Graphical Method

SUBSTITUTING GOMORY CUTTING PLANE METHOD TOWARDS BALAS ALGORITHM FOR SOLVING BINARY LINEAR PROGRAMMING

Section Notes 4. Duality, Sensitivity, and the Dual Simplex Algorithm. Applied Math / Engineering Sciences 121. Week of October 8, 2018

Transcription:

Solving the Linear Transportation Problem by Modified Vogel Method D. Almaatani, S.G. Diagne, Y. Gningue and P. M. Takouda Abstract In this chapter, we propose a modification of the Vogel Approximation Method (VAM) used to obtain near optimal solutions to linear transportation problems. This method, called Modified Vogel Method (MVM), consists of performing the row and column reduction of the cost matrix and then applying the classical Vogel method to the equivalent transportation problem with the reduced cost matrix. We prove that when no further reduction of a cost matrix is required, we do obtain an optimal solution, not an approximate one. We identify some cases when such a behavior occurs and provides rules that allow for fast new reductions and penalty calculations when needed. The method also allows us to make multiple assignments of variables. Numerical tests run on small tests show that the MVM over performs the original one in all instances while requiring comparable computing times. The tests also support the intuition that the new method provides optimal solutions almost all the time, making it a viable alternative to the classical transportation simplex. 1 Linear Transportation Problem The linear transportation problem (LTP) consists in shipping a commodity from supply centers, called sources, to receiving centers, called destinations, while minimizing the total distribution cost. Assuming that we have m sources i = 1,, m P. M. Takouda ( ) School of Commerce & Administration, Laurentian University, Sudbury, Canada e-mail: mtakouda@laurentian.ca D. Almaatani Department of Mathematics and Computer Sciences, Laurentian University, Sudbury, Canada e-mail: dalmaatani@laurentian.ca Y. Gningue Department of Mathematics and Computer Sciences, Laurentian University, Sudbury, ON, Canada e-mail: ygningue@cs.laurentian.ca S. G. Diagne Département de Mathématiques, Université Cheikh Anta Diop, Dakar, Sénégal e-mail: gueyesalli@yahoo.com Springer International Publishing Switzerland 2015 13 M. G. Cojocaru et al. (eds.), Interdisciplinary Topics in Applied Mathematics, Modeling and Computational Science, Springer Proceedings in Mathematics & Statistics 117, DOI 10.1007/978-3-319-12307-3_3

14 P. M. Takouda et al. and n destinations j = 1,, n, we denote by C ij the cost of shipping one unit of commodity from source i to destination j, bya i the capacity of source i and by b j the demand at destination j, Then, the LTP can be formulated as follows. min CT = m n C ij X ij i=1 j=1 n X ij = a i ; i = 1, m LTP j=1 m X ij = b j ; j = 1, n i=1 X ij 0; i = 1, m ; j = 1, n It is a linear program with n+m constraints and n m variables, X ij representing the quantity shipped from source i to destination j. The costs C i,j form a matrix C = (C i,j ) i,j called the cost matrix of the problem. LPT is usually optimally solved using the transportation simplex algorithm (see [3]). This algorithm has to be provided a starting basic feasible solution. Several methods exist that compute such starting points. One of the most efficient is the Vogel Approximation Method (VAM) [6]. Indeed, VAM produces good near optimal solutions, which reduce the number of iterations that the transportation simplex has to perform. Assuming for the rest of this chapter that a line in a matrix refers to either a row and a column in the matrix, the VAM runs as follows. For each line (row or column) of the cost matrix, compute its penalty, which is the difference between the two least costs of the line. Then, locate the line with the largest penalty (called the penalty line), and in that line, look for the lowest shipping cost (this cost is at the intersection of the penalty line and another one called the complementary line). Assign to the corresponding variable X the maximum quantity of commodity that can be shipped at that cost. Update corresponding demand and supply informations. One of these informations will be updated to 0. The corresponding line is said to be saturated and is removed from the cost matrix (which is shrinked). The process is repeated until all the demands have been satisfied. Some additional rules exist that helps deal with special situations such as the occurrence of several largest penalties or degeneracy (when the two penalty and complementary lines are saturated at the same time before the end of the algorithm). The interested reader should refer [6]. Modifications of the original VAM have been proposed, most of them for unbalanced transportation problems. For balanced ones, one variant proposed consisting in modifying the cost matrix as follows: from the cost matrix, obtain the row (respectively column) opportunity cost matrix by subtracting in each row (respectively column) the smallest cost of the row (column) from all entries in the row (column); then add the two row and column opportunity cost matrices to obtain a new cost matrix on which the original VAM is applied. In addition, some additional tie-breaking rules are proposed. These variants are proposed and analyzed in [5] and [7]. Our modification of the VAM extends the one proposed in [4] and is called the Modified Vogel Method (MVM). (1)

Solving the Linear Transportation Problem by Modified Vogel Method 15 The rest of the chapter is organized as follows. Next section presents the MVM. Then the method is illustrated on an example with four sources and five destinations in Sect. 3. Numerical tests are presented in Sect. 4 followed by concluding remarks in Sect. 5. 2 The Modified Vogel Method The MVM can be described as follows. First, compute the reduced cost matrix, R, by applying successively a row and a column reduction on the cost matrix C and define a reduced transportation problem (RTP) by replacing C in LTP by R. Then, apply the VAM to RTP. At each iteration, the new shrunk matrix is reduced if needed. A reduced cost matrix has a zero in each line. Therefore, penalties in MVM are simply the second lowest costs of the line. Due to the row and column reductions, each entry of the reduced cost matrix already contains information about gaps between the original costs in each row and column. Hence, the associated Vogel penalties are qualitatively better than the ones in VAM. Note also that using MVM, we compute solutions where some of the assigned variables are associated with zero reduced costs of the LTP. This makes them particularly appealing for the simplex transportation algorithms [4]. In fact, in several instances, the transportation simplex algorithm [2] is not needed: MVM is guaranteed to provide the optimal solutions. Theorem 1 The reduced transportation problem (RTP) is equivalent to the linear transportation problem (LTP), and if its optimal cost is zero, then the optimal solution of RTP is optimal for LTP. Proof The row and column reductions that have been applied to the cost matrix to obtain the reduced cost matrix are admissible transformations as defined in [1]. At each iteration of MVM, one line of the current reduced matrix is removed. The remaining shrunk matrix may not be in a reduced form. However, if the highest penalty is nonzero, the penalty line is saturated, and the penalty of the complementary line is zero, then, the shrunk cost matrix remains reduced. Indeed, all the lines parallel to the penalty line are unaltered. They stay reduced, and their penalties are unchanged. Then, the highest penalty being nonzero, there was only one zero entry on the penalty and that zero is also on the complementary line. Crossing the penalty do not remove a zero on the lines parallel to the complementary line. Hence, they stay reduced and their penalty would change only if their penalty was on crossed line. In such a case, the new penalty is simply the next smallest nonzero cost. Finally, the complementary line, since its penalty is 0, had at least two zero entries. Therefore, it has at least one zero remaining and stays reduced and its penalty has to be recalculated. As a result, only a few penalties have to be recalculated. If the penalty of the complementary line was not zero, all the previous remarks are still valid except for the complementary line which is now not reduced. Again, it is easy to reduce: subtract its penalty from all the entries. Note that such an operation

16 P. M. Takouda et al. is equivalent to applying an admissible transformation (see [1]) to the reduced cost matrix to solve an equivalent problem in which the complementary line has a zero penalty. In summary, when the saturated line is the penalty line, the shrunk matrix is always reduced, up to an admissible transformation. Hence, the following results hold. Theorem 2 During the application of MVM, we have the two following assertions. 1. If no new reduction is necessary during the iterations, the solution obtained is optimal for LTP. 2. If all the successive line removals are associated to a unique largest penalty, then LTP is optimal. Remark 1 The only time we require the use of the transportation algorithm to obtain the optimal solution is when we have ties for the largest penalty and we had to do new matrix reduction. In the MVM algorithm, we keep track of the occurrences of tied largest penalties and new reductions. The MVM algorithm is hence described as follows. Modified Vogel Algorithm Step 0. Compute the reduced cost matrix (R). Set Nred := 1; TieLpen := 0 and UniqueLpen := 1. Step 1. Penalty Determination Determine the penalties p i for each row i and q j for each column j. Find the largest penalty max{p i,q j } = Lpen. If there is a tie for the largest penalty then set TieLpen := 1. Step 2. Assigning Variable Find a zero reduced cost in the line of Lpen: R kr. X kr will be assigned a value. Step 3. Updating: assign the value of X kr, updates a k and b r. Eliminate the saturated line (supply or demand fully satisfied) Step 4. Stopping Test If there is one remaining line then fill it and go to step 6 Step 5. New Reduction of Remaining Matrix Reduce the remaining matrix then set Nred := Nred + 1. If TieLpen := 1 then set UniqueLpen := 0 and go to step 1. Step 6. Otpimality Test If Nred := 1 then the MVM solution is optimal. elseif UniqueLpen := 1 the MVM solution is optimal. else find the dual variables and test the optimality.

Solving the Linear Transportation Problem by Modified Vogel Method 17 3 Illustrative Example Let s illustrate the MVM on the following transportation problem. 1 3 8 7 6 15 8 10 7 9 6 25 2 11 6 8 9 10 5 9 8 7 7 16 10 15 12 14 15 In each row, we identify the smallest cost and subtract it from all the row s entries. As a result, there is a zero in each row. We say that the row is reduced. We obtain the following matrix. 0 2 7 6 5 15 2 4 1 3 0 25 0 91 4 6 7 10 0 4 3 2 2 16 10 15 12 14 15 Since columns 1 and 5 already have a zero, they are already reduced. We reduce columns 2, 3, and 4 (again by subtracting the least cost in the entry from all the column s entries). Then, we compute the penalties for each row (column 7) and column (row 6) and we start the MVM. 0 0 6 4 5 15 p 1 =0 p 1 = 4 2 2 0 1 0 25 p 2 =0 p 2 =0 p 2 =0 0 7 3 4 7 10 p 3 = 3 0 2 2 0 2 16 p 4 =0 p 4 =2 p 4 =2 10 15 12 14 15 q 1 =0 q 2 =2 q 3 =2 q 4 =1 q 5 =2 q 2 =2 q 3 =2 q 4 =1 q 5 =2 q 3 =2 q 4 =1 q 5 =2 q 3 =2 q 4 =1 q 5 = 2 q 3 =2 q 4 =1

18 P. M. Takouda et al. Iteration 1 Lpen = 3 leading to X 3,1 = 10. No ties for Lpen. Both lines are saturated. Remove the line of largest penalty (row 3). Then, using the least initial costs in column 1, assign X 11 = 0 and remove column 1. No new reduction needed. Iteration 2 Lpen = 4 leading to X 12 = 15. No ties for Lpen. Both lines are saturated. Remove the line of largest penalty (row 1). Again, the least initial cost in column 2 leads to X 41 = 0 and remove column 2. No new reduction needed. Iteration 3 Lpen = 2, and we have a three way tie: row 4, columns 3, 5. Lowest cost break the tie: assign X 25 = 15, remove column 5. No new reduction. Iteration 4 We assign X 44 =14 and fill the last line. We obtain the optimal solution: X 11 =0; X 12 =15; X 23 =10; X 25 =15; X 31 =10; X 44 =14; X 43 =2, with the total cost equal to TC = 339. The solution is optimal for the LPT since no new reduction was ever required. Note that in general, we will need to test the optimality by evaluating the dual variables and the reduced cost for the nonbasic variables. 4 Numerical Tests We run some tests to compare VAM and MVM. Both codes were written in Java. We solved two sets of randomly generated LTPs. The first set consists of 12 problems having the same number (5, 10, 15) of sources and destinations. The second set contains 15 transportation problems with different number of sources and destinations. There were three to five sources and destinations. The MVM has outperformed the VAM in all the cases. The improvement rate, measuring by how much MVM has improved the solution obtained by VAM, ranged from 0 to 20.92 %. At the same time, MVM has required less computing time: 11.17 s in average for VAM, compared to 9.72 s for MVM. In all these problems, the MVM provides a better solution than the VAM. We noticed also that the MVM solution is optimal when it equals the VAM. This confirms results from the literature [5,7]: VAM provides an optimal solution at least 80 % of the times [5,7]. Our test suggests that MVM should provide the optimal solutions in a higher proportion of time. It is our next objective: compare the MVM s solutions with LTP optimal ones. 5 Conclusion We introduced with MVM a new algorithm to compute approximate solutions to LTPs based on VAM. MVM always outperforms VAM without requiring significantly more time. It provides qualitatively better starting points to the transportation algorithms. It is proven to provide optimal solutions in several cases, and this optimality can be checked directly in the MVM algorithm. In the future, we would like to identify more of, if not all, the cases where we have a guarantee that MVM provides optimal solutions. It would allow MVM to become a viable alternative to the transportation simplex.

Solving the Linear Transportation Problem by Modified Vogel Method 19 References 1. Burkard, R.E.: Admissible transformations and assignment problems. Vietnam J. Math. 35(4), 373 386 (2007) 2. Charnes, A., Cooper, W.W.: The stepping-stone method for explaining linear programming calculations in transportation problems. Manage. Sci. 1(1), 49 69 (1954) 3. Dantzig, G.B.: Linear Programming and Extensions. Princeton University Press (1963) 4. Diagne, S.G., Gningue, Y.: Méthode de Vogel modifiée pour la résolution des problèmes de transport simple. Appl. Math. Sci. 5(48), 2373 2388 (2011) 5. Mathirajan, M., Neenakshi, B.: Experimental analysis of some variants of Vogel s Approximation Method. Asia Pac. J. Oper. Res. 21(4), 447 462 (2004) 6. Reinfeld, N.V., Vogel, W.R.: Mathematical Programming. Prentice-Hall, Englewood Cliffs (1958) 7. Singh, S., Dubey, G.C., Shrivastava, R.: Optimization and analysis of some variants through Vogel s Approximation Method (VAM). IOSR J. Eng. (IOSRJEN) 2(9), 20 30 (2012)