L2: Algorithms: Knapsack Problem & BnB

Size: px
Start display at page:

Download "L2: Algorithms: Knapsack Problem & BnB"

Transcription

1 L2: Algorithms: Knapsack Problem & BnB This tutorial covers the basic topics on creating a forms application, common form controls and the user interface for the optimization models, algorithms and heuristics, and simulation models that will be created in the following tutorials. Secondly, an exact algorithm for the continuous knapsack problem and a heuristic for the integer knapsack problem will be implemented. Finally, the integer knapsack problem will be solved by the Branch-and-Bound (BnB) algorithm that uses the implemented exact algorithm as the solution approach for the continuous relaxation of the integer problem. For a better understanding of the BnB algorithm and related strategies; the iterations will be carried out step by step using the form application created. In the following tutorials, a general BnB algorithm will be implemented to automate these iterations. (Wikipedia) The knapsack problem or rucksack problem is a problem in combinatorial optimization: Given a set of items, each with a weight and a value, determine the number of each item to include in a collection so that the total weight is less than or equal to a given limit and the total value is as large as possible. It derives its name from the problem faced by someone who is constrained by a fixed-size knapsack and must fill it with the most valuable items. Notation I : Set of items. v i : Unit value of item i. w i : Unit weight of item i. W : Total weight capacity of the knapsack l i : Minimum quantity of item i to be placed in the knapsack. u i : Available quantity of item i. x i : Quantity of item i placed in the knapsack (decision variable). Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 1

2 1.1 Mathematical Formulation max v i x i x i I subject to w i x i W, i I l i x i u i, i I. In classical knapsack problems, x i s are integer variables and l i = 0, i I. When x i s are continuous, there exists a polynomial time algorithm that optimally solves the problem. 1.2 Feasibility Condition The knapsack problem is feasible if the following condition is satisfied: l i w i W. i I Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 2

3 1.3 Greedy Algorithm for Continuous Knapsack Problem Let r i = v i /w i, i I, be value-to-weight ratio of item i. The following algorithm optimally solves the knapsack problem when the decision variables are continuous. S0. x i = l i, i I 2 W = l i w i i I 3 V = l i v i i I 4 If W > W then stop. The problem is infeasible. S1. While I do 6 i = argmax{r i } i I W W 7 x i = x i + min {u i l i, } 8 W = W + (x i l i ) w i w i 9 V = V + (x i l i ) v i 10 I = I {i } 11 EndWhile S2. Return x i, W, V. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 3

4 1.4 Greedy Algorithm for Integer Knapsack Problem The operation a rounds down the real number a to the nearest integer. In the integer knapsack problem; we assume that the bounds are integer. The following algorithm provides a polynomial-time discrete solution to the integer knapsack problem; however, the solution is not necessarily optimal. S0. x i = l i, i I 2 W = l i w i i I 3 V = l i v i i I 4 If W > W then stop. The problem is infeasible. S1. While I do 6 i = argmax{r i } i I W W 7 x i = x i + min {u i l i, } 8 W = W + (x i l i ) w i w i 9 V = V + (x i l i ) v i 10 I = I {i } 11 EndWhile S2. Return x i, W, V. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 4

5 STEP 1: Uğur Arıkan ; 5

6 Your program will be crated with a single class, Program.cs; and a single form Form1.cs. To run your program, press the Start button (or F5). You will see an empty form. The program stops when the form is closed. You can manually resize the form using the [Design] view. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 6

7 STEP 1: Click the Toolbox in the left-menu (if you cannot see the Toolbox, click View > Toolbox from the top-menu); select All Windows Forms; drag and drop DataGridView, Label (twice), TextBox, Button (twice), into the form. Uğur Arıkan ; 7

8 2.1 Label (msdn.microsoft.com) Label represents a standard Windows label. STEP 2: Select label1 in the form; right-click and click Properties. In the Properties window, change the Text property from label1 to W. STEP 3: Select label2 in the form; right-click and click Properties. In the Properties window, change: the Name property from label2 to lbloutput, the AutoSize property from True to False. The Label property of a Label control is the text displayed in the form; while the Name property will be used to call the control. Switching the AutoSize property to False allows to edit and fix the size of the control. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 8

9 2.2 TextBox (msdn.microsoft.com) You can both display text and retrieve text from users by using a TextBox control. STEP 4: Select the TextBox in the form; right-click and click Properties. In the Properties window, change the Name property from textbox1 to tbxw. 2.3 Button (msdn.microsoft.com) Button control reacts to the ButtonBase.Click event. STEP 5: Select the button1 in the form; right-click and click Properties. In the Properties window, change: the Name property from button1 to btncont, the Text property from button1 to Continuous. Select the button2 in the form; right-click and click Properties. In the Properties window, change: the Name property from button2 to btnint, the Text property from button2 to Integer. The first button, btncont, will be used to run the Greedy heuristic for the continuous problem; and btnint will be used to run the Greedy heuristic for the integer problem. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 9

10 2.4 DataGridView (msdn.microsoft.com) The DataGridView control provides a powerful and flexible way to display data in a tabular format. You can use the DataGridView control to show read-only views of a small amount of data, or you can scale it to show editable views of very large sets of data. STEP 6: Select the DataGridView in the form; right-click and click Properties. In the Properties window, change the Name property from datagridview1to dgv. In the Properties window, click the button; Edit Columns window will appear. Click Add to add a column for weight (w): Repeat the last step to add columns for value, lower bound, upper bound, and quantity: The DataGridView will be used to get data for the Knapsack problem; and display the solution (x). Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 10

11 STEP 7: Resize the form; resize and relocate the controls properly: Run your program (F5), and check the controls (test adding rows to the DataGridView). Uğur Arıkan ; 11

12 3.1 Data Representation The data of the Knapsack problem can be represented by a list of Item objects; where the Item class holds the variables for weight, value, lower bound, and upper bound. We will also create a field for value-to-weight ratio. STEP 8: Right-click L2 in the Solution Explorer; and click Add > Class; and add the Item class with the following fields, constructor, and methods. The index field will represent the index of the item in the DataGridView, and hence, in the item list. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 12

13 3.2 Code within the Windows Form STEP 9: Right-click Form1.cs in the Solution Explorer; and click View Code. You will see the code of the form class; with a single constructor that uses the generic InitializeComponent method. Notice the inheritance (Form1 : Form) which states that Form1.cs inherits the Form class. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 13

14 To represent the data of the Knapsack problem; we will add a list of Item objects; a double for the capacity W. Finally, we will add a method to read data from the form. STEP 10: Add the following fields and method in Form1.cs. Note that we use the names (tbxw, dgv) to call the controls in the form). tbxw.text returns the user input in the TextBox. this.items = new List<Item>(); creates an empty list of Item objects. Therefore, if the list contained Item objects before the readdata method is called; the list will be cleared at this line. dgv.rows.count returns the number of rows of the DataGridView. The -1 is to eliminate the last empty row of the DataGridView for the new entries. dgv.rows[i].cells[j].value returns the value entered in row i and column j of the DataGridView. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 14

15 3.3 Button Click Event STEP 11: Go back to Form1.cs [Deisgn]; double click on the Continuous button. The method btncont_click will automatically be added in Form1.cs. Add the following line in the method and add a breakpoint at this line (F9): Run the program (F5); add two items and enter the capacity as shown below: Then, click the Continuous button. The program will break at the readdata() line. Press F10 to Step Over (or click F11 to Step Into and check the code line by line). Once the program leaves the readdata() line; check if the data is read correctly (place the cursor on the items and W fields to see their current values). Then, click F5 to run the code and go back to the form. Close the form and remove the breakpoint (F9). Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 15

16 4.1 Greedy Algorithm for Continuous Knapsack Problem STEP 12: Add the code of the Greedy Algorithm for Continuous Knapsack Problem (Section 1.3) in the btncont_click method. Check the equivalence of the code with the algorithm given in Section 1.3. Ws and Vs variables correspond to W and V in the algorithm. // can be used to add comment lines to your code. The line List<Item> list = this.items.orderbydescending(t => t.getr()).tolist(); creates a new list called list of Item objects where the items are ordered in descending order of the values in their r field (getr() method). OrderByDescending (or OrderBy for ascending) is a method for lists that effectively orders objects with respect to the desired field(s). Since the list is ordered; the first item, i.e., list[0], returns the item with maximum value-to-weight ratio. The output is written in lbloutput (infeasible; or W and V values); and in the x column of the DataGridView. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 16

17 STEP 13: Test the code of the Greedy Algorithm for Continuous Knapsack Problem by entering the following inputs and pressing the Continuous button. Update both of the lower bounds as 3; and press the Continuous button. Uğur Arıkan ; 17

18 4.2 Greedy Algorithm for Integer Knapsack Problem STEP 14: Go back to Form1.cs [Deisgn]; double click on the Integer button. The method btnint_click will automatically be added in Form1.cs. Add the readdata() line and the code of the Greedy Algorithm for Integer Knapsack Problem (Section 1.4) in the btncont_click method. Add the following line in the method and add a breakpoint at this line (F9): Note that the only difference is the use of Floor function of the Math library; which is equivalent to the a operation defined in Section 1.4. Also the output message is slightly different. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 18

19 STEP 15: Test the code of the Greedy Algorithm for Integer Knapsack Problem by entering the following inputs and pressing the Integer button. Update both of the lower bounds as 3; and press the Integer button. Uğur Arıkan ; 19

20 (Wikipedia) Branch and bound (BB, B&B, or BnB) is an algorithm design paradigm for discrete and combinatorial optimization problems, as well as general real valued problems. A branch-andbound algorithm consists of a systematic enumeration of candidate solutions by means of state space search: the set of candidate solutions is thought of as forming a rooted tree with the full set at the root. The algorithm explores branches of this tree, which represent subsets of the solution set. Before enumerating the candidate solutions of a branch, the branch is checked against upper and lower estimated bounds on the optimal solution, and is discarded if it cannot produce a better solution than the best one found so far by the algorithm. In this tutorial, we will focus on using the BnB algorithm for solving a program with integer variables where it is efficient to solve the program s continuous relaxation. Notation I : Set of integer variables of the original problem. i, j : Variable index for the integer variables, i, j I. x i : Integer variable, i I. l i : Lower bound of the integer variable x i in the original problem. u i : Upper bound of the integer variable x i in the original problem. N : Set of active nodes in the branch and bound tree. n, m : Node indices for the nodes in the BnB tree; n, m N. z n : Lower bound for the objective function value at node n. l ni : Lower bound for variable x i at node n. u ni : Upper bound for variable x i at node n. z n : Optimal objective function value of the continuous relaxation at node n. z : Best integer solution found during the execution of the algorithm. N : Set of nodes that provides the best integer feasible solution(s). Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 20

21 5.1 The Algorithm The following BnB algorithm finds the optimal solution to a minimization problem with integer variables by sequentially solving its continuous relaxations. S0. z = and N =. 2 Create the root node n with: 3 l ni = l i ; u ni = u i, i I. 4 z n =. 5 N = {n}. S1. While N do 7 Select a node m in N. 8 N = N {m}. 9 Solve the relaxed problem with bounds l mi and u mi. 10 If the relaxation is feasible then 11 z m is the optimal obj. function value of the relaxation at node m. 12 If z m z then 13 x mi is the optimal value of x i in the relaxation at node m. 14 I frac = {i I: x mi Z}. 15 If I frac = then 16 If z m = z then 17 N = N {m} 18 ElseIf z m < z then 19 N = {m} 20 z = z m 21 N = N {n N: z n > z } 22 EndIf 23 ElseIf I frac then 24 Select a variable index j from I frac. 25 Create the child node n with z n = z m and 26 l ni = l mi, u ni = u mi ; i I {j} ; l nj = l mj, u nj = x mj 27 Create the child node n with z n = z m and 28 l n i = l mi, u n i = u mi ; i I {j} ; l n j = x mj, u nj = u mj 29 N = N {n, n}. 30 EndIf 31 EndIf 32 EndIf 33 EndWhile S2. Return z, N. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 21

22 In the initialization step, the following operations are carried out: Since there is no integer feasible solution yet, the best feasible objective function value (z ) is set to and the set of optimal nodes (N ) is initialized as an empty set (line 1). The root node is created (lines 2-4). The root node uses the bounds of the original problem and the best possible objective function value is set to. The node queue is initialized including only the root node (line 5). The algorithm keeps only the nodes that might improve the best feasible objective function value; and the algorithm terminates when no such node exists (line 6). At each iteration of the While loop; a node (m) is selected and removed from the node set N; and the relaxation of the problem with respect to the bounds of this node is solved (lines 7-9). If the relaxation of the problem is infeasible; the discrete version of the problem will certainly be infeasible. In this case, no further operation is needed and the node is eliminated (line 10). If the relaxation is feasible; but the optimal objective function value of the relaxation z m is worse than the best feasible solution found so far; the node is again eliminated (line 12). On the other hand, if z m z ; the node and its children may improve the best solution found so far. In this case we first check the integer feasibility of the solution (line 15). Note that I frac is the set of variable indices for which the variable takes fractional values (Z is the set of integers). Therefore, whenever I frac =, the solution is integer feasible. If an integer feasible solution is found (line 15) at node m; there is no need to create its children. However, its objective function value is compared with the best objective function value found so far. If z m = z ; the node is added to the set of nodes providing the best integer feasible solutions (line 17). On the other hand, if z m < z ; all nodes previously added to N are removed since they cannot be optimal (line 19). The best objective function value is updated (line 20). Moreover, due to the strict improvement in the best objective function value; the nodes in the queue N are checked, as well. In particular, if a node n N has z n > z ; the children of the node cannot lead to an optimal solution. Therefore, such nodes are fathomed (line 21). Finally, whenever the solution of the relaxation is feasible; the optimal objective function value promises improvement z m z ; however the solution of the relaxation is not integer feasible (line 23); two children of node m are created. A variable with fractional value, j, is selected from I frac. Both children of the node (n, n) inherits the lower and upper bounds of the parent (m) except that: u nj = x mj for the node n; and l nj = x mj for the node n (lines 26 and 28). Moreover, the best possible objective function value that these children can lead to is z m. Therefore, we set z n = z n = z m (lines 25 and 27). Finally, these children are added to the queue (line 29). Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 22

23 When the algorithm terminates; n corresponds to the optimal objective function value of the original program with integer variables. Moreover, the continuous relaxations of each node in N provides an optimal solution to the original problem. 5.2 Exercise 1 Note that the Greedy Algorithm for Continuous Knapsack Problem efficiently solves the LP relaxation of the Integer Knapsack problem. Therefore, the optimal solution of the Integer Knapsack problem can be obtained by the BnB algorithm where the continuous algorithm is used to solve the relaxation at the nodes (line 9). Moreover, the Windows Forms Application created in Sections 1-3 can be used to update lower and upper bounds using the DataGridView and get the solution of the relaxations using the Continuous button. STEP 16: The original problem with integer x i variables has the following data: Note that the integer feasible solution obtained by the Greedy Algorithm for Integer Knapsack Problem is 37. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 23

24 STEP 17: Initialization step: 1. z =, N = 2. Create node 0 with l 01 = 0, u 01 = 5; l 02 = 0, u 02 = 3 z 0 = 5. N = {0} First iteration: 7. Select node 0 from N = {0}. 8. N = N {0} = 9. Solve the relaxation (continuous button): 10. Relaxation is feasible. 11. z 0 = z 0 z =. 14. I frac = {1}. 24. Select variable index 1 from I frac. 25. Create lower branch node 1 with l 11 = 0, u 11 = 1; l 12 = 0, u 12 = 3 z 1 = Create upper branch node 2 with l 21 = 2, u 21 = 5; l 22 = 0, u 22 = 3 z 2 = Add the children to the queue: N = N {1,2} = {1,2} Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 24

25 STEP 18: Second iteration: 7. Select node 1 from N = {1,2}. 8. N = N {1} = {2} 9. Solve the relaxation (update the bounds and use the continuous button): 10. Relaxation is feasible. 11. z 1 = z 1 z =. 14. I frac =. 18. z 1 < z. 19. N = {1}. 20. z = z 1 = Check whether the nodes in N can be fathomed: N = N {n N: z n > z = 37} = {2} { } = {2} At the second iteration; an integer feasible solution is obtained. However, none of the nodes in N could be fathomed since the node 2 can lead to a better solution (z 2 = 46 z = 37). Note that the solution in this node is the solution obtained by the Greedy Heuristic. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 25

26 STEP 19: Third iteration: 7. Select node 2 from N = {2}. 8. N = N {2} = 9. Solve the relaxation (update the bounds and use the continuous button): 10. Relaxation is feasible. 11. z 2 = z 2 z = I frac = {2}. 24. Select variable index 2 from I frac. 25. Create lower branch node 3 with l 31 = 2, u 31 = 5; l 32 = 0, u 32 = 2 z 3 = Create upper branch node 4 with l 41 = 2, u 41 = 5; l 42 = 3, u 42 = 3 z 4 = Add the children to the queue: N = N {3,4} = {3,4} Note that the children inherit the bounds of the parent node (l 31 = l 41 = 2); except for the new bounds on the second variable (u 32 = 2 and l 42 = 3). Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 26

27 STEP 20: Fourth iteration: 7. Select node 3 from N = {3,4}. 8. N = N {3} = {4} 9. Solve the relaxation (update the bounds and use the continuous button): 10. Relaxation is feasible. 11. z 3 = z 3 z = I frac = {1}. 24. Select variable index 1 from I frac. 25. Create lower branch node 5 with l 51 = 2, u 51 = 2; l 52 = 0, u 52 = 2 z 5 = Create upper branch node 6 with l 61 = 3, u 61 = 5; l 62 = 0, u 62 = 2 z 6 = Add the children to the queue: N = N {5,6} = {4,5,6} Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 27

28 STEP 21: Fifth iteration: 7. Select node 4 from N = {4,5,6}. 8. N = N {4} = {5,6} 9. Solve the relaxation (update the bounds and use the continuous button): 10. Relaxation is infeasible. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 28

29 STEP 22: Sixth iteration: 7. Select node 5 from N = {5,6}. 8. N = N {5} = {6} 9. Solve the relaxation (update the bounds and use the continuous button): 10. Relaxation is feasible. 11. z 5 = z 5 z = I frac =. 18. z 5 < z. 19. N = {5}. 20. z = z 5 = Check whether the nodes in N can be fathomed: N = N {n N: z n > z = 38} = {6} { } = {6} Note that the solution in node 1, or the solution of the Greedy Heuristic (z 1 = 37) is not optimal, and hence, the node is removed from N (line 19). Since z 6 = 44 z ; node 6 cannot be fathomed. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 29

30 STEP 23: Seventh iteration: 7. Select node 6 from N = {6}. 8. N = N {6} =. 9. Solve the relaxation (update the bounds and use the continuous button): 10. Relaxation is feasible. 11. z 6 = z 6 z = I frac = {2}. 24. Select variable index 2 from I frac. 25. Create lower branch node 7 with l 71 = 3, u 71 = 5; l 72 = 0, u 72 = 1 z 7 = Create upper branch node 8 with l 81 = 3, u 81 = 5; l 82 = 2, u 82 = 2 z 8 = Add the children to the queue: N = N {7,8} = {7,8} Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 30

31 STEP 24: Eighth iteration: 7. Select node 7 from N = {7,8}. 8. N = N {7} = {8}. 9. Solve the relaxation (update the bounds and use the continuous button): 10. Relaxation is feasible. 11. z 7 = z 7 z = I frac = {1}. 24. Select variable index 1 from I frac. 25. Create lower branch node 9 with l 91 = 3, u 91 = 3; l 92 = 0, u 92 = 1 z 9 = Create upper branch node 10 with l 10,1 = 4, u 10,1 = 5; l 10,2 = 0, u 10,2 = 1 z 10 = Add the children to the queue: N = N {9,10} = {8,9,10} Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 31

32 STEP 25: Ninth iteration: 7. Select node 8 from N = {8,9,10}. 8. N = N {8} = {9,10} 9. Solve the relaxation (update the bounds and use the continuous button): 10. Relaxation is infeasible. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 32

33 STEP 26: Tenth iteration: 7. Select node 9 from N = {9,10}. 8. N = N {9} = {10}. 9. Solve the relaxation (update the bounds and use the continuous button): 10. Relaxation is feasible. 11. z 9 = z 9 z = I frac =. 18. z 9 < z. 19. N = {9}. 20. z = z 9 = Check whether the nodes in N can be fathomed: N = N {n N: z n > z = 39} = {10} { } = {10} Since z 10 = 42 z ; node 10 cannot be fathomed. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 33

34 STEP 27: Eleventh iteration: 7. Select node 10 from N = {10}. 8. N = N {10} =. 9. Solve the relaxation (update the bounds and use the continuous button): 10. Relaxation is feasible. 11. z 10 = z 10 z = I frac =. 18. z 10 < z. 19. N = {10}. 20. z = z 10 = Check whether the nodes in N can be fathomed: N = N {n N: z n > z = 40} = { } { } = Since N =, the algorithm terminates at the eleventh iteration. Since N = {10}, the relaxed solution at node 10 is the unique optimal solution to the original integer knapsack problem with x 1 = 4; x 2 = 0 and the optimal objective function value is 40. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 34

35 The BnB tree of the exercise is summarized in the figure below: Uğur Arıkan ; 35

36 5.3 Discussion on BnB Strategies In the Exercise in Section 4.2; we have selected the first element of the set N at each iteration (first-in-first-out). This is not a common strategy. Two classical strategies are best-first and depth first strategies; and a combination of these strategies is commonly used. Depth-first (last-in-first-out) strategy aims to select a node, children of which soon lead to an integer feasible solution. Finding good integer feasible solutions in the early iterations of the algorithm will improve the bounds and help fathom more nodes. On the other hand, best-first strategy selects the node with most promising lower bound. While the strategy is promising to find a good integer feasible solution; the algorithm may require a greater number of iterations to find the initial bounds. Implementing a good node selection strategy is not trivial. There exist heuristics that may help improve the convergence performance of the BnB algorithm. Moreover, problem-specific information might help implementing strategies that work well with the particular problem. In the exercise; we always observed a single variable that attains a fractional value ( I frac 1) due to the nature of the Knapsack problem and the given bounds. However, this is generally not the case; and the modeler needs to decide on which variable to branch on. Implementing a good variable selection strategy is also another nontrivial problem. Finally note that the BnB algorithm might solve a great number of relaxations of the original problem. However, when there exists an efficient approach to solve the relaxed problem; and when node and variable strategies are successfully implemented; the algorithm might converge considerably fast. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 36

37 5.4 Exercise 2 In the next tutorial, we will implement a generic BnB algorithm. Therefore, understanding how the algorithm works and how its strategies affect convergence on a well-known example is important before developing the generic code. Below is the data of another integer knapsack problem to exercise. Think how the steps of the BnB algorithm change when all integer variables of the problem are binary variables; or when some of the integer variables have negative lower bounds. Uğur Arıkan (ugur_arikan@sutd.edu.sg ; 37

15.083J Integer Programming and Combinatorial Optimization Fall Enumerative Methods

15.083J Integer Programming and Combinatorial Optimization Fall Enumerative Methods 5.8J Integer Programming and Combinatorial Optimization Fall 9 A knapsack problem Enumerative Methods Let s focus on maximization integer linear programs with only binary variables For example: a knapsack

More information

3 INTEGER LINEAR PROGRAMMING

3 INTEGER LINEAR PROGRAMMING 3 INTEGER LINEAR PROGRAMMING PROBLEM DEFINITION Integer linear programming problem (ILP) of the decision variables x 1,..,x n : (ILP) subject to minimize c x j j n j= 1 a ij x j x j 0 x j integer n j=

More information

UNIT 4 Branch and Bound

UNIT 4 Branch and Bound UNIT 4 Branch and Bound General method: Branch and Bound is another method to systematically search a solution space. Just like backtracking, we will use bounding functions to avoid generating subtrees

More information

Search Algorithms. IE 496 Lecture 17

Search Algorithms. IE 496 Lecture 17 Search Algorithms IE 496 Lecture 17 Reading for This Lecture Primary Horowitz and Sahni, Chapter 8 Basic Search Algorithms Search Algorithms Search algorithms are fundamental techniques applied to solve

More information

Subset sum problem and dynamic programming

Subset sum problem and dynamic programming Lecture Notes: Dynamic programming We will discuss the subset sum problem (introduced last time), and introduce the main idea of dynamic programming. We illustrate it further using a variant of the so-called

More information

MVE165/MMG630, Applied Optimization Lecture 8 Integer linear programming algorithms. Ann-Brith Strömberg

MVE165/MMG630, Applied Optimization Lecture 8 Integer linear programming algorithms. Ann-Brith Strömberg MVE165/MMG630, Integer linear programming algorithms Ann-Brith Strömberg 2009 04 15 Methods for ILP: Overview (Ch. 14.1) Enumeration Implicit enumeration: Branch and bound Relaxations Decomposition methods:

More information

February 19, Integer programming. Outline. Problem formulation. Branch-andbound

February 19, Integer programming. Outline. Problem formulation. Branch-andbound Olga Galinina olga.galinina@tut.fi ELT-53656 Network Analysis and Dimensioning II Department of Electronics and Communications Engineering Tampere University of Technology, Tampere, Finland February 19,

More information

2 is not feasible if rounded. x =0,x 2

2 is not feasible if rounded. x =0,x 2 Integer Programming Definitions Pure Integer Programming all variables should be integers Mied integer Programming Some variables should be integers Binary integer programming The integer variables are

More information

Optimization Methods in Management Science

Optimization Methods in Management Science Problem Set Rules: Optimization Methods in Management Science MIT 15.053, Spring 2013 Problem Set 6, Due: Thursday April 11th, 2013 1. Each student should hand in an individual problem set. 2. Discussing

More information

V. Solving Integer Linear Programs

V. Solving Integer Linear Programs Optimization Methods Draft of August 26, 2005 V. Solving Integer Linear Programs Robert Fourer Department of Industrial Engineering and Management Sciences Northwestern University Evanston, Illinois 60208-3119,

More information

Integer Programming ISE 418. Lecture 7. Dr. Ted Ralphs

Integer Programming ISE 418. Lecture 7. Dr. Ted Ralphs Integer Programming ISE 418 Lecture 7 Dr. Ted Ralphs ISE 418 Lecture 7 1 Reading for This Lecture Nemhauser and Wolsey Sections II.3.1, II.3.6, II.4.1, II.4.2, II.5.4 Wolsey Chapter 7 CCZ Chapter 1 Constraint

More information

Backtracking and Branch-and-Bound

Backtracking and Branch-and-Bound Backtracking and Branch-and-Bound Usually for problems with high complexity Exhaustive Search is too time consuming Cut down on some search using special methods Idea: Construct partial solutions and extend

More information

Coping with the Limitations of Algorithm Power Exact Solution Strategies Backtracking Backtracking : A Scenario

Coping with the Limitations of Algorithm Power Exact Solution Strategies Backtracking Backtracking : A Scenario Coping with the Limitations of Algorithm Power Tackling Difficult Combinatorial Problems There are two principal approaches to tackling difficult combinatorial problems (NP-hard problems): Use a strategy

More information

Algorithms for Integer Programming

Algorithms for Integer Programming Algorithms for Integer Programming Laura Galli November 9, 2016 Unlike linear programming problems, integer programming problems are very difficult to solve. In fact, no efficient general algorithm is

More information

General Methods and Search Algorithms

General Methods and Search Algorithms DM811 HEURISTICS AND LOCAL SEARCH ALGORITHMS FOR COMBINATORIAL OPTIMZATION Lecture 3 General Methods and Search Algorithms Marco Chiarandini 2 Methods and Algorithms A Method is a general framework for

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms Given an NP-hard problem, what should be done? Theory says you're unlikely to find a poly-time algorithm. Must sacrifice one of three desired features. Solve problem to optimality.

More information

Integer Programming. Xi Chen. Department of Management Science and Engineering International Business School Beijing Foreign Studies University

Integer Programming. Xi Chen. Department of Management Science and Engineering International Business School Beijing Foreign Studies University Integer Programming Xi Chen Department of Management Science and Engineering International Business School Beijing Foreign Studies University Xi Chen (chenxi0109@bfsu.edu.cn) Integer Programming 1 / 42

More information

9.4 SOME CHARACTERISTICS OF INTEGER PROGRAMS A SAMPLE PROBLEM

9.4 SOME CHARACTERISTICS OF INTEGER PROGRAMS A SAMPLE PROBLEM 9.4 SOME CHARACTERISTICS OF INTEGER PROGRAMS A SAMPLE PROBLEM Whereas the simplex method is effective for solving linear programs, there is no single technique for solving integer programs. Instead, a

More information

11. APPROXIMATION ALGORITHMS

11. APPROXIMATION ALGORITHMS 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: vertex cover LP rounding: vertex cover generalized load balancing knapsack problem Lecture slides by Kevin Wayne Copyright 2005

More information

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018 CS 580: Algorithm Design and Analysis Jeremiah Blocki Purdue University Spring 2018 Chapter 11 Approximation Algorithms Slides by Kevin Wayne. Copyright @ 2005 Pearson-Addison Wesley. All rights reserved.

More information

15-451/651: Design & Analysis of Algorithms November 4, 2015 Lecture #18 last changed: November 22, 2015

15-451/651: Design & Analysis of Algorithms November 4, 2015 Lecture #18 last changed: November 22, 2015 15-451/651: Design & Analysis of Algorithms November 4, 2015 Lecture #18 last changed: November 22, 2015 While we have good algorithms for many optimization problems, the previous lecture showed that many

More information

/ Approximation Algorithms Lecturer: Michael Dinitz Topic: Linear Programming Date: 2/24/15 Scribe: Runze Tang

/ Approximation Algorithms Lecturer: Michael Dinitz Topic: Linear Programming Date: 2/24/15 Scribe: Runze Tang 600.469 / 600.669 Approximation Algorithms Lecturer: Michael Dinitz Topic: Linear Programming Date: 2/24/15 Scribe: Runze Tang 9.1 Linear Programming Suppose we are trying to approximate a minimization

More information

Greedy Algorithms CHAPTER 16

Greedy Algorithms CHAPTER 16 CHAPTER 16 Greedy Algorithms In dynamic programming, the optimal solution is described in a recursive manner, and then is computed ``bottom up''. Dynamic programming is a powerful technique, but it often

More information

COLUMN GENERATION IN LINEAR PROGRAMMING

COLUMN GENERATION IN LINEAR PROGRAMMING COLUMN GENERATION IN LINEAR PROGRAMMING EXAMPLE: THE CUTTING STOCK PROBLEM A certain material (e.g. lumber) is stocked in lengths of 9, 4, and 6 feet, with respective costs of $5, $9, and $. An order for

More information

Outline of the module

Outline of the module Evolutionary and Heuristic Optimisation (ITNPD8) Lecture 2: Heuristics and Metaheuristics Gabriela Ochoa http://www.cs.stir.ac.uk/~goc/ Computing Science and Mathematics, School of Natural Sciences University

More information

Methods and Models for Combinatorial Optimization Exact methods for the Traveling Salesman Problem

Methods and Models for Combinatorial Optimization Exact methods for the Traveling Salesman Problem Methods and Models for Combinatorial Optimization Exact methods for the Traveling Salesman Problem L. De Giovanni M. Di Summa The Traveling Salesman Problem (TSP) is an optimization problem on a directed

More information

CS 231: Algorithmic Problem Solving

CS 231: Algorithmic Problem Solving CS 231: Algorithmic Problem Solving Naomi Nishimura Module 7 Date of this version: January 28, 2019 WARNING: Drafts of slides are made available prior to lecture for your convenience. After lecture, slides

More information

The Branch & Move algorithm: Improving Global Constraints Support by Local Search

The Branch & Move algorithm: Improving Global Constraints Support by Local Search Branch and Move 1 The Branch & Move algorithm: Improving Global Constraints Support by Local Search Thierry Benoist Bouygues e-lab, 1 av. Eugène Freyssinet, 78061 St Quentin en Yvelines Cedex, France tbenoist@bouygues.com

More information

3 No-Wait Job Shops with Variable Processing Times

3 No-Wait Job Shops with Variable Processing Times 3 No-Wait Job Shops with Variable Processing Times In this chapter we assume that, on top of the classical no-wait job shop setting, we are given a set of processing times for each operation. We may select

More information

Parallel Computing in Combinatorial Optimization

Parallel Computing in Combinatorial Optimization Parallel Computing in Combinatorial Optimization Bernard Gendron Université de Montréal gendron@iro.umontreal.ca Course Outline Objective: provide an overview of the current research on the design of parallel

More information

CSE 417 Branch & Bound (pt 4) Branch & Bound

CSE 417 Branch & Bound (pt 4) Branch & Bound CSE 417 Branch & Bound (pt 4) Branch & Bound Reminders > HW8 due today > HW9 will be posted tomorrow start early program will be slow, so debugging will be slow... Review of previous lectures > Complexity

More information

The Size Robust Multiple Knapsack Problem

The Size Robust Multiple Knapsack Problem MASTER THESIS ICA-3251535 The Size Robust Multiple Knapsack Problem Branch and Price for the Separate and Combined Recovery Decomposition Model Author: D.D. Tönissen, Supervisors: dr. ir. J.M. van den

More information

The complement of PATH is in NL

The complement of PATH is in NL 340 The complement of PATH is in NL Let c be the number of nodes in graph G that are reachable from s We assume that c is provided as an input to M Given G, s, t, and c the machine M operates as follows:

More information

NOTATION AND TERMINOLOGY

NOTATION AND TERMINOLOGY 15.053x, Optimization Methods in Business Analytics Fall, 2016 October 4, 2016 A glossary of notation and terms used in 15.053x Weeks 1, 2, 3, 4 and 5. (The most recent week's terms are in blue). NOTATION

More information

Theorem 2.9: nearest addition algorithm

Theorem 2.9: nearest addition algorithm There are severe limits on our ability to compute near-optimal tours It is NP-complete to decide whether a given undirected =(,)has a Hamiltonian cycle An approximation algorithm for the TSP can be used

More information

Integer Programming Chapter 9

Integer Programming Chapter 9 1 Integer Programming Chapter 9 University of Chicago Booth School of Business Kipp Martin October 30, 2017 2 Outline Branch and Bound Theory Branch and Bound Linear Programming Node Selection Strategies

More information

Computational Integer Programming. Lecture 12: Branch and Cut. Dr. Ted Ralphs

Computational Integer Programming. Lecture 12: Branch and Cut. Dr. Ted Ralphs Computational Integer Programming Lecture 12: Branch and Cut Dr. Ted Ralphs Computational MILP Lecture 12 1 Reading for This Lecture Wolsey Section 9.6 Nemhauser and Wolsey Section II.6 Martin Computational

More information

Lagrangean relaxation - exercises

Lagrangean relaxation - exercises Lagrangean relaxation - exercises Giovanni Righini Set covering We start from the following Set Covering Problem instance: min z = x + 2x 2 + x + 2x 4 + x 5 x + x 2 + x 4 x 2 + x x 2 + x 4 + x 5 x + x

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A 4 credit unit course Part of Theoretical Computer Science courses at the Laboratory of Mathematics There will be 4 hours

More information

Motivation for Heuristics

Motivation for Heuristics MIP Heuristics 1 Motivation for Heuristics Why not wait for branching? Produce feasible solutions as quickly as possible Often satisfies user demands Avoid exploring unproductive sub trees Better reduced

More information

Construction of Minimum-Weight Spanners Mikkel Sigurd Martin Zachariasen

Construction of Minimum-Weight Spanners Mikkel Sigurd Martin Zachariasen Construction of Minimum-Weight Spanners Mikkel Sigurd Martin Zachariasen University of Copenhagen Outline Motivation and Background Minimum-Weight Spanner Problem Greedy Spanner Algorithm Exact Algorithm:

More information

Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Lecture 18 All-Integer Dual Algorithm We continue the discussion on the all integer

More information

CMPSCI611: Approximating SET-COVER Lecture 21

CMPSCI611: Approximating SET-COVER Lecture 21 CMPSCI611: Approximating SET-COVER Lecture 21 Today we look at two more examples of approximation algorithms for NP-hard optimization problems. The first, for the SET-COVER problem, has an approximation

More information

Lecture 26. Introduction to Trees. Trees

Lecture 26. Introduction to Trees. Trees Lecture 26 Introduction to Trees Trees Trees are the name given to a versatile group of data structures. They can be used to implement a number of abstract interfaces including the List, but those applications

More information

Backtracking. Chapter 5

Backtracking. Chapter 5 1 Backtracking Chapter 5 2 Objectives Describe the backtrack programming technique Determine when the backtracking technique is an appropriate approach to solving a problem Define a state space tree for

More information

Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Lecture 16 Cutting Plane Algorithm We shall continue the discussion on integer programming,

More information

Notes for Lecture 24

Notes for Lecture 24 U.C. Berkeley CS170: Intro to CS Theory Handout N24 Professor Luca Trevisan December 4, 2001 Notes for Lecture 24 1 Some NP-complete Numerical Problems 1.1 Subset Sum The Subset Sum problem is defined

More information

Algorithm Design Techniques (III)

Algorithm Design Techniques (III) Algorithm Design Techniques (III) Minimax. Alpha-Beta Pruning. Search Tree Strategies (backtracking revisited, branch and bound). Local Search. DSA - lecture 10 - T.U.Cluj-Napoca - M. Joldos 1 Tic-Tac-Toe

More information

Chapter-6 Backtracking

Chapter-6 Backtracking Chapter-6 Backtracking 6.1 Background Suppose, if you have to make a series of decisions, among various choices, where you don t have enough information to know what to choose. Each decision leads to a

More information

Cost Optimal Parallel Algorithm for 0-1 Knapsack Problem

Cost Optimal Parallel Algorithm for 0-1 Knapsack Problem Cost Optimal Parallel Algorithm for 0-1 Knapsack Problem Project Report Sandeep Kumar Ragila Rochester Institute of Technology sr5626@rit.edu Santosh Vodela Rochester Institute of Technology pv8395@rit.edu

More information

Informed (Heuristic) Search. Idea: be smart about what paths to try.

Informed (Heuristic) Search. Idea: be smart about what paths to try. Informed (Heuristic) Search Idea: be smart about what paths to try. 1 Blind Search vs. Informed Search What s the difference? How do we formally specify this? A node is selected for expansion based on

More information

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18 601.433/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18 22.1 Introduction We spent the last two lectures proving that for certain problems, we can

More information

Algorithm Design and Analysis

Algorithm Design and Analysis Algorithm Design and Analysis LECTURE 29 Approximation Algorithms Load Balancing Weighted Vertex Cover Reminder: Fill out SRTEs online Don t forget to click submit Sofya Raskhodnikova 12/7/2016 Approximation

More information

Integer Programming Theory

Integer Programming Theory Integer Programming Theory Laura Galli October 24, 2016 In the following we assume all functions are linear, hence we often drop the term linear. In discrete optimization, we seek to find a solution x

More information

Fundamentals of Integer Programming

Fundamentals of Integer Programming Fundamentals of Integer Programming Di Yuan Department of Information Technology, Uppsala University January 2018 Outline Definition of integer programming Formulating some classical problems with integer

More information

MVE165/MMG631 Linear and integer optimization with applications Lecture 9 Discrete optimization: theory and algorithms

MVE165/MMG631 Linear and integer optimization with applications Lecture 9 Discrete optimization: theory and algorithms MVE165/MMG631 Linear and integer optimization with applications Lecture 9 Discrete optimization: theory and algorithms Ann-Brith Strömberg 2018 04 24 Lecture 9 Linear and integer optimization with applications

More information

In this lecture, we ll look at applications of duality to three problems:

In this lecture, we ll look at applications of duality to three problems: Lecture 7 Duality Applications (Part II) In this lecture, we ll look at applications of duality to three problems: 1. Finding maximum spanning trees (MST). We know that Kruskal s algorithm finds this,

More information

Constraint Satisfaction Problems

Constraint Satisfaction Problems Constraint Satisfaction Problems CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2013 Soleymani Course material: Artificial Intelligence: A Modern Approach, 3 rd Edition,

More information

Integer Programming! Using linear programming to solve discrete problems

Integer Programming! Using linear programming to solve discrete problems Integer Programming! Using linear programming to solve discrete problems Solving Discrete Problems Linear programming solves continuous problem! problems over the reai numbers.! For the remainder of the

More information

Column Generation and its applications

Column Generation and its applications Column Generation and its applications Murat Firat, dept. IE&IS, TU/e BPI Cluster meeting Outline Some real-life decision problems Standard formulations Basics of Column Generation Master formulations

More information

Chapter 5 Graph Algorithms Algorithm Theory WS 2012/13 Fabian Kuhn

Chapter 5 Graph Algorithms Algorithm Theory WS 2012/13 Fabian Kuhn Chapter 5 Graph Algorithms Algorithm Theory WS 2012/13 Fabian Kuhn Graphs Extremely important concept in computer science Graph, : node (or vertex) set : edge set Simple graph: no self loops, no multiple

More information

CS-6402 DESIGN AND ANALYSIS OF ALGORITHMS

CS-6402 DESIGN AND ANALYSIS OF ALGORITHMS CS-6402 DESIGN AND ANALYSIS OF ALGORITHMS 2 marks UNIT-I 1. Define Algorithm. An algorithm is a sequence of unambiguous instructions for solving a problem in a finite amount of time. 2.Write a short note

More information

Priority Queue: Heap Structures

Priority Queue: Heap Structures Priority Queue: Heap Structures Definition: A max-heap (min-heap) is a complete BT with the property that the value (priority) of each node is at least as large (small) as the values at its children (if

More information

Solving Hybrid Decision-Control Problems Through Conflict-Directed Branch & Bound

Solving Hybrid Decision-Control Problems Through Conflict-Directed Branch & Bound Solving Hybrid Decision-Control Problems Through Conflict-Directed Branch & Bound by Raj Krishnan Submitted to the Department of Electrical Engineering and Computer Science in Partial Fulfillment of the

More information

Introduction to Mathematical Programming IE406. Lecture 20. Dr. Ted Ralphs

Introduction to Mathematical Programming IE406. Lecture 20. Dr. Ted Ralphs Introduction to Mathematical Programming IE406 Lecture 20 Dr. Ted Ralphs IE406 Lecture 20 1 Reading for This Lecture Bertsimas Sections 10.1, 11.4 IE406 Lecture 20 2 Integer Linear Programming An integer

More information

Data Structures (CS 1520) Lecture 28 Name:

Data Structures (CS 1520) Lecture 28 Name: Traeling Salesperson Problem (TSP) -- Find an optimal (ie, minimum length) when at least one exists A (or Hamiltonian circuit) is a path from a ertex back to itself that passes through each of the other

More information

: Principles of Automated Reasoning and Decision Making Midterm

: Principles of Automated Reasoning and Decision Making Midterm 16.410-13: Principles of Automated Reasoning and Decision Making Midterm October 20 th, 2003 Name E-mail Note: Budget your time wisely. Some parts of this quiz could take you much longer than others. Move

More information

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Priority Queues / Heaps Date: 9/27/17

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Priority Queues / Heaps Date: 9/27/17 01.433/33 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Priority Queues / Heaps Date: 9/2/1.1 Introduction In this lecture we ll talk about a useful abstraction, priority queues, which are

More information

BackTracking Introduction

BackTracking Introduction Backtracking BackTracking Introduction Backtracking is used to solve problems in which a sequence of objects is chosen from a specified set so that the sequence satisfies some criterion. The classic example

More information

Analysis of Algorithms - Greedy algorithms -

Analysis of Algorithms - Greedy algorithms - Analysis of Algorithms - Greedy algorithms - Andreas Ermedahl MRTC (Mälardalens Real-Time Reseach Center) andreas.ermedahl@mdh.se Autumn 2003 Greedy Algorithms Another paradigm for designing algorithms

More information

Part 4. Decomposition Algorithms Dantzig-Wolf Decomposition Algorithm

Part 4. Decomposition Algorithms Dantzig-Wolf Decomposition Algorithm In the name of God Part 4. 4.1. Dantzig-Wolf Decomposition Algorithm Spring 2010 Instructor: Dr. Masoud Yaghini Introduction Introduction Real world linear programs having thousands of rows and columns.

More information

Unit.9 Integer Programming

Unit.9 Integer Programming Unit.9 Integer Programming Xiaoxi Li EMS & IAS, Wuhan University Dec. 22-29, 2016 (revised) Operations Research (Li, X.) Unit.9 Integer Programming Dec. 22-29, 2016 (revised) 1 / 58 Organization of this

More information

x ji = s i, i N, (1.1)

x ji = s i, i N, (1.1) Dual Ascent Methods. DUAL ASCENT In this chapter we focus on the minimum cost flow problem minimize subject to (i,j) A {j (i,j) A} a ij x ij x ij {j (j,i) A} (MCF) x ji = s i, i N, (.) b ij x ij c ij,

More information

Machine Learning for Software Engineering

Machine Learning for Software Engineering Machine Learning for Software Engineering Introduction and Motivation Prof. Dr.-Ing. Norbert Siegmund Intelligent Software Systems 1 2 Organizational Stuff Lectures: Tuesday 11:00 12:30 in room SR015 Cover

More information

A LARGE SCALE INTEGER AND COMBINATORIAL OPTIMIZER

A LARGE SCALE INTEGER AND COMBINATORIAL OPTIMIZER A LARGE SCALE INTEGER AND COMBINATORIAL OPTIMIZER By Qun Chen A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy (Industrial Engineering) at the

More information

Generalized Network Flow Programming

Generalized Network Flow Programming Appendix C Page Generalized Network Flow Programming This chapter adapts the bounded variable primal simplex method to the generalized minimum cost flow problem. Generalized networks are far more useful

More information

arxiv: v1 [cs.dm] 6 May 2009

arxiv: v1 [cs.dm] 6 May 2009 Solving the 0 1 Multidimensional Knapsack Problem with Resolution Search Sylvain Boussier a, Michel Vasquez a, Yannick Vimont a, Saïd Hanafi b and Philippe Michelon c arxiv:0905.0848v1 [cs.dm] 6 May 2009

More information

Conflict Graphs for Combinatorial Optimization Problems

Conflict Graphs for Combinatorial Optimization Problems Conflict Graphs for Combinatorial Optimization Problems Ulrich Pferschy joint work with Andreas Darmann and Joachim Schauer University of Graz, Austria Introduction Combinatorial Optimization Problem CO

More information

Geometric Steiner Trees

Geometric Steiner Trees Geometric Steiner Trees From the book: Optimal Interconnection Trees in the Plane By Marcus Brazil and Martin Zachariasen Part 2: Global properties of Euclidean Steiner Trees and GeoSteiner Marcus Brazil

More information

CS 171: Introduction to Computer Science II. Binary Search Trees

CS 171: Introduction to Computer Science II. Binary Search Trees CS 171: Introduction to Computer Science II Binary Search Trees Binary Search Trees Symbol table applications BST definitions and terminologies Search and insert Traversal Ordered operations Delete Symbol

More information

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I. Instructor: Shaddin Dughmi

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I. Instructor: Shaddin Dughmi CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I Instructor: Shaddin Dughmi Announcements Posted solutions to HW1 Today: Combinatorial problems

More information

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502)

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502) Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik Combinatorial Optimization (MA 4502) Dr. Michael Ritter Problem Sheet 4 Homework Problems Problem

More information

MITOCW watch?v=w_-sx4vr53m

MITOCW watch?v=w_-sx4vr53m MITOCW watch?v=w_-sx4vr53m The following content is provided under a Creative Commons license. Your support will help MIT OpenCourseWare continue to offer high-quality educational resources for free. To

More information

UNIT 3. Greedy Method. Design and Analysis of Algorithms GENERAL METHOD

UNIT 3. Greedy Method. Design and Analysis of Algorithms GENERAL METHOD UNIT 3 Greedy Method GENERAL METHOD Greedy is the most straight forward design technique. Most of the problems have n inputs and require us to obtain a subset that satisfies some constraints. Any subset

More information

Algorithms for Decision Support. Integer linear programming models

Algorithms for Decision Support. Integer linear programming models Algorithms for Decision Support Integer linear programming models 1 People with reduced mobility (PRM) require assistance when travelling through the airport http://www.schiphol.nl/travellers/atschiphol/informationforpassengerswithreducedmobility.htm

More information

Constraint Satisfaction Problems. Chapter 6

Constraint Satisfaction Problems. Chapter 6 Constraint Satisfaction Problems Chapter 6 Constraint Satisfaction Problems A constraint satisfaction problem consists of three components, X, D, and C: X is a set of variables, {X 1,..., X n }. D is a

More information

Chapter 10 Part 1: Reduction

Chapter 10 Part 1: Reduction //06 Polynomial-Time Reduction Suppose we could solve Y in polynomial-time. What else could we solve in polynomial time? don't confuse with reduces from Chapter 0 Part : Reduction Reduction. Problem X

More information

SUBSTITUTING GOMORY CUTTING PLANE METHOD TOWARDS BALAS ALGORITHM FOR SOLVING BINARY LINEAR PROGRAMMING

SUBSTITUTING GOMORY CUTTING PLANE METHOD TOWARDS BALAS ALGORITHM FOR SOLVING BINARY LINEAR PROGRAMMING Bulletin of Mathematics Vol. 06, No. 0 (20), pp.. SUBSTITUTING GOMORY CUTTING PLANE METHOD TOWARDS BALAS ALGORITHM FOR SOLVING BINARY LINEAR PROGRAMMING Eddy Roflin, Sisca Octarina, Putra B. J Bangun,

More information

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Dynamic Programming I Date: 10/6/16

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Dynamic Programming I Date: 10/6/16 600.463 Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Dynamic Programming I Date: 10/6/16 11.1 Introduction Dynamic programming can be very confusing until you ve used it a

More information

Module 4. Constraint satisfaction problems. Version 2 CSE IIT, Kharagpur

Module 4. Constraint satisfaction problems. Version 2 CSE IIT, Kharagpur Module 4 Constraint satisfaction problems Lesson 10 Constraint satisfaction problems - II 4.5 Variable and Value Ordering A search algorithm for constraint satisfaction requires the order in which variables

More information

Methods and Models for Combinatorial Optimization Heuristis for Combinatorial Optimization

Methods and Models for Combinatorial Optimization Heuristis for Combinatorial Optimization Methods and Models for Combinatorial Optimization Heuristis for Combinatorial Optimization L. De Giovanni 1 Introduction Solution methods for Combinatorial Optimization Problems (COPs) fall into two classes:

More information

Constructive and destructive algorithms

Constructive and destructive algorithms Constructive and destructive algorithms Heuristic algorithms Giovanni Righini University of Milan Department of Computer Science (Crema) Constructive algorithms In combinatorial optimization problems every

More information

Column Generation Method for an Agent Scheduling Problem

Column Generation Method for an Agent Scheduling Problem Column Generation Method for an Agent Scheduling Problem Balázs Dezső Alpár Jüttner Péter Kovács Dept. of Algorithms and Their Applications, and Dept. of Operations Research Eötvös Loránd University, Budapest,

More information

Search Algorithms for Discrete Optimization Problems

Search Algorithms for Discrete Optimization Problems Search Algorithms for Discrete Optimization Problems Ananth Grama, Anshul Gupta, George Karypis, and Vipin Kumar To accompany the text ``Introduction to Parallel Computing'', Addison Wesley, 2003. 1 Topic

More information

Trees and Tree Traversal

Trees and Tree Traversal Trees and Tree Traversal Material adapted courtesy of Prof. Dave Matuszek at UPENN Definition of a tree A tree is a node with a value and zero or more children Depending on the needs of the program, the

More information

Framework for Design of Dynamic Programming Algorithms

Framework for Design of Dynamic Programming Algorithms CSE 441T/541T Advanced Algorithms September 22, 2010 Framework for Design of Dynamic Programming Algorithms Dynamic programming algorithms for combinatorial optimization generalize the strategy we studied

More information

Models of distributed computing: port numbering and local algorithms

Models of distributed computing: port numbering and local algorithms Models of distributed computing: port numbering and local algorithms Jukka Suomela Adaptive Computing Group Helsinki Institute for Information Technology HIIT University of Helsinki FMT seminar, 26 February

More information

Parallel Branch & Bound

Parallel Branch & Bound Parallel Branch & Bound Bernard Gendron Université de Montréal gendron@iro.umontreal.ca Outline Mixed integer programming (MIP) and branch & bound (B&B) Linear programming (LP) based B&B Relaxation and

More information

1. Lecture notes on bipartite matching February 4th,

1. Lecture notes on bipartite matching February 4th, 1. Lecture notes on bipartite matching February 4th, 2015 6 1.1.1 Hall s Theorem Hall s theorem gives a necessary and sufficient condition for a bipartite graph to have a matching which saturates (or matches)

More information

TIM 206 Lecture Notes Integer Programming

TIM 206 Lecture Notes Integer Programming TIM 206 Lecture Notes Integer Programming Instructor: Kevin Ross Scribe: Fengji Xu October 25, 2011 1 Defining Integer Programming Problems We will deal with linear constraints. The abbreviation MIP stands

More information