Discrete (and Continuous) Optimization WI4 131

Size: px
Start display at page:

Download "Discrete (and Continuous) Optimization WI4 131"

Transcription

1 Discrete (and Continuous) Optimization WI4 131 Kees Roos Technische Universiteit Delft Faculteit Electrotechniek, Wiskunde en Informatica Afdeling Informatie, Systemen en Algoritmiek URL: roos November December, A.D. 2004

2 Course Schedule 1. Formulations (18 pages) 2. Optimality, Relaation, and Bounds (10 pages) 3. Well-solved Problems (13 pages) 4. Matching and Assigments (10 pages) 5. Dynamic Programming (11 pages) 6. Compleity and Problem Reduction (8 pages) 7. Branch and Bound (17 pages) 8. Cutting Plane Algorithms (21 pages) 9. Strong Valid Inequalities (22 pages) 10. Lagrangian Duality (14 pages) 11. Column Generation Algorithms (16 pages) 12. Heuristic Algorithms (15 pages) 13. From Theory to Solutions (20 pages) Optimization Group 1

3 Capter 12 Heuristic Algorithms Optimization Group 2

4 Introduction Many practical problems are NP-hard. In these case one may choose (or even be forced) to use a heuristic or approimation algorithm: a smart method that hopefully finds a good feasible solution quickly. In designing a heuristic, various questions arise: Should one accept any feasible solution, or should one ask a posteriori how far it is from optimal? Can one guarantee a priori that the heuristic produces a solution within ǫ (or α%) from optimal? Can one guarantee a priori that for the class of problems considered the heuristic on the average produces a solution within ǫ (or α%) from optimal? Optimization Group 3

5 Greedy and Local Search Revisited We suppose that the problem can be written as a COP in the form: min S N Consider for eample the 0-1 knapsack problem: z = ma n j=1 {c(s) : v(s) k}. c j j : n j=1 Here is the indicator vector of the set S. Thus, defining a j j b, {0,1} n. c(s) = j S c j, v(s) = j S a j, k = b the 0-1 knapsack problem gets the above form. Also the uncapacitated facility location problem min c ij ij + f j y j : j N i M j N fits in this model if we take c(s) = i M n ij = 1, ij M y j, ij [0,1], y j {0,1} j=1 i M min c ij + f j, v(s) = S, k = 1. j S j S Optimization Group 4

6 Greedy Heuristic min S N {c(s) : v(s) k}. Step 1: Set S 0 = and t = 1. Step 2: Set j t = argmin t c ( S t 1 {j t } ) c ( S t 1) v(s t 1 {j t }) v(s t 1 ). Step 3: If S t 1 is feasible, and the cost does not decrease when passing to S t = S t 1 {j t }, stop with S G = S t 1. Step 4: Otherwise, set S t = S t 1 {j t }. If S t is feasible and the cost function is nondecreasing or t = n: stop with S G = S t. Step 5: If t = n, no feasible solution has been found. Stop. Step 6: Set t t + 1, and return to Step 2. Optimization Group 5

7 Eample: Uncapacitated Facility Location Consider the UFL m = 6 clients and n = 4 depots, and costs as shown below: Greedy solution: (c ij ) = Step 1: Set S 0 = and t = 1. S 0 is infeasible. Step 2: We compute j 1 = argmin t c({j t }) v({j t }). and (f j ) = (21, 16, 11, 24) One has c({1}) = ( ) + 21 = 63, c({2}) = 66, c({3}) = 31, c({4}) = 64. So j 1 = 3, S 1 = {3} and c(s 1 ) = 31. S 1 is feasible. Step 3: Since S 1 is feasible, we check if the cost decreases when passing to S 2 = S 1 {j} for some j / S 1. One has c({3,1}) c({3}) = 18, c({3,2}) c({3}) = 11, and c({3,4}) c({3}) = 11. So the cost is nondecreasing: hence we stop with S G = S 1 = {3}. In many cases the heuristic must be adapted to the problem structure. We give an eample for STSP. Optimization Group 6

8 Distance matri. Eample: Symmetric TSP Let us start at node 1. The nearest neighbor of node 1 is 3, yielding P 1 = 1 3. The nearest neighbor of the node set {1,3} is 6, with distance 6 to 3, yielding the three possible paths shown in the table. We use the cheapest insertion: P 2 = The nearest neighbor of the node set {1,3,6} is 4, with distance 6 to node 6. Possible insertions are shown in the table. Now node 2 is nearest to P 3, and the possible insertions are as indicated. We use : P 4 = , which has length 20. Finally, the last node (node 5) must be connected to this path so as to obtain a tour. This can be done in 5 ways: Greedy Solution: The pure greedy heuristic has been applied to this instance in Chapter 2. Here we use a nearest neighborhood insertion heuristic: Starting from an arbitrary node we subsequently build paths P t containing that node by inserting the node nearest to this path in the current path, until we obtain a tour (this happens after n 1 insertions). node node insertions length new path P P P P P possible insertions of node 5 increment = = = = = 47 The shortest tour results when inserting 5 between the nodes 2 and 1 in P 4 (and adding the arc {4,2}): , and the length of this tour is = 52. Optimization Group 7

9 Other Variants of Greedy Insertion for Symmetric TSP The choice of the node to insert in the current path can be made in many different ways: nearest node (as done in the eample); random node; farthest node. When solving the same STSP instance with cheapest insertion of the farthest node we obtain (starting at node 1 again): P 1 = 1 5 (length 12); P 2 = (length 20); P 3 = (length 28); P 4 = (length 27); tour (length 49). The tour length is shorter than when inserting the nearest node! Suprising, but this often happens. Can you understand why? Optimization Group 8

10 Local Search Heuristic Local search can be more conveniently discussed when the problem has the form min S N {c(s) : g(s) = 0}, where g(s) 0 represents a measure for infeasibility of S. E.g., the constraint v(s) k can be represented by using g(s) = (k v(s)) +. For a local search heuristic we need: a solution S N; a local neighborhood Q(S) for each solution S N; a goal function f(s), which can be either c(s) when S is feasible, and infinite otherwise, or a composite function of the form c(s) + αg(s) (α 0). Local search Heuristic: Choose an intial solution S. Search for a solution S Q(S) that minimizes f(s ). If f(s ) = f(s), stop. Then S H = S is locally optimal. Otherwise, set S = S and repeat. Optimization Group 9

11 Local Search Heuristic (cont.) Local search Heuristic: Choose an intial solution S. Search for a solution S Q(S) that minimizes f(s ). If f(s ) = f(s), stop. Then S H = S is locally optimal. Otherwise, set S = S and repeat. Q(S 1 ) Q(S 2 ) Q(S k ) Appropriate choice of the neighborhoods depend on the problem structure. A simple choice is just to add or remove one element to or from S. This neighborhood has O(n) elements. Finding the best solution in the neighborhood can thus be done in O(n) time. If all feasible sets have the same size, a useful neighborhood is obtained by replacing one element of S by an element not in S. This requires O(n 2 ) time. In the case of the STSP this leads to the well known 2-echange heuristic. Optimization Group 10

12 2-Echange Heuristic for STSP With the greedy insertion heuristic we found the tour of length 49. A 2-echange removes two (nonadjacent, why?) edges. The two resulting pieces are connected by two other edges (this can be done in only one way, why?!). If this yields a better tour, we accept it and repeat, until we find a so-called 2-optimal tour. We have a 6-city problem. So a tour has 6 edges. There are = 9 possible 2-echanges. For each 2-echange we give the two new edges and the increase of the tour length Current tour. Length no. 2 deleted arcs 2 new arcs increment 1 {1,3}, {2,5} {1,2}, {3,5} = 15 2 {1,3}, {5,6} {1,5}, {3,6} = 3 3 {1,3}, {6,4} {1,6}, {3,4} 40 5 = 35 4 {3,2}, {5,6} {3,5}, {2,6} = 24 5 {3,2}, {6,4} {3,6}, {2,4} = 15 6 {3,2}, {4,1} {3,4}, {2,1} = 23 7 {2,5}, {6,4} {2,6}, {5,4} = 43 8 {2,5}, {4,1} {2,4}, {5,1} = 13 9 {5,6}, {4,1} {5,4}, {6,1} = New tour. Length 46. The second echange gives an improvement: It deceases the length of the tour to z L = 46. The tour is One may verify that this tour is 2-optimal. Optimization Group

13 Improved Local Search Heuristics How do we escape from a local minimum, and thus potentially do better than a local search heuristic? This is the question addressed by two heuristics that we first discuss briefly: Tabu search Simulated annealing After this we deal with Genetic algorithms Rather than working with individual solutions, genetic algorithms work with a finite population (set of solutions) S 1,..., S k, and the population evolves (changes somewhat randomly) from one generation (iteration) to the net. Finally we discuss two special purpose heuristics for the STSP problem and some heuristics for MIPs. Optimization Group 12

14 Tabu Search In order to escape from local minima one has to accept so now and then a solution with a worse value than the incumbent. Since the (old and better) incumbent may belong to the neighborhood of the new incumbent it may happen that cycling occurs, i.e., that the algorithm returns to the same solution every two or three steps: S 0 S 1 S 0 S To avoid cycling, certain solutions or moves are forbidden or tabu. Comparing the new solution with all previous incumbents would require much memory space and may be very time consuming. Instead, a tabu list of recent solutions, or solution modifications, is kept. A basic version of the algorithm is Step 1: Initialize a tabu list. Step 2: Get an initial solution S. Step 3: While the stopping criteria is not satisfied: 3.1: Choose a subset Q (S) Q(S) of non-tabu solutions. 3.2: Let S = argmin { f(t) : T Q (S) }. 3.3: Replace S by S. Step 4: On termination, the best solution found is the heuristic solution. Optimization Group 13

15 Tabu Search (cont.) Step 1: Initialize a tabu list. Step 2: Get an initial solution S. Step 3: While the stopping criteria is not satisfied: 3.1: Choose a subset Q (S) Q(S) of non-tabu solutions. 3.2: Let S = argmin {f(t) : T Q (S)}. 3.3: Replace S by S. Step 4: On termination, the best solution found is the heuristic solution. The parameters specific to tabu search are (i) The choice of Q (S). If Q(S) is small, one takes the whole neighborhood. Otherwise, Q (S) can be a fied number of neighbors of S, chosen randomly or by some heuristic rule. (ii) The tabu list consists of a small number t of most recent solutions (or modifications). If t = 1 or t = 2, it is not surprising that cycling is still common. The magic value t = 7 is often a good choice. (iii) The stopping rule is often just a fied number of iterations, or a certain number of iterations without any improvement of the goal value of the best solution found. Optimization Group 14

16 Tabu Search (cont.) If the neighborhood consists of single element switches: Q(S) = {T N : T \ S = 1}, S N, then the tabu list might be a list of the last t elements {i 1,..., i t } added to the incumbent and a list of the last t elements {j 1,..., j t } removed from the incumbent. A neighbor T is then tabu if T = S \ {i q } or if T = S {j q } for some q = 1,..., t. So, in forming the new incumbent one is not allowed to modify the current incumbent S by adding some i q or by removing some j q. When implementing tabu search, the performance can be improved by using common sense. E.g., there is no justification to make a solution tabu if it is the best solution found by other researchers. In other words, tabu search can be viewed as a search strategy that tries to take advantage of the history of the search and the problem structure intelligently. Optimization Group 15

17 Simulated Annealing Simulation is less direct, and less intelligent. The basic idea is to choose a neighbor randomly. The neighbor replaces the incumbent with probability 1 if it is better, and with some (changing) probability (0,1) if it has a worse value. The probability for accepting a worse solution is taken proportional to the difference in goal values. So, if the number of iterations is large enough, one can escape from every local minimum. On the other hand, to guarantee convergence of the algorithm, the probability of accepting worse solutions decreases over time. A more formal description is as follows: Step 1: Get an initial solution S. Step 2: Get an initial temperature T, and a cooling ratio r (0 < r < 1). Step 3: While not yet frozen, do the following: 3.1: Perform the following loop L times: 3.1.1: Pick a random neighborhood S of S : Let = f(s ) f(s) : If 0, replace S by S : If > 0, replace S by S with probability e T. 3.2: Set T rt. (Reduce the temperature.) Step 4: Return the best solution found, this is the heuristic solution. Optimization Group 16

18 Simulated Annealing (cont.) Step 1: Get an initial solution S. Step 2: Get an initial temperature T, and a cooling ratio r (0 < r < 1). Step 3: While not yet frozen, do the following: 3.1: Perform the following loop L times: 3.1.1: Pick a random neighborhood S of S : Let = f(s ) f(s) : If 0, replace S by S : If > 0, replace S by S with probability e T. 3.2: Set T rt. (Reduce the temperature.) Step 4: Return the best solution found, this is the heuristic solution. Note that the probability of accepting worse solutions decreases in time, because the temperature T decreases in time. Also, the probability decreases if increases, i.e. if the quality of the solution becomes worse. Just as for other local search heuristics, one has to define an initial solution, a neighborhood for each solution and the value f(s) of a solution. The parameters specific for SA are (i) The initial temperature T. (ii) The cooling ration r. (iii) The loop length L. (iv) The definition of frozen, i.e., a stopping rule. Optimization Group 17

19 Genetic Algorithms Rather than working with individual solutions, genetic algorithms work with a finite population (set of solutions) S 1,..., S k, and the population evolves (changes somewhat randomly) from one generation (iteration) to the net. An iteration consists of the following steps: (i) Evaluation. The fitness of the individuals is evaluated. (ii) Parent Selection. Certain pairs of solutions (parents) are selected based on their fitness. (iii) Crossover. Each pair of parents combines to produce one or two new solutions (offspring). (iv) Mutation. Some of the offspring solutions are randomly modified. (v) Population Selection. Based on their fitness, a new population is selected replacing some or all of the original population by an identical number of offspring solutions. Each of the five steps is discussed in more detail in the book. Optimization Group 18

20 We consider the integer knapsack problem Worst-case Analysis of (some) Heuristics z = ma Integer Knapsack Heuristic n j=1 c j j : n j=1 a j j b, Z n where a j Z + for all j and b Z +. We suppose that the variables are ordered so that c j a is non-increasing and, moreover, that a j j b for all j. We consider the following simple heuristic: take H 1 = b a1 and H j = 0 for j 2, yielding the value z H = c b 1 a1. Theorem 1 z H 1 2 z. Proof: The solution of the linear relaation is 1 = a b, and 1 j = 0 for j 2. This provides the upper bound z LP = c 1b a z. As a 1 1 b, we have b a1 1. Setting b a = [ ] b 1 a1 + b a1 f we have 0 [ ] b a1 < 1. Hence f z H b = c 1 a 1 = b a1 z LP = b a 1 b a1 b a1 + [ b a1 ]f z LP b a1, b a1 + b a1 z LP = 1 2 zlp 1 2 z. Optimization Group 19

21 Eucledian STSP We say that an STSP in graph G = (V, E) is Eucledian if for any three edges e, f and g forming a triangle, one has the so-called triangle inequality: c e c f + c g. Proposition 1 Let G be a complete graph whose edge lengths satisfy the triangle inequality, and let G contain a subgraph H = (V, Ē) which is Eulerian. Then G contains a Hamilton cycle of length at most c(h) = e Ē c e. Proof: Note that any Eulerian circuit in H passes eactly once through every edge in Ē, and hence has length e Ē c e. Any such circuit C passes through all the nodes in V, probably more than once through some or all of them. Using C we construct a Hamilton circuit as follows. We choose a node v 1 V, and starting at that node we walk along the edges of C (in one of the two possible directions). The first new node that we encounter after leaving v 1 is called v 2. After leaving v 2, the first node different from v 1 and v 2 is called v 3, and so on. Proceeding in this way we get a sequence v 1, v 2..., v n containing all the nodes in V. We claim that the tour T : v 1 v 2... v n v 1 has length at most e Ē c e For this two observations are necessary. First, G contains the edges { } v i, v i+1 for i = 1 to i = n 1 and also the edge {v n, v 1 }, since G is a complete graph. So T is a tour indeed. Second, the nodes on T partition the Euler circuit C in n subsequent edge-disjoint pieces. Each edge on the tour T is a shortcut for the corresponding piece, due to the triangle inequality. Thus the length of the tour is at most equal to the length of C. Optimization Group 20

22 Geometric proof We consider the piece of the Eulerian tour between the nodes v i and v i+1 of the Hamilton tour. It will be convenient to call the nodes on the corresponding piece of the Eulerain tour 1, 2,..., k. So 1 v i and k v i+1. The figure depicts the situation for k = 6. The red edges are part of the Eucledian tour Using the triangle inequality we show that the shortcut 1 6 is shorter than the path : c 16 c 15 + c 56 c 14 + c 45 + c 56 c 13 + c 34 + c 45 + c 56 c 12 + c 23 + c 34 + c 45 + c 56. Optimization Group 21

23 The Tree Heuristic for Eucledian STSP Step 1: Find a minimum-length spanning tree T, with edge set E T and length z T = e E T c e. Step 2: Double each edge in T to form a connected Eulerian graph. Step 3: Using the previous proposition, taking any Eulerian tour, construct a Hamilton circuit of length z H. Proposition 2 z H 2 z. Proof: Since every tour consists of a spanning tree plus an edge, we have the lower bound z T z. The length of the Eulerian tour is 2z T, by construction. The construction of the Hamilton circuit guarantees that z H 2z T. Thus we have z H 2 z T z z H. Optimization Group 22

24 Eample of the Tree Heuristic for Eucledian STSP Below the distances are given between the capitals of 11 of the 12 provinces in the Netherlands Amsterdam Arnhem Assen Den Haag Groningen Den Bosch Leeuwarden Maastricht Middelburg Utrecht Zwolle We want to find a minimum length Hamilton circuit along these cities. Optimization Group 23

25 Eample of the Tree Heuristic for Eucledian STSP We first find a minimal weight spanning tree. The minimal weight tree is shown in the graph. It has weight = 674. By doubling each edge in the tree the graph becomes Eulerian An Eulerian tour is (e.g.) , which has length From this we obtain the Hamilton circuit , with length = N.B. When applying 2-echanges to the above tour the length 9 6 reduces to The farthest neighbor insertion heuristic, when started at node 10, yields a tour of length When applying 2-echanges this length reduces to 990. This is optimal: For the nearest neighborhood insertion heuristic, 8 starting at node 8, the length is 1045, which can be reduced to 1032 by 2-echanges. Optimization Group 24

26 The Tree/Matching Heuristic for Eucledian STSP Step 1: Find a minimum-length spanning tree T, with edge set E T and length z T = e E T c e. Step 2: Let U be the set of odd nodes in (V, E T ). Find a perfect matching M of minimum length z M in the subgraph G = (U, E ) of G induced by the subset U, so E contains all edges in G with both end points in U. By construction, (V, E T M) is an Eulerian graph. Step 3: From any Eulerian tour in (V, E T M), construct a Hamilton circuit of length z C. Proposition 3 (Christofides, 1976) z C 3 2 z. Proof: As above, z T z. Let the nodes be ordered such that n 1 is { an optimal } tour (of length z). Let j 1 <... < j 2k be the nodes of U, and let e i = ji, j i+1, with jn+1 = j 1. Due to the triangle inequality, one has 2k i=1 e i z. The sets M 1 = {e i : i odd} and M 2 = {e i : i even} are matchings in G = (U, E ). Hence z M z M1 and z M z M2. Consequently, since z M1 + z M2 = 2k i=1 e i z, 2z M z. Hence either z M1 1 2 z, or z M z. Suppose z M z. Then (V, E T M 1 ) is Eulerian, and has weight 3 2z. As we have seen before, we can then construct a Hamiltonian circuit of length z C 3 2 z. Optimization Group 25

27 Geometric proof We are given an optimal Hamilton circuit C : n 1 of length z in the graph G = (V, E) and the set U of nodes having odd degree in a minimum weight spanning tree T of G. The nodes in U partition C in pieces, we connect these nodes by the edges that are shortcuts of these pieces, as indicated in the figure below. These edges form an even length circuit, because the number of nodes in U is even. We alternately assign the edges to the sets M 1 (red) and M 2 (green). Then M 1 and M 2 are matchings in the subgraph of G induced by U. u 2 u 1 u 3 u 6 u 5 u 4 Obviously, due to the triangle inequality, z M1 + z M2 z, Hence either z M1 1 2z, or z M2 1 2z. By adding the shortest matching to T we get an Eulerian graph whose Eulerian circuits have length 1 2 z. Optimization Group 26

28 Eample of the Tree/Matching Heuristic for Eucledian STSP We use the minimal weight spanning tree that we found earlier. The odd degree nodes in the tree form the set {7,8,9,10}. The indiced subgraph on these nodes is depicted below: The mimimal weight matching consists of the arcs {8, 9} and {7,10}. Adding these arcs to the tree the graph becomes Eulerian. An Eulerian tour is (e.g.) From this we obtain the Hamilton circuit , with length = This route is 2-optimal. Note that we know that a minimum length tour can not be shorter than = Optimization Group 27

29 MIP-based Heuristics Dive-and-Fi Aim: find quickly a feasible solution, which gives a tight lower bound in the search tree. Suppose we have a mied 0-1 problem in variables i R and y j {0,1}. Given a solution (, y ) of the linear relaation, let F = { j : y j / {0,1} }. Initialization Take the solution (, y ) of the linear relaation at some node in the search tree. Basic Iteration As long as F, do the{ following: [ Let i = argmin j F min y j,1 yj ]} (find the fractional variable closest to integer). If yi < 0.5, fi y i = 0 (if close to 0, fi to 0). If yi 0.5, fi y i = 1 (if close to 1, fi to 1). Solve the resulting LO problem. If the LO problem is infeasible, stop (the heuristic has failed). Otherwise, let (, y ) be the new linear solution. Termination If F =, (, y ) is a feasible mied integer solution. Optimization Group 28

30 Two other MIP-based Heuristics We discuss two other heuristics for mied integer problems that are hard to solve. For simplicity we describe the heuristic for an IP of the following form. { z = ma c 1T 1 + c 2T 2 : A A 2 2 = b, 1 Z n 1 +, 2 Z n } 2 +. It is supposed that the variables 1 j for j N 1 are more important than the variables 2 j for j N 2, where N i = n i for i = 1,2. The idea is to solve two (or more) easier LO problems or MIPs. The first one allows us to fi or limit the range of the more important 1 variables, whereas the second allows us choose good values for the 1 variables. Optimization Group 29

31 z = ma Rela-and-Fi { } c 1T 1 + c 2T 2 : A A 2 2 = b, 1 Z n 1 +, 2 Z n 2 +. Rela: Solve the relaation { z = ma c 1T 1 + c 2T 2 : A A 2 2 = b, 1 Z n 1 +, 2 R n 2 + Let ( 1, 2) be a solution of this problem. Fi: Fi the (important) 1 variables to their value in 1 and solve the restriction z = ma { c 1T 1 + c 2T 2 : A 2 2 = b A 1 1, 1 = 1, 2 Z n 2 + If this problem is infeasible, the heuristic fails. Otherwise, let ( 1, 2) be a solution of this problem. Heuristic: Use as heuristic solution H = ( 1, 2), whose value satisfies z = c T H z z. } }.. Optimization Group 30

32 z = ma Cut-and-Fi { } c 1T 1 + c 2T 2 : A A 2 2 = b, 1 Z n 1 +, 2 Z n 2 + We assume that a strong cutting plane algorithm is available, which generates strong cuts C C 2 2 γ, so after adding these cuts, at least some of the integer variables take values close to integer or to their optimal values. Cut: Using a strong cutting plane algorithm find a solution of the tight linear relaation { } z = ma c 1T 1 + c 2T 2 : A A 2 2 = b, C C 2 2 γ, 1 R n 1 +, 2 R n 2 +. Let ( 1, 2) be a solution of this problem. N.B. The cuts are generated in the course of the algorithm solving the linear relaation of the given problem. Fi or Bound : Choose ǫ. For j N 1, set l j = 1 j + ǫ and u j = 1 j ǫ. Solve the restriction z = ma c 1T 1 + c 2T 2 : A A 2 2 = b, C C 2 2 γ, l 1 u, 1 Z n 1 +, 2 Z n 2 +. If this problem is infeasible, the heuristic fails: one may try again with ǫ decreased. Otherwise, let ( 1, 2) be a solution of this problem. Heuristic: Use as heuristic solution H = ( 1, 2), whose value satisfies z = c T H z z. Observe that if ǫ is small and positive, 1 variables taking values within ǫ of integer in the linear relaation are fied in the restricted problem, while others are forced to either the value 1 j or the value 1 j. On the other hand, if ǫ is negative, all the 1 variables can still take two values in the restricted problem.. Optimization Group 31

33 More Courses on Optmization Code Name Docent WI3 031 Niet-Lineaire Optimalisering C. Roos WI4 051TU Introduction to OR H. van Maaren WI4 060 Optimization and Engineering C. Roos WI4 062TU Transportation, Routing and Scheduling Problems C. Roos WI4 063TU Network Models and Algorithms J.B.M. Melissen WI4 064 Discrete Optimization C. Roos WI4 087TU Optimization, Models and Algorithms H. van Maaren WI4 131 Discrete and Continuous Optimization G.J. Olsder/C. Roos IN4 082 Local (Heuristic) Search Methods H. van Maaren IN4 077 Computational Logic and Satisfiability H. van Maaren/C. Witteveen IN4 081 Randomized algorithms H. van Maaren Optimization Group 32

Last topic: Summary; Heuristics and Approximation Algorithms Topics we studied so far:

Last topic: Summary; Heuristics and Approximation Algorithms Topics we studied so far: Last topic: Summary; Heuristics and Approximation Algorithms Topics we studied so far: I Strength of formulations; improving formulations by adding valid inequalities I Relaxations and dual problems; obtaining

More information

Optimal tour along pubs in the UK

Optimal tour along pubs in the UK 1 From Facebook Optimal tour along 24727 pubs in the UK Road distance (by google maps) see also http://www.math.uwaterloo.ca/tsp/pubs/index.html (part of TSP homepage http://www.math.uwaterloo.ca/tsp/

More information

MVE165/MMG630, Applied Optimization Lecture 8 Integer linear programming algorithms. Ann-Brith Strömberg

MVE165/MMG630, Applied Optimization Lecture 8 Integer linear programming algorithms. Ann-Brith Strömberg MVE165/MMG630, Integer linear programming algorithms Ann-Brith Strömberg 2009 04 15 Methods for ILP: Overview (Ch. 14.1) Enumeration Implicit enumeration: Branch and bound Relaxations Decomposition methods:

More information

Theorem 2.9: nearest addition algorithm

Theorem 2.9: nearest addition algorithm There are severe limits on our ability to compute near-optimal tours It is NP-complete to decide whether a given undirected =(,)has a Hamiltonian cycle An approximation algorithm for the TSP can be used

More information

Travelling Salesman Problem. Algorithms and Networks 2015/2016 Hans L. Bodlaender Johan M. M. van Rooij

Travelling Salesman Problem. Algorithms and Networks 2015/2016 Hans L. Bodlaender Johan M. M. van Rooij Travelling Salesman Problem Algorithms and Networks 2015/2016 Hans L. Bodlaender Johan M. M. van Rooij 1 Contents TSP and its applications Heuristics and approximation algorithms Construction heuristics,

More information

Methods and Models for Combinatorial Optimization Exact methods for the Traveling Salesman Problem

Methods and Models for Combinatorial Optimization Exact methods for the Traveling Salesman Problem Methods and Models for Combinatorial Optimization Exact methods for the Traveling Salesman Problem L. De Giovanni M. Di Summa The Traveling Salesman Problem (TSP) is an optimization problem on a directed

More information

Traveling Salesman Problem. Algorithms and Networks 2014/2015 Hans L. Bodlaender Johan M. M. van Rooij

Traveling Salesman Problem. Algorithms and Networks 2014/2015 Hans L. Bodlaender Johan M. M. van Rooij Traveling Salesman Problem Algorithms and Networks 2014/2015 Hans L. Bodlaender Johan M. M. van Rooij 1 Contents TSP and its applications Heuristics and approximation algorithms Construction heuristics,

More information

Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R + Goal: find a tour (Hamiltonian cycle) of minimum cost

Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R + Goal: find a tour (Hamiltonian cycle) of minimum cost Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R + Goal: find a tour (Hamiltonian cycle) of minimum cost Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R

More information

Polynomial time approximation algorithms

Polynomial time approximation algorithms Polynomial time approximation algorithms Doctoral course Optimization on graphs - Lecture 5.2 Giovanni Righini January 18 th, 2013 Approximation algorithms There are several reasons for using approximation

More information

Comparison of TSP Algorithms

Comparison of TSP Algorithms Comparison of TSP Algorithms Project for Models in Facilities Planning and Materials Handling December 1998 Participants: Byung-In Kim Jae-Ik Shim Min Zhang Executive Summary Our purpose in this term project

More information

Fall CS598CC: Approximation Algorithms. Chandra Chekuri

Fall CS598CC: Approximation Algorithms. Chandra Chekuri Fall 2006 CS598CC: Approximation Algorithms Chandra Chekuri Administrivia http://www.cs.uiuc.edu/homes/chekuri/teaching/fall2006/approx.htm Grading: 4 home works (60-70%), 1 take home final (30-40%) Mailing

More information

1 The Traveling Salesperson Problem (TSP)

1 The Traveling Salesperson Problem (TSP) CS 598CSC: Approximation Algorithms Lecture date: January 23, 2009 Instructor: Chandra Chekuri Scribe: Sungjin Im In the previous lecture, we had a quick overview of several basic aspects of approximation

More information

Coping with the Limitations of Algorithm Power Exact Solution Strategies Backtracking Backtracking : A Scenario

Coping with the Limitations of Algorithm Power Exact Solution Strategies Backtracking Backtracking : A Scenario Coping with the Limitations of Algorithm Power Tackling Difficult Combinatorial Problems There are two principal approaches to tackling difficult combinatorial problems (NP-hard problems): Use a strategy

More information

CS270 Combinatorial Algorithms & Data Structures Spring Lecture 19:

CS270 Combinatorial Algorithms & Data Structures Spring Lecture 19: CS270 Combinatorial Algorithms & Data Structures Spring 2003 Lecture 19: 4.1.03 Lecturer: Satish Rao Scribes: Kevin Lacker and Bill Kramer Disclaimer: These notes have not been subjected to the usual scrutiny

More information

2 is not feasible if rounded. x =0,x 2

2 is not feasible if rounded. x =0,x 2 Integer Programming Definitions Pure Integer Programming all variables should be integers Mied integer Programming Some variables should be integers Binary integer programming The integer variables are

More information

Outline. CS38 Introduction to Algorithms. Approximation Algorithms. Optimization Problems. Set Cover. Set cover 5/29/2014. coping with intractibility

Outline. CS38 Introduction to Algorithms. Approximation Algorithms. Optimization Problems. Set Cover. Set cover 5/29/2014. coping with intractibility Outline CS38 Introduction to Algorithms Lecture 18 May 29, 2014 coping with intractibility approximation algorithms set cover TSP center selection randomness in algorithms May 29, 2014 CS38 Lecture 18

More information

Approximation Algorithms

Approximation Algorithms Chapter 8 Approximation Algorithms Algorithm Theory WS 2016/17 Fabian Kuhn Approximation Algorithms Optimization appears everywhere in computer science We have seen many examples, e.g.: scheduling jobs

More information

3 No-Wait Job Shops with Variable Processing Times

3 No-Wait Job Shops with Variable Processing Times 3 No-Wait Job Shops with Variable Processing Times In this chapter we assume that, on top of the classical no-wait job shop setting, we are given a set of processing times for each operation. We may select

More information

Lecture 8: The Traveling Salesman Problem

Lecture 8: The Traveling Salesman Problem Lecture 8: The Traveling Salesman Problem Let G = (V, E) be an undirected graph. A Hamiltonian cycle of G is a cycle that visits every vertex v V exactly once. Instead of Hamiltonian cycle, we sometimes

More information

V1.0: Seth Gilbert, V1.1: Steven Halim August 30, Abstract. d(e), and we assume that the distance function is non-negative (i.e., d(x, y) 0).

V1.0: Seth Gilbert, V1.1: Steven Halim August 30, Abstract. d(e), and we assume that the distance function is non-negative (i.e., d(x, y) 0). CS4234: Optimisation Algorithms Lecture 4 TRAVELLING-SALESMAN-PROBLEM (4 variants) V1.0: Seth Gilbert, V1.1: Steven Halim August 30, 2016 Abstract The goal of the TRAVELLING-SALESMAN-PROBLEM is to find

More information

Introduction to Approximation Algorithms

Introduction to Approximation Algorithms Introduction to Approximation Algorithms Dr. Gautam K. Das Departmet of Mathematics Indian Institute of Technology Guwahati, India gkd@iitg.ernet.in February 19, 2016 Outline of the lecture Background

More information

Unit 8: Coping with NP-Completeness. Complexity classes Reducibility and NP-completeness proofs Coping with NP-complete problems. Y.-W.

Unit 8: Coping with NP-Completeness. Complexity classes Reducibility and NP-completeness proofs Coping with NP-complete problems. Y.-W. : Coping with NP-Completeness Course contents: Complexity classes Reducibility and NP-completeness proofs Coping with NP-complete problems Reading: Chapter 34 Chapter 35.1, 35.2 Y.-W. Chang 1 Complexity

More information

val(y, I) α (9.0.2) α (9.0.3)

val(y, I) α (9.0.2) α (9.0.3) CS787: Advanced Algorithms Lecture 9: Approximation Algorithms In this lecture we will discuss some NP-complete optimization problems and give algorithms for solving them that produce a nearly optimal,

More information

Steiner Trees and Forests

Steiner Trees and Forests Massachusetts Institute of Technology Lecturer: Adriana Lopez 18.434: Seminar in Theoretical Computer Science March 7, 2006 Steiner Trees and Forests 1 Steiner Tree Problem Given an undirected graph G

More information

Outline. 1 The matching problem. 2 The Chinese Postman Problem

Outline. 1 The matching problem. 2 The Chinese Postman Problem Outline The matching problem Maximum-cardinality matchings in bipartite graphs Maximum-cardinality matchings in bipartite graphs: The augmenting path algorithm 2 Let G = (V, E) be an undirected graph.

More information

CS261: A Second Course in Algorithms Lecture #16: The Traveling Salesman Problem

CS261: A Second Course in Algorithms Lecture #16: The Traveling Salesman Problem CS61: A Second Course in Algorithms Lecture #16: The Traveling Salesman Problem Tim Roughgarden February 5, 016 1 The Traveling Salesman Problem (TSP) In this lecture we study a famous computational problem,

More information

Combinatorial Optimization - Lecture 14 - TSP EPFL

Combinatorial Optimization - Lecture 14 - TSP EPFL Combinatorial Optimization - Lecture 14 - TSP EPFL 2012 Plan Simple heuristics Alternative approaches Best heuristics: local search Lower bounds from LP Moats Simple Heuristics Nearest Neighbor (NN) Greedy

More information

Graphs and Network Flows IE411. Lecture 21. Dr. Ted Ralphs

Graphs and Network Flows IE411. Lecture 21. Dr. Ted Ralphs Graphs and Network Flows IE411 Lecture 21 Dr. Ted Ralphs IE411 Lecture 21 1 Combinatorial Optimization and Network Flows In general, most combinatorial optimization and integer programming problems are

More information

Decision Problems. Observation: Many polynomial algorithms. Questions: Can we solve all problems in polynomial time? Answer: No, absolutely not.

Decision Problems. Observation: Many polynomial algorithms. Questions: Can we solve all problems in polynomial time? Answer: No, absolutely not. Decision Problems Observation: Many polynomial algorithms. Questions: Can we solve all problems in polynomial time? Answer: No, absolutely not. Definition: The class of problems that can be solved by polynomial-time

More information

Coloring 3-Colorable Graphs

Coloring 3-Colorable Graphs Coloring -Colorable Graphs Charles Jin April, 015 1 Introduction Graph coloring in general is an etremely easy-to-understand yet powerful tool. It has wide-ranging applications from register allocation

More information

Notes for Recitation 9

Notes for Recitation 9 6.042/18.062J Mathematics for Computer Science October 8, 2010 Tom Leighton and Marten van Dijk Notes for Recitation 9 1 Traveling Salesperson Problem Now we re going to talk about a famous optimization

More information

SLS Methods: An Overview

SLS Methods: An Overview HEURSTC OPTMZATON SLS Methods: An Overview adapted from slides for SLS:FA, Chapter 2 Outline 1. Constructive Heuristics (Revisited) 2. terative mprovement (Revisited) 3. Simple SLS Methods 4. Hybrid SLS

More information

Integer Programming Theory

Integer Programming Theory Integer Programming Theory Laura Galli October 24, 2016 In the following we assume all functions are linear, hence we often drop the term linear. In discrete optimization, we seek to find a solution x

More information

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502)

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502) Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik Combinatorial Optimization (MA 4502) Dr. Michael Ritter Problem Sheet 4 Homework Problems Problem

More information

Notes for Lecture 24

Notes for Lecture 24 U.C. Berkeley CS170: Intro to CS Theory Handout N24 Professor Luca Trevisan December 4, 2001 Notes for Lecture 24 1 Some NP-complete Numerical Problems 1.1 Subset Sum The Subset Sum problem is defined

More information

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I. Instructor: Shaddin Dughmi

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I. Instructor: Shaddin Dughmi CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I Instructor: Shaddin Dughmi Announcements Posted solutions to HW1 Today: Combinatorial problems

More information

56:272 Integer Programming & Network Flows Final Exam -- December 16, 1997

56:272 Integer Programming & Network Flows Final Exam -- December 16, 1997 56:272 Integer Programming & Network Flows Final Exam -- December 16, 1997 Answer #1 and any five of the remaining six problems! possible score 1. Multiple Choice 25 2. Traveling Salesman Problem 15 3.

More information

Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Lecture 28 Chinese Postman Problem In this lecture we study the Chinese postman

More information

5.5 The Travelling Salesman Problem

5.5 The Travelling Salesman Problem 0 Matchings and Independent Sets 5.5 The Travelling Salesman Problem The Travelling Salesman Problem A travelling salesman, starting in his own town, has to visit each of towns where he should go to precisely

More information

An O(log n/ log log n)-approximation Algorithm for the Asymmetric Traveling Salesman Problem

An O(log n/ log log n)-approximation Algorithm for the Asymmetric Traveling Salesman Problem An O(log n/ log log n)-approximation Algorithm for the Asymmetric Traveling Salesman Problem and more recent developments CATS @ UMD April 22, 2016 The Asymmetric Traveling Salesman Problem (ATSP) Problem

More information

Small Survey on Perfect Graphs

Small Survey on Perfect Graphs Small Survey on Perfect Graphs Michele Alberti ENS Lyon December 8, 2010 Abstract This is a small survey on the exciting world of Perfect Graphs. We will see when a graph is perfect and which are families

More information

Slides on Approximation algorithms, part 2: Basic approximation algorithms

Slides on Approximation algorithms, part 2: Basic approximation algorithms Approximation slides Slides on Approximation algorithms, part : Basic approximation algorithms Guy Kortsarz Approximation slides Finding a lower bound; the TSP example The optimum TSP cycle P is an edge

More information

Algorithm Design (4) Metaheuristics

Algorithm Design (4) Metaheuristics Algorithm Design (4) Metaheuristics Takashi Chikayama School of Engineering The University of Tokyo Formalization of Constraint Optimization Minimize (or maximize) the objective function f(x 0,, x n )

More information

APPROXIMATION ALGORITHMS FOR GEOMETRIC PROBLEMS

APPROXIMATION ALGORITHMS FOR GEOMETRIC PROBLEMS APPROXIMATION ALGORITHMS FOR GEOMETRIC PROBLEMS Subhas C. Nandy (nandysc@isical.ac.in) Advanced Computing and Microelectronics Unit Indian Statistical Institute Kolkata 70010, India. Organization Introduction

More information

Polynomial Time Approximation Schemes for the Euclidean Traveling Salesman Problem

Polynomial Time Approximation Schemes for the Euclidean Traveling Salesman Problem PROJECT FOR CS388G: ALGORITHMS: TECHNIQUES/THEORY (FALL 2015) Polynomial Time Approximation Schemes for the Euclidean Traveling Salesman Problem Shanshan Wu Vatsal Shah October 20, 2015 Abstract In this

More information

Basic Approximation algorithms

Basic Approximation algorithms Approximation slides Basic Approximation algorithms Guy Kortsarz Approximation slides 2 A ρ approximation algorithm for problems that we can not solve exactly Given an NP-hard question finding the optimum

More information

Improved Approximations for Graph-TSP in Regular Graphs

Improved Approximations for Graph-TSP in Regular Graphs Improved Approximations for Graph-TSP in Regular Graphs R Ravi Carnegie Mellon University Joint work with Uriel Feige (Weizmann), Jeremy Karp (CMU) and Mohit Singh (MSR) 1 Graph TSP Given a connected unweighted

More information

Constraint Satisfaction Problems

Constraint Satisfaction Problems Constraint Satisfaction Problems Greedy Local Search Bernhard Nebel, Julien Hué, and Stefan Wölfl Albert-Ludwigs-Universität Freiburg June 19, 2007 Nebel, Hué and Wölfl (Universität Freiburg) Constraint

More information

Lecture 3: Totally Unimodularity and Network Flows

Lecture 3: Totally Unimodularity and Network Flows Lecture 3: Totally Unimodularity and Network Flows (3 units) Outline Properties of Easy Problems Totally Unimodular Matrix Minimum Cost Network Flows Dijkstra Algorithm for Shortest Path Problem Ford-Fulkerson

More information

Escaping Local Optima: Genetic Algorithm

Escaping Local Optima: Genetic Algorithm Artificial Intelligence Escaping Local Optima: Genetic Algorithm Dae-Won Kim School of Computer Science & Engineering Chung-Ang University We re trying to escape local optima To achieve this, we have learned

More information

Lecture 7: Asymmetric K-Center

Lecture 7: Asymmetric K-Center Advanced Approximation Algorithms (CMU 18-854B, Spring 008) Lecture 7: Asymmetric K-Center February 5, 007 Lecturer: Anupam Gupta Scribe: Jeremiah Blocki In this lecture, we will consider the K-center

More information

EXERCISES SHORTEST PATHS: APPLICATIONS, OPTIMIZATION, VARIATIONS, AND SOLVING THE CONSTRAINED SHORTEST PATH PROBLEM. 1 Applications and Modelling

EXERCISES SHORTEST PATHS: APPLICATIONS, OPTIMIZATION, VARIATIONS, AND SOLVING THE CONSTRAINED SHORTEST PATH PROBLEM. 1 Applications and Modelling SHORTEST PATHS: APPLICATIONS, OPTIMIZATION, VARIATIONS, AND SOLVING THE CONSTRAINED SHORTEST PATH PROBLEM EXERCISES Prepared by Natashia Boland 1 and Irina Dumitrescu 2 1 Applications and Modelling 1.1

More information

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery

A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery Monika Sharma 1, Deepak Sharma 2 1 Research Scholar Department of Computer Science and Engineering, NNSS SGI Samalkha,

More information

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) Local Search and Optimization Chapter 4 Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) 1 2 Outline Local search techniques and optimization Hill-climbing

More information

Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret

Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret Greedy Algorithms (continued) The best known application where the greedy algorithm is optimal is surely

More information

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld )

Local Search and Optimization Chapter 4. Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) Local Search and Optimization Chapter 4 Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld ) 1 2 Outline Local search techniques and optimization Hill-climbing

More information

Finding a -regular Supergraph of Minimum Order

Finding a -regular Supergraph of Minimum Order Finding a -regular Supergraph of Minimum Order Hans L. Bodlaender a, Richard B. Tan a,b and Jan van Leeuwen a a Department of Computer Science Utrecht University Padualaan 14, 3584 CH Utrecht The Netherlands

More information

Part II. Graph Theory. Year

Part II. Graph Theory. Year Part II Year 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2017 53 Paper 3, Section II 15H Define the Ramsey numbers R(s, t) for integers s, t 2. Show that R(s, t) exists for all s,

More information

In this lecture, we ll look at applications of duality to three problems:

In this lecture, we ll look at applications of duality to three problems: Lecture 7 Duality Applications (Part II) In this lecture, we ll look at applications of duality to three problems: 1. Finding maximum spanning trees (MST). We know that Kruskal s algorithm finds this,

More information

Approximation Algorithms: The Primal-Dual Method. My T. Thai

Approximation Algorithms: The Primal-Dual Method. My T. Thai Approximation Algorithms: The Primal-Dual Method My T. Thai 1 Overview of the Primal-Dual Method Consider the following primal program, called P: min st n c j x j j=1 n a ij x j b i j=1 x j 0 Then the

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms CSE 101, Winter 2018 Design and Analysis of Algorithms Lecture 17: Coping With Intractability Class URL: http://vlsicad.ucsd.edu/courses/cse101-w18/ Branch-and-Bound (B&B) Variant of backtrack with costs

More information

Chapter Design Techniques for Approximation Algorithms

Chapter Design Techniques for Approximation Algorithms Chapter 2 Design Techniques for Approximation Algorithms I N THE preceding chapter we observed that many relevant optimization problems are NP-hard, and that it is unlikely that we will ever be able to

More information

Fast and Simple Algorithms for Weighted Perfect Matching

Fast and Simple Algorithms for Weighted Perfect Matching Fast and Simple Algorithms for Weighted Perfect Matching Mirjam Wattenhofer, Roger Wattenhofer {mirjam.wattenhofer,wattenhofer}@inf.ethz.ch, Department of Computer Science, ETH Zurich, Switzerland Abstract

More information

IE 102 Spring Routing Through Networks - 1

IE 102 Spring Routing Through Networks - 1 IE 102 Spring 2017 Routing Through Networks - 1 The Bridges of Koenigsberg: Euler 1735 Graph Theory began in 1735 Leonard Eüler Visited Koenigsberg People wondered whether it is possible to take a walk,

More information

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018 CS 580: Algorithm Design and Analysis Jeremiah Blocki Purdue University Spring 2018 Chapter 11 Approximation Algorithms Slides by Kevin Wayne. Copyright @ 2005 Pearson-Addison Wesley. All rights reserved.

More information

DIT411/TIN175, Artificial Intelligence. Peter Ljunglöf. 23 January, 2018

DIT411/TIN175, Artificial Intelligence. Peter Ljunglöf. 23 January, 2018 DIT411/TIN175, Artificial Intelligence Chapters 3 4: More search algorithms CHAPTERS 3 4: MORE SEARCH ALGORITHMS DIT411/TIN175, Artificial Intelligence Peter Ljunglöf 23 January, 2018 1 TABLE OF CONTENTS

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms Given an NP-hard problem, what should be done? Theory says you're unlikely to find a poly-time algorithm. Must sacrifice one of three desired features. Solve problem to optimality.

More information

Approximation Algorithms

Approximation Algorithms Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015 Approximation Algorithms Tamassia Approximation Algorithms 1 Applications One of

More information

Copyright 2000, Kevin Wayne 1

Copyright 2000, Kevin Wayne 1 Guessing Game: NP-Complete? 1. LONGEST-PATH: Given a graph G = (V, E), does there exists a simple path of length at least k edges? YES. SHORTEST-PATH: Given a graph G = (V, E), does there exists a simple

More information

Some Upper Bounds for Signed Star Domination Number of Graphs. S. Akbari, A. Norouzi-Fard, A. Rezaei, R. Rotabi, S. Sabour.

Some Upper Bounds for Signed Star Domination Number of Graphs. S. Akbari, A. Norouzi-Fard, A. Rezaei, R. Rotabi, S. Sabour. Some Upper Bounds for Signed Star Domination Number of Graphs S. Akbari, A. Norouzi-Fard, A. Rezaei, R. Rotabi, S. Sabour Abstract Let G be a graph with the vertex set V (G) and edge set E(G). A function

More information

Optimization Techniques for Design Space Exploration

Optimization Techniques for Design Space Exploration 0-0-7 Optimization Techniques for Design Space Exploration Zebo Peng Embedded Systems Laboratory (ESLAB) Linköping University Outline Optimization problems in ERT system design Heuristic techniques Simulated

More information

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 36

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 36 CS 473: Algorithms Ruta Mehta University of Illinois, Urbana-Champaign Spring 2018 Ruta (UIUC) CS473 1 Spring 2018 1 / 36 CS 473: Algorithms, Spring 2018 LP Duality Lecture 20 April 3, 2018 Some of the

More information

Simple mechanisms for escaping from local optima:

Simple mechanisms for escaping from local optima: The methods we have seen so far are iterative improvement methods, that is, they get stuck in local optima. Simple mechanisms for escaping from local optima: I Restart: re-initialise search whenever a

More information

Approximation Algorithms

Approximation Algorithms 15-251: Great Ideas in Theoretical Computer Science Spring 2019, Lecture 14 March 5, 2019 Approximation Algorithms 1 2 SAT 3SAT Clique Hamiltonian- Cycle given a Boolean formula F, is it satisfiable? same,

More information

3 Euler Tours, Hamilton Cycles, and Their Applications

3 Euler Tours, Hamilton Cycles, and Their Applications 3 Euler Tours, Hamilton Cycles, and Their Applications 3.1 Euler Tours and Applications 3.1.1 Euler tours Carefully review the definition of (closed) walks, trails, and paths from Section 1... Definition

More information

1. Lecture notes on bipartite matching February 4th,

1. Lecture notes on bipartite matching February 4th, 1. Lecture notes on bipartite matching February 4th, 2015 6 1.1.1 Hall s Theorem Hall s theorem gives a necessary and sufficient condition for a bipartite graph to have a matching which saturates (or matches)

More information

Topic: Local Search: Max-Cut, Facility Location Date: 2/13/2007

Topic: Local Search: Max-Cut, Facility Location Date: 2/13/2007 CS880: Approximations Algorithms Scribe: Chi Man Liu Lecturer: Shuchi Chawla Topic: Local Search: Max-Cut, Facility Location Date: 2/3/2007 In previous lectures we saw how dynamic programming could be

More information

GRASP. Greedy Randomized Adaptive. Search Procedure

GRASP. Greedy Randomized Adaptive. Search Procedure GRASP Greedy Randomized Adaptive Search Procedure Type of problems Combinatorial optimization problem: Finite ensemble E = {1,2,... n } Subset of feasible solutions F 2 Objective function f : 2 Minimisation

More information

Algorithm Design and Analysis

Algorithm Design and Analysis Algorithm Design and Analysis LECTURE 29 Approximation Algorithms Load Balancing Weighted Vertex Cover Reminder: Fill out SRTEs online Don t forget to click submit Sofya Raskhodnikova 12/7/2016 Approximation

More information

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini

Metaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini Metaheuristic Development Methodology Fall 2009 Instructor: Dr. Masoud Yaghini Phases and Steps Phases and Steps Phase 1: Understanding Problem Step 1: State the Problem Step 2: Review of Existing Solution

More information

3 INTEGER LINEAR PROGRAMMING

3 INTEGER LINEAR PROGRAMMING 3 INTEGER LINEAR PROGRAMMING PROBLEM DEFINITION Integer linear programming problem (ILP) of the decision variables x 1,..,x n : (ILP) subject to minimize c x j j n j= 1 a ij x j x j 0 x j integer n j=

More information

Algorithms for Integer Programming

Algorithms for Integer Programming Algorithms for Integer Programming Laura Galli November 9, 2016 Unlike linear programming problems, integer programming problems are very difficult to solve. In fact, no efficient general algorithm is

More information

Assignment 5: Solutions

Assignment 5: Solutions Algorithm Design Techniques Assignment 5: Solutions () Port Authority. [This problem is more commonly called the Bin Packing Problem.] (a) Suppose K = 3 and (w, w, w 3, w 4 ) = (,,, ). The optimal solution

More information

Integer Programming. Xi Chen. Department of Management Science and Engineering International Business School Beijing Foreign Studies University

Integer Programming. Xi Chen. Department of Management Science and Engineering International Business School Beijing Foreign Studies University Integer Programming Xi Chen Department of Management Science and Engineering International Business School Beijing Foreign Studies University Xi Chen (chenxi0109@bfsu.edu.cn) Integer Programming 1 / 42

More information

Comparison Study of Multiple Traveling Salesmen Problem using Genetic Algorithm

Comparison Study of Multiple Traveling Salesmen Problem using Genetic Algorithm IOSR Journal of Computer Engineering (IOSR-JCE) e-issn: 2278-661, p- ISSN: 2278-8727Volume 13, Issue 3 (Jul. - Aug. 213), PP 17-22 Comparison Study of Multiple Traveling Salesmen Problem using Genetic

More information

Pre-requisite Material for Course Heuristics and Approximation Algorithms

Pre-requisite Material for Course Heuristics and Approximation Algorithms Pre-requisite Material for Course Heuristics and Approximation Algorithms This document contains an overview of the basic concepts that are needed in preparation to participate in the course. In addition,

More information

Partha Sarathi Mandal

Partha Sarathi Mandal MA 515: Introduction to Algorithms & MA353 : Design and Analysis of Algorithms [3-0-0-6] Lecture 39 http://www.iitg.ernet.in/psm/indexing_ma353/y09/index.html Partha Sarathi Mandal psm@iitg.ernet.in Dept.

More information

Module 6 NP-Complete Problems and Heuristics

Module 6 NP-Complete Problems and Heuristics Module 6 NP-Complete Problems and Heuristics Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu P, NP-Problems Class

More information

MVE165/MMG631 Linear and integer optimization with applications Lecture 9 Discrete optimization: theory and algorithms

MVE165/MMG631 Linear and integer optimization with applications Lecture 9 Discrete optimization: theory and algorithms MVE165/MMG631 Linear and integer optimization with applications Lecture 9 Discrete optimization: theory and algorithms Ann-Brith Strömberg 2018 04 24 Lecture 9 Linear and integer optimization with applications

More information

Introduction to Graph Theory

Introduction to Graph Theory Introduction to Graph Theory Tandy Warnow January 20, 2017 Graphs Tandy Warnow Graphs A graph G = (V, E) is an object that contains a vertex set V and an edge set E. We also write V (G) to denote the vertex

More information

Mathematical Tools for Engineering and Management

Mathematical Tools for Engineering and Management Mathematical Tools for Engineering and Management Lecture 8 8 Dec 0 Overview Models, Data and Algorithms Linear Optimization Mathematical Background: Polyhedra, Simplex-Algorithm Sensitivity Analysis;

More information

Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch.

Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. Iterative Improvement Algorithm design technique for solving optimization problems Start with a feasible solution Repeat the following step until no improvement can be found: change the current feasible

More information

15-854: Approximations Algorithms Lecturer: Anupam Gupta Topic: Direct Rounding of LP Relaxations Date: 10/31/2005 Scribe: Varun Gupta

15-854: Approximations Algorithms Lecturer: Anupam Gupta Topic: Direct Rounding of LP Relaxations Date: 10/31/2005 Scribe: Varun Gupta 15-854: Approximations Algorithms Lecturer: Anupam Gupta Topic: Direct Rounding of LP Relaxations Date: 10/31/2005 Scribe: Varun Gupta 15.1 Introduction In the last lecture we saw how to formulate optimization

More information

The Subtour LP for the Traveling Salesman Problem

The Subtour LP for the Traveling Salesman Problem The Subtour LP for the Traveling Salesman Problem David P. Williamson Cornell University November 22, 2011 Joint work with Jiawei Qian, Frans Schalekamp, and Anke van Zuylen The Traveling Salesman Problem

More information

(Refer Slide Time: 01:00)

(Refer Slide Time: 01:00) Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Lecture minus 26 Heuristics for TSP In this lecture, we continue our discussion

More information

6 ROUTING PROBLEMS VEHICLE ROUTING PROBLEMS. Vehicle Routing Problem, VRP:

6 ROUTING PROBLEMS VEHICLE ROUTING PROBLEMS. Vehicle Routing Problem, VRP: 6 ROUTING PROBLEMS VEHICLE ROUTING PROBLEMS Vehicle Routing Problem, VRP: Customers i=1,...,n with demands of a product must be served using a fleet of vehicles for the deliveries. The vehicles, with given

More information

Greedy algorithms Or Do the right thing

Greedy algorithms Or Do the right thing Greedy algorithms Or Do the right thing March 1, 2005 1 Greedy Algorithm Basic idea: When solving a problem do locally the right thing. Problem: Usually does not work. VertexCover (Optimization Version)

More information

CME 305: Discrete Mathematics and Algorithms Instructor: Reza Zadeh HW#3 Due at the beginning of class Thursday 02/26/15

CME 305: Discrete Mathematics and Algorithms Instructor: Reza Zadeh HW#3 Due at the beginning of class Thursday 02/26/15 CME 305: Discrete Mathematics and Algorithms Instructor: Reza Zadeh (rezab@stanford.edu) HW#3 Due at the beginning of class Thursday 02/26/15 1. Consider a model of a nonbipartite undirected graph in which

More information

Adaptive Large Neighborhood Search

Adaptive Large Neighborhood Search Adaptive Large Neighborhood Search Heuristic algorithms Giovanni Righini University of Milan Department of Computer Science (Crema) VLSN and LNS By Very Large Scale Neighborhood (VLSN) local search, we

More information

7KH9HKLFOH5RXWLQJSUREOHP

7KH9HKLFOH5RXWLQJSUREOHP 7K9KO5RXWJSUREOP Given a set of vehicles with a certain capacity located at a depot and a set of customers with different demands at various locations, the vehicle routing problem (VRP) is how to satisfy

More information