Constructive and destructive algorithms

Size: px
Start display at page:

Download "Constructive and destructive algorithms"

Transcription

1 Constructive and destructive algorithms Heuristic algorithms Giovanni Righini University of Milan Department of Computer Science (Crema)

2 Constructive algorithms In combinatorial optimization problems every solution x is a subset of the ground set E. A constructive heuristic iteratively updates a subset x (t) as follows: 1. it starts from an empty subset: x (0) = (it is a subset of the optimal solution); 2. at each iteration t, if selects an element i (t) E as the best one among the admissible elements (one tries to make x (t) to be a subset of a feasible and optimal solution); 3. inserts i (t) in the current subset x (t) : x (t+1) := x (t) { i (t)} (no backtrack is allowed); 4. it goes back to step 2 until the solution is complete (if further enlarged, the solution would not remain feasible).

3 Search space We define search space F A of a constructive algorithm A the set of all subsets x (t) that the algorithm considers admissible. The search space always contains the empty set: F A. The search space often contains all feasible solutions: X F A. If F A would include only subsets of feasible solutions, then A would never be misguided. Unfortunately for some problems it may be difficult to state whether feasible solutions exist; whether a given subset x is contained in any feasible solution; whether a given subset x is contained in any optimal solution.

4 Search space We need to define F A so that the test x F A be easy. Examples: the search space F A for the KP contains the subsets with feasible weight (F A = X).... for the MDP contains the subsets with no more than k elements (F A X).... for the TSP contains the arc subsets with no cycles and no branches.

5 Partial solutions The elements of the search space are called partial solutions when the search space coincides with the set of all subsets of the feasible solutions. This is desirable because every element of F A can be enlarged until a feasible solution is obtained. However, with respect to the set of partial solutions, the search space F A can be smaller, because A considers only some partial solutions (es., l algoritmo di Prim per il MST); larger, because it is difficult to characterize partial solutions and a weaker characterization is used (i.e. a necessary but not sufficient condition is tested); generically different. When F A contains subsets that are not partial solutions, then the test x (t) { i (t)} F does not guarantee that A will reach a feasible solution. In this case a constructive algorithm can fail.

6 Graphical representation F 2 E x' X * x * x (0) x = X The algorithm visits a sequence of subsets = x (0)... x (k) stopping at an optimal solution x X... at a feasible but sub-optimal solution x X... at an infeasible subset x Example: the MST.

7 Graphical representation F 2 E x' X * x * x (0) x = X The algorithm visits a sequence of subsets = x (0)... x (k) stopping at an optimal solution x X... at a feasible but sub-optimal solution x X... at an infeasible subset x Example: the KP, the MDP, etc...

8 Graphical representation F 2 E x' X * x * x (0) x = X The algorithm visits a sequence of subsets = x (0)... x (k) stopping at an optimal solution x X... at a feasible but sub-optimal solution x X... at an infeasible subset x Example: the TSP on an incomplete graph.

9 Termination A constructive heuristic terminates when the insertion of any further element to the current subset would not keep it within the search space. Let define Ext(x) = {i E \ x : x {i} F}. The end test is Ext(x) = STOP Sometimes all visited subsets are feasible solutions (e.g., the KP). Often only the last subset is a feasible solution (e.g., the MDP, the TSP). The final solution is the best one found during the execution: usually it is the last one.

10 Pseudo-code A constructive algorithm (for minimization) can be described as follows. Algorithm Greedy(I) x := ; x := ; f := + ; { Best incumbent solution } While Ext(x) do i := arg min φ(i, x); i Ext(x) x := x {i}; If x X and f (x) < f then x := x; f := f (x); Return (x, f ); The sequence of subsets visited by the algorithm depends on the search space F, i.e. the set Ext(x); the function φ : E F R we use to select how to extend the current subset x (t) x (t+1).

11 Efficiency and effectiveness A constructive heuristic finds the optimum when the current subset x (t) at every iteration t is contained in an optimal solution. This property is valid for x (0) =, but is usually lost in some later iteration t. A constructive heuristic executes at most n = E iterations. The complexity of each step is affected by 1. the computation of Ext(x); 2. the evaluation of φ(i, x) for each i Ext(x); 3. the selection of the minimum value and the corresponding element; 4. the update of x (and possibly other data-structures). In general, the resulting complexity turns out to be a low order polynomial. T (n) = n(γ Ext (n)+γ φ (n))

12 Constructive heuristics 1. are intuitive; General characteristics 2. are rather simple to design, analyze and implement; 3. are very efficient 4. can be very different in effectiveness according to the different problems optimality is guaranteed approximation is guaranteed feasible solution is guaranteed, but unreliable quality (most cases); no feasible solution is guaranteed. Hence, it is necessary to study the problem before designing any algorithm.

13 Constructive algorithms are used Typical uses 1. when they provide the optimal solution; 2. when the available computing time is little (e.g. on-line optimization, on-demand systems,... ); 3. when the instances are very large or they require complex calculations; 4. as components within other algorithms: initialization step for local search algorithms; base procedure for recombination algorithms.

14 A notable special case is when A special case the domain of the objective function is extended from X to F; the algorithm selects the admissible element that produces the best subset. φ(i, x) = f(x {i}) Algorithm Look-Ahead(I) x := ; x := ; f := + ; { Best incumbent solution } While Ext(x) do i := arg min f(x {i}); i Ext(x) x := x {i}; If x X and f(x) < f then x := x; f := f(x); Return (x, f );

15 The KP with unit weights Consider a KP where all items have the same volume v: the capacity constraint reduces to a cardinality constraint x V/v. Algorithm GreedyUKP(I) x := ; x := ; f := 0; { Best incumbent solution } While x < V/v do i := arg max φ i ; i E\x x := x {i}; If x X and f (x) > f then x := x; f := f (x); Return (x, f ); Every subset x can be extended until x V/v Any element of E \ x extends x in an admissible way The objective function is additive, and then f (x {i}) = f (x)+φ i arg max f (x {i}) = arg max φ i i E\x i E\x Every visited subset is better than the previous one.

16 Example E A B C D E F φ v i = 1 for each i E V = 4 The algorithms does the following iterations: 1. x := ; 2. since x = 0 < 4, it selects i := A and updates x := {A}; 3. since x = 1 < 4, it selects i := D and updates x := {A, D}; 4. since x = 2 < 4, it selects i := C and updates x := {A, C, D}; 5. since x = 3 < 4, it selects i := E and updates x := {A, C, D, E}; 6. since x = 4 4, it stops This algorithm always finds the optimal solution.

17 The KP From a set N of items of volume v i i N, select a maximum value subset fitting in a knapsack of given capacity V. Algorithm GreedyKP(I) x := ; x := ; f := 0; { Best incumbent solution } While Ext(x) do i := arg max i E\x φ i; x := x {i}; Return (x, f (x)); Only some elements of E \ x extend x in an admissible way; Ext(x) = {i E \ x : j x v j + v i V} The objective function is additive, and then f (x {i}) = f (x)+φ i arg max f (x {i}) = arg max φ i i E\x i E\x Every visited subset is better than the previous ones.

18 Example E A B C D E F φ v V = 8 The algorithm does the following iterations: 1. x := ; 2. since Ext(x), it selects i := A and updates x := {A}; 3. since Ext(x), it selects i := D and updates x := {A, D}; 4. since Ext(x) =, it stops. This algorithm does not find the optimal solution x = {A, C, E}.

19 The TSP The constructive algorithm selects the minimum cost edge among those that do not produce sub-tours; do not produce vertices with degree larger than 2. Algorithm GreedyTSP(I) x := ; x := ; f := 0; While Ext(x) do i := arg min c i; i Ext(x) x := x {i}; If x X and f (x) < f then x := x; f := f (x); Return (x, f );

20 Example The algorithm does the following iterations: 1. x := ; 2. since Ext(x), it selects i := (3, 5) and updates x; 3. since Ext(x), it selects i := (2, 4) and updates x; 4. since Ext(x), it selects i := (2, 5) and updates x; 5. since Ext(x), it selects i := (1, 4) and updates x; 6. since Ext(x) =, it stops. The algorithm does not find a feasible solution. 4 3

21 Example: the Maximum Diversity Problem The algorithm inserts one element at a time, selecting the one that yields the minimum sum of the distances. Algorithm GreedyMDP(I) x := ; x := ; f := 0; { Best incumbent solution } While x < k do i := arg min j x d ij; i E\x x := x {i}; x := x; f := f (x); Return (x, f ); The subset x is extendable until x < k Any element of E \ x extends x in a feasible way The objective function is quadratic, and then f (x {i}) = f (x)+2 j x d ij +d ii arg max i E\x f (x {i}) = arg max i E\x Every generated subset is better than the previous one j x d ij

22 Example: the Maximum Diversity Problem The algorithm has several drawbacks: 1. at the first iteration all elements are equivalent: (f ({i}) = 0) 2. the final result may be sub-optimal even if the algorithm starts with the two most distant elements (i.e. (1, 7)) the algorithm is restarted by selecting each element as the first one

23 Characterization We would like to characterize the problems for which greedy constructive algorithms find the optimum, or guarantee an approximation, or guarantee to find a feasible solution or do not guarantee anything. No general characterization is known for the optimization problems that can be solved by greedy constructive algorithms. However we have partial characterizations.

24 The additive case Assume that 1. the objective function is additive: f (x) = i x φ i 2. the solutions are the maximal subsets (bases): X = B F = {Y F : Y F : Y Y } This is a frequent case (KP, MDP, TSP; but not the SCP). In this case the greedy constructive algorithm always finds an optimal solution if and only if (E,F) is a matroid embedding. The definition of matroid embedding is rather complex. We limit ourselves to the special case of matroids.

25 Matroids A matroid is a system of sets (E,F) with F 2 E such that trivial axiom: F inheritance axiom: if x F and y x then y F. It is possible to construct any subset of the system by inserting its elements in any order. exchange axiom: x, y F with x = y +1, i x \ y such that y {i} F It is possible to extend any subset of the system with a suitable element belonging to any other subset of larger cardinality. These conditions hold for the KP with unit volumes, the MST... do not hold for the general KP, the TSP... hold for the MDP, but the objective function is not additive.

26 The uniform matroid and the KP with unit volume F = {x E : x V/v } Trivial axiom: the empty set is feasible. Inheritance axiom: if x satisfies the cardinality constraint, then all its subsets also do. Exchange axiom: if x and y satisfy the cardinality constraint and x = y +1, any element in x and not in y can extend y without violating the constraint. In the KP the first two axioms hold, but not the third one. Example: if V = 6 and v = [ ], the subsets x = {3, 4, 5} and y = {1, 2} are in F, but no element of x can extend y.

27 Graphic matroid and minimum spanning tree F = {x E : x does not contain cycles } Trivial axiom: the empty set complies with the cardinality constraint; Inheritance axiom: if x is acyclic, all its subsets are; Exchange axiom: if x and y are acyclic and x = y +1, one can always insert a suitable edge of x into y without producing cycles (not all edges of x are suitable) A B C x = {(A, D),(D, H),(E, F),(B, F),(C, G)} D E F G H y = {(A, E),(B, E),(E, F),(E, H)}

28 Greedoids A greedoid is a system of sets (E,F) with F 2 E such that trivial axiom: F accessibility axiom: if x F and x then i x : x \{i} F One can obtain any subset in F by inserting its elements in a suitable order (this axiom is weaker than the inheritance axiom) exchange axiom: for each x, y F with x = y +1, i x \ y such that y {i} F In general the constructive algorithm does not guarantee optimality on greedoids. But it works for the min spanning tree (Prim algorithm): E = edge set of a graph F = set of the trees incident to a given vertex v 1

29 Greedoids with strong exchange axiom In the case of the min spanning tree the algorithm works because of the strong exchange axiom: { x F, y B F s.t. x y i E \ y s.t. x {i} F j y \ x : { x {j} F y {i}\{j} F Given a basis and one of its subsets (from which the basis is accessible), if there is an element that misleads the subset from reaching the basis, there is another one which keeps it on the right way and the two elements can be exchanged in the basis. Remark. The optimality of the constructive algorithm depends on the properties of the problem (additive objective function, bases as feasible solutions) on the search space F chosen for designing the algorithm.

30 Constructive algorithms for approximating the KP In most cases the problem does not have a search space F with the properties of matroids. We must take into account the constraints of the problem 1. in the definition of F, 2. in the definition of φ(i, x). In the case of the KP we can sort the items according to their value/volume ratio: φ(i, x) = φ i v i The resulting constructive algorithm can produce arbitrarily bad solutions but with a slight modification it is 2-approximating.

31 Example E A B C D E F φ v φ/v V = 8 The algorithm does the following iterations: 1. x := ; 2. it selects i := E and updates x := {E}; 3. it selects i := C and updates x := {C, E}; 4. it selects i := D and updates x := {C, D, E}; 5. it selects i := F and updates x := {C, D, E, F}; (A does not fit) 6. since Ext(x) =, it stops. The solution value is 14; the optimal solution is x = {A, C, E} with value 15.

32 Example There are bad instances E A B φ v 1 10 φ/v 10 9 V = 10 The algorithm does the following iterations: 1. x := ; 2. it selects i := A and updates x := {B}; 3. since Ext(x) =, it stops. The solution has value 10, but the optimum is 90: there are instances with arbitrarily large error. This depends on the first discarded item, when it has large volume and value.

33 Approximating the KP This is a 2-approximation algorithm for the KP. 1. Start with the empty subset: x (0) = 2. At each iteration t select the item i t with maximum value/volume ratio in E \ x 3. If it fits, insert it and go back to step 2. x (t 1) = {i 1, i 2,...,i t 1 } x (t) = {i 1, i 2,...,i t } 4. Otherwise consider an alternative solution that contains only the selected item i t : x = {i t } 5. Take the best among x and x : f A = max{f(x), f(x )} It is easy to prove that the sum of the values of the two solutions is an upper bound of the optimum: t f(x)+f(x ) = φ il f the maximum of the two values is at least half of the sum: f A = max{f(x), f(x )} f(x)+f(x ) f l=1

34 Constructive algorithms: pure and adaptive A constructive algorithm is defined as pure constructive if the function φ( ) depends only on the new element i; adaptive if the function φ( ) depends also on the current solution x. So far we have considered only pure constructive algorithms Consider now the Bin Packing Problem: a set A of weighted items must be partitioned into a minimum number of subsets corresponding to capacitated bins, taken from a set B. The ground set E = A B contains all item-bin assignments (i, j) complying with the assignment constraints; the capacity constraints. The search space F A contains the partial solutions

35 Algorithm FFD: Sort the bins arbitrarily. First-Fit Decreasing Start with the empty subset: x (0) =. Select the maximum weight non-assigned item i = arg max{v i }. Select the first bin j that can receive item i (an empty bin, if necessary). Insert the new assignment in the solution x (t) = x (t 1) {(i, j)}.

36 First-Fit Decreasing The solution is not optimal in general.

37 However, it is approximated. Proof: First-Fit Decreasing At least z i A v i/v bins are needed. Sort them by non-increasing capacity used. All bins used, but the last one, contain more than V/2 units of volume (otherwise one could merge them with the last one). The total volume is larger than that contained in the bins but the last one, i.e. in the first z A 1 bins. v i > (z A 1)V/2 Therefore (z A 1) 2 i A i A v i V 2z z A 2z + 1.

38 First-Fit Decreasing The approximation factor 2 holds when items are selected in any order. Intuitively, it is better to select the largest items first, because every item in bin j has volume strictly larger then the residual capacity of all previous bins by keeping the small items at the end, we obtain many bins with small residual capacity. When items are sorted by non-increasing volume, it is possible to prove an even better approximation bound: z A 11 9 z + 1.

39 The Nearest Neighbour algorithm for the TSP Consider the TSP on a complete graph G = (N, A), where F A includes the paths leaving node 1. the constructive algorithm always finds a feasible solution; the solution can be arbitrarily bad. The resulting algorithm is called Nearest Neighbour. Since F A contains path leaving node 1, Ext(x) contains the arcs outgoing from the last node of the path x.

40 The Nearest Neighbour algorithm for the TSP One starts with an empty arc set: x (0) = representing a degenerate path outgoing from node 1 (The optimal solution certainly visits node 1) One searches for the minimum cost arc outgoing from the last node of x (i, j) = arg min (h,k) Ext(x):h=Last(x) {c hk} If j 1, go to step 2; otherwise, stop (Ext(x) forces to go back to node 1 only at the last iteration) The algorithm is very intuitive and it has complexity Θ ( n 2) If the triangle inequality holds, it is log n-approximate.

41 Example Consider a complete graph: Starting from node 1 Starting from node The optimal solution is not found from any starting node

42 A larger example

43 A larger example

44 A larger example

45 A larger example

46 Extensions of the constructive algorithm The constructive algorithm adds one element at a time to the solution. It is possible to generalize this with 1. algorithms that insert several elements at each iteration: the selection function φ(i, x) chooses subsets I to be inserted, instead of single elements 2. algorithms that also delete elements, inserting more elements than they delete: the selection function φ(i +, I, x) chooses subsets I + to insert and I to delete, with I + > I The basic idea remains the same: visiting the search space without going back. The main problem is to define the subsets so that the optimization of the selection function remain a polynomial problem subsets of bounded size (e.g., I + = 2 and I = 1) subsets efficiently computable (shortest paths,... )

47 Insertion algorithms for the TSP Several heuristics for the TSP define the search space F as the set of all cycles one cannot obtain a cycle from another one by adding an arc one can do it by removing an arc and inserting two arcs. One starts with an empty arc set: x (0) = representing a degenerate cycle from node 1 to itself; one selects an arc (i, j) to be removed and a node k to be added if the cycle does not visit all nodes, go to step 2; otherwise stop. After n iterations, it provides a feasible solution.

48 Insertion algorithms for the TSP The selection function φ(i +, I, x) must find a node (out of the current cycle) and an arc (in the current cycle); the possible choices are O ( n 2) n x possible nodes to be inserted x possible arcs to be removed The Cheapest Insertion algorithm uses as a selection criterion φ ( I +, I, x ) = f ( x I + \ I ) Since f (x I + \ I ) = f (x)+c si,k + c k,si+1 c si,s i+1, then arg min I +,I φ( I +, I, x ) ( = arg min csi,k + c c k,si+1 s i,s i,k i+1) The computational cost of the evaluation of φ decreases from Θ(n) to Θ(1). The algorithm is 2-approximate, if the triangle inequality holds.

49 Example One starts as with the Nearest Neighbour algorithm.

50 Example But a cycle is generated, instead of a path.

51 The Cheapest Insertion algorithm for the TSP The algorithm executes n iterations at each iteration, it evaluates t (n t) arc-node pairs each evaluation requires constant time each evaluation possibly updates the best move at each iteration the best insertion is selected and the end test is done. The overall complexity is Θ ( n 3) It can be reduced to Θ ( n 2 log n ) by suitable data-structures.

52 The Nearest Insertion algorithm for the TSP The Cheapest Insertion algorithm tends to select nodes close to the cycle x: to minimize c si,k + c k,si+1 c si,s i+1 the costs c si,k and c si+1,k must be small. The Nearest Insertion algorithm is quote similar but Selection criterion: it selects the node k that is closest to the cycle x k = arg min arg min c s l/ N x s i x i,l insertion criterion: it selects the arc (s i, s i+1 ) that minimizes c si,k + c k,si+1 c si,s i+1 It is 2-approximate if the triangle inequality holds.

53 Example One starts as in Nearest Neighbour and Cheapest Insertion

54 Example A cycle is formed as in Cheapest Insertion

55 Example But the cycle grows differently: every time the closest node is inserted.

56 Example When the cycle visits all nodes, the algorithm stops.

57 The Nearest Insertion algorithm for the TSP The algorithm executes n iterations at each iteration, it evaluates (n t) nodes and finds the one closest to the cycle each evaluation requires Θ(n t) at each iteration it evaluates t arcs and finds the most convenient to be removed each evaluation possibly updates the best move at each step it updates the cycle and checks the end test. The overall complexity is Θ ( n 3) It can be reduced to Θ ( n 2) by keeping the closest node in the cycle for each node out of the cycle (in a suitable data-structure to be updated at each iteration).

58 The Farthest Insertion algorithm for the TSP The Farthest Insertion algorithm is quote similar but: Selection criterion: it selects the node k that is farthest from cycle x k = arg max l/ N x d (l, x) con d (l, x) = arg min i x c i,l Insertion criterion: it selects the arc (s i, s i+1 ) that minimizes c si,k + c k,si+1 c si,s i+1 It is log n-approximate if the triangle inequality holds.

59 The farthest node is selected Example

60 The farthest node is selected Example

61 Example Each node is inserted in the best possible way.

62 Example The cycle quickly spans the whole region of the points.

63 Example When the cycle is complete, the algorithm stops.

64 The Farthest Insertion algorithm for the TSP The algorithm does n iterations at each iteration it evaluates (n t) nodes and finds the node farthest from the cycle each evaluation requires Θ(n t) at each iterations it evaluates t arcs and finds the most convenient to be removed each evaluation possibly updates the best move at each iteration it updates the cycle and checks the end test. The overall complexity is Θ ( n 3) It can be reduced to Θ ( n 2) by keeping the closest node in the cycle for each node out of the cycle (to be updated at each iteration).

65 Distance Heuristic for the Steiner Tree Problem (STP) Given a graph G = (V, E) with cost on the edges (c : E N) and a subset of special vertices U V, find a minimum cost tree connecting the special vertices. As a search space F we adopt the set of trees incident to the special vertex 1. A classical constructive algorithm, inserting one element at a time, yields solutions with redundant edges (sub-optimal) cannot easily compute whether the edges it inserts are redundant or not. The idea is to add a special vertex at a time and to stop when all special vertices are connected. To connect a vertex a whole path must be inserted, not a single edge; finding the shortest path between a chosen vertex and the current tree x is easy At each iteration one can efficiently compute the set I of new vertices.

66 Example we start with the special vertex a (degenerate tree) the closest special vertex is b, through the path (a, e, d, b): x = {(a, e),(e, d),(d, b)} the closest special vertex is g, through the path (g, h, d): x = {(a, e),(e, d),(d, b),(g, h),(h, d)} all special vertices are in the solution: stop.

67 Example we start with the special vertex a (degenerate tree) the closest special vertex is b, through the path (a, e, d, b): x = {(a, e),(e, d),(d, b)} the closest special vertex is g, through the path (g, h, d): x = {(a, e),(e, d),(d, b),(g, h),(h, d)} all special vertices are in the solution: stop.

68 Example we start with the special vertex a (degenerate tree) the closest special vertex is b, through the path (a, e, d, b): x = {(a, e),(e, d),(d, b)} the closest special vertex is g, through the path (g, h, d): x = {(a, e),(e, d),(d, b),(g, h),(h, d)} all special vertices are in the solution: stop.

69 Example In this example the optimal solution is found; in general, the algorithm is 2-approximate. It looks equivalent to searching for the minimum spanning tree on a graph with only special vertices edges corresponding to shortest paths but shortest paths may share some edges.

70 Destructive heuristics Destructive algorithms follow the complementary approach: Start with the complete set E. Delete an element at each iteration, selecting it so that the search remains in the search space F; a suitable criterion φ(i, x) is optimized. Stop when there is no possibility to remain in the search space. Pseudo-code (minimization problem) Algorithm Stingy(I) x := E; x := ; z := + ; { Best incumbent solution } While Red(x) do i := arg max φ(i, x); i Red(x) x := x \{i}; If x X and z(x) < z then x := x; z := z(x); Return (x, z ); where Red(x) = {i x : x \{i} F}.

71 A destructive heuristic requires more iterations; Destructive heuristics has larger probabilities to take a wrong decision. Also the evaluation of Red(x) and φ(i, x) is often more complex. However it may be useful to combine a constructive heuristic with a destructive one when the constructive heuristic provides redundant solutions; when the constructive heuristic visits several feasible solutions and the best one is not necessarily the last one. The auxiliary destructive heuristic starts from the solution x of the constructive heuristic, instead of E; its search space often coincides with the solutions space X: F = X Red(x) = {i x : x \{i} X} the selection criterion is usually the objective: φ(i, x) = z(x \{i}).

72 Constructive/destructive heuristic for the SCP c A The constructive heuristic selects columns 1, 2, 4 and 3 (each of them covers uncovered rows) 2. The solution is redundant: column 2 can be deleted 3. In this example a postprocessing with a trivial destructive heuristic provides the optimal solution x = {1, 3, 4}.

Local search. Heuristic algorithms. Giovanni Righini. University of Milan Department of Computer Science (Crema)

Local search. Heuristic algorithms. Giovanni Righini. University of Milan Department of Computer Science (Crema) Local search Heuristic algorithms Giovanni Righini University of Milan Department of Computer Science (Crema) Exchange algorithms In Combinatorial Optimization every solution x is a subset of E An exchange

More information

Constructive meta-heuristics

Constructive meta-heuristics Constructive meta-heuristics Heuristic algorithms Giovanni Righini University of Milan Department of Computer Science (Crema) Improving constructive algorithms For many problems constructive algorithms

More information

Theorem 2.9: nearest addition algorithm

Theorem 2.9: nearest addition algorithm There are severe limits on our ability to compute near-optimal tours It is NP-complete to decide whether a given undirected =(,)has a Hamiltonian cycle An approximation algorithm for the TSP can be used

More information

Introduction to Approximation Algorithms

Introduction to Approximation Algorithms Introduction to Approximation Algorithms Dr. Gautam K. Das Departmet of Mathematics Indian Institute of Technology Guwahati, India gkd@iitg.ernet.in February 19, 2016 Outline of the lecture Background

More information

Notes for Lecture 24

Notes for Lecture 24 U.C. Berkeley CS170: Intro to CS Theory Handout N24 Professor Luca Trevisan December 4, 2001 Notes for Lecture 24 1 Some NP-complete Numerical Problems 1.1 Subset Sum The Subset Sum problem is defined

More information

Graphs and Network Flows IE411. Lecture 21. Dr. Ted Ralphs

Graphs and Network Flows IE411. Lecture 21. Dr. Ted Ralphs Graphs and Network Flows IE411 Lecture 21 Dr. Ted Ralphs IE411 Lecture 21 1 Combinatorial Optimization and Network Flows In general, most combinatorial optimization and integer programming problems are

More information

COMP 355 Advanced Algorithms Approximation Algorithms: VC and TSP Chapter 11 (KT) Section (CLRS)

COMP 355 Advanced Algorithms Approximation Algorithms: VC and TSP Chapter 11 (KT) Section (CLRS) COMP 355 Advanced Algorithms Approximation Algorithms: VC and TSP Chapter 11 (KT) Section 35.1-35.2(CLRS) 1 Coping with NP-Completeness Brute-force search: This is usually only a viable option for small

More information

Algorithms for Integer Programming

Algorithms for Integer Programming Algorithms for Integer Programming Laura Galli November 9, 2016 Unlike linear programming problems, integer programming problems are very difficult to solve. In fact, no efficient general algorithm is

More information

Approximation Algorithms

Approximation Algorithms Chapter 8 Approximation Algorithms Algorithm Theory WS 2016/17 Fabian Kuhn Approximation Algorithms Optimization appears everywhere in computer science We have seen many examples, e.g.: scheduling jobs

More information

Evolutionary Computation for Combinatorial Optimization

Evolutionary Computation for Combinatorial Optimization Evolutionary Computation for Combinatorial Optimization Günther Raidl Vienna University of Technology, Vienna, Austria raidl@ads.tuwien.ac.at EvoNet Summer School 2003, Parma, Italy August 25, 2003 Evolutionary

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms Given an NP-hard problem, what should be done? Theory says you're unlikely to find a poly-time algorithm. Must sacrifice one of three desired features. Solve problem to optimality.

More information

The minimum cost spanning tree problem

The minimum cost spanning tree problem The minimum cost spanning tree problem Combinatorial optimization Giovanni Righini Definitions - 1 A graph G = (V,E) is a tree if and only if it is connected and acyclic. Connectivity: for each cut, at

More information

MVE165/MMG630, Applied Optimization Lecture 8 Integer linear programming algorithms. Ann-Brith Strömberg

MVE165/MMG630, Applied Optimization Lecture 8 Integer linear programming algorithms. Ann-Brith Strömberg MVE165/MMG630, Integer linear programming algorithms Ann-Brith Strömberg 2009 04 15 Methods for ILP: Overview (Ch. 14.1) Enumeration Implicit enumeration: Branch and bound Relaxations Decomposition methods:

More information

Computational complexity

Computational complexity Computational complexity Heuristic Algorithms Giovanni Righini University of Milan Department of Computer Science (Crema) Definitions: problems and instances A problem is a general question expressed in

More information

Polynomial time approximation algorithms

Polynomial time approximation algorithms Polynomial time approximation algorithms Doctoral course Optimization on graphs - Lecture 5.2 Giovanni Righini January 18 th, 2013 Approximation algorithms There are several reasons for using approximation

More information

Module 6 NP-Complete Problems and Heuristics

Module 6 NP-Complete Problems and Heuristics Module 6 NP-Complete Problems and Heuristics Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS 397 E-mail: natarajan.meghanathan@jsums.edu Optimization vs. Decision

More information

Algorithm Design Methods. Some Methods Not Covered

Algorithm Design Methods. Some Methods Not Covered Algorithm Design Methods Greedy method. Divide and conquer. Dynamic Programming. Backtracking. Branch and bound. Some Methods Not Covered Linear Programming. Integer Programming. Simulated Annealing. Neural

More information

Chapter Design Techniques for Approximation Algorithms

Chapter Design Techniques for Approximation Algorithms Chapter 2 Design Techniques for Approximation Algorithms I N THE preceding chapter we observed that many relevant optimization problems are NP-hard, and that it is unlikely that we will ever be able to

More information

Assignment 5: Solutions

Assignment 5: Solutions Algorithm Design Techniques Assignment 5: Solutions () Port Authority. [This problem is more commonly called the Bin Packing Problem.] (a) Suppose K = 3 and (w, w, w 3, w 4 ) = (,,, ). The optimal solution

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Greedy Algorithms October 28, 2016 École Centrale Paris, Châtenay-Malabry, France Dimo Brockhoff Inria Saclay Ile-de-France 2 Course Overview Date Fri, 7.10.2016 Fri, 28.10.2016

More information

3 INTEGER LINEAR PROGRAMMING

3 INTEGER LINEAR PROGRAMMING 3 INTEGER LINEAR PROGRAMMING PROBLEM DEFINITION Integer linear programming problem (ILP) of the decision variables x 1,..,x n : (ILP) subject to minimize c x j j n j= 1 a ij x j x j 0 x j integer n j=

More information

CMSC 451: Lecture 22 Approximation Algorithms: Vertex Cover and TSP Tuesday, Dec 5, 2017

CMSC 451: Lecture 22 Approximation Algorithms: Vertex Cover and TSP Tuesday, Dec 5, 2017 CMSC 451: Lecture 22 Approximation Algorithms: Vertex Cover and TSP Tuesday, Dec 5, 2017 Reading: Section 9.2 of DPV. Section 11.3 of KT presents a different approximation algorithm for Vertex Cover. Coping

More information

NP Completeness. Andreas Klappenecker [partially based on slides by Jennifer Welch]

NP Completeness. Andreas Klappenecker [partially based on slides by Jennifer Welch] NP Completeness Andreas Klappenecker [partially based on slides by Jennifer Welch] Dealing with NP-Complete Problems Dealing with NP-Completeness Suppose the problem you need to solve is NP-complete. What

More information

Module 6 NP-Complete Problems and Heuristics

Module 6 NP-Complete Problems and Heuristics Module 6 NP-Complete Problems and Heuristics Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS 97 E-mail: natarajan.meghanathan@jsums.edu Optimization vs. Decision

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms Subhash Suri June 5, 2018 1 Figure of Merit: Performance Ratio Suppose we are working on an optimization problem in which each potential solution has a positive cost, and we want

More information

Outline. CS38 Introduction to Algorithms. Approximation Algorithms. Optimization Problems. Set Cover. Set cover 5/29/2014. coping with intractibility

Outline. CS38 Introduction to Algorithms. Approximation Algorithms. Optimization Problems. Set Cover. Set cover 5/29/2014. coping with intractibility Outline CS38 Introduction to Algorithms Lecture 18 May 29, 2014 coping with intractibility approximation algorithms set cover TSP center selection randomness in algorithms May 29, 2014 CS38 Lecture 18

More information

Methods and Models for Combinatorial Optimization Exact methods for the Traveling Salesman Problem

Methods and Models for Combinatorial Optimization Exact methods for the Traveling Salesman Problem Methods and Models for Combinatorial Optimization Exact methods for the Traveling Salesman Problem L. De Giovanni M. Di Summa The Traveling Salesman Problem (TSP) is an optimization problem on a directed

More information

Minimum cost spanning tree

Minimum cost spanning tree Minimum cost spanning tree Doctoral course Optimization on graphs - Lecture 2.2 Giovanni Righini January 15 th, 2013 Definitions - 1 A graph G = (V,E) is a tree if and only if it is connected and acyclic.

More information

UNIT 3. Greedy Method. Design and Analysis of Algorithms GENERAL METHOD

UNIT 3. Greedy Method. Design and Analysis of Algorithms GENERAL METHOD UNIT 3 Greedy Method GENERAL METHOD Greedy is the most straight forward design technique. Most of the problems have n inputs and require us to obtain a subset that satisfies some constraints. Any subset

More information

11. APPROXIMATION ALGORITHMS

11. APPROXIMATION ALGORITHMS 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: vertex cover LP rounding: vertex cover generalized load balancing knapsack problem Lecture slides by Kevin Wayne Copyright 2005

More information

Steiner Trees and Forests

Steiner Trees and Forests Massachusetts Institute of Technology Lecturer: Adriana Lopez 18.434: Seminar in Theoretical Computer Science March 7, 2006 Steiner Trees and Forests 1 Steiner Tree Problem Given an undirected graph G

More information

Integer Programming. Xi Chen. Department of Management Science and Engineering International Business School Beijing Foreign Studies University

Integer Programming. Xi Chen. Department of Management Science and Engineering International Business School Beijing Foreign Studies University Integer Programming Xi Chen Department of Management Science and Engineering International Business School Beijing Foreign Studies University Xi Chen (chenxi0109@bfsu.edu.cn) Integer Programming 1 / 42

More information

22 Elementary Graph Algorithms. There are two standard ways to represent a

22 Elementary Graph Algorithms. There are two standard ways to represent a VI Graph Algorithms Elementary Graph Algorithms Minimum Spanning Trees Single-Source Shortest Paths All-Pairs Shortest Paths 22 Elementary Graph Algorithms There are two standard ways to represent a graph

More information

Approximation Algorithms

Approximation Algorithms Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015 Approximation Algorithms Tamassia Approximation Algorithms 1 Applications One of

More information

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502)

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502) Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik Combinatorial Optimization (MA 4502) Dr. Michael Ritter Problem Sheet 4 Homework Problems Problem

More information

(Refer Slide Time: 01:00)

(Refer Slide Time: 01:00) Advanced Operations Research Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Lecture minus 26 Heuristics for TSP In this lecture, we continue our discussion

More information

CS 4407 Algorithms. Lecture 8: Circumventing Intractability, using Approximation and other Techniques

CS 4407 Algorithms. Lecture 8: Circumventing Intractability, using Approximation and other Techniques CS 4407 Algorithms Lecture 8: Circumventing Intractability, using Approximation and other Techniques Prof. Gregory Provan Department of Computer Science University College Cork CS 4010 1 Lecture Outline

More information

Module 6 NP-Complete Problems and Heuristics

Module 6 NP-Complete Problems and Heuristics Module 6 NP-Complete Problems and Heuristics Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu P, NP-Problems Class

More information

Models of distributed computing: port numbering and local algorithms

Models of distributed computing: port numbering and local algorithms Models of distributed computing: port numbering and local algorithms Jukka Suomela Adaptive Computing Group Helsinki Institute for Information Technology HIIT University of Helsinki FMT seminar, 26 February

More information

22 Elementary Graph Algorithms. There are two standard ways to represent a

22 Elementary Graph Algorithms. There are two standard ways to represent a VI Graph Algorithms Elementary Graph Algorithms Minimum Spanning Trees Single-Source Shortest Paths All-Pairs Shortest Paths 22 Elementary Graph Algorithms There are two standard ways to represent a graph

More information

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018 CS 580: Algorithm Design and Analysis Jeremiah Blocki Purdue University Spring 2018 Chapter 11 Approximation Algorithms Slides by Kevin Wayne. Copyright @ 2005 Pearson-Addison Wesley. All rights reserved.

More information

Last topic: Summary; Heuristics and Approximation Algorithms Topics we studied so far:

Last topic: Summary; Heuristics and Approximation Algorithms Topics we studied so far: Last topic: Summary; Heuristics and Approximation Algorithms Topics we studied so far: I Strength of formulations; improving formulations by adding valid inequalities I Relaxations and dual problems; obtaining

More information

Approximation Algorithms: The Primal-Dual Method. My T. Thai

Approximation Algorithms: The Primal-Dual Method. My T. Thai Approximation Algorithms: The Primal-Dual Method My T. Thai 1 Overview of the Primal-Dual Method Consider the following primal program, called P: min st n c j x j j=1 n a ij x j b i j=1 x j 0 Then the

More information

1 Variations of the Traveling Salesman Problem

1 Variations of the Traveling Salesman Problem Stanford University CS26: Optimization Handout 3 Luca Trevisan January, 20 Lecture 3 In which we prove the equivalence of three versions of the Traveling Salesman Problem, we provide a 2-approximate algorithm,

More information

Decreasing a key FIB-HEAP-DECREASE-KEY(,, ) 3.. NIL. 2. error new key is greater than current key 6. CASCADING-CUT(, )

Decreasing a key FIB-HEAP-DECREASE-KEY(,, ) 3.. NIL. 2. error new key is greater than current key 6. CASCADING-CUT(, ) Decreasing a key FIB-HEAP-DECREASE-KEY(,, ) 1. if >. 2. error new key is greater than current key 3.. 4.. 5. if NIL and.

More information

Math 3012 Combinatorial Optimization Worksheet

Math 3012 Combinatorial Optimization Worksheet Math 3012 Combinatorial Optimization Worksheet Combinatorial Optimization is the way in which combinatorial thought is applied to real world optimization problems. Optimization entails achieving the sufficient

More information

Slides on Approximation algorithms, part 2: Basic approximation algorithms

Slides on Approximation algorithms, part 2: Basic approximation algorithms Approximation slides Slides on Approximation algorithms, part : Basic approximation algorithms Guy Kortsarz Approximation slides Finding a lower bound; the TSP example The optimum TSP cycle P is an edge

More information

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Approximation Algorithms

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Approximation Algorithms Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015 Approximation Algorithms 1 Bike Tour Suppose you decide to ride a bicycle around

More information

LECTURES 3 and 4: Flows and Matchings

LECTURES 3 and 4: Flows and Matchings LECTURES 3 and 4: Flows and Matchings 1 Max Flow MAX FLOW (SP). Instance: Directed graph N = (V,A), two nodes s,t V, and capacities on the arcs c : A R +. A flow is a set of numbers on the arcs such that

More information

Copyright 2000, Kevin Wayne 1

Copyright 2000, Kevin Wayne 1 Guessing Game: NP-Complete? 1. LONGEST-PATH: Given a graph G = (V, E), does there exists a simple path of length at least k edges? YES. SHORTEST-PATH: Given a graph G = (V, E), does there exists a simple

More information

Decision Problems. Observation: Many polynomial algorithms. Questions: Can we solve all problems in polynomial time? Answer: No, absolutely not.

Decision Problems. Observation: Many polynomial algorithms. Questions: Can we solve all problems in polynomial time? Answer: No, absolutely not. Decision Problems Observation: Many polynomial algorithms. Questions: Can we solve all problems in polynomial time? Answer: No, absolutely not. Definition: The class of problems that can be solved by polynomial-time

More information

ACO and other (meta)heuristics for CO

ACO and other (meta)heuristics for CO ACO and other (meta)heuristics for CO 32 33 Outline Notes on combinatorial optimization and algorithmic complexity Construction and modification metaheuristics: two complementary ways of searching a solution

More information

Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology. Assignment

Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology. Assignment Class: V - CE Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology Sub: Design and Analysis of Algorithms Analysis of Algorithm: Assignment

More information

3 No-Wait Job Shops with Variable Processing Times

3 No-Wait Job Shops with Variable Processing Times 3 No-Wait Job Shops with Variable Processing Times In this chapter we assume that, on top of the classical no-wait job shop setting, we are given a set of processing times for each operation. We may select

More information

Mathematical Tools for Engineering and Management

Mathematical Tools for Engineering and Management Mathematical Tools for Engineering and Management Lecture 8 8 Dec 0 Overview Models, Data and Algorithms Linear Optimization Mathematical Background: Polyhedra, Simplex-Algorithm Sensitivity Analysis;

More information

Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret

Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret Greedy Algorithms (continued) The best known application where the greedy algorithm is optimal is surely

More information

Module 6 P, NP, NP-Complete Problems and Approximation Algorithms

Module 6 P, NP, NP-Complete Problems and Approximation Algorithms Module 6 P, NP, NP-Complete Problems and Approximation Algorithms Dr. Natarajan Meghanathan Associate Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu

More information

Basic Approximation algorithms

Basic Approximation algorithms Approximation slides Basic Approximation algorithms Guy Kortsarz Approximation slides 2 A ρ approximation algorithm for problems that we can not solve exactly Given an NP-hard question finding the optimum

More information

5. Lecture notes on matroid intersection

5. Lecture notes on matroid intersection Massachusetts Institute of Technology Handout 14 18.433: Combinatorial Optimization April 1st, 2009 Michel X. Goemans 5. Lecture notes on matroid intersection One nice feature about matroids is that a

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Greedy Algorithms October 5, 2015 École Centrale Paris, Châtenay-Malabry, France Dimo Brockhoff INRIA Lille Nord Europe Course Overview 2 Date Topic Mon, 21.9.2015 Introduction

More information

Chapter 9. Greedy Technique. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

Chapter 9. Greedy Technique. Copyright 2007 Pearson Addison-Wesley. All rights reserved. Chapter 9 Greedy Technique Copyright 2007 Pearson Addison-Wesley. All rights reserved. Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that

More information

DESIGN AND ANALYSIS OF ALGORITHMS GREEDY METHOD

DESIGN AND ANALYSIS OF ALGORITHMS GREEDY METHOD 1 DESIGN AND ANALYSIS OF ALGORITHMS UNIT II Objectives GREEDY METHOD Explain and detail about greedy method Explain the concept of knapsack problem and solve the problems in knapsack Discuss the applications

More information

Solution for Homework set 3

Solution for Homework set 3 TTIC 300 and CMSC 37000 Algorithms Winter 07 Solution for Homework set 3 Question (0 points) We are given a directed graph G = (V, E), with two special vertices s and t, and non-negative integral capacities

More information

CS270 Combinatorial Algorithms & Data Structures Spring Lecture 19:

CS270 Combinatorial Algorithms & Data Structures Spring Lecture 19: CS270 Combinatorial Algorithms & Data Structures Spring 2003 Lecture 19: 4.1.03 Lecturer: Satish Rao Scribes: Kevin Lacker and Bill Kramer Disclaimer: These notes have not been subjected to the usual scrutiny

More information

CS261: Problem Set #2

CS261: Problem Set #2 CS261: Problem Set #2 Due by 11:59 PM on Tuesday, February 9, 2016 Instructions: (1) Form a group of 1-3 students. You should turn in only one write-up for your entire group. (2) Submission instructions:

More information

Rigidity, connectivity and graph decompositions

Rigidity, connectivity and graph decompositions First Prev Next Last Rigidity, connectivity and graph decompositions Brigitte Servatius Herman Servatius Worcester Polytechnic Institute Page 1 of 100 First Prev Next Last Page 2 of 100 We say that a framework

More information

Solutions for the Exam 6 January 2014

Solutions for the Exam 6 January 2014 Mastermath and LNMB Course: Discrete Optimization Solutions for the Exam 6 January 2014 Utrecht University, Educatorium, 13:30 16:30 The examination lasts 3 hours. Grading will be done before January 20,

More information

6. Lecture notes on matroid intersection

6. Lecture notes on matroid intersection Massachusetts Institute of Technology 18.453: Combinatorial Optimization Michel X. Goemans May 2, 2017 6. Lecture notes on matroid intersection One nice feature about matroids is that a simple greedy algorithm

More information

Coping with the Limitations of Algorithm Power Exact Solution Strategies Backtracking Backtracking : A Scenario

Coping with the Limitations of Algorithm Power Exact Solution Strategies Backtracking Backtracking : A Scenario Coping with the Limitations of Algorithm Power Tackling Difficult Combinatorial Problems There are two principal approaches to tackling difficult combinatorial problems (NP-hard problems): Use a strategy

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms CSE 101, Winter 018 D/Q Greed SP s DP LP, Flow B&B, Backtrack Metaheuristics P, NP Design and Analysis of Algorithms Lecture 8: Greed Class URL: http://vlsicad.ucsd.edu/courses/cse101-w18/ Optimization

More information

1. Lecture notes on bipartite matching

1. Lecture notes on bipartite matching Massachusetts Institute of Technology 18.453: Combinatorial Optimization Michel X. Goemans February 5, 2017 1. Lecture notes on bipartite matching Matching problems are among the fundamental problems in

More information

CMPSCI 311: Introduction to Algorithms Practice Final Exam

CMPSCI 311: Introduction to Algorithms Practice Final Exam CMPSCI 311: Introduction to Algorithms Practice Final Exam Name: ID: Instructions: Answer the questions directly on the exam pages. Show all your work for each question. Providing more detail including

More information

11. APPROXIMATION ALGORITHMS

11. APPROXIMATION ALGORITHMS Coping with NP-completeness 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: weighted vertex cover LP rounding: weighted vertex cover generalized load balancing knapsack problem

More information

Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R + Goal: find a tour (Hamiltonian cycle) of minimum cost

Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R + Goal: find a tour (Hamiltonian cycle) of minimum cost Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R + Goal: find a tour (Hamiltonian cycle) of minimum cost Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R

More information

Adaptive Large Neighborhood Search

Adaptive Large Neighborhood Search Adaptive Large Neighborhood Search Heuristic algorithms Giovanni Righini University of Milan Department of Computer Science (Crema) VLSN and LNS By Very Large Scale Neighborhood (VLSN) local search, we

More information

Partha Sarathi Mandal

Partha Sarathi Mandal MA 515: Introduction to Algorithms & MA353 : Design and Analysis of Algorithms [3-0-0-6] Lecture 39 http://www.iitg.ernet.in/psm/indexing_ma353/y09/index.html Partha Sarathi Mandal psm@iitg.ernet.in Dept.

More information

Pre-requisite Material for Course Heuristics and Approximation Algorithms

Pre-requisite Material for Course Heuristics and Approximation Algorithms Pre-requisite Material for Course Heuristics and Approximation Algorithms This document contains an overview of the basic concepts that are needed in preparation to participate in the course. In addition,

More information

Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-4004) Subject Notes

Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-4004) Subject Notes Page no: Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-00) Subject Notes Unit- Greedy Technique. Introduction: Greedy is the most straight forward design technique.

More information

Conflict Graphs for Combinatorial Optimization Problems

Conflict Graphs for Combinatorial Optimization Problems Conflict Graphs for Combinatorial Optimization Problems Ulrich Pferschy joint work with Andreas Darmann and Joachim Schauer University of Graz, Austria Introduction Combinatorial Optimization Problem CO

More information

APPROXIMATION ALGORITHMS FOR GEOMETRIC PROBLEMS

APPROXIMATION ALGORITHMS FOR GEOMETRIC PROBLEMS APPROXIMATION ALGORITHMS FOR GEOMETRIC PROBLEMS Subhas C. Nandy (nandysc@isical.ac.in) Advanced Computing and Microelectronics Unit Indian Statistical Institute Kolkata 70010, India. Organization Introduction

More information

ALGORITHM CHEAPEST INSERTION

ALGORITHM CHEAPEST INSERTION Version for STSP ALGORITHM CHEAPEST INSERTION. Choose the two furthest vertices i and k as initial subtour (c ik = max {c hj : (h, j) A}); set V := V \ {i} \ {k} (set of the unvisited vertices).. For each

More information

Methods and Models for Combinatorial Optimization Heuristis for Combinatorial Optimization

Methods and Models for Combinatorial Optimization Heuristis for Combinatorial Optimization Methods and Models for Combinatorial Optimization Heuristis for Combinatorial Optimization L. De Giovanni 1 Introduction Solution methods for Combinatorial Optimization Problems (COPs) fall into two classes:

More information

Final. Name: TA: Section Time: Course Login: Person on Left: Person on Right: U.C. Berkeley CS170 : Algorithms, Fall 2013

Final. Name: TA: Section Time: Course Login: Person on Left: Person on Right: U.C. Berkeley CS170 : Algorithms, Fall 2013 U.C. Berkeley CS170 : Algorithms, Fall 2013 Final Professor: Satish Rao December 16, 2013 Name: Final TA: Section Time: Course Login: Person on Left: Person on Right: Answer all questions. Read them carefully

More information

An iteration of Branch and Bound One iteration of Branch and Bound consists of the following four steps: Some definitions. Branch and Bound.

An iteration of Branch and Bound One iteration of Branch and Bound consists of the following four steps: Some definitions. Branch and Bound. ranch and ound xamples and xtensions jesla@man.dtu.dk epartment of Management ngineering Technical University of enmark ounding ow do we get ourselves a bounding function? Relaxation. Leave out some constraints.

More information

5 MST and Greedy Algorithms

5 MST and Greedy Algorithms 5 MST and Greedy Algorithms One of the traditional and practically motivated problems of discrete optimization asks for a minimal interconnection of a given set of terminals (meaning that every pair will

More information

Lecture 8: The Traveling Salesman Problem

Lecture 8: The Traveling Salesman Problem Lecture 8: The Traveling Salesman Problem Let G = (V, E) be an undirected graph. A Hamiltonian cycle of G is a cycle that visits every vertex v V exactly once. Instead of Hamiltonian cycle, we sometimes

More information

Approximation Algorithms

Approximation Algorithms 18.433 Combinatorial Optimization Approximation Algorithms November 20,25 Lecturer: Santosh Vempala 1 Approximation Algorithms Any known algorithm that finds the solution to an NP-hard optimization problem

More information

Exact Algorithms for NP-hard problems

Exact Algorithms for NP-hard problems 24 mai 2012 1 Why do we need exponential algorithms? 2 3 Why the P-border? 1 Practical reasons (Jack Edmonds, 1965) For practical purposes the difference between algebraic and exponential order is more

More information

Greedy Algorithms 1. For large values of d, brute force search is not feasible because there are 2 d

Greedy Algorithms 1. For large values of d, brute force search is not feasible because there are 2 d Greedy Algorithms 1 Simple Knapsack Problem Greedy Algorithms form an important class of algorithmic techniques. We illustrate the idea by applying it to a simplified version of the Knapsack Problem. Informally,

More information

Chapter 9 Graph Algorithms

Chapter 9 Graph Algorithms Chapter 9 Graph Algorithms 2 Introduction graph theory useful in practice represent many real-life problems can be if not careful with data structures 3 Definitions an undirected graph G = (V, E) is a

More information

Lecture 4: Graph Algorithms

Lecture 4: Graph Algorithms Lecture 4: Graph Algorithms Definitions Undirected graph: G =(V, E) V finite set of vertices, E finite set of edges any edge e = (u,v) is an unordered pair Directed graph: edges are ordered pairs If e

More information

CSE 417 Branch & Bound (pt 4) Branch & Bound

CSE 417 Branch & Bound (pt 4) Branch & Bound CSE 417 Branch & Bound (pt 4) Branch & Bound Reminders > HW8 due today > HW9 will be posted tomorrow start early program will be slow, so debugging will be slow... Review of previous lectures > Complexity

More information

Stanford University CS261: Optimization Handout 1 Luca Trevisan January 4, 2011

Stanford University CS261: Optimization Handout 1 Luca Trevisan January 4, 2011 Stanford University CS261: Optimization Handout 1 Luca Trevisan January 4, 2011 Lecture 1 In which we describe what this course is about and give two simple examples of approximation algorithms 1 Overview

More information

5 MST and Greedy Algorithms

5 MST and Greedy Algorithms 5 MST and Greedy Algorithms One of the traditional and practically motivated problems of discrete optimization asks for a minimal interconnection of a given set of terminals (meaning that every pair will

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms CSE 101, Winter 2018 Design and Analysis of Algorithms Lecture 9: Minimum Spanning Trees Class URL: http://vlsicad.ucsd.edu/courses/cse101-w18/ Goal: MST cut and cycle properties Prim, Kruskal greedy algorithms

More information

Linear programming and duality theory

Linear programming and duality theory Linear programming and duality theory Complements of Operations Research Giovanni Righini Linear Programming (LP) A linear program is defined by linear constraints, a linear objective function. Its variables

More information

1 The Traveling Salesman Problem

1 The Traveling Salesman Problem Comp 260: Advanced Algorithms Tufts University, Spring 2018 Prof. Lenore Cowen Scribe: Duc Nguyen Lecture 3a: The Traveling Salesman Problem 1 The Traveling Salesman Problem The Traveling Salesman Problem

More information

1 Minimum Spanning Trees (MST) b 2 3 a. 10 e h. j m

1 Minimum Spanning Trees (MST) b 2 3 a. 10 e h. j m Minimum Spanning Trees (MST) 8 0 e 7 b 3 a 5 d 9 h i g c 8 7 6 3 f j 9 6 k l 5 m A graph H(U,F) is a subgraph of G(V,E) if U V and F E. A subgraph H(U,F) is called spanning if U = V. Let G be a graph with

More information

Solutions to Assignment# 4

Solutions to Assignment# 4 Solutions to Assignment# 4 Liana Yepremyan 1 Nov.12: Text p. 651 problem 1 Solution: (a) One example is the following. Consider the instance K = 2 and W = {1, 2, 1, 2}. The greedy algorithm would load

More information

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18 601.433/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18 22.1 Introduction We spent the last two lectures proving that for certain problems, we can

More information