Local search. Heuristic algorithms. Giovanni Righini. University of Milan Department of Computer Science (Crema)

Size: px
Start display at page:

Download "Local search. Heuristic algorithms. Giovanni Righini. University of Milan Department of Computer Science (Crema)"

Transcription

1 Local search Heuristic algorithms Giovanni Righini University of Milan Department of Computer Science (Crema)

2 Exchange algorithms In Combinatorial Optimization every solution x is a subset of E An exchange heuristic iteratively updates a subset x (t) 1. it starts from a feasible solution x (0) obtained in some way (often with a constructive heuristic) 2. it exchanges elements in the current solution with elements not in it yielding other feasible solutions x A,D = x A\D with different A E \ x and D x 3. at each step t, it selects which elements must be added and deleted according to a suitable criterion (A, D ) = arg minφ(x, A, D) A,D 4. it generates the new current solution x (t+1) := x (t) A \ D 5. when a suitable end test is satisfied, it terminates; otherwise, it goes back to step 2.

3 Neighborhood An exchange heuristic is defined by: the set of subsets A and D that can be used, i.e. the subset of solutions that can be generated with an exchange; the selection criterion φ(x, A, D). The neighborhood N : X 2 X is a function associating a subset of neighbor solutions N(x) X with each feasible solution x X. One can define a search graph in which nodes represent feasible solutions; arcs link each solution x with those in its neighborhood N(x). Given a search graph a run of the algorithm corresponds to a path the traversal of an arc is called move, because it transforms a solution in another one by moving some elements

4 Distance-based neighborhoods Every solution x X can be represented with its incidence vector: { 1 if i x x i = 0 if i E \ x The Hamming distance between two incidence vectors x and x is the numbero of components in which they are different: d H (x, x ) = i E x i x i With reference to the subsets this means x \ x + x \ x. The set of solutions with Hamming distance from x within a given threshold is a possible definition of a neighborhood (parameterized on the threshold, k): N Hk (x) = {x X : d H (x, x ) k}

5 An example: the KP The instance of the KP with E = {1, 2, 3, 4}, w = [ ] and W = 10, has the following solutions where the subsets {1, 2, 3, 4}, {1, 2, 3} and {1, 2, 4} are not feasible. The solution x = {1, 3, 4} (in blue) has a neighborhood N H2 (x) of 7 elements (in pink). The subsets in black do not belong to the neighborhood, because their Hamming distance from x is larger than 2.

6 Operations-defined neighborhoods Another common definition of neighborhood is oeprational. It is obtained by defining a set O of operations that can be done on the solutions of the problem; a set of solutions generated by doing the operations of O n x. N O (x) = {x X : o O : o(x) = x } For the KP, one can define O as insertion of an element of E \ x in x; deletion of an element of x from x; exchange of an element in x with an element in E \ x. The resulting neighborhood N O is related to the distance-based neighborhoods but it does not coincide with any of them. N 1 N O N 2 These neighborhoods can be parameterized by executing sequences of k operations of O instead of one, as with distance-based neighborhoods.

7 Differences between neighborhoods In general, operations-based neighborhoods produce solutions at different Hamming distances. For the TSP one can define a neighborhood N S1 with the solutions that can be obtained by exchanging two vertices in the sequence of visits. The solution x = (3, 1, 4, 5, 2) has neighborhood: N S1 (x) = {(1, 3, 4, 5, 2),(4, 1, 3, 5, 2),(5, 1, 4, 3, 2),(2, 1, 4, 5, 3),(3, 4, 1, 5, 2), (3, 5, 4, 1, 2),(3, 2, 4, 5, 1),(3, 1, 5, 4, 2),(3, 1, 2, 5, 4),(3, 1, 4, 2, 5)} With respect to x three arcs change if the two vertices are adjacent, 4 arcs change otherwise.

8 Relations between distance-based and operations-based neighborhoods Sometimes the neighborhoods defined in the two ways coincide: for the MDP: N H2 with solutions at distance 2; N S1 defined by the exchange of an element; for the BPP: N H2 with solutions at distance 2; N T1 defined by moving an item to a different bin; and many other examples are possible... This is typical with solutions where the cardinality is fixed: one runs a sequence of k exchanges; k elements enter and k elements leave the solution; the Hamming distance between the first and the last solution is 2k.

9 Different neighbors for a same problem: the CMST A same problem may allow for different operations-based neighborhoods. In the CMST one can exchange edges: (i, j) leaves, (i, n) enters; exchange vertices: n is moved from subtree 2 to subtree 1 (recomputing the edges to reconnect all subtrees at minimum cost):

10 Different neighbors for a same problem: the PMSP For the PMSP one can define a transfer neighborhood N T1, based on the set T 1 of job moves from a machine to another one: an exchange neighborhood N S1, based on the set S 1 of job exchanges between two different machines (one job for each machine)

11 Connectivity of the search space An exchange heuristic can always find an optimal solution only if at least one optimal solution is reachable from any initial solution. One says the the search graph is weakly connected to the optimum when for each solution x X a path from x to x exists. Since x is unknown, a stronger condition is often used: the search graph is strongly connected when for each pair of solutions x, y X a path from x to y exists. An exchange heuristic should guarantee one of such conditions. This is not always possible: in the MDP, the neighborhood N S1 allows to connect any pair of solutions is at most k steps in the KP and the SCP, no neighborhood N Sk guarantees this The feasible solutions may have any cardinality if we allow for deletions (in the KP) and insertions (in the SCP), then the search graph is connected.

12 Connectivity of the search space If feasibility is defined in a sophisticated way, owing to the many constraints of the problem, then deletions, insertions and exchanges od elements may be insufficient: infeasible subsets may interrupt the paths between pairs of feasible solutions. w = 1 w = 1 w = 1 e f g w = 1 w = 1 e f g w = 1 w = 1 e f g w = 1 w = 1 a b c d a b c d a b c d w = 1 w = 2 w = 1 w = 1 w = 1 w = 2 w = 1 w = 1 w = 1 w = 2 w = 1 w = 1 r r r Given W = 8, tehre are three feasible solutions, all with two subtrees of weight 4: x = {(r, a),(a, b),(b, e),(r, d),(c, d),(d, g),(f, g)} x = {(r, a),(a, e),(e, f),(f, g),(r, d),(c, d),(b, c)} x = {(r, a),(a, b),(b, c),(r, d),(d, g),(f, g),(e, f)} The three solutions can be reached from one another only by exchanging two edges at a time; exchanging one edge, only infeasible solutions are reached.

13 Steepest descent heuristic The selection criterion φ(x) of the new solution in the neighborhood of the current solution is typically the objective function: at each step, the heuristic moves from the current solution to the best one in in its neighborhood. To avoid cycling, one accepts only strictly improving moves. Algorithm SteepestDescent ( I, x (0)) x := x (0) ; Fine := false; While Fine = false do x := arg min x N(x) f (x); If f (x ) f (x) then Fine := true; else x := x ; EndWhile; Return (x, f (x));

14 Local and global optimality A steepest descent heuristic terminates when it finds a locally optimal solution, that is a solution x X such that (assuming minimization) z( x) z(x) for each x X A globally optimal solution is also locally optimal, but the viceversa in not true in general: X X N X f(x) f(x) f(x) f(x*) N (x) X x* x X x x

15 Exact neighborhood An exact neighborhood is a neighborhood function N : X 2 X such that every local optimum is also a global optimum. X N = X A trivial case occurs when the neighborhood of every solution coincides with the whole feasible region (N(x) = X for each x X). Exact neighborhoods are extremely rare This is useless: too large to explore. the only relevant case is the exchange of basic and non-basic variables used by the simplex algorithm for linear programming problems. In general, a steepest descent heuristic finds a local optimum, not a global optimum. Its effectiveness depends on the properties of the search graph and the objective.

16 Some relevant properties are: Properties of the search graph the size of the search space X the connectivity of the search space (or the search graph) the diameter of the search graph, i.e. the number of arcs of the longest shortest path in it. For instance, for the symmetric TSP on complete graphs: the search space contains X = (n 1)! solutions the vertex exchange neighborhood contains ( ) n 2 = n(n 1) 2 solutions the diameter of the search graph is n 2, because any solution can be transformed in any other by at most n 2 exchanges. For instance, x = (1, 5, 4, 2, 3) becomes x = (1, 2, 3, 4, 5) in 3 steps x = (1, 5, 4, 2, 3) (1, 2, 4, 5, 3) (1, 2, 3, 5, 4) (1, 2, 3, 4, 5) = x

17 Other relevant properties: Properties of the search graph the density of globally optimal solutions ( X X ) and locally optimal solutions ( X X ): if local optima are many, it is difficult to find global optima. the quality of local optima compared with global optima (δ( x) = z( x) z(x ) z(x ) ), possibly described by an SQD diagram: if local optima are good, it may be less important to find global optima. the distribution of local optima in the search space: if local optima are close by, it is not necessary to explore the whole space. The exact evaluation of these indicators would require an exhaustive exploration of the search space. In practice, we limit ourselves to probe it: this analysis may require a lot of time; may provide misleading results.

18 Example: the TSP Typical results with the TSP on complete graphs with Euclidean costs: the average Hamming distance between two local optima is n: local optima are concentrated in a small sub-region of X; the average Hamming distance between to local optima is larger than between local and global optima: global optima are likely to in between local optima; the FDC (Fitness-Distance Correlation) diagram links the quality δ( x) with the distance from the global optima d H ( x, X ): better local optima are closer to global optima;

19 Fitness-Distance Correlation If the correlation between quality and closeness to global optima is strong, it is more convenient to search for good initial solutions, because they guide the local search to good local optima it is more better to intensify than to diversify. On the contrary, if the correlation is weak a good initialization is less important; it is better to diversify than to intensify. This happens, for instance, with the Quadratic Assignment Problem (QAP)

20 Landscape A landscape is a triple (X, N, z), where X is the search space, or the feasible region; N : X 2 X is the neighborhood function; z : X N is the objective function. One can see the search graph as a graph weighted on the vertices with the objective. The effectiveness of exchange heuristics depends on the landscape. Rugged landscapes imply many local optima and hence less effective heuristics.

21 Different types of landscapes There is a wide variety of landscapes.

22 Autocorrelation coefficient The complexity of a landscape can be estimated empirically 1. doing a random walk on the search graph 2. determining the sequence of objective values z (1),...,z (tmax) 3. computing their average value z = tmax z (t) t=1 4. computing the empirical autocorrelation coefficient r(i) = tmax i t=1 (z (t) z)(z (t+i) z) t max i tmax t=1 (z(t) z) 2 t max It is a function of i that starts from r(0) = 1 and usually decreases. If r(i) remains 1, the landscape is smooth: the neighbor solutions have values similar to the current one there are few local optima the steepest descent heuristic is effective. If r rapidly changes, the landscape is rugged: the neighbor solutions have values quite different from the current one there are many local optima the steepest descent heuristic is not so effective.

23 Plateaux One can analyse the search graph dividing it into objective levels: a plateau of value z is a subset of solutions of value z that are adjacent in the search graph. Large plateaux hamper the selection of the move, because they make it dependent on the order in which the neighbor solutions are visited. Hence a too smooth landscape is not an advantage! Example (PMSP): all transfers and exchanges between machines 1 and 3 leave the objective function value unchanged (the other moves worsen it.)

24 Attraction basins An alternative subdivision of the search graph is based on the concept of attraction basin of a local optimum x. It is the set of solutions x (0) X such that the steepest descent heuristic starting from x (0) terminates in x. The steepest descent heuristic is effective if attraction basins are few and large (especially when global optima have larger attraction basins); ineffective if attraction basins are many and small (especially if global optima have smaller attraction basins).

25 Complexity Algorithm SteepestDescent ( I, x (0)) x := x (0) ; Stop := false; While Fine = false do x := arg min x N(x) z(x); If z(x ) z(x) then Stop := true; else x := x ; EndWhile; Return (x, z(x)); The complexity of the steepest descent heuristic depends on 1. the number of steps: this depends on the structure of the search graph (width of the attraction basins), which is difficult to estimate a priori; 2. the selection of a best solution in the neighborhood: this depends on ho w the search is done.

26 Two main strategies are used: Exploring the neighborhood 1. exhaustive search: all neighbor solutions are evaluated; the complexity of each iteration is the product of the number of neighbor solutions ( N(x) ) the cost for evaluating each of them (γ N ( E, x)) Sometimes it is not easy to evaluate only neighbor solutions: one visits a superset of the neighborhood for each element the feasibility is checked for the feasible elements the cost is evaluated 2. efficient exploration of the neighborhood: instead of visiting the whole neighborhood, one finds the optimal neighbor solution by solving an auxiliary problem. Only some special neighborhoods allow for this.

27 Exhaustive exploration of the neighborhood Algorithm SteepestDescent ( I, x (0)) x := x (0) ; Stop := false; While Stop = false do x := x; { x := arg min x N(x) z(x) } For each x N (x) do If z ( x) < z (x ) then x := x; EndFor; If z (x ) z (x) then Stop := true; else x := x ; EndWhile; Return (x, z (x)); The complexity is the product of three terms: 1. the number of iterations t max to reach the local optimum 2. the number of solutions N ( x (t) ) visited at each iteration 3. the time to evaluate the objective γ N ( x (t), E ) In general N ( x (t) ) and γn ( x (t), E ) have a maximum which is independent of x (t).

28 Evaluating the objective: the additive case The first expedient to accelerate an exchange algorithm is minimizing the time needed to evaluate the objective. If an exchange inserts or deletes a small number of elements, updating z(x) instead of recomputing it costs γ N ( E ) O(1): it is enough to add φ j for each element j inserted in x; to subtract φ j for each element j deleted from x. In the KP and the CMSTP one can define the neighborhood N S1 generated by the exchange of an element i x with an element j E \ x. Moving from x to x = x \{i} {j}, the objective varies by δ(x, i, j) = z(x \{i} {j}) z(x) = φ(j) φ(i) Note that δ(x, i, j) does not depend on x.

29 Example: the symmetric TSP The neighborhood N R2 for the TSP deletes two non-consecutive edges (s i, s i+1 ) and ( ) s j, s j+1 inserts the two edges ( ) ( ) s i, s j and si+1, s j+1 reverses the direction of ( s i+1,...,s j ) (modifying O(n) edges.) If the cost function is symmetric, the variation of z(x) is δ(x, i, j) = c si,s j + c si+1,s j+1 c si,s i+1 c sj,s j+1 In many other cases, however, the function is not additive.

30 Quadratic functions In the MDP the objective function is quadratic: if one uses the neighborhood N S1, moving from x to x = x \{i} {j}, the objective varies by δ(x, i, j) = z(x \{i} {j}) z(x) = 1 2 d hk 1 2 h,k x There are O(n) different terms in the two sums. h,k x\{i} {j} There is a general expedient that works with symmetric quadratic objective functions: δ(x, i, j) = 1 d hk 1 d hk 2 2 h x k x h x\{i} {j} k x\{i} {j} δ(x, i, j) = k x d jk k x d ik d ij = D j (x) D i (x) d ij d hk If one knows D l (x) = k x d lk for each l E, the computation requires O(1).

31 Example: the MDP We want to evaluate the exchange x x = x \{i} {j} with i x and j E \ x. x E \ x z = z D i + D j d ij We loose the pairs including i We get the pairs including j But the pair (i, j) is in excess. i j

32 Example: the MDP We want to evaluate the exchange x x = x \{i} {j} with i x and j E \ x. x E \ x z = z D i + D j d ij We loose the pairs including i We get the pairs including j But the pair (i, j) is in excess. i j

33 Example: the MDP We want to evaluate the exchange x x = x \{i} {j} with i x and j E \ x. x E \ x z = z D i + D j d ij i D i j We loose the pairs including i We get the pairs including j But the pair (i, j) is in excess.

34 Example: the MDP We want to evaluate the exchange x x = x \{i} {j} with i x and j E \ x. x E \ x z = z D i + D j d ij i +D j j We loose the pairs including i We get the pairs including j But the pair (i, j) is in excess.

35 Example: the MDP We want to evaluate the exchange x x = x \{i} {j} with i x and j E \ x. x E \ x z = z D i + D j d ij i d ij j We loose the pairs including i We get the pairs including j But the pair (i, j) is in excess.

36 Example: the MDP x E \ x i j d li +d lj l Update of the data-structures: D l = D l d li + d lj, l E Each element l sees d li disappearing d lj appearing

37 Example: the MDP x E \ x i j d li +d lj l Update of the data-structures: D l = D l d li + d lj, l E Each element l sees d li disappearing d lj appearing

38 Example: the MDP x E \ x i j d li +d lj l Update of the data-structures: D l = D l d li + d lj, l E Each element l sees d li disappearing d lj appearing

39 Use of auxiliary information Also other non-linear functions can be updated keeping aggregate information on the current solution; using this information to compute z efficiently; updating such information when moving to the next solution. For the PMSP with the transfer neighborhood N T1 and the exchange neighborhood N S1, one can evaluate the objective in constant time by keeping and updating the completion time of each machine; the index of the machines with the two largest completion time values.

40 Example: the PMSP Consider the exchange o = (i, j) of jobs i and j (i on machine M i, j on machine M j ) the new completion times can be computed in constant time: one of them increases, the other one decreases (o they remain unchanged); one can check in constant time whether one of them exceeds the maximum completion time; if the maximum completion time decreases, one can check in constant time whether the other one or the second one becomes the maximum. Once visited the whole neighborhood and once selected the move, it is necessary to update the completion times (in constant time: only two of them change);

41 Use of auxiliary information The auxiliary information can be about the current solution x; the previous solution in the neighborhood, according to a suitable ordering. Consider the neighborhood N R2 for the symmetric TSP: the neighbor solution differ from x by O(n) edges; the solutions in the neighborhood differ from one another by O(n) edges; if the edge pairs (s i, s i+1 ) and (s j, s j+1 ) follow the lexicographical order, the reversed path changes by only one edge. π 0 π i π i+1 π j π j+1 π 0 π 0 π i π i+1 π j π j+1 π j+2 π 0

42 Example: the asymmetric TSP π 0 π i π i+1 π j π j+1 π 0 π 0 π i π i+1 π j π j+1 π j+2 π 0 In general, the variation of z(x) is δ(x, i, j) = c si,s j + c si+1,s j+1 c si,s i+1 c sj,s j+1 + c sj...s i+1 c si+1...s j When we have considered exchange (i, j) and we consider exchange (i, j ) with j = j + 1 the first four terms change, but they are data; the last two terms can be updated in constant time: { c sj...s i+1 = c sj...s i+1 + c sj+1,s j c si+1...s j = c si+1...s j + c sj,s j+1

43 Feasibility Some operations done to explore the neighborhood may involve infeasible solutions: { Ñ O (x) = x 2 E : o O : o(x) = x } N O (x) = ÑO(x) X In this case, for each element of ÑO(x) one needs to check the feasibility; if feasible, to evaluate the cost. To check feasibility one can use the same techniques used for the objective.

44 Example: the CMSTP Consider the neighborhood N S1 that inserts an edge and deletes another: if the two edges are in the same branch, the solution remains feasible; if they belong to different branches, one loses weight, the other one gets it: the variation is equal to the weight of the transferred sub-tree. If we keep the weight of the sub-tree rooted at each vertex, it is enough to compare such weight with the residual capacity of the branch that receives it. This piece of information must be updated once the move has been done: it takes O(n) time.

45 Refined heuristic The use of additional information implies 1. the inizialization of suitable local data-structures, relates to the exploration of each neighborhood; global data-structures, related to the whole search process; 2. their update from a solution to another or from an iteration to another. Algorithm SteepestDescent ( I, x (0)) x := x (0) ; GD := InitializeGD(); Stop := false; While Stop = false do x := 0; δ := 0; LD := InitializeLD() For each x N (x) do If δ( x) < δ(x ) then x := x; LD := UpdateLD(LD) EndFor; If z (x ) z (x) then Stop := true; else x := x ; GD := UpdateGD(GD) EndIf

46 Partial conservation of the neighborhood When an operation o O is executed on a solution x, it often happens that the variation δ(x, o) of the objective function does not depend on x or it depends only on a part of x. Many operations o O executed on x = o(x) produce δ(x, o) = δ(x, o) In this case, it is convenient 1. to store all values of δ(x, o) as they are computed; 2. to do the best move, generating x ; 3. to delete the values δ(x, o) δ(x, o); 4. to recompute only the deleted values; 5. to go back to step 2.

47 Example: the CMST Consider the neighborhood N S1 for the CMST: insert an edge j E \ x delete an edge i x The exchanges of the branches not affected by the move produce the same effect. δ(x, i, j) = δ(x, i, j) Therefore it is possible to keep the set of the feasible exchanges; to delete from the list the exchanges involving one or both the branches associate with the move; to recompute only the effect of those exchanges.

48 The efficiency-efficacy trade-off The complexity depends on three factors: 1. the number of local search iterations; 2. the size of the neighborhood to be explored; 3. the complexity of evaluating each solution. The former two are conflicting: large neighborhood allows for few steps (or better solutions); small neighborhood implies many steps. The optimal trade-off is somewhere in between: we need a neighborhood large enough to allow to reach good solutions; small enough to allow for a quick selection of the move. In general it is difficult to understand a priori what is the best trade-off.

49 Fine tuning the neighborhoods It is also possible to fine tune the size of a given neighborhood N: one explores only a promising subset N N For instance, one can insert only elements j E \ x with cost φ(j) low enough delete only elements i x with cost φ(i) high enough if the best known solution is promising, the search terminates For instance, one apply the first-improve strategy: the exploration of the neighborhood is stopped as soon as a solution is found which is better than the current one If z( x) < z(x) then Stop := true;

50 Fine tuning the neighborhoods The effectiveness depends on the objective: if the cost of some elements heavily affects the objective, it may be worth fixing or forbidding them. It also depends on the neighborhood: if the landscape is smooth, the first improving neighbor solution is not likely to be much worse than the best improving; if the landscape is rugged, the best solution in the neighborhood can be much better than the first improving solution.

Constructive and destructive algorithms

Constructive and destructive algorithms Constructive and destructive algorithms Heuristic algorithms Giovanni Righini University of Milan Department of Computer Science (Crema) Constructive algorithms In combinatorial optimization problems every

More information

Computational complexity

Computational complexity Computational complexity Heuristic Algorithms Giovanni Righini University of Milan Department of Computer Science (Crema) Definitions: problems and instances A problem is a general question expressed in

More information

Constructive meta-heuristics

Constructive meta-heuristics Constructive meta-heuristics Heuristic algorithms Giovanni Righini University of Milan Department of Computer Science (Crema) Improving constructive algorithms For many problems constructive algorithms

More information

Polynomial time approximation algorithms

Polynomial time approximation algorithms Polynomial time approximation algorithms Doctoral course Optimization on graphs - Lecture 5.2 Giovanni Righini January 18 th, 2013 Approximation algorithms There are several reasons for using approximation

More information

Algorithms for Integer Programming

Algorithms for Integer Programming Algorithms for Integer Programming Laura Galli November 9, 2016 Unlike linear programming problems, integer programming problems are very difficult to solve. In fact, no efficient general algorithm is

More information

ACO and other (meta)heuristics for CO

ACO and other (meta)heuristics for CO ACO and other (meta)heuristics for CO 32 33 Outline Notes on combinatorial optimization and algorithmic complexity Construction and modification metaheuristics: two complementary ways of searching a solution

More information

An iteration of Branch and Bound One iteration of Branch and Bound consists of the following four steps: Some definitions. Branch and Bound.

An iteration of Branch and Bound One iteration of Branch and Bound consists of the following four steps: Some definitions. Branch and Bound. ranch and ound xamples and xtensions jesla@man.dtu.dk epartment of Management ngineering Technical University of enmark ounding ow do we get ourselves a bounding function? Relaxation. Leave out some constraints.

More information

Simplicial Global Optimization

Simplicial Global Optimization Simplicial Global Optimization Julius Žilinskas Vilnius University, Lithuania September, 7 http://web.vu.lt/mii/j.zilinskas Global optimization Find f = min x A f (x) and x A, f (x ) = f, where A R n.

More information

Adaptive Large Neighborhood Search

Adaptive Large Neighborhood Search Adaptive Large Neighborhood Search Heuristic algorithms Giovanni Righini University of Milan Department of Computer Science (Crema) VLSN and LNS By Very Large Scale Neighborhood (VLSN) local search, we

More information

Methods and Models for Combinatorial Optimization Heuristis for Combinatorial Optimization

Methods and Models for Combinatorial Optimization Heuristis for Combinatorial Optimization Methods and Models for Combinatorial Optimization Heuristis for Combinatorial Optimization L. De Giovanni 1 Introduction Solution methods for Combinatorial Optimization Problems (COPs) fall into two classes:

More information

B553 Lecture 12: Global Optimization

B553 Lecture 12: Global Optimization B553 Lecture 12: Global Optimization Kris Hauser February 20, 2012 Most of the techniques we have examined in prior lectures only deal with local optimization, so that we can only guarantee convergence

More information

Complete Local Search with Memory

Complete Local Search with Memory Complete Local Search with Memory Diptesh Ghosh Gerard Sierksma SOM-theme A Primary Processes within Firms Abstract Neighborhood search heuristics like local search and its variants are some of the most

More information

and 6.855J. The Successive Shortest Path Algorithm and the Capacity Scaling Algorithm for the Minimum Cost Flow Problem

and 6.855J. The Successive Shortest Path Algorithm and the Capacity Scaling Algorithm for the Minimum Cost Flow Problem 15.082 and 6.855J The Successive Shortest Path Algorithm and the Capacity Scaling Algorithm for the Minimum Cost Flow Problem 1 Pseudo-Flows A pseudo-flow is a "flow" vector x such that 0 x u. Let e(i)

More information

First-improvement vs. Best-improvement Local Optima Networks of NK Landscapes

First-improvement vs. Best-improvement Local Optima Networks of NK Landscapes First-improvement vs. Best-improvement Local Optima Networks of NK Landscapes Gabriela Ochoa 1, Sébastien Verel 2, and Marco Tomassini 3 1 School of Computer Science, University of Nottingham, Nottingham,

More information

Outline. Construction Heuristics for CVRP. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING

Outline. Construction Heuristics for CVRP. Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING Outline DMP204 SCHEDULING, TIMETABLING AND ROUTING Lecture 27 Vehicle Routing Heuristics Marco Chiarandini 1. for CVRP for VRPTW 2. 3. 4. Constraint Programming for VRP 2 Outline for CVRP TSP based heuristics

More information

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation

Module 1 Lecture Notes 2. Optimization Problem and Model Formulation Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization

More information

Theorem 2.9: nearest addition algorithm

Theorem 2.9: nearest addition algorithm There are severe limits on our ability to compute near-optimal tours It is NP-complete to decide whether a given undirected =(,)has a Hamiltonian cycle An approximation algorithm for the TSP can be used

More information

Algorithms and Experimental Study for the Traveling Salesman Problem of Second Order. Gerold Jäger

Algorithms and Experimental Study for the Traveling Salesman Problem of Second Order. Gerold Jäger Algorithms and Experimental Study for the Traveling Salesman Problem of Second Order Gerold Jäger joint work with Paul Molitor University Halle-Wittenberg, Germany August 22, 2008 Overview 1 Introduction

More information

Consistency and Set Intersection

Consistency and Set Intersection Consistency and Set Intersection Yuanlin Zhang and Roland H.C. Yap National University of Singapore 3 Science Drive 2, Singapore {zhangyl,ryap}@comp.nus.edu.sg Abstract We propose a new framework to study

More information

Overview. H. R. Alvarez A., Ph. D.

Overview. H. R. Alvarez A., Ph. D. Network Modeling Overview Networks arise in numerous settings: transportation, electrical, and communication networks, for example. Network representations also are widely used for problems in such diverse

More information

Optimization Techniques for Design Space Exploration

Optimization Techniques for Design Space Exploration 0-0-7 Optimization Techniques for Design Space Exploration Zebo Peng Embedded Systems Laboratory (ESLAB) Linköping University Outline Optimization problems in ERT system design Heuristic techniques Simulated

More information

Parallel Computing in Combinatorial Optimization

Parallel Computing in Combinatorial Optimization Parallel Computing in Combinatorial Optimization Bernard Gendron Université de Montréal gendron@iro.umontreal.ca Course Outline Objective: provide an overview of the current research on the design of parallel

More information

Heuristis for Combinatorial Optimization

Heuristis for Combinatorial Optimization Heuristis for Combinatorial Optimization Luigi De Giovanni Dipartimento di Matematica, Università di Padova Luigi De Giovanni Heuristic for Combinatorial Optimization 1 / 57 Exact and heuristic methods

More information

Heuristic Optimisation

Heuristic Optimisation Heuristic Optimisation Part 2: Basic concepts Sándor Zoltán Németh http://web.mat.bham.ac.uk/s.z.nemeth s.nemeth@bham.ac.uk University of Birmingham S Z Németh (s.nemeth@bham.ac.uk) Heuristic Optimisation

More information

Methods and Models for Combinatorial Optimization Exact methods for the Traveling Salesman Problem

Methods and Models for Combinatorial Optimization Exact methods for the Traveling Salesman Problem Methods and Models for Combinatorial Optimization Exact methods for the Traveling Salesman Problem L. De Giovanni M. Di Summa The Traveling Salesman Problem (TSP) is an optimization problem on a directed

More information

Construction Heuristics and Local Search Methods for VRP/VRPTW

Construction Heuristics and Local Search Methods for VRP/VRPTW DM204, 2010 SCHEDULING, TIMETABLING AND ROUTING Lecture 31 Construction Heuristics and Local Search Methods for VRP/VRPTW Marco Chiarandini Department of Mathematics & Computer Science University of Southern

More information

Local Search. Outline DM811 HEURISTICS AND LOCAL SEARCH ALGORITHMS FOR COMBINATORIAL OPTIMZATION. Definition: Local Search Algorithm.

Local Search. Outline DM811 HEURISTICS AND LOCAL SEARCH ALGORITHMS FOR COMBINATORIAL OPTIMZATION. Definition: Local Search Algorithm. DM811 HEURISTICS AND LOCAL SEARCH ALGORITHMS FOR COMBINATORIAL OPTIMZATION slides in part based on http://www.sls-book.net/ H. Hoos and T. Stützle, 2005 Lecture 7 Local Search Marco Chiarandini Outline

More information

a local optimum is encountered in such a way that further improvement steps become possible.

a local optimum is encountered in such a way that further improvement steps become possible. Dynamic Local Search I Key Idea: Modify the evaluation function whenever a local optimum is encountered in such a way that further improvement steps become possible. I Associate penalty weights (penalties)

More information

Heuristis for Combinatorial Optimization

Heuristis for Combinatorial Optimization Heuristis for Combinatorial Optimization Luigi De Giovanni Dipartimento di Matematica, Università di Padova Luigi De Giovanni Heuristic for Combinatorial Optimization 1 / 59 Exact and heuristic methods

More information

Module 4. Constraint satisfaction problems. Version 2 CSE IIT, Kharagpur

Module 4. Constraint satisfaction problems. Version 2 CSE IIT, Kharagpur Module 4 Constraint satisfaction problems Lesson 10 Constraint satisfaction problems - II 4.5 Variable and Value Ordering A search algorithm for constraint satisfaction requires the order in which variables

More information

Cluster Analysis. Angela Montanari and Laura Anderlucci

Cluster Analysis. Angela Montanari and Laura Anderlucci Cluster Analysis Angela Montanari and Laura Anderlucci 1 Introduction Clustering a set of n objects into k groups is usually moved by the aim of identifying internally homogenous groups according to a

More information

3 INTEGER LINEAR PROGRAMMING

3 INTEGER LINEAR PROGRAMMING 3 INTEGER LINEAR PROGRAMMING PROBLEM DEFINITION Integer linear programming problem (ILP) of the decision variables x 1,..,x n : (ILP) subject to minimize c x j j n j= 1 a ij x j x j 0 x j integer n j=

More information

Parallel Machine and Flow Shop Models

Parallel Machine and Flow Shop Models Outline DM87 SCHEDULING, TIMETABLING AND ROUTING 1. Resume and Extensions on Single Machine Models Lecture 10 Parallel Machine and Flow Shop Models 2. Parallel Machine Models Marco Chiarandini 3. Flow

More information

Coping with the Limitations of Algorithm Power Exact Solution Strategies Backtracking Backtracking : A Scenario

Coping with the Limitations of Algorithm Power Exact Solution Strategies Backtracking Backtracking : A Scenario Coping with the Limitations of Algorithm Power Tackling Difficult Combinatorial Problems There are two principal approaches to tackling difficult combinatorial problems (NP-hard problems): Use a strategy

More information

5.4 Pure Minimal Cost Flow

5.4 Pure Minimal Cost Flow Pure Minimal Cost Flow Problem. Pure Minimal Cost Flow Networks are especially convenient for modeling because of their simple nonmathematical structure that can be easily portrayed with a graph. This

More information

Outline. No Free Lunch Theorems SMTWTP. Outline DM812 METAHEURISTICS

Outline. No Free Lunch Theorems SMTWTP. Outline DM812 METAHEURISTICS DM812 METAHEURISTICS Outline Lecture 9 Marco Chiarandini 1. Department of Mathematics and Computer Science University of Southern Denmark, Odense, Denmark 2. Outline 1. 2. Linear permutations

More information

Chapter S:II. II. Search Space Representation

Chapter S:II. II. Search Space Representation Chapter S:II II. Search Space Representation Systematic Search Encoding of Problems State-Space Representation Problem-Reduction Representation Choosing a Representation S:II-1 Search Space Representation

More information

Effective probabilistic stopping rules for randomized metaheuristics: GRASP implementations

Effective probabilistic stopping rules for randomized metaheuristics: GRASP implementations Effective probabilistic stopping rules for randomized metaheuristics: GRASP implementations Celso C. Ribeiro Isabel Rosseti Reinaldo C. Souza Universidade Federal Fluminense, Brazil July 2012 1/45 Contents

More information

LECTURES 3 and 4: Flows and Matchings

LECTURES 3 and 4: Flows and Matchings LECTURES 3 and 4: Flows and Matchings 1 Max Flow MAX FLOW (SP). Instance: Directed graph N = (V,A), two nodes s,t V, and capacities on the arcs c : A R +. A flow is a set of numbers on the arcs such that

More information

Metaheuristics : from Design to Implementation

Metaheuristics : from Design to Implementation Metaheuristics : from Design to Implementation Chap 2 Single-solution based Metaheuristics Wiley, 2009 (596pp) ISBN: 978-0-470-27858-1 Single solution-based metaheuristics Improvement of a single solution

More information

Fast algorithms for max independent set

Fast algorithms for max independent set Fast algorithms for max independent set N. Bourgeois 1 B. Escoffier 1 V. Th. Paschos 1 J.M.M. van Rooij 2 1 LAMSADE, CNRS and Université Paris-Dauphine, France {bourgeois,escoffier,paschos}@lamsade.dauphine.fr

More information

Branch-and-bound: an example

Branch-and-bound: an example Branch-and-bound: an example Giovanni Righini Università degli Studi di Milano Operations Research Complements The Linear Ordering Problem The Linear Ordering Problem (LOP) is an N P-hard combinatorial

More information

MIT 801. Machine Learning I. [Presented by Anna Bosman] 16 February 2018

MIT 801. Machine Learning I. [Presented by Anna Bosman] 16 February 2018 MIT 801 [Presented by Anna Bosman] 16 February 2018 Machine Learning What is machine learning? Artificial Intelligence? Yes as we know it. What is intelligence? The ability to acquire and apply knowledge

More information

Geometric Steiner Trees

Geometric Steiner Trees Geometric Steiner Trees From the book: Optimal Interconnection Trees in the Plane By Marcus Brazil and Martin Zachariasen Part 2: Global properties of Euclidean Steiner Trees and GeoSteiner Marcus Brazil

More information

The PRAM model. A. V. Gerbessiotis CIS 485/Spring 1999 Handout 2 Week 2

The PRAM model. A. V. Gerbessiotis CIS 485/Spring 1999 Handout 2 Week 2 The PRAM model A. V. Gerbessiotis CIS 485/Spring 1999 Handout 2 Week 2 Introduction The Parallel Random Access Machine (PRAM) is one of the simplest ways to model a parallel computer. A PRAM consists of

More information

Material handling and Transportation in Logistics. Paolo Detti Dipartimento di Ingegneria dell Informazione e Scienze Matematiche Università di Siena

Material handling and Transportation in Logistics. Paolo Detti Dipartimento di Ingegneria dell Informazione e Scienze Matematiche Università di Siena Material handling and Transportation in Logistics Paolo Detti Dipartimento di Ingegneria dell Informazione e Scienze Matematiche Università di Siena Introduction to Graph Theory Graph Theory As Mathematical

More information

Graph. Vertex. edge. Directed Graph. Undirected Graph

Graph. Vertex. edge. Directed Graph. Undirected Graph Module : Graphs Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS E-mail: natarajan.meghanathan@jsums.edu Graph Graph is a data structure that is a collection

More information

Network Design and Optimization course

Network Design and Optimization course Effective maximum flow algorithms Modeling with flows Network Design and Optimization course Lecture 5 Alberto Ceselli alberto.ceselli@unimi.it Dipartimento di Tecnologie dell Informazione Università degli

More information

A Topography-Preserving Latent Variable Model with Learning Metrics

A Topography-Preserving Latent Variable Model with Learning Metrics A Topography-Preserving Latent Variable Model with Learning Metrics Samuel Kaski and Janne Sinkkonen Helsinki University of Technology Neural Networks Research Centre P.O. Box 5400, FIN-02015 HUT, Finland

More information

Evolutionary tree reconstruction (Chapter 10)

Evolutionary tree reconstruction (Chapter 10) Evolutionary tree reconstruction (Chapter 10) Early Evolutionary Studies Anatomical features were the dominant criteria used to derive evolutionary relationships between species since Darwin till early

More information

Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch.

Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. Iterative Improvement Algorithm design technique for solving optimization problems Start with a feasible solution Repeat the following step until no improvement can be found: change the current feasible

More information

EXERCISES SHORTEST PATHS: APPLICATIONS, OPTIMIZATION, VARIATIONS, AND SOLVING THE CONSTRAINED SHORTEST PATH PROBLEM. 1 Applications and Modelling

EXERCISES SHORTEST PATHS: APPLICATIONS, OPTIMIZATION, VARIATIONS, AND SOLVING THE CONSTRAINED SHORTEST PATH PROBLEM. 1 Applications and Modelling SHORTEST PATHS: APPLICATIONS, OPTIMIZATION, VARIATIONS, AND SOLVING THE CONSTRAINED SHORTEST PATH PROBLEM EXERCISES Prepared by Natashia Boland 1 and Irina Dumitrescu 2 1 Applications and Modelling 1.1

More information

Decreasing the Diameter of Bounded Degree Graphs

Decreasing the Diameter of Bounded Degree Graphs Decreasing the Diameter of Bounded Degree Graphs Noga Alon András Gyárfás Miklós Ruszinkó February, 00 To the memory of Paul Erdős Abstract Let f d (G) denote the minimum number of edges that have to be

More information

Mathematical and Algorithmic Foundations Linear Programming and Matchings

Mathematical and Algorithmic Foundations Linear Programming and Matchings Adavnced Algorithms Lectures Mathematical and Algorithmic Foundations Linear Programming and Matchings Paul G. Spirakis Department of Computer Science University of Patras and Liverpool Paul G. Spirakis

More information

GRASP. Greedy Randomized Adaptive. Search Procedure

GRASP. Greedy Randomized Adaptive. Search Procedure GRASP Greedy Randomized Adaptive Search Procedure Type of problems Combinatorial optimization problem: Finite ensemble E = {1,2,... n } Subset of feasible solutions F 2 Objective function f : 2 Minimisation

More information

Exact Algorithms for NP-hard problems

Exact Algorithms for NP-hard problems 24 mai 2012 1 Why do we need exponential algorithms? 2 3 Why the P-border? 1 Practical reasons (Jack Edmonds, 1965) For practical purposes the difference between algebraic and exponential order is more

More information

Trees. 3. (Minimally Connected) G is connected and deleting any of its edges gives rise to a disconnected graph.

Trees. 3. (Minimally Connected) G is connected and deleting any of its edges gives rise to a disconnected graph. Trees 1 Introduction Trees are very special kind of (undirected) graphs. Formally speaking, a tree is a connected graph that is acyclic. 1 This definition has some drawbacks: given a graph it is not trivial

More information

Computational problems. Lecture 2: Combinatorial search and optimisation problems. Computational problems. Examples. Example

Computational problems. Lecture 2: Combinatorial search and optimisation problems. Computational problems. Examples. Example Lecture 2: Combinatorial search and optimisation problems Different types of computational problems Examples of computational problems Relationships between problems Computational properties of different

More information

Introduction to Approximation Algorithms

Introduction to Approximation Algorithms Introduction to Approximation Algorithms Dr. Gautam K. Das Departmet of Mathematics Indian Institute of Technology Guwahati, India gkd@iitg.ernet.in February 19, 2016 Outline of the lecture Background

More information

Rollout Algorithms for Discrete Optimization: A Survey

Rollout Algorithms for Discrete Optimization: A Survey Rollout Algorithms for Discrete Optimization: A Survey by Dimitri P. Bertsekas Massachusetts Institute of Technology Cambridge, MA 02139 dimitrib@mit.edu August 2010 Abstract This chapter discusses rollout

More information

1. Lecture notes on bipartite matching February 4th,

1. Lecture notes on bipartite matching February 4th, 1. Lecture notes on bipartite matching February 4th, 2015 6 1.1.1 Hall s Theorem Hall s theorem gives a necessary and sufficient condition for a bipartite graph to have a matching which saturates (or matches)

More information

Chapter Design Techniques for Approximation Algorithms

Chapter Design Techniques for Approximation Algorithms Chapter 2 Design Techniques for Approximation Algorithms I N THE preceding chapter we observed that many relevant optimization problems are NP-hard, and that it is unlikely that we will ever be able to

More information

Mathematics of Networks II

Mathematics of Networks II Mathematics of Networks II 26.10.2016 1 / 30 Definition of a network Our definition (Newman): A network (graph) is a collection of vertices (nodes) joined by edges (links). More precise definition (Bollobàs):

More information

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University

Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Data Mining Chapter 8: Search and Optimization Methods Fall 2011 Ming Li Department of Computer Science and Technology Nanjing University Search & Optimization Search and Optimization method deals with

More information

6. Lecture notes on matroid intersection

6. Lecture notes on matroid intersection Massachusetts Institute of Technology 18.453: Combinatorial Optimization Michel X. Goemans May 2, 2017 6. Lecture notes on matroid intersection One nice feature about matroids is that a simple greedy algorithm

More information

COLUMN GENERATION IN LINEAR PROGRAMMING

COLUMN GENERATION IN LINEAR PROGRAMMING COLUMN GENERATION IN LINEAR PROGRAMMING EXAMPLE: THE CUTTING STOCK PROBLEM A certain material (e.g. lumber) is stocked in lengths of 9, 4, and 6 feet, with respective costs of $5, $9, and $. An order for

More information

Exercise set 2 Solutions

Exercise set 2 Solutions Exercise set 2 Solutions Let H and H be the two components of T e and let F E(T ) consist of the edges of T with one endpoint in V (H), the other in V (H ) Since T is connected, F Furthermore, since T

More information

Combinatorial Optimization Lab No. 10 Traveling Salesman Problem

Combinatorial Optimization Lab No. 10 Traveling Salesman Problem Combinatorial Optimization Lab No. 10 Traveling Salesman Problem Industrial Informatics Research Center http://industrialinformatics.cz/ May 29, 2018 Abstract In this lab we review various ways how to

More information

Conflict Graphs for Combinatorial Optimization Problems

Conflict Graphs for Combinatorial Optimization Problems Conflict Graphs for Combinatorial Optimization Problems Ulrich Pferschy joint work with Andreas Darmann and Joachim Schauer University of Graz, Austria Introduction Combinatorial Optimization Problem CO

More information

Branch-price-and-cut for vehicle routing. Guy Desaulniers

Branch-price-and-cut for vehicle routing. Guy Desaulniers Guy Desaulniers Professor, Polytechnique Montréal, Canada Director, GERAD, Canada VeRoLog PhD School 2018 Cagliari, Italy, June 2, 2018 Outline 1 VRPTW definition 2 Mathematical formulations Arc-flow formulation

More information

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2

Hybridization EVOLUTIONARY COMPUTING. Reasons for Hybridization - 1. Naming. Reasons for Hybridization - 3. Reasons for Hybridization - 2 Hybridization EVOLUTIONARY COMPUTING Hybrid Evolutionary Algorithms hybridization of an EA with local search techniques (commonly called memetic algorithms) EA+LS=MA constructive heuristics exact methods

More information

Computer Science 385 Design and Analysis of Algorithms Siena College Spring Topic Notes: Brute-Force Algorithms

Computer Science 385 Design and Analysis of Algorithms Siena College Spring Topic Notes: Brute-Force Algorithms Computer Science 385 Design and Analysis of Algorithms Siena College Spring 2019 Topic Notes: Brute-Force Algorithms Our first category of algorithms are called brute-force algorithms. Levitin defines

More information

2.3 Optimal paths. Optimal (shortest or longest) paths have a wide range of applications:

2.3 Optimal paths. Optimal (shortest or longest) paths have a wide range of applications: . Optimal paths Optimal (shortest or longest) paths have a wide range of applications: Google maps, GPS navigators planning and management of transportation, electrical and telecommunication networks project

More information

AC64/AT64 DESIGN & ANALYSIS OF ALGORITHMS DEC 2014

AC64/AT64 DESIGN & ANALYSIS OF ALGORITHMS DEC 2014 AC64/AT64 DESIGN & ANALYSIS OF ALGORITHMS DEC 214 Q.2 a. Design an algorithm for computing gcd (m,n) using Euclid s algorithm. Apply the algorithm to find gcd (31415, 14142). ALGORITHM Euclid(m, n) //Computes

More information

TABU search and Iterated Local Search classical OR methods

TABU search and Iterated Local Search classical OR methods TABU search and Iterated Local Search classical OR methods tks@imm.dtu.dk Informatics and Mathematical Modeling Technical University of Denmark 1 Outline TSP optimization problem Tabu Search (TS) (most

More information

A Fast Taboo Search Algorithm for the Job Shop Scheduling Problem

A Fast Taboo Search Algorithm for the Job Shop Scheduling Problem A Fast Taboo Search Algorithm for the Job Shop Scheduling Problem Uffe Gram Christensen (uffe@diku.dk) Anders Bjerg Pedersen (andersbp@diku.dk) Kim Vejlin (vejlin@diku.dk) October 21, 2008 Abstract: In

More information

Module 6 NP-Complete Problems and Heuristics

Module 6 NP-Complete Problems and Heuristics Module 6 NP-Complete Problems and Heuristics Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS 97 E-mail: natarajan.meghanathan@jsums.edu Optimization vs. Decision

More information

Outline. TABU search and Iterated Local Search classical OR methods. Traveling Salesman Problem (TSP) 2-opt

Outline. TABU search and Iterated Local Search classical OR methods. Traveling Salesman Problem (TSP) 2-opt TABU search and Iterated Local Search classical OR methods Outline TSP optimization problem Tabu Search (TS) (most important) Iterated Local Search (ILS) tks@imm.dtu.dk Informatics and Mathematical Modeling

More information

Lecture notes on the simplex method September We will present an algorithm to solve linear programs of the form. maximize.

Lecture notes on the simplex method September We will present an algorithm to solve linear programs of the form. maximize. Cornell University, Fall 2017 CS 6820: Algorithms Lecture notes on the simplex method September 2017 1 The Simplex Method We will present an algorithm to solve linear programs of the form maximize subject

More information

New algorithm for analyzing performance of neighborhood strategies in solving job shop scheduling problems

New algorithm for analyzing performance of neighborhood strategies in solving job shop scheduling problems Journal of Scientific & Industrial Research ESWARAMURTHY: NEW ALGORITHM FOR ANALYZING PERFORMANCE OF NEIGHBORHOOD STRATEGIES 579 Vol. 67, August 2008, pp. 579-588 New algorithm for analyzing performance

More information

Interactive segmentation, Combinatorial optimization. Filip Malmberg

Interactive segmentation, Combinatorial optimization. Filip Malmberg Interactive segmentation, Combinatorial optimization Filip Malmberg But first... Implementing graph-based algorithms Even if we have formulated an algorithm on a general graphs, we do not neccesarily have

More information

1. Lecture notes on bipartite matching

1. Lecture notes on bipartite matching Massachusetts Institute of Technology 18.453: Combinatorial Optimization Michel X. Goemans February 5, 2017 1. Lecture notes on bipartite matching Matching problems are among the fundamental problems in

More information

EE 701 ROBOT VISION. Segmentation

EE 701 ROBOT VISION. Segmentation EE 701 ROBOT VISION Regions and Image Segmentation Histogram-based Segmentation Automatic Thresholding K-means Clustering Spatial Coherence Merging and Splitting Graph Theoretic Segmentation Region Growing

More information

Search Algorithms. IE 496 Lecture 17

Search Algorithms. IE 496 Lecture 17 Search Algorithms IE 496 Lecture 17 Reading for This Lecture Primary Horowitz and Sahni, Chapter 8 Basic Search Algorithms Search Algorithms Search algorithms are fundamental techniques applied to solve

More information

Canonical Forms and Algorithms for Steiner Trees in Uniform Orientation Metrics

Canonical Forms and Algorithms for Steiner Trees in Uniform Orientation Metrics Canonical Forms and Algorithms for Steiner Trees in Uniform Orientation Metrics M. Brazil D.A. Thomas J.F. Weng M. Zachariasen December 13, 2002 Abstract We present some fundamental structural properties

More information

An Improved Hybrid Genetic Algorithm for the Generalized Assignment Problem

An Improved Hybrid Genetic Algorithm for the Generalized Assignment Problem An Improved Hybrid Genetic Algorithm for the Generalized Assignment Problem Harald Feltl and Günther R. Raidl Institute of Computer Graphics and Algorithms Vienna University of Technology, Vienna, Austria

More information

x ji = s i, i N, (1.1)

x ji = s i, i N, (1.1) Dual Ascent Methods. DUAL ASCENT In this chapter we focus on the minimum cost flow problem minimize subject to (i,j) A {j (i,j) A} a ij x ij x ij {j (j,i) A} (MCF) x ji = s i, i N, (.) b ij x ij c ij,

More information

Lecture 25: Bezier Subdivision. And he took unto him all these, and divided them in the midst, and laid each piece one against another: Genesis 15:10

Lecture 25: Bezier Subdivision. And he took unto him all these, and divided them in the midst, and laid each piece one against another: Genesis 15:10 Lecture 25: Bezier Subdivision And he took unto him all these, and divided them in the midst, and laid each piece one against another: Genesis 15:10 1. Divide and Conquer If we are going to build useful

More information

Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret

Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret Greedy Algorithms (continued) The best known application where the greedy algorithm is optimal is surely

More information

Variable Neighborhood Search

Variable Neighborhood Search Variable Neighborhood Search Hansen and Mladenovic, Variable neighborhood search: Principles and applications, EJOR 43 (2001) 1 Basic notions of VNS Systematic change of the neighborhood in search Does

More information

Today. Golden section, discussion of error Newton s method. Newton s method, steepest descent, conjugate gradient

Today. Golden section, discussion of error Newton s method. Newton s method, steepest descent, conjugate gradient Optimization Last time Root finding: definition, motivation Algorithms: Bisection, false position, secant, Newton-Raphson Convergence & tradeoffs Example applications of Newton s method Root finding in

More information

(Stochastic) Local Search Algorithms

(Stochastic) Local Search Algorithms DM841 DISCRETE OPTIMIZATION Part 2 Heuristics (Stochastic) Marco Chiarandini Department of Mathematics & Computer Science University of Southern Denmark Outline 1. 2. 3. Components 2 Outline 1. 2. 3. Components

More information

Constraint Satisfaction Problems

Constraint Satisfaction Problems Constraint Satisfaction Problems CE417: Introduction to Artificial Intelligence Sharif University of Technology Spring 2013 Soleymani Course material: Artificial Intelligence: A Modern Approach, 3 rd Edition,

More information

NP Completeness. Andreas Klappenecker [partially based on slides by Jennifer Welch]

NP Completeness. Andreas Klappenecker [partially based on slides by Jennifer Welch] NP Completeness Andreas Klappenecker [partially based on slides by Jennifer Welch] Dealing with NP-Complete Problems Dealing with NP-Completeness Suppose the problem you need to solve is NP-complete. What

More information

Crash-Starting the Simplex Method

Crash-Starting the Simplex Method Crash-Starting the Simplex Method Ivet Galabova Julian Hall School of Mathematics, University of Edinburgh Optimization Methods and Software December 2017 Ivet Galabova, Julian Hall Crash-Starting Simplex

More information

Localized and Incremental Monitoring of Reverse Nearest Neighbor Queries in Wireless Sensor Networks 1

Localized and Incremental Monitoring of Reverse Nearest Neighbor Queries in Wireless Sensor Networks 1 Localized and Incremental Monitoring of Reverse Nearest Neighbor Queries in Wireless Sensor Networks 1 HAI THANH MAI AND MYOUNG HO KIM Department of Computer Science Korea Advanced Institute of Science

More information

Image representation. 1. Introduction

Image representation. 1. Introduction Image representation Introduction Representation schemes Chain codes Polygonal approximations The skeleton of a region Boundary descriptors Some simple descriptors Shape numbers Fourier descriptors Moments

More information

A Tabu Search Heuristic for the Generalized Traveling Salesman Problem

A Tabu Search Heuristic for the Generalized Traveling Salesman Problem A Tabu Search Heuristic for the Generalized Traveling Salesman Problem Jacques Renaud 1,2 Frédéric Semet 3,4 1. Université Laval 2. Centre de Recherche sur les Technologies de l Organisation Réseau 3.

More information

Programming, numerics and optimization

Programming, numerics and optimization Programming, numerics and optimization Lecture C-4: Constrained optimization Łukasz Jankowski ljank@ippt.pan.pl Institute of Fundamental Technological Research Room 4.32, Phone +22.8261281 ext. 428 June

More information

Efficient Edge-Swapping Heuristics for the Reload Cost Spanning Tree Problem

Efficient Edge-Swapping Heuristics for the Reload Cost Spanning Tree Problem Efficient Edge-Swapping Heuristics for the Reload Cost Spanning Tree Problem S. Raghavan and Mustafa Sahin Smith School of Business & Institute for Systems Research, University of Maryland, College Park,

More information