LEXICOGRAPHIC LOCAL SEARCH AND THE P-CENTER PROBLEM

Size: px
Start display at page:

Download "LEXICOGRAPHIC LOCAL SEARCH AND THE P-CENTER PROBLEM"

Transcription

1 LEXICOGRAPHIC LOCAL SEARCH AND THE P-CENTER PROBLEM Refael Hassin, Asaf Levin and Dana Morad Abstract We introduce a local search strategy that suits combinatorial optimization problems with a min-max (or max-min) objective. According to this approach, solutions are compared lexicographically rather then by their worst coordinate. We apply this approach to the p-center problem. Keywords: Facility planning, Heuristics. 1 Introduction The subject of this paper is the application of local search to the class of bottleneck problems. In these problems, each feasible solution, say X, is associated with a vector, say c X IR n, and the goal is to minimize max i c X i (or maximize min i c X i ). For example, Hochbaum and Shmoys [11] considered a class of problems in which a weighted graph is given and we wish to find a subgraph satisfying some requirements such that the length of the longest edge included in the subgraph is minimized. Several problems from this class emerging from routing, location, and communication network design, are described there in detail. We observe that bottleneck problems are quite insensitive to local changes in the solution. This is in contrast to the case where i cx i is minimized. We suggest dealing with this drawback by considering a more sensitive measure that ranks solutions which have the same objective value. Our approach is of sorting the elements of the solution vectors and then comparing them lexicographically. We call the resulting algorithm lexicographic local search. This approach can be incorporated in various local improvement approaches such as simulated annealing and tabu search. In recent years many heuristic approaches were designed that include a local search procedure as a sub-routine. Such approaches include for example simulating annealing and tabu search. We suggest to incorporate the proposed lexicographic local search in such heuristics when solving a bottleneck problem. We demonstrate the power of using lexicographic local search in such circumstances by using it in simulating annealing. Department of Statistics and Operations Research, School of Mathematical Sciences, Tel-Aviv University, Tel- Aviv 69978, Israel. {hassin,levinas}@post.tau.ac.il 1

2 Our goal in this paper is to examine the effect of modifying a local search heuristic by the use of lexicographic local search. We discuss lexicographic local search in the context of the p-center problem. We present some theoretical results which motivate our use of lexicographic local search and show the limitation of this approach. Then we concentrate on a computational study. Our results indicate that lexicographic local search is a useful tool. It improves (with respect to local search) the quality of the solutions without increasing the computational effort. In many cases we were able to receive a higher frequency of the best solution using lexicographic local search. This higher frequency means that if we were running the algorithms for a shorter time period the chances of getting the better result would be higher. An interesting bottleneck problem is that of minimizing the maximum completion time when scheduling jobs on m parallel machines. Here a solution is associated with an m-vector giving the completion time on each machine. Glass, Potts and Shade [7] applied local search to this problem. They defined as an improvement a step that either decreases the maximum completion time or keeps it unchanged but decreases the number of machines that achieve it. Lexicographic search can be considered as a further step towards improving the sensitivity of the search to promising directions by considering not only the number of occurrences of the maximum completion time but also the next highest completion times. Fürer and Raghavachari [5] developed an approximation algorithm for minimizing the maximum node degree of a spanning tree. The local search they developed swaps pairs of edges when the maximum degree of the four nodes involved is reduced, even when this doesn t affect the maximum degree of the tree. A different variation appears in [6]. This approach can be viewed as a variation of lexicographic local search. Finally, Khanna, Motwani, Sudan and Vazirani [12] developed a general paradigm, non-oblivious local search, in which local optimality is defined according to a function which does not just rank solutions that are identical with respect to the original function, as we do, but in fact ranks solutions in a different way from that dictated by the original function. They demonstrate that a clever choice of this function may improve the theoretical approximation ratio. In the next sections we provide a brief description of several algorithms for the p-center problem including the lexicographic local search. We then present a worst case analysis for the p-center problem of the lexicographic local search and the other algorithms and we finally present results of 2

3 a computational study for the p-center problem and conclusions we draw from it. 2 The p-center problem In the p-center problem, a finite set V is given with a nonnegative distance matrix d = d ij i, j V. For a subset X V, let D ix = min j X d ij be the distance of i V from the subset X. The problem is to find X V such that X = p and F (X) = max v V D vx is minimized. We call F (X) the radius of X. In an illustrative description of the problem, V is a set of customers to be served by p service centers. The goal is to locate the centers so that the maximum travel time of a customer to the nearest center is minimized. The p-center problem is NP-hard even when V is a set of points in the two dimensional plane and d describes the Euclidean distances among the points [4]. Assuming that the distance matrix is symmetric and obeys the triangle inequality, the polynomial algorithms of Dyer and Frieze [3], Hochbaum and Shmoys [10] and Plesník [13] are fast and guarantee an approximation which is bounded by twice the optimal solution s value. These algorithms are designed to achieve the error bound and will often produce solutions with an error that cannot be accepted for practical computations. There are other algorithms that usually perform well while theoretically their running time may not be polynomially bounded and the solutions they provide cannot be bounded in the worst case by any constant. An important example is an algorithm of Drezner [2] (Cooper [1] used earlier a similar method for related problems). 3 Local search The local search approach assumes that for each solution, X, a neighborhood, N X, is defined. Then, in each iteration the neighborhood is searched for an improved solution. A natural definition of a neighborhood in the p-center problem is the set of solutions obtained by relocating at most k centers of the current solution, for some prespecified k. In our study we assume k = 1. Algorithm LS: 1. Start with an arbitrary solution X. [This may be the output of some other heuristic.] 2. Let N X be the set of solutions obtained from X by relocating a single center. Scan N X for a better solution, X. If none exists, stop and output X. Else, replace X by X and repeat. 3

4 The search terminates in a local optimum, a solution which is best in its neighborhood. Clearly, the specific output depends on the choice of the initial solution, and a common practice is to repeat the search many times, starting from different initial solutions, and finally select the best outcome. We note that each iteration of the algorithm (i.e., reaching a local minimum from a given initial point) requires polynomial time. The reason is that the number of improvements is bounded by the number of distinct distance values which is O(n 2 ). In many applications d is symmetric and satisfies the triangle inequality. In the case of two centers we can use these properties to prove a performance guarantee for the algorithm: Theorem 3.1 If d is symmetric and the triangle inequality is satisfied, then the radius of the 2- center solution produced by Algorithm LS is at most twice the size of the radius of an optimal solution. This bound is tight. Proof: Let {x 1, x 2 } be an optimal solution with radius r. Let S 1, S 2 be a partition of V such that d v,xi r v S i i = 1, 2. For example, S i may be the set of customers serviced by the center at x i under the optimal solution. Then, it follows from the triangle inequality that any solution {y, z} such that y S 1, z S 2 satisfies d vy d v,x1 +d x1,y 2r for v S 1, and d vz d v,x2 +d x2,z 2r for v S 2. Hence, the radius of any such solution is at most 2r. It now follows that a solution whose radius is strictly greater than 2r cannot be a local minimum since it must have its two centers located in the same subset S i and by relocating a center to the other subset the radius is improved to a value at most 2r. Consider now four points on the line with coordinates 0, 1, 2, 4. The 2-center solution is {1, 4} with radius 1. But the solution {0, 2} is a local minimum and its radius is 2. This proves that the bound stated by the proposition is tight. For p > 2, the ratio of the approximate solution to the optimal cannot be bounded by a constant. This will follow from a stronger claim presented in the next section. 4 Lexicographic local search Consider a 2-center example: Six points with coordinates {0, 10, 19, 21, 30, 40} are located on a line. Consider the solution X = {19, 21}. To improve it, both of its centers must be moved. Therefore it is a local minimum with respect to LS. Note however that by moving one of the centers of X 4

5 we can reduce the number of points whose distance from their nearest center is 19. Such a change, while not reducing the radius, is a good move towards reducing the radius in a following step. This example motivates a modification of the objective function that will make it more sensitive to promising changes in the solution. Consider two vectors x, y IR n. Let x, y be the vectors obtained by permuting x, y respectively, so that x 1 x 2... x n and y 1 y 2... y n. We say that x is l-lexicographically equal to y if for all k {1,..., l}, x k = y k. We say that x is l-lexicographically smaller than y if for some k {1,..., l}, x i = y i i = 1,..., k 1 and x k < y k. The algorithm which we propose in this paper is a local search variation that scans the neighborhood to find an l lexicographically smaller solution, where l is a predetermined parameter: Algorithm llex: 1. Start with an arbitrary solution X. 2. Scan N X for an l-lexicographically smaller solution, X (first improvement). If none exists, stop and output X. Else, replace X by X and repeat. In the p-center problem, we associate with each solution a vector x IR n, where x i is the distance between node i and its nearest center in the solution. We apply the lexicographic local search to obtain a lexicographic local minimum with respect to this measure. We note that for any fixed l value, each iteration of the algorithm (reaching an l-lexicographic local minimum from a given initial point) requires polynomial time. The reason is that the number of improvements is O(N l ), where N is the number of distinct values in the distance matrix d. Clearly 1LEX is equivalent to LS, and 2LEX is sufficient to reach the optimal solution in the example that opened this section. One could generalize this example to justify higher l values: Let V = V 1 V 2 V 3 where the subsets are of equal size. Suppose that the distances between points in each of the subsets are zero, while between subsets the distances are of one unit. An optimal solution for p 3 locates at least one center in each of the three subsets and its radius is zero. However, a solution that locates all the centers within one subset is llex optimal for all l n/3, since by moving one center to another subset D ix remains one for all points i in the third subset. We conclude that the ratio of the llex solution to the optimal in the worst case cannot be bounded by a constant. 5

6 y 2 1 M 2M x Figure 1: A bad example for nlex Figure 1 illustrates a bad example for nlex. The problem consists of 9 points in the plane. The optimal solution for p = 3 locates the centers at the points whose y coordinate is 1. The radius of this solution is 1. However, consider the solution consisting of the points whose x coordinate is M. Its radius is M. Any relocation of a center increases the radius to M and hence this is a lexicographic local minimum. The ratio of this approximation to the optimal radius can be made arbitrarily large by controlling the value of M. The next example demonstrates that the ratio between a lexicographic local optimum and the global optimum cannot be bounded by a constant even when larger neighborhoods are considered. An example with similar properties but for a generalized version of the p-center problem that allows node weights can be found in [14]. The graph in Figure 2 has a star shape with p 1 rays. It consists of p black nodes and 3(p 1) white nodes (p = 7 in the figure). The distances between vertices are induced by the graph shown in the figure. This graph has long edges of length M, and short edges of unit length. In an optimal solution, one center is located at the central black node, and there is a center at one of the three white nodes of each of the p 1 rays. The radius is then equal to 1. However, the solution with radius M, in which all the centers are located at the black nodes, is locally optimal with respect to nlex even under the k-change neighborhood for any k < p 1. The reason is that unless we relocate the centers so that each ray has a white node with a center, any relocation will create a node whose closest center is at distance M + 1 from it. However, we are able to show that when the points are distinct points on a line (1-dimension 6

7 1 1 1 M M M 1 1 Figure 2: A bad example for k-change local search space) 2LEX provides a 2-approximation algorithm for the 3-center problem whereas LS doesn t provide us any constant error ratio as can be seen by the following example: Let V = {0, M, M + 1, M + 2, 2M + 2}. The optimal 3-center is {0, M + 1, 2M + 2} with unit radius. However, {M, M + 1, M + 2} is a local optimum for LS with radius M. Theorem 4.1 2LEX is a 2-approximation algorithm for the 3-center problem when all the points are distinct points on a line. Proof: Every solution divides the line into 3 segments such that all the points on a segment are served by the same center. Consider the partition induced by an optimal solution. Any solution that locates one center in every segment has a radius of at most twice the optimal. Consider a 2LEX local optimum. If there is one center in each segment it is a 2-approximation. If there is a segment with two centers and another one with one center, moving one of the two 7

8 centers to the third segment will guarantee a 2-approximation. The algorithm decided not to move to this solution and therefore, the current radius is at most twice the optimum. The remaining case is that there is a segment with 3 centers. In order for this solution to have a radius which is more than twice the optimum, the maximum distance must not be achieved in the segment of the 3 centers. If there is only one point with distance from its nearest center which is the maximum distance then moving a center to the segment which contains this point will reduce the radius. Therefore, there are exactly 2 points, a and b, with a distance from their nearest neighbor which equals the maximum distance (as all the points are distinct). By moving a center (the middle center) to another segment (that contains a) we reach a new solution in which one of the two maximum distances of the current solution is reduced (the distance from a). The distances which were increased are at most twice the optimum (these are distances in the segment where the three centers were located). As the current solution is a 2LEX local optimum (by assumption) this case is not possible and the radius is at most twice the optimal value. The bound of the above theorem cannot be improved as shown by the following example: Consider the p-center problem on the line with points {1, 2, 3, 4, 5, 6,..., 3p 2, 3p 1, 3p}. The optimal solution is {2, 5,..., 3p 1} and it has radius 1. However, the solution {1, 4,..., 3p 2} is n-lex optimal but it has radius 2 (this is so for p 2). The distance vector of this solution consist of one entry of 2, 2p 1 entries of 1 and the rest are 0. To improve lexicographically one must change all the centers of the current solution. The result of Theorem 4.1 can not be extended to higher dimension Euclidean spaces and in particular the graph in Figure 1 shows that in IR 2 nlex doesn t guarantee any constant error ratio when p = 3. 5 Fast algorithms for the p-center problem The algorithm of Dyer and Frieze [3] runs in O(pn) time where n = V. It is a 2-approximation for the problem when d satisfies the triangle inequality. Algorithm DF: 1. Locate the first center at an arbitrary point v V. 2. Given a set of centers X V : If X = p, stop and output X. Else, locate the next center at a point v V satisfying D vx = max u V D ux. 8

9 In our computational study, we executed n runs starting with all possible initial choices. Our computational study indicates that this implementation is practically useful, however, it does not improve the error bound of 2 even in the (polynomially solvable) case of p = 2, as demonstrated by the following example. Consider 6 points on a line with coordinates {1, 2, 3, 4, 5, 6}. Let p = 2. The optimal solution locates the centers at 2 and 5 and has a radius 1. No matter what is the initial point, DF will locate the second center in one of the extreme locations (1 or 6) and the radius obtained will be 2. Drezner s algorithm is a heuristic designed to practically solve the p-center problem. No polynomial bound on its running time is known and we will show below that its error ratio cannot be bounded by a constant. However, it often produces good solutions in short time. To simplify the presentation of the algorithm we assume that the elements of the distance matrix d are all distinct. Consider a subset X V. For each j X let I j (X) = {i V d ij = min k X d ik }. Thus, I j (X) is the set of customers that will be served by the center at j, given the solution X. For j X let k(j) be the optimal center s location in the 1-center problem in which customers are located in the subset I j (X) and the center can be located at any point of V. Note that under our simplifying assumption k(j) is unique. Algorithm DR: 1. Arbitrarily select X V such that X = p. 2. If k(j) = j for all j X stop and output X. Else, for each j X such that k(j) j insert k(j) into X, delete j from X, and repeat this step. Assuming that the elements of d are distinct, the algorithm is deterministically determined except for the choice of the initial solution. One can apply the algorithm several times, starting with different initial solutions, and finally select the best of the solutions that are obtained. Such a strategy is desired since otherwise the algorithm may be easily trapped in bad solutions. For example consider the graph in Figure 1 for the 3-center problem (one can change the distances by infinitesimally quantities so they all be distinct). If we start with a solution consisting of the points whose x coordinate is M then this solution cannot be improved by DR. 9

10 6 The computational study We compare l-lexicographic local search for l = 1, 2, 3, 4 on several p-center instances satisfying the triangle inequality. We compare the performance of l-lexicographic local search with that of ordinary local search (LS). We produced initial solutions using various heuristics, and continue either by LS with neighborhoods allowing a single relocation of a center, or by simulated annealing with the same type of neighborhood. The algorithms were programmed in Fortran 77 and ran on Sun4m computers running SunOS U1. We used different machines for different instances, but all the algorithms that we tested on any given instance were executed on the same machine. We tested each instance with several p values. We applied each algorithm n times, where n is the number of nodes in the graph. The reason for executing n runs is that we often used DF to form initial solutions. DF is deterministic up to the initial choice and the assumption that the distances are distinct. In each run we applied DF with a different starting point. The detailed test results appear in the appendix. For each instance and each algorithm, they give the best solution value obtained in n runs, the frequency of this value, the average value, the worst value, and the total execution time of the n runs (where s, m, h denote seconds, minutes, hours, respectively). The row denoted DF refers to the application of Dyer and Frieze s algorithm. DFDR is the combined algorithm that first runs DF and then uses its output as an initial solution for DR. A DFDR prefix for LEX means that the lexicographic local search started with the output solution obtained by DFDR. A SIM prefix means that simulated annealing was used with the lexicographic criterion (with the output of DFDR giving the initial solution). To apply simulated annealing we concatenated each solution s l highest elements. For example, when l = 2, a vector (122, 25, 32, 6) will be represented by its two highest elements as Applying LS with this representation is equivalent to applying 2LEX to the original data. This representation enabled us to test llex with different values of l, using the same simulating annealing code. Since our goal is to test the lexicographic local search against ordinary local search, the details of the simulating annealing are not important and are not described here. Note that it was not essential to develop an efficient simulated annealing program. 10

11 In our computational study, we changed the solution whenever a lexicographic improvement was detected. On small problems, with up to 50 nodes and p = 10, the use of lexicographic search was not justified since the optimal solutions (that were found separately using an Integer Programming solver) were also often found by the faster DFDR algorithm or by LS. We will present a summary of the results according to different criteria. We say that algorithm A performed better than algorithm B on a given instance if: the best solution returned by A is better than the one returned by B, or the best solutions have the same value but A returned the best solution more frequently, or if both A and B returned the same best solution the same number of times but the average value returned by A was better. Lastly, A is better than B if both algorithms performed equally with respect to all of the above mentioned criteria, and A was faster than B. In our study we obtained that DFDRlLEX is better than llex: DFDR1LEX had better results on 30 instances (in 14 instances it produced a better best solution), whereas 1LEX had better results on 6 instances (in one of them it produced a better best solution). DFDR2LEX had better results on 22 instances (in 8 instances it produced a better best solution), whereas 2LEX had better results on 14 instances (in 4 instances it produced a better best solution). DFDR3LEX had better results on 21 instances (in 3 instances it produced a better best solution), whereas 3LEX had better results on 15 instances (in 4 instances it produced a better best solution). DFDR4LEX had better results on 19 instances (in one instance it produced a better best solution), whereas 4LEX had better results on 17 instances (in one instance it produced a better best solution). The following is a summary of the results comparing llex to (l + 1)LEX: 2LEX had better results on 35 instances (in 17 instances it produced a better best solution), whereas 1LEX had better results on one instance (it never produced a better best result). 3LEX had better results on 30 instances (in 7 instances it produced a better best solution), whereas 2LEX had better results on 4 instances (it never produced a better best result). 11

12 4LEX had better results on 28 instances (in 3 instances it produced a better best solution), whereas 3LEX had better results on 6 instances (in one instance it produced a better best solution). The following is a summary of the results comparing DFDRlLEX to DFDR(l + 1)LEX: DFDR2LEX had better results on 30 instances (in 3 instances it produced a better best solution), whereas DFDR1LEX had better results on 3 instances (it never produced a better best result). DFDR3LEX had better results on 22 instances (in one instance it produced a better best solution), whereas DFDR2LEX had better results on 9 instances (it never produced a better best result). DFDR4LEX had better results on 21 instances (in one instance it produced a better best solution), whereas DFDR3LEX had better results on 12 instances (in one instance it produced a better best solution). The following is a summary of the results comparing SIMlLEX to SIM(l + 1)LEX: SIM2LEX had better results on 20 instances (in one instance it produced a better best solution), whereas SIM1LEX had better results on 9 instances (it never produced a better best result). SIM3LEX had better results on 13 instances (in one instance it produced a better best solution), whereas SIM2LEX had better results on 10 instances (it never produced a better best result). SIM4LEX had better results on 11 instances (it never produced a better best result), whereas SIM3LEX had better results on 15 instances (it never produced a better best result). The following properties could be observed in the study: DF is very fast (it produces solutions within a few seconds even for large instances). It produces solutions of various quality, some are very far from the optimum (for example in FRI26 with p = 10). However, the best among the n values is often a good approximation, mainly when p is relatively large. 12

13 The combination, DFDR, in which the output of DF produces the initial solutions to DR is a reasonable approximation algorithm. It is still very fast, but sometimes even its best solution is inferior to those produced by 1LEX. When applying the lexicographic rule in local search from random points or from the results of DFDR, the quality of the solutions usually improved when l grows from 1 to 4. The main improvement occurs when increasing l from 1 to 2. When applying the lexicographic rule in simulating annealing the quality of the solution usually improves when l grows from 1 to 2. However, when l grows further it sometimes improves and sometimes not. The change in computation time as a result of applying the lexicographic search instead of local search is insignificant. 7 Conclusion The use of lexicographic local search was motivated by both worst case analysis and numerical results. The advantages and limitations of the lexicographic local search were emphasized by bad examples. The lexicographic local search proved to be superior to the ordinary local search. This superiority was demonstrated by a worst-case analysis and by a computational study. 13

14 Appendix In the appendix we show the detailed numerical results relating to the following instances: CT24: An instance with 24 nodes from [8]. FRI26: An instance with 26 nodes from [15]. CT30: An instance with 30 nodes from [9]. DANTZIG42: An instance with 42 nodes from [15]. SWISS42: An instance with 42 nodes from [15]. CT48: An instance with 48 nodes from [8]. EIL51: An instance with 51 nodes from [15]. BERLIN52: An instance with 52 nodes from [15]. BRASIL58: An instance with 58 nodes from [15]. ST70: An instance with 70 nodes from [15]. EIL76: An instance with 76 nodes from [15]. EIL101: An instance with 101 nodes from [15]. CT120: An instance with 120 nodes from [8]. SI535: An instance with 535 nodes from [15]. We tested in each case several p values. p= 5 10 best freq avg worst time best freq avg worst time DF s s DFDR s s 1LEX s s 2LEX s s 3LEX s s 4LEX s s 1LEX s s 2LEX s s 3LEX s s 4LEX s s 1LEX m m 2LEX m m 3LEX m m 4LEX m m Table 1: Results for CT24 14

15 p= best freq avg worst time best freq avg worst time best freq avg worst time DF s s s DFDR s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX m m m 2LEX m m m 3LEX m m m 4LEX m m m Table 2: Results for FRI26 p= 5 10 best freq avg worst time best freq avg worst time DF s s DFDR s s 1LEX s s 2LEX s s 3LEX s s 4LEX s s 1LEX s s 2LEX s s 3LEX s s 4LEX s s 1LEX m m 2LEX m m 3LEX m m 4LEX m m Table 3: Results for CT30 p= best freq avg worst time best freq avg worst time best freq avg worst time DF s s s DFDR s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX m m m 2LEX m m m 3LEX m m m 4LEX m m m Table 4: Results for DANTZIG42 15

16 p= best freq avg worst time best freq avg worst time best freq avg worst time DF s s s DFDR s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX m m m 2LEX m m m 3LEX m m m 4LEX m m m Table 5: Results for SWISS42 p= best freq avg worst time best freq avg worst time best freq avg worst time DF s s s DFDR s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX m m m 2LEX m m m 3LEX m m m 4LEX m m m Table 6: Results for CT48 p= best freq avg worst time best freq avg worst time best freq avg worst time DF s s s DFDR s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX m m m 2LEX m m m 3LEX m m m 4LEX m m m Table 7: Results for EIL51 16

17 p= best freq avg worst time best freq avg worst time best freq avg worst time DF s s s DFDR s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX m m m 2LEX m m m 3LEX m m m 4LEX m m m Table 8: Results for BERLIN52 p= best freq avg worst time best freq avg worst time best freq avg worst time DF s s s DFDR s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX m m m 2LEX m m m 3LEX m m m 4LEX m m m Table 9: Results for BRASIL58 p= best freq avg worst time best freq avg worst time best freq avg worst time DF s s s DFDR s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX m m m 2LEX m m m 3LEX m m m 4LEX m m m Table 10: Results for ST70 17

18 p= best freq avg worst time best freq avg worst time best freq avg worst time DF s s s DFDR s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX m m m 2LEX m m m 3LEX m m m 4LEX m m m Table 11: Results for EIL76 p= best freq avg worst time best freq avg worst time best freq avg worst time DF s s s DFDR s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX s s s 2LEX s s s 3LEX s s s 4LEX s s s 1LEX m m m 2LEX m m m 3LEX m m m 4LEX m m m Table 12: Results for EIL101 p= best freq avg worst time best freq avg worst time DF s s DFDR s s 1LEX m m 2LEX m m 3LEX m m 4LEX m m 1LEX m m 2LEX m m 3LEX m m 4LEX m m 1LEX h h 2LEX h h 3LEX h h 4LEX h h Table 13: Results for CT120 18

19 p= best freq avg worst time best freq avg worst time DF s s DFDR m m 1LEX h LEX h LEX h LEX h LEX h h 2LEX h h 3LEX h h 4LEX h h Table 14: Results for SI535 References [1] L. Cooper, Location-allocation problems, Operations Research 11, , [2] Z. Drezner, The p-center problem - heuristics and optimal algorithms, Journal of the Operational Research Society 35, , [3] M. E. Dyer and A. M. Frieze, A simple heuristic for the p-center problem, Operations Research Letters 3, , [4] R.J Fowler, M.S. Paterson and S.L. Tanimoto, Optimal packing and covering in the plane are NP-complete, Information Processing Letters 12, , [5] M. Fürer and B. Raghavachari, Approximating the minimum degree spanning tree to within one from the optimal degree, Proceedings of the Third Annual ACM-SIAM Symposium on Discrete Algorithms, [6] M. Fürer and B. Raghavachari, Approximating the minimum degree Steiner tree to within one of optimal, Journal of Algorithms 17, , [7] C. A. Glass, C. N. Potts and P. Shade, Unrelated parallel machine scheduling using local search, Mathematical and Computer Modeling 20, 41-52, [8] M. Grötschel and O. Holland, Solution of large-scale symmetric travelling salesman problems, Mathematical Programming 51, , DFDRLEX is extremely slow for p =

20 [9] G. Y. Handler and P. Mirchandani, Location on Networks, Theory and Algorithms, M.I.T. Press, Cambridge Massachusetts, [10] D.S. Hochbaum and D. B. Shmoys, A best possible heuristic for the k-center problem, Mathematics of Operations Research 10, , [11] D.S. Hochbaum and D. B. Shmoys, A unified approach to approximation algorithms for bottleneck problems, Journal of the Association for Computing Machinery 33, , [12] S. Khanna, R. Motwani, M. Sudan and U. Vazirani, On Syntactic versus computational views of approximability, Proceedings of the 35th Annual IEEE Symposium on Foundations of Computer Science, , [13] J. Plesník, A heuristic for the p-center problem in graphs, Discrete Applied Mathematics 17, , [14] J. Plesník, On the interchange heuristic for locating centers and medians in a graph, Math. Slovaca 37, , [15] G. Reinelt, TSPLIB, ftp://elib.zib-berlin.de/pub/mp-testdata/tsp/index.html. 20

Theorem 2.9: nearest addition algorithm

Theorem 2.9: nearest addition algorithm There are severe limits on our ability to compute near-optimal tours It is NP-complete to decide whether a given undirected =(,)has a Hamiltonian cycle An approximation algorithm for the TSP can be used

More information

Introduction to Approximation Algorithms

Introduction to Approximation Algorithms Introduction to Approximation Algorithms Dr. Gautam K. Das Departmet of Mathematics Indian Institute of Technology Guwahati, India gkd@iitg.ernet.in February 19, 2016 Outline of the lecture Background

More information

Approximability Results for the p-center Problem

Approximability Results for the p-center Problem Approximability Results for the p-center Problem Stefan Buettcher Course Project Algorithm Design and Analysis Prof. Timothy Chan University of Waterloo, Spring 2004 The p-center

More information

Bottleneck Steiner Tree with Bounded Number of Steiner Vertices

Bottleneck Steiner Tree with Bounded Number of Steiner Vertices Bottleneck Steiner Tree with Bounded Number of Steiner Vertices A. Karim Abu-Affash Paz Carmi Matthew J. Katz June 18, 2011 Abstract Given a complete graph G = (V, E), where each vertex is labeled either

More information

Module 6 NP-Complete Problems and Heuristics

Module 6 NP-Complete Problems and Heuristics Module 6 NP-Complete Problems and Heuristics Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu P, NP-Problems Class

More information

An Efficient Approximation for the Generalized Assignment Problem

An Efficient Approximation for the Generalized Assignment Problem An Efficient Approximation for the Generalized Assignment Problem Reuven Cohen Liran Katzir Danny Raz Department of Computer Science Technion Haifa 32000, Israel Abstract We present a simple family of

More information

Lecture 1. 2 Motivation: Fast. Reliable. Cheap. Choose two.

Lecture 1. 2 Motivation: Fast. Reliable. Cheap. Choose two. Approximation Algorithms and Hardness of Approximation February 19, 2013 Lecture 1 Lecturer: Ola Svensson Scribes: Alantha Newman 1 Class Information 4 credits Lecturers: Ola Svensson (ola.svensson@epfl.ch)

More information

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018 CS 580: Algorithm Design and Analysis Jeremiah Blocki Purdue University Spring 2018 Chapter 11 Approximation Algorithms Slides by Kevin Wayne. Copyright @ 2005 Pearson-Addison Wesley. All rights reserved.

More information

Module 6 P, NP, NP-Complete Problems and Approximation Algorithms

Module 6 P, NP, NP-Complete Problems and Approximation Algorithms Module 6 P, NP, NP-Complete Problems and Approximation Algorithms Dr. Natarajan Meghanathan Associate Professor of Computer Science Jackson State University Jackson, MS 39217 E-mail: natarajan.meghanathan@jsums.edu

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms Given an NP-hard problem, what should be done? Theory says you're unlikely to find a poly-time algorithm. Must sacrifice one of three desired features. Solve problem to optimality.

More information

11. APPROXIMATION ALGORITHMS

11. APPROXIMATION ALGORITHMS 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: vertex cover LP rounding: vertex cover generalized load balancing knapsack problem Lecture slides by Kevin Wayne Copyright 2005

More information

Randomized rounding of semidefinite programs and primal-dual method for integer linear programming. Reza Moosavi Dr. Saeedeh Parsaeefard Dec.

Randomized rounding of semidefinite programs and primal-dual method for integer linear programming. Reza Moosavi Dr. Saeedeh Parsaeefard Dec. Randomized rounding of semidefinite programs and primal-dual method for integer linear programming Dr. Saeedeh Parsaeefard 1 2 3 4 Semidefinite Programming () 1 Integer Programming integer programming

More information

Treewidth and graph minors

Treewidth and graph minors Treewidth and graph minors Lectures 9 and 10, December 29, 2011, January 5, 2012 We shall touch upon the theory of Graph Minors by Robertson and Seymour. This theory gives a very general condition under

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A 4 credit unit course Part of Theoretical Computer Science courses at the Laboratory of Mathematics There will be 4 hours

More information

Topic: Local Search: Max-Cut, Facility Location Date: 2/13/2007

Topic: Local Search: Max-Cut, Facility Location Date: 2/13/2007 CS880: Approximations Algorithms Scribe: Chi Man Liu Lecturer: Shuchi Chawla Topic: Local Search: Max-Cut, Facility Location Date: 2/3/2007 In previous lectures we saw how dynamic programming could be

More information

11. APPROXIMATION ALGORITHMS

11. APPROXIMATION ALGORITHMS Coping with NP-completeness 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: weighted vertex cover LP rounding: weighted vertex cover generalized load balancing knapsack problem

More information

1 The Traveling Salesperson Problem (TSP)

1 The Traveling Salesperson Problem (TSP) CS 598CSC: Approximation Algorithms Lecture date: January 23, 2009 Instructor: Chandra Chekuri Scribe: Sungjin Im In the previous lecture, we had a quick overview of several basic aspects of approximation

More information

Lecture 8: The Traveling Salesman Problem

Lecture 8: The Traveling Salesman Problem Lecture 8: The Traveling Salesman Problem Let G = (V, E) be an undirected graph. A Hamiltonian cycle of G is a cycle that visits every vertex v V exactly once. Instead of Hamiltonian cycle, we sometimes

More information

An Optimal Algorithm for the Euclidean Bottleneck Full Steiner Tree Problem

An Optimal Algorithm for the Euclidean Bottleneck Full Steiner Tree Problem An Optimal Algorithm for the Euclidean Bottleneck Full Steiner Tree Problem Ahmad Biniaz Anil Maheshwari Michiel Smid September 30, 2013 Abstract Let P and S be two disjoint sets of n and m points in the

More information

On the Max Coloring Problem

On the Max Coloring Problem On the Max Coloring Problem Leah Epstein Asaf Levin May 22, 2010 Abstract We consider max coloring on hereditary graph classes. The problem is defined as follows. Given a graph G = (V, E) and positive

More information

CMSC 451: Lecture 22 Approximation Algorithms: Vertex Cover and TSP Tuesday, Dec 5, 2017

CMSC 451: Lecture 22 Approximation Algorithms: Vertex Cover and TSP Tuesday, Dec 5, 2017 CMSC 451: Lecture 22 Approximation Algorithms: Vertex Cover and TSP Tuesday, Dec 5, 2017 Reading: Section 9.2 of DPV. Section 11.3 of KT presents a different approximation algorithm for Vertex Cover. Coping

More information

Online Facility Location

Online Facility Location Online Facility Location Adam Meyerson Abstract We consider the online variant of facility location, in which demand points arrive one at a time and we must maintain a set of facilities to service these

More information

Module 6 NP-Complete Problems and Heuristics

Module 6 NP-Complete Problems and Heuristics Module 6 NP-Complete Problems and Heuristics Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS 397 E-mail: natarajan.meghanathan@jsums.edu Optimization vs. Decision

More information

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502)

Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik. Combinatorial Optimization (MA 4502) Technische Universität München, Zentrum Mathematik Lehrstuhl für Angewandte Geometrie und Diskrete Mathematik Combinatorial Optimization (MA 4502) Dr. Michael Ritter Problem Sheet 4 Homework Problems Problem

More information

Chapter 9 Graph Algorithms

Chapter 9 Graph Algorithms Introduction graph theory useful in practice represent many real-life problems can be if not careful with data structures Chapter 9 Graph s 2 Definitions Definitions an undirected graph is a finite set

More information

Comparing the strength of query types in property testing: The case of testing k-colorability

Comparing the strength of query types in property testing: The case of testing k-colorability Comparing the strength of query types in property testing: The case of testing k-colorability Ido Ben-Eliezer Tali Kaufman Michael Krivelevich Dana Ron Abstract We study the power of four query models

More information

THE FIRST APPROXIMATED DISTRIBUTED ALGORITHM FOR THE MINIMUM DEGREE SPANNING TREE PROBLEM ON GENERAL GRAPHS. and

THE FIRST APPROXIMATED DISTRIBUTED ALGORITHM FOR THE MINIMUM DEGREE SPANNING TREE PROBLEM ON GENERAL GRAPHS. and International Journal of Foundations of Computer Science c World Scientific Publishing Company THE FIRST APPROXIMATED DISTRIBUTED ALGORITHM FOR THE MINIMUM DEGREE SPANNING TREE PROBLEM ON GENERAL GRAPHS

More information

Lower Bounds for Insertion Methods for TSP. Yossi Azar. Abstract. optimal tour. The lower bound holds even in the Euclidean Plane.

Lower Bounds for Insertion Methods for TSP. Yossi Azar. Abstract. optimal tour. The lower bound holds even in the Euclidean Plane. Lower Bounds for Insertion Methods for TSP Yossi Azar Abstract We show that the random insertion method for the traveling salesman problem (TSP) may produce a tour (log log n= log log log n) times longer

More information

Approximation Basics

Approximation Basics Milestones, Concepts, and Examples Xiaofeng Gao Department of Computer Science and Engineering Shanghai Jiao Tong University, P.R.China Spring 2015 Spring, 2015 Xiaofeng Gao 1/53 Outline History NP Optimization

More information

COMP 355 Advanced Algorithms Approximation Algorithms: VC and TSP Chapter 11 (KT) Section (CLRS)

COMP 355 Advanced Algorithms Approximation Algorithms: VC and TSP Chapter 11 (KT) Section (CLRS) COMP 355 Advanced Algorithms Approximation Algorithms: VC and TSP Chapter 11 (KT) Section 35.1-35.2(CLRS) 1 Coping with NP-Completeness Brute-force search: This is usually only a viable option for small

More information

Vertex Cover Approximations

Vertex Cover Approximations CS124 Lecture 20 Heuristics can be useful in practice, but sometimes we would like to have guarantees. Approximation algorithms give guarantees. It is worth keeping in mind that sometimes approximation

More information

Chapter 9 Graph Algorithms

Chapter 9 Graph Algorithms Chapter 9 Graph Algorithms 2 Introduction graph theory useful in practice represent many real-life problems can be if not careful with data structures 3 Definitions an undirected graph G = (V, E) is a

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A new 4 credit unit course Part of Theoretical Computer Science courses at the Department of Mathematics There will be 4 hours

More information

11.1 Facility Location

11.1 Facility Location CS787: Advanced Algorithms Scribe: Amanda Burton, Leah Kluegel Lecturer: Shuchi Chawla Topic: Facility Location ctd., Linear Programming Date: October 8, 2007 Today we conclude the discussion of local

More information

1 Variations of the Traveling Salesman Problem

1 Variations of the Traveling Salesman Problem Stanford University CS26: Optimization Handout 3 Luca Trevisan January, 20 Lecture 3 In which we prove the equivalence of three versions of the Traveling Salesman Problem, we provide a 2-approximate algorithm,

More information

FOUR EDGE-INDEPENDENT SPANNING TREES 1

FOUR EDGE-INDEPENDENT SPANNING TREES 1 FOUR EDGE-INDEPENDENT SPANNING TREES 1 Alexander Hoyer and Robin Thomas School of Mathematics Georgia Institute of Technology Atlanta, Georgia 30332-0160, USA ABSTRACT We prove an ear-decomposition theorem

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms Subhash Suri June 5, 2018 1 Figure of Merit: Performance Ratio Suppose we are working on an optimization problem in which each potential solution has a positive cost, and we want

More information

The strong chromatic number of a graph

The strong chromatic number of a graph The strong chromatic number of a graph Noga Alon Abstract It is shown that there is an absolute constant c with the following property: For any two graphs G 1 = (V, E 1 ) and G 2 = (V, E 2 ) on the same

More information

Subdivided graphs have linear Ramsey numbers

Subdivided graphs have linear Ramsey numbers Subdivided graphs have linear Ramsey numbers Noga Alon Bellcore, Morristown, NJ 07960, USA and Department of Mathematics Raymond and Beverly Sackler Faculty of Exact Sciences Tel Aviv University, Tel Aviv,

More information

Clustering. (Part 2)

Clustering. (Part 2) Clustering (Part 2) 1 k-means clustering 2 General Observations on k-means clustering In essence, k-means clustering aims at minimizing cluster variance. It is typically used in Euclidean spaces and works

More information

Notes for Lecture 24

Notes for Lecture 24 U.C. Berkeley CS170: Intro to CS Theory Handout N24 Professor Luca Trevisan December 4, 2001 Notes for Lecture 24 1 Some NP-complete Numerical Problems 1.1 Subset Sum The Subset Sum problem is defined

More information

Maximal Monochromatic Geodesics in an Antipodal Coloring of Hypercube

Maximal Monochromatic Geodesics in an Antipodal Coloring of Hypercube Maximal Monochromatic Geodesics in an Antipodal Coloring of Hypercube Kavish Gandhi April 4, 2015 Abstract A geodesic in the hypercube is the shortest possible path between two vertices. Leader and Long

More information

Stanford University CS261: Optimization Handout 1 Luca Trevisan January 4, 2011

Stanford University CS261: Optimization Handout 1 Luca Trevisan January 4, 2011 Stanford University CS261: Optimization Handout 1 Luca Trevisan January 4, 2011 Lecture 1 In which we describe what this course is about and give two simple examples of approximation algorithms 1 Overview

More information

Module 6 NP-Complete Problems and Heuristics

Module 6 NP-Complete Problems and Heuristics Module 6 NP-Complete Problems and Heuristics Dr. Natarajan Meghanathan Professor of Computer Science Jackson State University Jackson, MS 97 E-mail: natarajan.meghanathan@jsums.edu Optimization vs. Decision

More information

Fall CS598CC: Approximation Algorithms. Chandra Chekuri

Fall CS598CC: Approximation Algorithms. Chandra Chekuri Fall 2006 CS598CC: Approximation Algorithms Chandra Chekuri Administrivia http://www.cs.uiuc.edu/homes/chekuri/teaching/fall2006/approx.htm Grading: 4 home works (60-70%), 1 take home final (30-40%) Mailing

More information

Algorithms for Euclidean TSP

Algorithms for Euclidean TSP This week, paper [2] by Arora. See the slides for figures. See also http://www.cs.princeton.edu/~arora/pubs/arorageo.ps Algorithms for Introduction This lecture is about the polynomial time approximation

More information

Complexity Results on Graphs with Few Cliques

Complexity Results on Graphs with Few Cliques Discrete Mathematics and Theoretical Computer Science DMTCS vol. 9, 2007, 127 136 Complexity Results on Graphs with Few Cliques Bill Rosgen 1 and Lorna Stewart 2 1 Institute for Quantum Computing and School

More information

11. APPROXIMATION ALGORITHMS

11. APPROXIMATION ALGORITHMS 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: vertex cover LP rounding: vertex cover generalized load balancing knapsack problem Lecture slides by Kevin Wayne Copyright 2005

More information

Crossing Families. Abstract

Crossing Families. Abstract Crossing Families Boris Aronov 1, Paul Erdős 2, Wayne Goddard 3, Daniel J. Kleitman 3, Michael Klugerman 3, János Pach 2,4, Leonard J. Schulman 3 Abstract Given a set of points in the plane, a crossing

More information

From NP to P Musings on a Programming Contest Problem

From NP to P Musings on a Programming Contest Problem From NP to P Musings on a Programming Contest Problem Edward Corwin Antonette Logar Mathematics and CS SDSM&T Rapid City, SD 57701 edward.corwin@sdsmt.edu ABSTRACT A classic analysis of algorithms problem

More information

Chapter 15 Introduction to Linear Programming

Chapter 15 Introduction to Linear Programming Chapter 15 Introduction to Linear Programming An Introduction to Optimization Spring, 2015 Wei-Ta Chu 1 Brief History of Linear Programming The goal of linear programming is to determine the values of

More information

Approximating Fault-Tolerant Steiner Subgraphs in Heterogeneous Wireless Networks

Approximating Fault-Tolerant Steiner Subgraphs in Heterogeneous Wireless Networks Approximating Fault-Tolerant Steiner Subgraphs in Heterogeneous Wireless Networks Ambreen Shahnaz and Thomas Erlebach Department of Computer Science University of Leicester University Road, Leicester LE1

More information

Optimal tour along pubs in the UK

Optimal tour along pubs in the UK 1 From Facebook Optimal tour along 24727 pubs in the UK Road distance (by google maps) see also http://www.math.uwaterloo.ca/tsp/pubs/index.html (part of TSP homepage http://www.math.uwaterloo.ca/tsp/

More information

Mathematical and Algorithmic Foundations Linear Programming and Matchings

Mathematical and Algorithmic Foundations Linear Programming and Matchings Adavnced Algorithms Lectures Mathematical and Algorithmic Foundations Linear Programming and Matchings Paul G. Spirakis Department of Computer Science University of Patras and Liverpool Paul G. Spirakis

More information

1 Unweighted Set Cover

1 Unweighted Set Cover Comp 60: Advanced Algorithms Tufts University, Spring 018 Prof. Lenore Cowen Scribe: Yuelin Liu Lecture 7: Approximation Algorithms: Set Cover and Max Cut 1 Unweighted Set Cover 1.1 Formulations There

More information

Extremal Graph Theory: Turán s Theorem

Extremal Graph Theory: Turán s Theorem Bridgewater State University Virtual Commons - Bridgewater State University Honors Program Theses and Projects Undergraduate Honors Program 5-9-07 Extremal Graph Theory: Turán s Theorem Vincent Vascimini

More information

Integer Programming Theory

Integer Programming Theory Integer Programming Theory Laura Galli October 24, 2016 In the following we assume all functions are linear, hence we often drop the term linear. In discrete optimization, we seek to find a solution x

More information

The Set Cover with Pairs Problem

The Set Cover with Pairs Problem The Set Cover with Pairs Problem Refael Hassin Danny Segev Abstract We consider a generalization of the set cover problem, in which elements are covered by pairs of objects, and we are required to find

More information

Simplicial Global Optimization

Simplicial Global Optimization Simplicial Global Optimization Julius Žilinskas Vilnius University, Lithuania September, 7 http://web.vu.lt/mii/j.zilinskas Global optimization Find f = min x A f (x) and x A, f (x ) = f, where A R n.

More information

CME 305: Discrete Mathematics and Algorithms Instructor: Reza Zadeh HW#3 Due at the beginning of class Thursday 02/26/15

CME 305: Discrete Mathematics and Algorithms Instructor: Reza Zadeh HW#3 Due at the beginning of class Thursday 02/26/15 CME 305: Discrete Mathematics and Algorithms Instructor: Reza Zadeh (rezab@stanford.edu) HW#3 Due at the beginning of class Thursday 02/26/15 1. Consider a model of a nonbipartite undirected graph in which

More information

Chapter Design Techniques for Approximation Algorithms

Chapter Design Techniques for Approximation Algorithms Chapter 2 Design Techniques for Approximation Algorithms I N THE preceding chapter we observed that many relevant optimization problems are NP-hard, and that it is unlikely that we will ever be able to

More information

Approximation Algorithms

Approximation Algorithms Chapter 8 Approximation Algorithms Algorithm Theory WS 2016/17 Fabian Kuhn Approximation Algorithms Optimization appears everywhere in computer science We have seen many examples, e.g.: scheduling jobs

More information

arxiv:cs/ v1 [cs.ds] 20 Feb 2003

arxiv:cs/ v1 [cs.ds] 20 Feb 2003 The Traveling Salesman Problem for Cubic Graphs David Eppstein School of Information & Computer Science University of California, Irvine Irvine, CA 92697-3425, USA eppstein@ics.uci.edu arxiv:cs/0302030v1

More information

A SIMPLE APPROXIMATION ALGORITHM FOR NONOVERLAPPING LOCAL ALIGNMENTS (WEIGHTED INDEPENDENT SETS OF AXIS PARALLEL RECTANGLES)

A SIMPLE APPROXIMATION ALGORITHM FOR NONOVERLAPPING LOCAL ALIGNMENTS (WEIGHTED INDEPENDENT SETS OF AXIS PARALLEL RECTANGLES) Chapter 1 A SIMPLE APPROXIMATION ALGORITHM FOR NONOVERLAPPING LOCAL ALIGNMENTS (WEIGHTED INDEPENDENT SETS OF AXIS PARALLEL RECTANGLES) Piotr Berman Department of Computer Science & Engineering Pennsylvania

More information

On-line Steiner Trees in the Euclidean Plane

On-line Steiner Trees in the Euclidean Plane On-line Steiner Trees in the Euclidean Plane Noga Alon Yossi Azar Abstract Suppose we are given a sequence of n points in the Euclidean plane, and our objective is to construct, on-line, a connected graph

More information

4 Basics of Trees. Petr Hliněný, FI MU Brno 1 FI: MA010: Trees and Forests

4 Basics of Trees. Petr Hliněný, FI MU Brno 1 FI: MA010: Trees and Forests 4 Basics of Trees Trees, actually acyclic connected simple graphs, are among the simplest graph classes. Despite their simplicity, they still have rich structure and many useful application, such as in

More information

On the Complexity of the Policy Improvement Algorithm. for Markov Decision Processes

On the Complexity of the Policy Improvement Algorithm. for Markov Decision Processes On the Complexity of the Policy Improvement Algorithm for Markov Decision Processes Mary Melekopoglou Anne Condon Computer Sciences Department University of Wisconsin - Madison 0 West Dayton Street Madison,

More information

A NEW HEURISTIC ALGORITHM FOR MULTIPLE TRAVELING SALESMAN PROBLEM

A NEW HEURISTIC ALGORITHM FOR MULTIPLE TRAVELING SALESMAN PROBLEM TWMS J. App. Eng. Math. V.7, N.1, 2017, pp. 101-109 A NEW HEURISTIC ALGORITHM FOR MULTIPLE TRAVELING SALESMAN PROBLEM F. NURIYEVA 1, G. KIZILATES 2, Abstract. The Multiple Traveling Salesman Problem (mtsp)

More information

Part II. Graph Theory. Year

Part II. Graph Theory. Year Part II Year 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2017 53 Paper 3, Section II 15H Define the Ramsey numbers R(s, t) for integers s, t 2. Show that R(s, t) exists for all s,

More information

Approximation Algorithms for Geometric Intersection Graphs

Approximation Algorithms for Geometric Intersection Graphs Approximation Algorithms for Geometric Intersection Graphs Subhas C. Nandy (nandysc@isical.ac.in) Advanced Computing and Microelectronics Unit Indian Statistical Institute Kolkata 700108, India. Outline

More information

(67686) Mathematical Foundations of AI July 30, Lecture 11

(67686) Mathematical Foundations of AI July 30, Lecture 11 (67686) Mathematical Foundations of AI July 30, 2008 Lecturer: Ariel D. Procaccia Lecture 11 Scribe: Michael Zuckerman and Na ama Zohary 1 Cooperative Games N = {1,...,n} is the set of players (agents).

More information

The Power of Local Optimization: Approximation Algorithms for Maximum-Leaf Spanning Tree

The Power of Local Optimization: Approximation Algorithms for Maximum-Leaf Spanning Tree The Power of Local Optimization: Approximation Algorithms for Maximum-Leaf Spanning Tree Hsueh-I Lu R. Ravi y Brown University, Providence, RI 02912 Abstract Given an undirected graph G, finding a spanning

More information

Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch.

Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. Iterative Improvement Algorithm design technique for solving optimization problems Start with a feasible solution Repeat the following step until no improvement can be found: change the current feasible

More information

Better approximations for max TSP

Better approximations for max TSP Information Processing Letters 75 (2000) 181 186 Better approximations for max TSP Refael Hassin,1, Shlomi Rubinstein Department of Statistics and Operations Research, School of Mathematical Sciences,

More information

New Constructions of Non-Adaptive and Error-Tolerance Pooling Designs

New Constructions of Non-Adaptive and Error-Tolerance Pooling Designs New Constructions of Non-Adaptive and Error-Tolerance Pooling Designs Hung Q Ngo Ding-Zhu Du Abstract We propose two new classes of non-adaptive pooling designs The first one is guaranteed to be -error-detecting

More information

Lecture 9. Semidefinite programming is linear programming where variables are entries in a positive semidefinite matrix.

Lecture 9. Semidefinite programming is linear programming where variables are entries in a positive semidefinite matrix. CSE525: Randomized Algorithms and Probabilistic Analysis Lecture 9 Lecturer: Anna Karlin Scribe: Sonya Alexandrova and Keith Jia 1 Introduction to semidefinite programming Semidefinite programming is linear

More information

A CSP Search Algorithm with Reduced Branching Factor

A CSP Search Algorithm with Reduced Branching Factor A CSP Search Algorithm with Reduced Branching Factor Igor Razgon and Amnon Meisels Department of Computer Science, Ben-Gurion University of the Negev, Beer-Sheva, 84-105, Israel {irazgon,am}@cs.bgu.ac.il

More information

Decreasing the Diameter of Bounded Degree Graphs

Decreasing the Diameter of Bounded Degree Graphs Decreasing the Diameter of Bounded Degree Graphs Noga Alon András Gyárfás Miklós Ruszinkó February, 00 To the memory of Paul Erdős Abstract Let f d (G) denote the minimum number of edges that have to be

More information

1 Linear programming relaxation

1 Linear programming relaxation Cornell University, Fall 2010 CS 6820: Algorithms Lecture notes: Primal-dual min-cost bipartite matching August 27 30 1 Linear programming relaxation Recall that in the bipartite minimum-cost perfect matching

More information

Superconcentrators of depth 2 and 3; odd levels help (rarely)

Superconcentrators of depth 2 and 3; odd levels help (rarely) Superconcentrators of depth 2 and 3; odd levels help (rarely) Noga Alon Bellcore, Morristown, NJ, 07960, USA and Department of Mathematics Raymond and Beverly Sackler Faculty of Exact Sciences Tel Aviv

More information

CSE 417 Branch & Bound (pt 4) Branch & Bound

CSE 417 Branch & Bound (pt 4) Branch & Bound CSE 417 Branch & Bound (pt 4) Branch & Bound Reminders > HW8 due today > HW9 will be posted tomorrow start early program will be slow, so debugging will be slow... Review of previous lectures > Complexity

More information

Scribe from 2014/2015: Jessica Su, Hieu Pham Date: October 6, 2016 Editor: Jimmy Wu

Scribe from 2014/2015: Jessica Su, Hieu Pham Date: October 6, 2016 Editor: Jimmy Wu CS 267 Lecture 3 Shortest paths, graph diameter Scribe from 2014/2015: Jessica Su, Hieu Pham Date: October 6, 2016 Editor: Jimmy Wu Today we will talk about algorithms for finding shortest paths in a graph.

More information

arxiv: v2 [cs.dm] 3 Dec 2014

arxiv: v2 [cs.dm] 3 Dec 2014 The Student/Project Allocation problem with group projects Aswhin Arulselvan, Ágnes Cseh, and Jannik Matuschke arxiv:4.035v [cs.dm] 3 Dec 04 Department of Management Science, University of Strathclyde,

More information

On 2-Subcolourings of Chordal Graphs

On 2-Subcolourings of Chordal Graphs On 2-Subcolourings of Chordal Graphs Juraj Stacho School of Computing Science, Simon Fraser University 8888 University Drive, Burnaby, B.C., Canada V5A 1S6 jstacho@cs.sfu.ca Abstract. A 2-subcolouring

More information

Geometric Steiner Trees

Geometric Steiner Trees Geometric Steiner Trees From the book: Optimal Interconnection Trees in the Plane By Marcus Brazil and Martin Zachariasen Part 2: Global properties of Euclidean Steiner Trees and GeoSteiner Marcus Brazil

More information

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I. Instructor: Shaddin Dughmi

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I. Instructor: Shaddin Dughmi CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I Instructor: Shaddin Dughmi Announcements Posted solutions to HW1 Today: Combinatorial problems

More information

On Covering a Graph Optimally with Induced Subgraphs

On Covering a Graph Optimally with Induced Subgraphs On Covering a Graph Optimally with Induced Subgraphs Shripad Thite April 1, 006 Abstract We consider the problem of covering a graph with a given number of induced subgraphs so that the maximum number

More information

arxiv:cs/ v1 [cs.cc] 28 Apr 2003

arxiv:cs/ v1 [cs.cc] 28 Apr 2003 ICM 2002 Vol. III 1 3 arxiv:cs/0304039v1 [cs.cc] 28 Apr 2003 Approximation Thresholds for Combinatorial Optimization Problems Uriel Feige Abstract An NP-hard combinatorial optimization problem Π is said

More information

Fast and Simple Algorithms for Weighted Perfect Matching

Fast and Simple Algorithms for Weighted Perfect Matching Fast and Simple Algorithms for Weighted Perfect Matching Mirjam Wattenhofer, Roger Wattenhofer {mirjam.wattenhofer,wattenhofer}@inf.ethz.ch, Department of Computer Science, ETH Zurich, Switzerland Abstract

More information

Polynomial Time Approximation Schemes for the Euclidean Traveling Salesman Problem

Polynomial Time Approximation Schemes for the Euclidean Traveling Salesman Problem PROJECT FOR CS388G: ALGORITHMS: TECHNIQUES/THEORY (FALL 2015) Polynomial Time Approximation Schemes for the Euclidean Traveling Salesman Problem Shanshan Wu Vatsal Shah October 20, 2015 Abstract In this

More information

Travelling Salesman Problem. Algorithms and Networks 2015/2016 Hans L. Bodlaender Johan M. M. van Rooij

Travelling Salesman Problem. Algorithms and Networks 2015/2016 Hans L. Bodlaender Johan M. M. van Rooij Travelling Salesman Problem Algorithms and Networks 2015/2016 Hans L. Bodlaender Johan M. M. van Rooij 1 Contents TSP and its applications Heuristics and approximation algorithms Construction heuristics,

More information

Unlabeled equivalence for matroids representable over finite fields

Unlabeled equivalence for matroids representable over finite fields Unlabeled equivalence for matroids representable over finite fields November 16, 2012 S. R. Kingan Department of Mathematics Brooklyn College, City University of New York 2900 Bedford Avenue Brooklyn,

More information

Approximating the Maximum Quadratic Assignment Problem 1

Approximating the Maximum Quadratic Assignment Problem 1 Approximating the Maximum Quadratic Assignment Problem 1 Esther M. Arkin Refael Hassin 3 Maxim Sviridenko 4 Keywords: Approximation algorithm; quadratic assignment problem 1 Introduction In the maximum

More information

6. Concluding Remarks

6. Concluding Remarks [8] K. J. Supowit, The relative neighborhood graph with an application to minimum spanning trees, Tech. Rept., Department of Computer Science, University of Illinois, Urbana-Champaign, August 1980, also

More information

Bipartite Roots of Graphs

Bipartite Roots of Graphs Bipartite Roots of Graphs Lap Chi Lau Department of Computer Science University of Toronto Graph H is a root of graph G if there exists a positive integer k such that x and y are adjacent in G if and only

More information

1. Lecture notes on bipartite matching February 4th,

1. Lecture notes on bipartite matching February 4th, 1. Lecture notes on bipartite matching February 4th, 2015 6 1.1.1 Hall s Theorem Hall s theorem gives a necessary and sufficient condition for a bipartite graph to have a matching which saturates (or matches)

More information

6. Lecture notes on matroid intersection

6. Lecture notes on matroid intersection Massachusetts Institute of Technology 18.453: Combinatorial Optimization Michel X. Goemans May 2, 2017 6. Lecture notes on matroid intersection One nice feature about matroids is that a simple greedy algorithm

More information

Framework for Design of Dynamic Programming Algorithms

Framework for Design of Dynamic Programming Algorithms CSE 441T/541T Advanced Algorithms September 22, 2010 Framework for Design of Dynamic Programming Algorithms Dynamic programming algorithms for combinatorial optimization generalize the strategy we studied

More information

Greedy Algorithms 1 {K(S) K(S) C} For large values of d, brute force search is not feasible because there are 2 d {1,..., d}.

Greedy Algorithms 1 {K(S) K(S) C} For large values of d, brute force search is not feasible because there are 2 d {1,..., d}. Greedy Algorithms 1 Simple Knapsack Problem Greedy Algorithms form an important class of algorithmic techniques. We illustrate the idea by applying it to a simplified version of the Knapsack Problem. Informally,

More information

High Dimensional Indexing by Clustering

High Dimensional Indexing by Clustering Yufei Tao ITEE University of Queensland Recall that, our discussion so far has assumed that the dimensionality d is moderately high, such that it can be regarded as a constant. This means that d should

More information