11. APPROXIMATION ALGORITHMS

Size: px
Start display at page:

Download "11. APPROXIMATION ALGORITHMS"

Transcription

1 Coping with NP-completeness 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: weighted vertex cover LP rounding: weighted vertex cover generalized load balancing knapsack problem Q. Suppose I need to solve an NP-hard problem. What should I do? A. Sacrifice one of three desired features. i. Solve arbitrary instances of the problem. ii. Solve problem to optimality. iii. Solve problem in polynomial time. ρ-approximation algorithm. Guaranteed to run in poly-time. Guaranteed to solve arbitrary instance of the problem Guaranteed to find solution within ratio ρ of true optimum. Lecture slides by Kevin Wayne Copyright 005 Pearson-Addison Wesley Challenge. Need to prove a solution s value is close to optimum, without even knowing what optimum value is Last updated on /19/18 5:3 AM Load balancing 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: weighted vertex cover LP rounding: weighted vertex cover generalized load balancing knapsack problem Input. m identical machines; n jobs, job j has processing time t j. Job j must run contiguously on one machine. A machine can process at most one job at a time. Def. Let S[i] be the subset of jobs assigned to machine i. The load of machine i is L[i] = Σ j S[i] t j. Def. The makespan is the maximum load on any machine L = max i L[i]. Load balancing. Assign each job to a machine to minimize makespan. SECTION 11.1 machine 1 a dmachine 1 f machine b c Machine e g 0 L[1] L[] time

2 <latexit sha1_base6="isgblgzclgh+a/czdqrwxxjeny=">aaacunicbvllsgmxfm3u97vq0kwccpszkrqcsocefcwarqqummvvntk8yq3bhlmn/g17jvr3dht7gxrv3y6owqwzn3kxtinelh0fc/vcre5nt0zozc/mli0vjkdxxtxqa5ddgquznxcwsskghgqil3gugmiol3mapz3399gmmfam+xlglcu6wiscm3ruvn5un+lulqsmo0p6vgmalocxuge7r0oyq6lkmuvwowvpr/idoxxamqy0mzja9dbcdspzbrq5znyaz/dvsemci6hna9zcxnjj6wdtqc1ubbxwcnkm5pkt1lijkq7y3xufu9bvowyfcmho671yf+0zo7juasqossrnp8zlossykr7btgmmbr9hxg3aj3vsofmdmfny0juwa9m+ajmxtpury8bcmyk/ezdsudi8gz39by79+xpevdmqnr0m7z8kgstbjcch5jsck0vsijy8kffyrt69d++rn7jtrfg9ask5golhdizazmg==</latexit> Load balancing on machines is NP-hard Claim. Load balancing is hard even if m = machines. Pf. PARTITION P LOAD-BALANCE. NP-complete by Exercise 8.6 Load balancing: list scheduling List-scheduling algorithm. Consider n jobs in some fixed order. Assign job j to machine i whose load is smallest so far. LIST-SCHEDULING (m, n, t 1, t,, t n) a b c d FOR i = 1 TO m e f g L[i] 0. S[i]. load on machine i jobs assigned to machine i length of job f FOR j = 1 TO n i argmin k L[k]. machine i has smallest load S[i] S[i] { j }. assign job j to machine i machine 1 a dmachine 1 f L[i] L[i] + t j. update load of machine i machine b c Machine e g yes RETURN S[1], S[],, S[m]. 0 L time 5 Implementation. O(n log m) using a priority queue for loads L[k]. 6 Load balancing: list scheduling analysis Load balancing: list scheduling analysis Theorem. [Graham 1966] Greedy algorithm is a -approximation. First worst-case analysis of an approximation algorithm. Need to compare resulting solution with optimal makespan L*. Lemma 1. The optimal makespan L* max j t j. Pf. Some machine must process the most time-consuming job. Lemma. The optimal makespan L Pf. 1 m j t j. The total processing time is Σ j t j. One of m machines must do at least a 1 / m fraction of total work. Theorem. Greedy algorithm is a -approximation. Pf. Consider load L[i] of bottleneck machine i. Let j be last job scheduled on machine i. When job j assigned to machine i, i had smallest load. machine i Its load before assignment is L[i] t j L[i] t j L[k] for all 1 k m. blue jobs scheduled before j j machine that ends up with highest load 7 0 L[i] - tj L = L[i] time 8

3 <latexit sha1_base6="uae/bi6m8arffw88jixmomzqrna=">aaacyxicjvfnt9wwehvssuna6fhlharvlvfv0nvdyousfx6aoviiup5hjnybznssbxbkkf+kft/g3esifuwogrxn56m89jp+evkg6j6doih083nyy9bt37pmll6/6zs/xvlbawnrqtke59ybkgbgkfhbewwb61zbwt9xttpfon1sjsnok8g1fzcyeikjp7k+v9giuwpo6qffgtmv35jh8pu9ahygrlrro3jwp6fm1tqb0ley9srwo+vxidx6ficyjn69hb9qtsmuqb3qbwea7kmkw7ggtutqadarfnuviqmk0ralundwogmlklydx0hanlm06k1v6xjmtwptwp0hasf8rgq6dm+vcdqol69tidvqyu1fgdpi01vixhxo6iofcwslv6ftqqfgwruardw+rtsccm9e+h/bvkd3yfyuulzawupqtwgmvztdy1rsyr3tfw/dr8nox+fbkchszu3yc7zi+9itl6si/kdnjaxecfmsb98dr6eo9cgs/dpbwsyldwvyuqef8aspzoq==</latexit> Load balancing: list scheduling analysis Load balancing: list scheduling analysis Theorem. Greedy algorithm is a -approximation. Pf. Consider load L[i] of bottleneck machine i. Let j be last job scheduled on machine i. When job j assigned to machine i, i had smallest load. Its load before assignment is L[i] t j L[i] t j L[k] for all 1 k m. Sum inequalities over all k and divide by m: Now, 1 L[i] t j L[k] m k Lemma L* k = 1 m k t k L. <latexit sha1_base6="wgtomafdxp9+t8x59plbcwmsc=">aaacbhicbvdbsgmxee3xe71vxacibivibcuccoicl70acfq0j3ldl0qrhz7jlmimxpb/k1vupx+aumwqohby5syztoyesrqgxfcz50xmtk3pzm7l5xcwl5ylk6u3jko0hzqpzktvazacgv1fcjhptbawkdcxdc56nfvxkabeakb7mbgh+xribbgdc3vlfzuqhfqndize9faq/j0j9njsgokzeddm+9byskbpmwafvae9hrfkpucc/gwvisiryvw1v3krxivisqgkuwtgncpujh7knaouozf3egmxx3ca0lfqvb+gmby9uwazf5gryhnn8dkqun6yabvyymn8xru/+vskd7u6hibehxwabiilgtg8dbqknhgxxasa1sh+l/ilpxteapdilezsgprjj+poowamwjlesx1gznnwxmu7zx1cvlk/k7vvh6fxaocskzjnskmq5iick0tyreqekzfytj7iz+7lwxekzsza6usgpwtkjjztb77ht1y=</latexit> L = L[i] = (L[i] t j )+t j L* L machine that ends up with highest load Q. Is our analysis tight? A. Essentially yes. Ex: m machines, m (m 1) jobs length 1 jobs, one job of length m. list scheduling makespan = 19 = m - 1 machine idle machine 3 idle machine idle m = 10 machine 5 idle machine 6 idle machine 7 idle machine 8 idle machine 9 idle machine 10 idle above inequality Lemma Load balancing: list scheduling analysis Load balancing: LPT rule Q. Is our analysis tight? A. Essentially yes. Longest processing time (LPT). Sort n jobs in decreasing order of processing times; then run list scheduling algorithm. Ex: m machines, m (m 1) jobs length 1 jobs, one job of length m. optimal makespan = 10 = m LPT-LIST-SCHEDULING (m, n, t 1, t,, t n) SORT jobs and renumber so that t 1 t t n. FOR i = 1 TO m L[i] 0. S[i]. load on machine i jobs assigned to machine i m = 10 FOR j = 1 TO n i argmin k L[k]. S[i] S[i] { j }. L[i] L[i] + tj. RETURN S[1], S[],, S[m]. machine i has smallest load assign job j to machine i update load of machine i

4 Load balancing: LPT rule Load balancing: LPT rule Observation. If bottleneck machine i has only 1 job, then optimal. Pf. Any solution must schedule that job. Lemma 3. If there are more than m jobs, L* t m+1. Pf. Consider processing times of first m+1 jobs t 1 t t m+1. Each takes at least t m+1 time. There are m + 1 jobs and m machines, so by pigeonhole principle, at least one machine gets two jobs. Theorem. LPT rule is a 3/-approximation algorithm. Pf. [ similar to proof for list scheduling ] Consider load L[i] of bottleneck machine i. Let j be last job scheduled on machine i. <latexit sha1_base6="wgtomafdxp9+t8x59plbcwmsc=">aaacbhicbvdbsgmxee3xe71vxacibivibcuccoicl70acfq0j3ldl0qrhz7jlmimxpb/k1vupx+aumwqohby5syztoyesrqgxfcz50xmtk3pzm7l5xcwl5ylk6u3jko0hzqpzktvazacgv1fcjhptbawkdcxdc56nfvxkabeakb7mbgh+xribbgdc3vlfzuqhfqndize9faq/j0j9njsgokzeddm+9byskbpmwafvae9hrfkpucc/gwvisiryvw1v3krxivisqgkuwtgncpujh7knaouozf3egmxx3ca0lfqvb+gmby9uwazf5gryhnn8dkqun6yabvyymn8xru/+vskd7u6hibehxwabiilgtg8dbqknhgxxasa1sh+l/ilpxteapdilezsgprjj+poowamwjlesx1gznnwxmu7zx1cvlk/k7vvh6fxaocskzjnskmq5iick0tyreqekzfytj7iz+7lwxekzsza6usgpwtkjjztb77ht1y=</latexit> L = L[i] = (L[i] t j )+t j 3 L assuming machine i has at least jobs, we have j > m Q. Is our 3/ analysis tight? A. No. Theorem. [Graham 1969] LPT rule is a /3-approximation. Pf. More sophisticated analysis of same algorithm. Q. Is Graham s /3 analysis tight? A. Essentially yes. Ex. m machines n = m + 1 jobs jobs of length m, m + 1,, m 1 and one more job of length m. Then, L / L* = (m 1) / (3m) as before L* ½ L* Lemma 3 (since tj tm+1) 13 1 Believe it or not 11. APPROXIMATION ALGORITHMS L * m 1 j t j. load balancing center selection pricing method: weighted vertex cover LP rounding: weighted vertex cover generalized load balancing knapsack problem SECTION

5 Center selection problem Center selection problem Input. Set of n sites s 1,, s n and an integer k > 0. Input. Set of n sites s 1,, s n and an integer k > 0. Center selection problem. Select set of k centers C so that maximum distance r(c) from a site to nearest center is minimized. Center selection problem. Select set of k centers C so that maximum distance r(c) from a site to nearest center is minimized. r(c) k = centers Notation. dist(x, y) = distance between sites x and y. dist(s i, C) = min c C dist(s i, c) = distance from s i to closest center. r(c) = max i dist(s i, C) = smallest covering radius. Goal. Find set of centers C that minimizes r(c), subject to C = k. center site Distance function properties. dist(x, x) = 0 [ identity ] dist(x, y) = dist(y, x) [ symmetry ] dist(x, y) dist(x, z) + dist(z, y) [ triangle inequality ] Center selection example Greedy algorithm: a false start Ex: each site is a point in the plane, a center can be any point in the plane, dist(x, y) = Euclidean distance. Remark: search can be infinite! Greedy algorithm. Put the first center at the best possible location for a single center, and then keep adding centers so as to reduce the covering radius each time by as much as possible. Remark: arbitrarily bad! k = centers k = centers r(c) greedy center 1 center site center site 19 0

6 Center selection: greedy algorithm Center selection: analysis of greedy algorithm Repeatedly choose next center to be site farthest from any existing center. GREEDY-CENTER-SELECTION (k, n, s 1, s,, s n) C. REPEAT k times Select a site s i with maximum distance dist(s i, C). C C s i. RETURN C. site farthest from any center Lemma. Let C* be an optimal set of centers. Then r(c) r(c*). Pf. [by contradiction] Assume r(c*) < ½ r(c). For each site c i C, consider ball of radius ½ r(c) around it. Exactly one c i * in each ball; let c i be the site paired with c i *. Consider any site s and its closest center c i * C*. dist(s, C) dist(s, c i ) dist(s, c i *) + dist(c i *, c i ) r(c*). Thus, r(c) r(c*). Δ-inequality ½ r(c) r(c*) since c i * is closest center ½ r(c) ½ r(c) c i Property. Upon termination, all centers in C are pairwise at least r(c) apart. Pf. By construction of algorithm. C* site s c i * 1 Center selection Dominating set reduces to center selection Lemma. Let C* be an optimal set of centers. Then r(c) r (C*). Theorem. Greedy algorithm is a -approximation for center selection problem. Remark. Greedy algorithm always places centers at sites, but is still within a factor of of best solution that is allowed to place centers anywhere. Question. Is there hope of a 3/-approximation? /3? e.g., points in the plane Theorem. Unless P = NP, there no ρ-approximation for center selection problem for any ρ <. Pf. We show how we could use a ( ε) approximation algorithm for CENTER-SELECTION selection to solve DOMINATING-SET in poly-time. Let G = (V, E), k be an instance of DOMINATING-SET. Construct instance Gʹ of CENTER-SELECTION with sites V and distances - dist(u, v) = 1 if (u, v) E - dist(u, v) = if (u, v) E Note that Gʹ satisfies the triangle inequality. G has dominating set of size k iff there exists k centers C* with r(c*) = 1. Thus, if G has a dominating set of size k, a ( ε)-approximation algorithm for CENTER-SELECTION would find a solution C* with r(c*) = 1 since it cannot use any edge of distance. 3

7 Weighted vertex cover 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: weighted vertex cover LP rounding: weighted vertex cover generalized load balancing knapsack problem Definition. Given a graph G = (V, E), a vertex cover is a set S V such that each edge in E has at least one end in S. Weighted vertex cover. Given a graph G with vertex weights, find a vertex cover of minimum weight. SECTION weight = + + weight = 11 6 Pricing method Pricing method Pricing method. Each edge must be covered by some vertex. Edge e = (i, j) pays price p e 0 to use both vertex i and j. Set prices and find vertex cover simultaneously. Fairness. Edges incident to vertex i should pay w i in total. WEIGHTED-VERTEX-COVER (G, w) for each vertex i : p e e=(i, j) w i S. FOREACH e E p e 0. 9 Fairness lemma. For any vertex cover S and any fair prices pe : e pe w(s). Pf. WHILE (there exists an edge (i, j) such that neither i nor j is tight) Select such an edge e = (i, j). Increase pe as much as possible until i or j tight. S set of all tight nodes. RETURN S. each edge e covered by at least one node in S sum fairness inequalities for each node in S 7 8

8 Pricing method example Pricing method: analysis Theorem. Pricing method is a -approximation for WEIGHTED-VERTEX-COVER. Pf. Algorithm terminates since at least one new node becomes tight after each iteration of while loop. Let S = set of all tight nodes upon termination of algorithm. S is a vertex cover: if some edge (i, j) is uncovered, then neither i nor j is tight. But then while loop would not terminate. vertex weight price of edge a-b Let S* be optimal vertex cover. We show w(s) w(s*). w(s) = w i = i S i S p e e=(i, j) i V p e e=(i, j) = p e w(s*). e E all nodes in S are tight S V, prices 0 each edge counted twice fairness lemma 9 30 Weighted vertex cover 11. APPROXIMATION ALGORITHMS Given a graph G = (V, E) with vertex weights w i 0, find a min-weight subset of vertices S V such that every edge is incident to at least one vertex in S. load balancing center selection pricing method: weighted vertex cover LP rounding: weighted vertex cover generalized load balancing knapsack problem SECTION total weight = = 57 3

9 <latexit sha1_base6="ojbhmaisnfwzzmjeojgzuvf/cla=">aaac+nicfvhlbtqwfhxcqtxtczzwiyoweujqtsykldskgdwmkcro5zk7hqojf9gzkk8jwseft+hcv/gyctxmwucsrbr+cex18fp7wsfqpol+ffuxvv/oodh8gjx0+eppschnyvwmezeslknodcgtkapihraxxtqfepgqu0pv36/rvfzbwvvosvzukjs+0zkxg6ktf5cfligfcgl6yaadfsovdua9ycoxu3vdooom6gpqwmq0arts6p7syyo+d6zkulr5d9n3tkwmbblsighvxo9at7ec3j7sp5zthfp610kj/1glcg0emnubjubmogs3huifhmpleydac3styskrlxstj0jlhwiayejujxa+dxvgpigqiuclz7xklnxq0vyo6o5ixypbuy7+krpq0rxbgumgbp/oegntqkydc0hqv7yw/1wbn5ifje6rdyogxeaivfeuk7r+sjpjawlvyheujhszurhkhgt0371zy9c7brhzkq5ttbrvbnuqwhyn71k8x5mt8nstfgjd6etm/pxjgpyavykrwmmtkl5+qdusazirxtl/fyr/b7/6v/zf++sfreeoy5yh/zeeeujg</latexit> Weighted vertex cover: ILP formulation Weighted vertex cover: ILP formulation Given a graph G = (V, E) with vertex weights w i 0, find a min-weight subset of vertices S V such that every edge is incident to at least one vertex in S. Integer linear programming formulation. Model inclusion of each vertex i using a 0/1 variable x i. x i = " # $ Vertex covers in 1 1 correspondence with 0/1 assignments: S = { i V : x i = 1}. 0 if vertex i is not in vertex cover 1 if vertex i is in vertex cover Objective function: minimize Σ i w i x i. Weighted vertex cover. Integer linear programming formulation. (ILP) i V w i x i x i + x j 1 (i, j) E x i {0, 1} i V Observation. If x* is optimal solution to ILP, then S = { i V : x i * = 1} is a min-weight vertex cover. For every edge (i, j), must take either vertex i or j (or both): x i + x j Integer linear programming Linear programming Given integers a ij, b i, and c j, find integers x j that satisfy: c T x Ax b x 0 x Observation. Vertex cover formulation proves that INTEGER-PROGRAMMING is an NP-hard search problem. n j=1 n j=1 c j x j a ij x j b i 1 i m x j 0 1 j n x j 1 j n Given integers a ij, b i, and c j, find real numbers x j that satisfy: c T x Ax b x 0 Linear. No x, xy, arccos(x), x(1 x), etc. Simplex algorithm. [Dantzig 197] Can solve LP in practice. Ellipsoid algorithm. [Khachiyan 1979] Can solve LP in poly-time. Interior point algorithms. [Karmarkar 198, Renegar 1988, ] Can solve LP both in poly-time and in practice. n j=1 n j=1 c j x j a ij x j b i 1 i m x j 0 1 j n 35 36

10 LP feasible region Weighted vertex cover: LP relaxation LP geometry in D. Linear programming relaxation. x 1 = 0 (LP ) i V w i x i x i + x j 1 (i, j) E x i 0 i V Observation. Optimal value of LP is optimal value of ILP. Pf. LP has fewer constraints. Note. LP is not equivalent to weighted vertex cover. (even if all weights are 1) ½ ½ x 1 + x = 6 x = 0 x 1 + x = 6 Q. How can solving LP help us find a low-weight vertex cover? A. Solve LP and round fractional values. ½ Weighted vertex cover: LP rounding algorithm Weighted vertex cover inapproximability Lemma. If x* is optimal solution to LP, then S = { i V : x i * ½} is a vertex cover whose weight is at most twice the min possible weight. Theorem. [Dinur Safra 00] If P NP, then no ρ-approximation for WEIGHTED-VERTEX-COVER for any ρ < (even if all weights are 1). Pf. [S is a vertex cover] Consider an edge (i, j) E. Since x i * + x j * 1, either x i * ½ or x j * ½ (or both) (i, j) covered. Pf. [S has desired cost] Let S* be optimal vertex cover. Then w i i S* * w i x i 1 w i i S LP is a relaxation xi* ½ i S Theorem. The rounding algorithm is a -approximation algorithm. Pf. Lemma + fact that LP can be solved in poly-time. On the Hardness of Approximating Minimum Vertex Cover Irit Dinur Samuel Safra May 6, 00 Abstract We prove the Minimum Vertex Cover problem to be NP-hard to approximate to within a factor of , extending on previous PCP and hardness of approximation technique. To that end, one needs to develop a new proof framework, and borrow and extend ideas from several fields. Open research problem. Close the gap. 39 0

11 Generalized load balancing 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: weighted vertex cover LP rounding: weighted vertex cover generalized load balancing knapsack problem Input. Set of m machines M; set of n jobs J. Job j J must run contiguously on an authorized machine in M j M. Job j J has processing time t j. Each machine can process at most one job at a time. Def. Let J i be the subset of jobs assigned to machine i. The load of machine i is L i = Σ j Ji t j. Def. The makespan is the maximum load on any machine = max i L i. SECTION 11.7 Generalized load balancing. Assign each job to an authorized machine to minimize makespan. Generalized load balancing: integer linear program and relaxation Generalized load balancing: lower bounds ILP formulation. x ij = time machine i spends processing job j. (IP) min L s. t. x i j = t j for all j J i L for all i M x i j j x i j {0, t j } for all j J and i M j Lemma 1. The optimal makespan L* max j t j. Pf. Some machine must process the most time-consuming job. Lemma. Let L be optimal value to the LP. Then, optimal makespan L* L. Pf. LP has fewer constraints than ILP formulation. x i j = 0 for all j J and i M j LP relaxation. (LP) min L s. t. x i j = t j for all j J i L for all i M x i j j x i j 0 for all j J and i M j x i j = 0 for all j J and i M j 3

12 Generalized load balancing: structure of LP solution Generalized load balancing: rounding Lemma 3. Let x be solution to LP. Let G(x) be the graph with an edge between machine i and job j if x ij > 0. Then G(x) is acyclic. Pf. (deferred) can transform x into another LP solution where G(x) is acyclic if LP solver doesn t return such an x Rounded solution. Find LP solution x where G(x) is a forest. Root forest G(x) at some arbitrary machine node r. If job j is a leaf node, assign j to its parent machine i. If job j is not a leaf node, assign j to any one of its children. Lemma. Rounded solution only assigns jobs to authorized machines. Pf. If job j is assigned to machine i, then x ij > 0. LP solution can only assign positive value to authorized machines. x ij > 0 G(x) acyclic job G(x) cyclic job machine machine 5 6 Generalized load balancing: analysis Generalized load balancing: analysis Lemma 5. If job j is a leaf node and machine i = parent(j), then x ij = t j. Pf. Since i is a leaf, x ij = 0 for all j parent(i). LP constraint guarantees Σ i x ij = t j. Theorem. Rounded solution is a -approximation. Pf. Let J(i) be the jobs assigned to machine i. By LEMMA 6, the load L i on machine i has two components: Lemma 6. At most one non-leaf job is assigned to a machine. Pf. The only possible non-leaf job assigned to machine i is parent(i). - leaf nodes: t j j J(i) j is a leaf = x ij x ij L L * Lemma 1 Lemma 5 j J(i) j is a leaf j J LP Lemma (LP is a relaxation) optimal value of LP - parent: t parent(i) L * job machine Thus, the overall load L i L*. 7 8

13 Generalized load balancing: flow formulation Generalized load balancing: structure of solution Flow formulation of LP. x i j i x i j j = t j for all j J L for all i M x i j 0 for all j J and i M j x i j = 0 for all j J and i M j Lemma 3. Let (x, L) be solution to LP. Let G(x) be the graph with an edge from machine i to job j if x ij > 0. We can find another solution (xʹ, L) such that G(xʹ) is acyclic. Pf. Let C be a cycle in G(x). Augment flow along the cycle C. flow conservation maintained At least one edge from C is removed (and none are added). Repeat until G(xʹ) is acyclic. 3 3 augment flow along cycle C Observation. Solution to feasible flow problem with value L are in 1-to-1 correspondence with LP solutions of value L. 1 3 G(x) 5 1 G(x ) Conclusions Running time. The bottleneck operation in our -approximation is solving one LP with m n + 1 variables. Remark. Can solve LP using flow techniques on a graph with m + n + 1 nodes: given L, find feasible flow if it exists. Binary search to find L*. Extensions: unrelated parallel machines. [Lenstra Shmoys Tardos 1990] Job j takes t ij time if processed on machine i. -approximation algorithm via LP rounding. If P NP, then no no ρ-approximation exists for any ρ < 3/. 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: weighted vertex cover LP rounding: weighted vertex cover generalized load balancing knapsack problem SECTION 11.8 Mathematical Programming 6 (1990) North-Holland APPROXIMATION ALGORITHMS FOR SCHEDULING UNRELATED PARALLEL MACHINES Jan Karel LENSTRA Eindhoven University of Technology, Eindhoven, The Netherlands, and Centre for Mathematics and Computer Science, Amsterdam, The Netherlands David B. SHMOYS and l~va TARDOS Cornell University, Ithaca, NY, USA 51

14 Polynomial-time approximation scheme PTAS. (1 + ε)-approximation algorithm for any constant ε > 0. Load balancing. [Hochbaum Shmoys 1987] Euclidean TSP. [Arora, Mitchell 1996] Consequence. PTAS produces arbitrarily high quality solution, but trades off accuracy for time. Knapsack problem Knapsack problem. Given n objects and a knapsack. Item i has value > 0 and weighs w i > 0. Knapsack has weight limit W. Goal: fill knapsack so as to maximize total value. Ex: { 3, } has value 0. we assume w i W for each i This section. PTAS for knapsack problem via rounding and scaling. item value weight original instance (W = 11) 53 5 Knapsack is NP-complete Knapsack problem: dynamic programming I KNAPSACK. Given a set X, weights w i 0, values 0, a weight limit W, and a target value V, is there a subset S X such that: SUBSET-SUM. Given a set X, values u i 0, and an integer U, is there a subset S X whose elements sum to exactly U? Theorem. SUBSET-SUM P KNAPSACK. W w i i S i S V Pf. Given instance (u1,, un, U) of SUBSET-SUM, create KNAPSACK instance: = w i = u i V = W = U u i i S u i i S U U Def. OPT(i, w) = max value subset of items 1,..., i with weight limit w. Case 1. OPT does not select item i. OPT selects best of 1,, i 1 using up to weight limit w. Case. OPT selects item i. New weight limit = w w i. OPT selects best of 1,, i 1 using up to weight limit w w i. # 0 if i = 0 % OPT(i, w) = $ OPT(i 1, w) if w i > w % & max{ OPT(i 1, w), + OPT(i 1, w w i )} otherwise Theorem. Computes the optimal value in O(n W) time. Not polynomial in input size. Polynomial in input size if weights are small integers

15 Knapsack problem: dynamic programming II Knapsack problem: dynamic programming II Def. OPT(i, v) = min weight of a knapsack for which we can obtain a solution of value v using a subset of items 1,..., i. Note. Optimal value is the largest value v such that OPT(n, v) W. Case 1. OPT does not select item i. OPT selects best of 1,, i 1 that achieves value v. Case. OPT selects item i. Consumes weight w i, need to achieve value v. OPT selects best of 1,, i 1 that achieves value v. Theorem. Dynamic programming algorithm II computes the optimal value in O(n vmax) time, where vmax is the maximum of any value. Pf. The optimal value V* n v max. There is one subproblem for each item and for each value v V*. It takes O(1) time per subproblem. Remark 1. Not polynomial in input size! Remark. Polynomial time if values are small integers. OPT(i, v) = 0 v 0 i =0 v>0 min {OPT(i 1,v), w i + OPT(i 1,v )} Knapsack problem: polynomial-time approximation scheme Intuition for approximation algorithm. Round all values up to lie in smaller range. Run dynamic programming algorithm II on rounded/scaled instance. Return optimal items in rounded instance. item value weight item value weight Knapsack problem: polynomial-time approximation scheme Round up all values: 0 < ε 1 = precision parameter. v max = largest value in original instance. θ = scaling factor = ε v max / n. = Observation. Optimal solutions to problem with v are equivalent to optimal solutions to problem with v ˆ. Intuition. v close to v so optimal solution using s nearly optimal; v ˆ small and integral so dynamic programming algorithm II is fast., ˆ = original instance (W = 11) rounded instance (W = 11) 59 60

16 <latexit sha1_base6="wcmh0rei0yrvhzhqc6cyws3u=">aaac3icnzhnattaemdxsj9s98tjjr0snsfgpfcocm+baolt7q0bgjei1brubjkp8tuyngivufpode+sn+kb9ovykrt5nqbrx76z+6m9j95qathjpkdxtv37j9upuo9/jj0fp+3v737ytnicjsmq685x7unlabcuqoc8dcj0romuvprt5swul635iqsszppfgfliwtfiwf/xiquz5sugujzio9fhpaddql7sws0pkz+aegicziib9ufyegqnompmy6byi9vna0/lciyz16r/x3n63iwx+qdjmu6g1i1zag6xhne9e+m1trataofpd+miylzmruuaofty9vhkourvgftamarshp6s7khrkypww1oxhio3uf0/uxhu/0nnyqtle+u1ck96vm1zyhm9qacokwyibrkwlkfratozopqobahwacyfdv1jxyymtgoa60awrxylyuem9riwudg5bqsilotef9ntz7d5hdfph8phqchk/t3cuvyevyhqtkhtkhprmjkrenpoyfyrgcrf/j6/jhzdbh95obsrpzzdxec3sg=</latexit> <latexit sha1_base6="wcmh0rei0yrvhzhqc6cyws3u=">aaac3icnzhnattaemdxsj9s98tjjr0snsfgpfcocm+baolt7q0bgjei1brubjkp8tuyngivufpode+sn+kb9ovykrt5nqbrx76z+6m9j95qathjpkdxtv37j9upuo9/jj0fp+3v737ytnicjsmq685x7unlabcuqoc8dcj0romuvprt5swul635iqsszppfgfliwtfiwf/xiquz5sugujzio9fhpaddql7sws0pkz+aegicziib9ufyegqnompmy6byi9vna0/lciyz16r/x3n63iwx+qdjmu6g1i1zag6xhne9e+m1trataofpd+miylzmruuaofty9vhkourvgftamarshp6s7khrkypww1oxhio3uf0/uxhu/0nnyqtle+u1ck96vm1zyhm9qacokwyibrkwlkfratozopqobahwacyfdv1jxyymtgoa60awrxylyuem9riwudg5bqsilotef9ntz7d5hdfph8phqchk/t3cuvyevyhqtkhtkhprmjkrenpoyfyrgcrf/j6/jhzdbh95obsrpzzdxec3sg=</latexit> <latexit sha1_base6="wcmh0rei0yrvhzhqc6cyws3u=">aaac3icnzhnattaemdxsj9s98tjjr0snsfgpfcocm+baolt7q0bgjei1brubjkp8tuyngivufpode+sn+kb9ovykrt5nqbrx76z+6m9j95qathjpkdxtv37j9upuo9/jj0fp+3v737ytnicjsmq685x7unlabcuqoc8dcj0romuvprt5swul635iqsszppfgfliwtfiwf/xiquz5sugujzio9fhpaddql7sws0pkz+aegicziib9ufyegqnompmy6byi9vna0/lciyz16r/x3n63iwx+qdjmu6g1i1zag6xhne9e+m1trataofpd+miylzmruuaofty9vhkourvgftamarshp6s7khrkypww1oxhio3uf0/uxhu/0nnyqtle+u1ck96vm1zyhm9qacokwyibrkwlkfratozopqobahwacyfdv1jxyymtgoa60awrxylyuem9riwudg5bqsilotef9ntz7d5hdfph8phqchk/t3cuvyevyhqtkhtkhprmjkrenpoyfyrgcrf/j6/jhzdbh95obsrpzzdxec3sg=</latexit> <latexit sha1_base6="f7vmfgkdvun5wdalvczxnychk8=">aaacwhicbvdlsgmxfmm73dbl6crxfvp0vq6uzw1lrqtapqya91wcsgzi7pwwyz/br3ophid9jonzhqwdcdufcm5t7lqki0hw6fkli0vlk6tr6xubw9s7lwrt3iazddhiuzmy8wsskghgwilpkygmiolpmqvlxp/yqjgiktftifnmjpwgwez+ikqhi8jpjqsvfbadgofuklbssls0tjmkckfdoeltqyericr1obguoh9jc0rqzirrqorvwn7cmwuauwtwdptbircgrrcqreezhzsxl/ye3qd1uyb7exlzgu9ceqfdhljjkzaqr87cqashavyvsqgz3bemj/ed0mbe9xog0q9d8z9agkxqtoomj9oubjnlscongul9s/swm+jcnjlsvp0cn9kkhva8kqpc6reerpwubsb85n9jz1w7wr3jzul86mca6spbjpjkitnjilckwusydw8kreydv58l58z1/x135kfw/as0tmne+asngs+a=</latexit> <latexit sha1_base6="58x0jzzejolvrsw88aym8xzosa=">aaacpnicbvfnaxsxenvuv5l0i5zzexufnqlwydce0wg0etvtsfbbyzjzsqzxhgtteizwwbzh9pz/0jlj0ltzedi8ey9jfsuvvp5splfufzs+yuxrwoj16/efvuuhps/elt7ssm0mrr7jlwqjxblbrpvkscqplphgx3x1f90qm6r6y5pwwfkxlmruvkagvqmlfacqfpo0oydfymbwsqy05hriupzktuqzt9fwgqkcbinzoxqeuqig/k9bz00wyewvtmbpnu30kn6ylvydlagx7z1mzjumjmzviianb+/egqwjsgcmlnbzhovzygbyhoydnfcinztrnfr+itaznlsxlig+zv93nfb6vyyzocybcr/fw5fp9cy15retrpmqjjrymyivnsflv6hzmxioss8daoluucuxbyrwkhznzpt1rxknzc0i9ooawexpakimpdjyz+wxsm/7l/3kx+fe9cuzgnxt6zjzavrbr9o3dsjrj9ic6jlrrafwp/h6n8wgjjaot55ttvpzrl8dn0hi=</latexit> Knapsack problem: polynomial-time approximation scheme Knapsack problem: polynomial-time approximation scheme Theorem. If S is solution found by rounding algorithm and S* is any other feasible solution, then Pf. Let S* be any feasible solution satisfying weight constraint. i S i S (1 + ) always round up i S i S subset containing only the item of largest value Theorem. For any ε > 0, the rounding algorithm computes a feasible solution whose value is within a (1 + ε) factor of the optimum in O(n 3 / ε) time. Pf. We have already proved the accuracy bound. Dynamic program II running time is O(n v ˆ ) max, where i i S S ( + ) solve rounded instance optimally never round up by more than θ choosing S* = { max } v max i S + 1 v max ˆv max = v max = n i S + n S n thus i S + 1 v max = i S + 1 v max θ = ε v max / n v max i S (1 + ) i S v max Σ i S 61 6

11. APPROXIMATION ALGORITHMS

11. APPROXIMATION ALGORITHMS 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: vertex cover LP rounding: vertex cover generalized load balancing knapsack problem Lecture slides by Kevin Wayne Copyright 2005

More information

11. APPROXIMATION ALGORITHMS

11. APPROXIMATION ALGORITHMS 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: vertex cover LP rounding: vertex cover generalized load balancing knapsack problem Lecture slides by Kevin Wayne Copyright 2005

More information

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018 CS 580: Algorithm Design and Analysis Jeremiah Blocki Purdue University Spring 2018 Chapter 11 Approximation Algorithms Slides by Kevin Wayne. Copyright @ 2005 Pearson-Addison Wesley. All rights reserved.

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms Given an NP-hard problem, what should be done? Theory says you're unlikely to find a poly-time algorithm. Must sacrifice one of three desired features. Solve problem to optimality.

More information

Algorithm Design and Analysis

Algorithm Design and Analysis Algorithm Design and Analysis LECTURE 29 Approximation Algorithms Load Balancing Weighted Vertex Cover Reminder: Fill out SRTEs online Don t forget to click submit Sofya Raskhodnikova 12/7/2016 Approximation

More information

Outline. CS38 Introduction to Algorithms. Approximation Algorithms. Optimization Problems. Set Cover. Set cover 5/29/2014. coping with intractibility

Outline. CS38 Introduction to Algorithms. Approximation Algorithms. Optimization Problems. Set Cover. Set cover 5/29/2014. coping with intractibility Outline CS38 Introduction to Algorithms Lecture 18 May 29, 2014 coping with intractibility approximation algorithms set cover TSP center selection randomness in algorithms May 29, 2014 CS38 Lecture 18

More information

Copyright 2000, Kevin Wayne 1

Copyright 2000, Kevin Wayne 1 Guessing Game: NP-Complete? 1. LONGEST-PATH: Given a graph G = (V, E), does there exists a simple path of length at least k edges? YES. SHORTEST-PATH: Given a graph G = (V, E), does there exists a simple

More information

10. EXTENDING TRACTABILITY

10. EXTENDING TRACTABILITY 0. EXTENDING TRACTABILITY finding small vertex covers solving NP-hard problems on trees circular arc coverings vertex cover in bipartite graphs Lecture slides by Kevin Wayne Copyright 005 Pearson-Addison

More information

Approximation Algorithms

Approximation Algorithms Chapter 8 Approximation Algorithms Algorithm Theory WS 2016/17 Fabian Kuhn Approximation Algorithms Optimization appears everywhere in computer science We have seen many examples, e.g.: scheduling jobs

More information

15-451/651: Design & Analysis of Algorithms November 4, 2015 Lecture #18 last changed: November 22, 2015

15-451/651: Design & Analysis of Algorithms November 4, 2015 Lecture #18 last changed: November 22, 2015 15-451/651: Design & Analysis of Algorithms November 4, 2015 Lecture #18 last changed: November 22, 2015 While we have good algorithms for many optimization problems, the previous lecture showed that many

More information

Theorem 2.9: nearest addition algorithm

Theorem 2.9: nearest addition algorithm There are severe limits on our ability to compute near-optimal tours It is NP-complete to decide whether a given undirected =(,)has a Hamiltonian cycle An approximation algorithm for the TSP can be used

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A 4 credit unit course Part of Theoretical Computer Science courses at the Laboratory of Mathematics There will be 4 hours

More information

11.1 Facility Location

11.1 Facility Location CS787: Advanced Algorithms Scribe: Amanda Burton, Leah Kluegel Lecturer: Shuchi Chawla Topic: Facility Location ctd., Linear Programming Date: October 8, 2007 Today we conclude the discussion of local

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms CSE 101, Winter 018 D/Q Greed SP s DP LP, Flow B&B, Backtrack Metaheuristics P, NP Design and Analysis of Algorithms Lecture 8: Greed Class URL: http://vlsicad.ucsd.edu/courses/cse101-w18/ Optimization

More information

10. EXTENDING TRACTABILITY

10. EXTENDING TRACTABILITY 0. EXTENDING TRACTABILITY finding small vertex covers solving NP-hard problems on trees circular arc coverings vertex cover in bipartite graphs Lecture slides by Kevin Wayne Copyright 005 Pearson-Addison

More information

Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R + Goal: find a tour (Hamiltonian cycle) of minimum cost

Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R + Goal: find a tour (Hamiltonian cycle) of minimum cost Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R + Goal: find a tour (Hamiltonian cycle) of minimum cost Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R

More information

Introduction to Approximation Algorithms

Introduction to Approximation Algorithms Introduction to Approximation Algorithms Dr. Gautam K. Das Departmet of Mathematics Indian Institute of Technology Guwahati, India gkd@iitg.ernet.in February 19, 2016 Outline of the lecture Background

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A new 4 credit unit course Part of Theoretical Computer Science courses at the Department of Mathematics There will be 4 hours

More information

Mathematical and Algorithmic Foundations Linear Programming and Matchings

Mathematical and Algorithmic Foundations Linear Programming and Matchings Adavnced Algorithms Lectures Mathematical and Algorithmic Foundations Linear Programming and Matchings Paul G. Spirakis Department of Computer Science University of Patras and Liverpool Paul G. Spirakis

More information

Algorithms for Euclidean TSP

Algorithms for Euclidean TSP This week, paper [2] by Arora. See the slides for figures. See also http://www.cs.princeton.edu/~arora/pubs/arorageo.ps Algorithms for Introduction This lecture is about the polynomial time approximation

More information

Reductions. Linear Time Reductions. Desiderata. Reduction. Desiderata. Classify problems according to their computational requirements.

Reductions. Linear Time Reductions. Desiderata. Reduction. Desiderata. Classify problems according to their computational requirements. Desiderata Reductions Desiderata. Classify problems according to their computational requirements. Frustrating news. Huge number of fundamental problems have defied classification for decades. Desiderata'.

More information

Approximation Basics

Approximation Basics Milestones, Concepts, and Examples Xiaofeng Gao Department of Computer Science and Engineering Shanghai Jiao Tong University, P.R.China Spring 2015 Spring, 2015 Xiaofeng Gao 1/53 Outline History NP Optimization

More information

Approximation Algorithms: The Primal-Dual Method. My T. Thai

Approximation Algorithms: The Primal-Dual Method. My T. Thai Approximation Algorithms: The Primal-Dual Method My T. Thai 1 Overview of the Primal-Dual Method Consider the following primal program, called P: min st n c j x j j=1 n a ij x j b i j=1 x j 0 Then the

More information

Basic Approximation algorithms

Basic Approximation algorithms Approximation slides Basic Approximation algorithms Guy Kortsarz Approximation slides 2 A ρ approximation algorithm for problems that we can not solve exactly Given an NP-hard question finding the optimum

More information

15-854: Approximations Algorithms Lecturer: Anupam Gupta Topic: Direct Rounding of LP Relaxations Date: 10/31/2005 Scribe: Varun Gupta

15-854: Approximations Algorithms Lecturer: Anupam Gupta Topic: Direct Rounding of LP Relaxations Date: 10/31/2005 Scribe: Varun Gupta 15-854: Approximations Algorithms Lecturer: Anupam Gupta Topic: Direct Rounding of LP Relaxations Date: 10/31/2005 Scribe: Varun Gupta 15.1 Introduction In the last lecture we saw how to formulate optimization

More information

A List Heuristic for Vertex Cover

A List Heuristic for Vertex Cover A List Heuristic for Vertex Cover Happy Birthday Vasek! David Avis McGill University Tomokazu Imamura Kyoto University Operations Research Letters (to appear) Online: http://cgm.cs.mcgill.ca/ avis revised:

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms Subhash Suri June 5, 2018 1 Figure of Merit: Performance Ratio Suppose we are working on an optimization problem in which each potential solution has a positive cost, and we want

More information

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748 COT 6936: Topics in Algorithms! Giri Narasimhan ECS 254A / EC 2443; Phone: x3748 giri@cs.fiu.edu http://www.cs.fiu.edu/~giri/teach/cot6936_s12.html https://moodle.cis.fiu.edu/v2.1/course/view.php?id=174

More information

1 Unweighted Set Cover

1 Unweighted Set Cover Comp 60: Advanced Algorithms Tufts University, Spring 018 Prof. Lenore Cowen Scribe: Yuelin Liu Lecture 7: Approximation Algorithms: Set Cover and Max Cut 1 Unweighted Set Cover 1.1 Formulations There

More information

Notes for Lecture 24

Notes for Lecture 24 U.C. Berkeley CS170: Intro to CS Theory Handout N24 Professor Luca Trevisan December 4, 2001 Notes for Lecture 24 1 Some NP-complete Numerical Problems 1.1 Subset Sum The Subset Sum problem is defined

More information

MVE165/MMG630, Applied Optimization Lecture 8 Integer linear programming algorithms. Ann-Brith Strömberg

MVE165/MMG630, Applied Optimization Lecture 8 Integer linear programming algorithms. Ann-Brith Strömberg MVE165/MMG630, Integer linear programming algorithms Ann-Brith Strömberg 2009 04 15 Methods for ILP: Overview (Ch. 14.1) Enumeration Implicit enumeration: Branch and bound Relaxations Decomposition methods:

More information

3 INTEGER LINEAR PROGRAMMING

3 INTEGER LINEAR PROGRAMMING 3 INTEGER LINEAR PROGRAMMING PROBLEM DEFINITION Integer linear programming problem (ILP) of the decision variables x 1,..,x n : (ILP) subject to minimize c x j j n j= 1 a ij x j x j 0 x j integer n j=

More information

NP Completeness. Andreas Klappenecker [partially based on slides by Jennifer Welch]

NP Completeness. Andreas Klappenecker [partially based on slides by Jennifer Welch] NP Completeness Andreas Klappenecker [partially based on slides by Jennifer Welch] Dealing with NP-Complete Problems Dealing with NP-Completeness Suppose the problem you need to solve is NP-complete. What

More information

CSE 417 Branch & Bound (pt 4) Branch & Bound

CSE 417 Branch & Bound (pt 4) Branch & Bound CSE 417 Branch & Bound (pt 4) Branch & Bound Reminders > HW8 due today > HW9 will be posted tomorrow start early program will be slow, so debugging will be slow... Review of previous lectures > Complexity

More information

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I. Instructor: Shaddin Dughmi

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I. Instructor: Shaddin Dughmi CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I Instructor: Shaddin Dughmi Announcements Posted solutions to HW1 Today: Combinatorial problems

More information

Slides on Approximation algorithms, part 2: Basic approximation algorithms

Slides on Approximation algorithms, part 2: Basic approximation algorithms Approximation slides Slides on Approximation algorithms, part : Basic approximation algorithms Guy Kortsarz Approximation slides Finding a lower bound; the TSP example The optimum TSP cycle P is an edge

More information

CMSC 451: Lecture 22 Approximation Algorithms: Vertex Cover and TSP Tuesday, Dec 5, 2017

CMSC 451: Lecture 22 Approximation Algorithms: Vertex Cover and TSP Tuesday, Dec 5, 2017 CMSC 451: Lecture 22 Approximation Algorithms: Vertex Cover and TSP Tuesday, Dec 5, 2017 Reading: Section 9.2 of DPV. Section 11.3 of KT presents a different approximation algorithm for Vertex Cover. Coping

More information

Clustering: Centroid-Based Partitioning

Clustering: Centroid-Based Partitioning Clustering: Centroid-Based Partitioning Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong 1 / 29 Y Tao Clustering: Centroid-Based Partitioning In this lecture, we

More information

1 The Traveling Salesperson Problem (TSP)

1 The Traveling Salesperson Problem (TSP) CS 598CSC: Approximation Algorithms Lecture date: January 23, 2009 Instructor: Chandra Chekuri Scribe: Sungjin Im In the previous lecture, we had a quick overview of several basic aspects of approximation

More information

Topic: Local Search: Max-Cut, Facility Location Date: 2/13/2007

Topic: Local Search: Max-Cut, Facility Location Date: 2/13/2007 CS880: Approximations Algorithms Scribe: Chi Man Liu Lecturer: Shuchi Chawla Topic: Local Search: Max-Cut, Facility Location Date: 2/3/2007 In previous lectures we saw how dynamic programming could be

More information

Polynomial-Time Approximation Algorithms

Polynomial-Time Approximation Algorithms 6.854 Advanced Algorithms Lecture 20: 10/27/2006 Lecturer: David Karger Scribes: Matt Doherty, John Nham, Sergiy Sidenko, David Schultz Polynomial-Time Approximation Algorithms NP-hard problems are a vast

More information

Greedy algorithms is another useful way for solving optimization problems.

Greedy algorithms is another useful way for solving optimization problems. Greedy Algorithms Greedy algorithms is another useful way for solving optimization problems. Optimization Problems For the given input, we are seeking solutions that must satisfy certain conditions. These

More information

COMP Analysis of Algorithms & Data Structures

COMP Analysis of Algorithms & Data Structures COMP 3170 - Analysis of Algorithms & Data Structures Shahin Kamali Approximation Algorithms CLRS 35.1-35.5 University of Manitoba COMP 3170 - Analysis of Algorithms & Data Structures 1 / 30 Approaching

More information

/ Approximation Algorithms Lecturer: Michael Dinitz Topic: Linear Programming Date: 2/24/15 Scribe: Runze Tang

/ Approximation Algorithms Lecturer: Michael Dinitz Topic: Linear Programming Date: 2/24/15 Scribe: Runze Tang 600.469 / 600.669 Approximation Algorithms Lecturer: Michael Dinitz Topic: Linear Programming Date: 2/24/15 Scribe: Runze Tang 9.1 Linear Programming Suppose we are trying to approximate a minimization

More information

Bottleneck Steiner Tree with Bounded Number of Steiner Vertices

Bottleneck Steiner Tree with Bounded Number of Steiner Vertices Bottleneck Steiner Tree with Bounded Number of Steiner Vertices A. Karim Abu-Affash Paz Carmi Matthew J. Katz June 18, 2011 Abstract Given a complete graph G = (V, E), where each vertex is labeled either

More information

Chapter 10 Part 1: Reduction

Chapter 10 Part 1: Reduction //06 Polynomial-Time Reduction Suppose we could solve Y in polynomial-time. What else could we solve in polynomial time? don't confuse with reduces from Chapter 0 Part : Reduction Reduction. Problem X

More information

Greedy Approximations

Greedy Approximations CS 787: Advanced Algorithms Instructor: Dieter van Melkebeek Greedy Approximations Approximation algorithms give a solution to a problem in polynomial time, at most a given factor away from the correct

More information

Copyright 2000, Kevin Wayne 1

Copyright 2000, Kevin Wayne 1 Linear Time: O(n) CS 580: Algorithm Design and Analysis 2.4 A Survey of Common Running Times Merge. Combine two sorted lists A = a 1,a 2,,a n with B = b 1,b 2,,b n into sorted whole. Jeremiah Blocki Purdue

More information

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18 601.433/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18 22.1 Introduction We spent the last two lectures proving that for certain problems, we can

More information

COMP 355 Advanced Algorithms Approximation Algorithms: VC and TSP Chapter 11 (KT) Section (CLRS)

COMP 355 Advanced Algorithms Approximation Algorithms: VC and TSP Chapter 11 (KT) Section (CLRS) COMP 355 Advanced Algorithms Approximation Algorithms: VC and TSP Chapter 11 (KT) Section 35.1-35.2(CLRS) 1 Coping with NP-Completeness Brute-force search: This is usually only a viable option for small

More information

Lecture 7. s.t. e = (u,v) E x u + x v 1 (2) v V x v 0 (3)

Lecture 7. s.t. e = (u,v) E x u + x v 1 (2) v V x v 0 (3) COMPSCI 632: Approximation Algorithms September 18, 2017 Lecturer: Debmalya Panigrahi Lecture 7 Scribe: Xiang Wang 1 Overview In this lecture, we will use Primal-Dual method to design approximation algorithms

More information

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748 COT 6936: Topics in Algorithms! Giri Narasimhan ECS 254A / EC 2443; Phone: x3748 giri@cs.fiu.edu http://www.cs.fiu.edu/~giri/teach/cot6936_s12.html https://moodle.cis.fiu.edu/v2.1/course/view.php?id=174

More information

4.1 Interval Scheduling

4.1 Interval Scheduling 41 Interval Scheduling Interval Scheduling Interval scheduling Job j starts at s j and finishes at f j Two jobs compatible if they don't overlap Goal: find maximum subset of mutually compatible jobs a

More information

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved. Chapter 4 Greedy Algorithms Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved. 1 4.5 Minimum Spanning Tree Minimum Spanning Tree Minimum spanning tree. Given a connected

More information

CS6841: Advanced Algorithms IIT Madras, Spring 2016 Lecture #1: Properties of LP Optimal Solutions + Machine Scheduling February 24, 2016

CS6841: Advanced Algorithms IIT Madras, Spring 2016 Lecture #1: Properties of LP Optimal Solutions + Machine Scheduling February 24, 2016 CS6841: Advanced Algorithms IIT Madras, Spring 2016 Lecture #1: Properties of LP Optimal Solutions + Machine Scheduling February 24, 2016 Lecturer: Ravishankar Krishnaswamy Scribe: Anand Kumar(CS15M010)

More information

Submodularity Reading Group. Matroid Polytopes, Polymatroid. M. Pawan Kumar

Submodularity Reading Group. Matroid Polytopes, Polymatroid. M. Pawan Kumar Submodularity Reading Group Matroid Polytopes, Polymatroid M. Pawan Kumar http://www.robots.ox.ac.uk/~oval/ Outline Linear Programming Matroid Polytopes Polymatroid Polyhedron Ax b A : m x n matrix b:

More information

Vertex Cover Approximations

Vertex Cover Approximations CS124 Lecture 20 Heuristics can be useful in practice, but sometimes we would like to have guarantees. Approximation algorithms give guarantees. It is worth keeping in mind that sometimes approximation

More information

Advanced Algorithms. On-line Algorithms

Advanced Algorithms. On-line Algorithms Advanced Algorithms On-line Algorithms 1 Introduction Online Algorithms are algorithms that need to make decisions without full knowledge of the input. They have full knowledge of the past but no (or partial)

More information

1. Lecture notes on bipartite matching

1. Lecture notes on bipartite matching Massachusetts Institute of Technology 18.453: Combinatorial Optimization Michel X. Goemans February 5, 2017 1. Lecture notes on bipartite matching Matching problems are among the fundamental problems in

More information

Lecture 2. 1 Introduction. 2 The Set Cover Problem. COMPSCI 632: Approximation Algorithms August 30, 2017

Lecture 2. 1 Introduction. 2 The Set Cover Problem. COMPSCI 632: Approximation Algorithms August 30, 2017 COMPSCI 632: Approximation Algorithms August 30, 2017 Lecturer: Debmalya Panigrahi Lecture 2 Scribe: Nat Kell 1 Introduction In this lecture, we examine a variety of problems for which we give greedy approximation

More information

Minimum Weight Constrained Forest Problems. Problem Definition

Minimum Weight Constrained Forest Problems. Problem Definition Slide 1 s Xiaoyun Ji, John E. Mitchell Department of Mathematical Sciences Rensselaer Polytechnic Institute Troy, NY, USA jix@rpi.edu, mitchj@rpi.edu 2005 Optimization Days Montreal, Canada May 09, 2005

More information

Outline. CS38 Introduction to Algorithms. Linear programming 5/21/2014. Linear programming. Lecture 15 May 20, 2014

Outline. CS38 Introduction to Algorithms. Linear programming 5/21/2014. Linear programming. Lecture 15 May 20, 2014 5/2/24 Outline CS38 Introduction to Algorithms Lecture 5 May 2, 24 Linear programming simplex algorithm LP duality ellipsoid algorithm * slides from Kevin Wayne May 2, 24 CS38 Lecture 5 May 2, 24 CS38

More information

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 36

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 36 CS 473: Algorithms Ruta Mehta University of Illinois, Urbana-Champaign Spring 2018 Ruta (UIUC) CS473 1 Spring 2018 1 / 36 CS 473: Algorithms, Spring 2018 LP Duality Lecture 20 April 3, 2018 Some of the

More information

Dual-fitting analysis of Greedy for Set Cover

Dual-fitting analysis of Greedy for Set Cover Dual-fitting analysis of Greedy for Set Cover We showed earlier that the greedy algorithm for set cover gives a H n approximation We will show that greedy produces a solution of cost at most H n OPT LP

More information

val(y, I) α (9.0.2) α (9.0.3)

val(y, I) α (9.0.2) α (9.0.3) CS787: Advanced Algorithms Lecture 9: Approximation Algorithms In this lecture we will discuss some NP-complete optimization problems and give algorithms for solving them that produce a nearly optimal,

More information

Graph Algorithms Matching

Graph Algorithms Matching Chapter 5 Graph Algorithms Matching Algorithm Theory WS 2012/13 Fabian Kuhn Circulation: Demands and Lower Bounds Given: Directed network, with Edge capacities 0and lower bounds l for Node demands for

More information

Lecture 1. 2 Motivation: Fast. Reliable. Cheap. Choose two.

Lecture 1. 2 Motivation: Fast. Reliable. Cheap. Choose two. Approximation Algorithms and Hardness of Approximation February 19, 2013 Lecture 1 Lecturer: Ola Svensson Scribes: Alantha Newman 1 Class Information 4 credits Lecturers: Ola Svensson (ola.svensson@epfl.ch)

More information

Network Design Foundations Fall 2011 Lecture 10

Network Design Foundations Fall 2011 Lecture 10 Network Design Foundations Fall 2011 Lecture 10 Instructor: Mohammad T. Hajiaghayi Scribe: Catalin-Stefan Tiseanu November 2, 2011 1 Overview We study constant factor approximation algorithms for CONNECTED

More information

Iterative Methods in Combinatorial Optimization. R. Ravi Carnegie Bosch Professor Tepper School of Business Carnegie Mellon University

Iterative Methods in Combinatorial Optimization. R. Ravi Carnegie Bosch Professor Tepper School of Business Carnegie Mellon University Iterative Methods in Combinatorial Optimization R. Ravi Carnegie Bosch Professor Tepper School of Business Carnegie Mellon University ravi@cmu.edu Combinatorial Optimization Easy Problems : polynomial

More information

Stanford University CS261: Optimization Handout 1 Luca Trevisan January 4, 2011

Stanford University CS261: Optimization Handout 1 Luca Trevisan January 4, 2011 Stanford University CS261: Optimization Handout 1 Luca Trevisan January 4, 2011 Lecture 1 In which we describe what this course is about and give two simple examples of approximation algorithms 1 Overview

More information

CS 580: Algorithm Design and Analysis

CS 580: Algorithm Design and Analysis CS 580: Algorithm Design and Analysis Jeremiah Blocki Purdue University Spring 2018 Homework 4: Due tomorrow (March 9) at 11:59 PM Recap Linear Programming Very Powerful Technique (Subject of Entire Courses)

More information

3/7/2018. CS 580: Algorithm Design and Analysis. 8.1 Polynomial-Time Reductions. Chapter 8. NP and Computational Intractability

3/7/2018. CS 580: Algorithm Design and Analysis. 8.1 Polynomial-Time Reductions. Chapter 8. NP and Computational Intractability Algorithm Design Patterns and Anti-Patterns CS 580: Algorithm Design and Analysis Jeremiah Blocki Purdue University Spring 2018 Algorithm design patterns. Ex. Greedy. O(n log n) interval scheduling. Divide-and-conquer.

More information

Lecture 7: Asymmetric K-Center

Lecture 7: Asymmetric K-Center Advanced Approximation Algorithms (CMU 18-854B, Spring 008) Lecture 7: Asymmetric K-Center February 5, 007 Lecturer: Anupam Gupta Scribe: Jeremiah Blocki In this lecture, we will consider the K-center

More information

Chapter 16. Greedy Algorithms

Chapter 16. Greedy Algorithms Chapter 16. Greedy Algorithms Algorithms for optimization problems (minimization or maximization problems) typically go through a sequence of steps, with a set of choices at each step. A greedy algorithm

More information

February 19, Integer programming. Outline. Problem formulation. Branch-andbound

February 19, Integer programming. Outline. Problem formulation. Branch-andbound Olga Galinina olga.galinina@tut.fi ELT-53656 Network Analysis and Dimensioning II Department of Electronics and Communications Engineering Tampere University of Technology, Tampere, Finland February 19,

More information

In this lecture, we ll look at applications of duality to three problems:

In this lecture, we ll look at applications of duality to three problems: Lecture 7 Duality Applications (Part II) In this lecture, we ll look at applications of duality to three problems: 1. Finding maximum spanning trees (MST). We know that Kruskal s algorithm finds this,

More information

Prove, where is known to be NP-complete. The following problems are NP-Complete:

Prove, where is known to be NP-complete. The following problems are NP-Complete: CMPSCI 601: Recall From Last Time Lecture 21 To prove is NP-complete: Prove NP. Prove, where is known to be NP-complete. The following problems are NP-Complete: SAT (Cook-Levin Theorem) 3-SAT 3-COLOR CLIQUE

More information

Chapter 5 Graph Algorithms Algorithm Theory WS 2012/13 Fabian Kuhn

Chapter 5 Graph Algorithms Algorithm Theory WS 2012/13 Fabian Kuhn Chapter 5 Graph Algorithms Algorithm Theory WS 2012/13 Fabian Kuhn Graphs Extremely important concept in computer science Graph, : node (or vertex) set : edge set Simple graph: no self loops, no multiple

More information

Approximation Algorithms

Approximation Algorithms Approximation Algorithms Group Members: 1. Geng Xue (A0095628R) 2. Cai Jingli (A0095623B) 3. Xing Zhe (A0095644W) 4. Zhu Xiaolu (A0109657W) 5. Wang Zixiao (A0095670X) 6. Jiao Qing (A0095637R) 7. Zhang

More information

High Dimensional Indexing by Clustering

High Dimensional Indexing by Clustering Yufei Tao ITEE University of Queensland Recall that, our discussion so far has assumed that the dimensionality d is moderately high, such that it can be regarded as a constant. This means that d should

More information

Algorithms Dr. Haim Levkowitz

Algorithms Dr. Haim Levkowitz 91.503 Algorithms Dr. Haim Levkowitz Fall 2007 Lecture 4 Tuesday, 25 Sep 2007 Design Patterns for Optimization Problems Greedy Algorithms 1 Greedy Algorithms 2 What is Greedy Algorithm? Similar to dynamic

More information

Lecture 8: The Traveling Salesman Problem

Lecture 8: The Traveling Salesman Problem Lecture 8: The Traveling Salesman Problem Let G = (V, E) be an undirected graph. A Hamiltonian cycle of G is a cycle that visits every vertex v V exactly once. Instead of Hamiltonian cycle, we sometimes

More information

Solutions for the Exam 6 January 2014

Solutions for the Exam 6 January 2014 Mastermath and LNMB Course: Discrete Optimization Solutions for the Exam 6 January 2014 Utrecht University, Educatorium, 13:30 16:30 The examination lasts 3 hours. Grading will be done before January 20,

More information

Decision Problems. Observation: Many polynomial algorithms. Questions: Can we solve all problems in polynomial time? Answer: No, absolutely not.

Decision Problems. Observation: Many polynomial algorithms. Questions: Can we solve all problems in polynomial time? Answer: No, absolutely not. Decision Problems Observation: Many polynomial algorithms. Questions: Can we solve all problems in polynomial time? Answer: No, absolutely not. Definition: The class of problems that can be solved by polynomial-time

More information

Assignment 5: Solutions

Assignment 5: Solutions Algorithm Design Techniques Assignment 5: Solutions () Port Authority. [This problem is more commonly called the Bin Packing Problem.] (a) Suppose K = 3 and (w, w, w 3, w 4 ) = (,,, ). The optimal solution

More information

Chapter 6. Dynamic Programming

Chapter 6. Dynamic Programming Chapter 6 Dynamic Programming CS 573: Algorithms, Fall 203 September 2, 203 6. Maximum Weighted Independent Set in Trees 6..0. Maximum Weight Independent Set Problem Input Graph G = (V, E) and weights

More information

6. Lecture notes on matroid intersection

6. Lecture notes on matroid intersection Massachusetts Institute of Technology 18.453: Combinatorial Optimization Michel X. Goemans May 2, 2017 6. Lecture notes on matroid intersection One nice feature about matroids is that a simple greedy algorithm

More information

Reductions. designing algorithms establishing lower bounds establishing intractability classifying problems. Bird s-eye view

Reductions. designing algorithms establishing lower bounds establishing intractability classifying problems. Bird s-eye view Bird s-eye view Reductions designing algorithms establishing lower bounds establishing intractability classifying problems Desiderata. Classify problems according to computational requirements. Linear:

More information

1 Better Approximation of the Traveling Salesman

1 Better Approximation of the Traveling Salesman Stanford University CS261: Optimization Handout 4 Luca Trevisan January 13, 2011 Lecture 4 In which we describe a 1.5-approximate algorithm for the Metric TSP, we introduce the Set Cover problem, observe

More information

Integer Programming Theory

Integer Programming Theory Integer Programming Theory Laura Galli October 24, 2016 In the following we assume all functions are linear, hence we often drop the term linear. In discrete optimization, we seek to find a solution x

More information

Lecture 9. Semidefinite programming is linear programming where variables are entries in a positive semidefinite matrix.

Lecture 9. Semidefinite programming is linear programming where variables are entries in a positive semidefinite matrix. CSE525: Randomized Algorithms and Probabilistic Analysis Lecture 9 Lecturer: Anna Karlin Scribe: Sonya Alexandrova and Keith Jia 1 Introduction to semidefinite programming Semidefinite programming is linear

More information

CSC 373: Algorithm Design and Analysis Lecture 4

CSC 373: Algorithm Design and Analysis Lecture 4 CSC 373: Algorithm Design and Analysis Lecture 4 Allan Borodin January 14, 2013 1 / 16 Lecture 4: Outline (for this lecture and next lecture) Some concluding comments on optimality of EST Greedy Interval

More information

NP-Hardness. We start by defining types of problem, and then move on to defining the polynomial-time reductions.

NP-Hardness. We start by defining types of problem, and then move on to defining the polynomial-time reductions. CS 787: Advanced Algorithms NP-Hardness Instructor: Dieter van Melkebeek We review the concept of polynomial-time reductions, define various classes of problems including NP-complete, and show that 3-SAT

More information

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS Department of Computer Science University of Babylon LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS By Faculty of Science for Women( SCIW), University of Babylon, Iraq Samaher@uobabylon.edu.iq

More information

FINAL EXAM SOLUTIONS

FINAL EXAM SOLUTIONS COMP/MATH 3804 Design and Analysis of Algorithms I Fall 2015 FINAL EXAM SOLUTIONS Question 1 (12%). Modify Euclid s algorithm as follows. function Newclid(a,b) if a

More information

CS521 \ Notes for the Final Exam

CS521 \ Notes for the Final Exam CS521 \ Notes for final exam 1 Ariel Stolerman Asymptotic Notations: CS521 \ Notes for the Final Exam Notation Definition Limit Big-O ( ) Small-o ( ) Big- ( ) Small- ( ) Big- ( ) Notes: ( ) ( ) ( ) ( )

More information

Solving NP-hard Problems on Special Instances

Solving NP-hard Problems on Special Instances Solving NP-hard Problems on Special Instances Solve it in poly- time I can t You can assume the input is xxxxx No Problem, here is a poly-time algorithm 1 Solving NP-hard Problems on Special Instances

More information

Strategies for Replica Placement in Tree Networks

Strategies for Replica Placement in Tree Networks Strategies for Replica Placement in Tree Networks http://graal.ens-lyon.fr/~lmarchal/scheduling/ 2 avril 2009 Introduction and motivation Replica placement in tree networks Set of clients (tree leaves):

More information

Steiner Trees and Forests

Steiner Trees and Forests Massachusetts Institute of Technology Lecturer: Adriana Lopez 18.434: Seminar in Theoretical Computer Science March 7, 2006 Steiner Trees and Forests 1 Steiner Tree Problem Given an undirected graph G

More information

CMPSCI611: Approximating SET-COVER Lecture 21

CMPSCI611: Approximating SET-COVER Lecture 21 CMPSCI611: Approximating SET-COVER Lecture 21 Today we look at two more examples of approximation algorithms for NP-hard optimization problems. The first, for the SET-COVER problem, has an approximation

More information