In this chapter, we consider some of the interesting problems for which the greedy technique works, but we start with few simple examples.

Size: px
Start display at page:

Download "In this chapter, we consider some of the interesting problems for which the greedy technique works, but we start with few simple examples."

Transcription

1 . Greedy Technique The greedy technique (also known as greedy strategy) is applicable to solving optimization problems; an optimization problem calls for finding a solution that achieves the optimal (minimum or maximum) value of some objective function, while satisfying some constraints. The greedy technique constructs a solution to an optimization problem through a sequence of steps, each expanding a partially constructed solution obtained thus far, until a complete solution is found. On each step, a greedy algorithm examines all possible options at that point and decides on a locally optimal option; once a decision made, it is never retracted later. This technique does produce a globally optimal solution for certain problems, and the end-result will be an efficient algorithm. However, there are countless other problems where the greedy technique fails to produce an optimal solution because the combination of locally optimal decisions may not converge to a global optimum. In this chapter, we consider some of the interesting problems for which the greedy technique works, but we start with few simple examples. Coin Change As a first example of using the greedy technique, we consider applying it for the coin-change problem: Give change for a specific amount n using the least number of coins of some given denomination: d > d > > d m. A good strategy for reducing the number of coins is to always try to use the largest denomination. For example, to dispense an mount n = cents using quarters (d =), dimes (d =0) and cents (d =), we give quarters, dime and cents. This solution uses a total of 0 coins. Is this solution optimal? It is, if there is no other solution that uses less than 0 coins. It can be proven that the solution given by this strategy for any amount n, and the given denominations, is always optimal. At the same time, there are examples of weird coin denominations where the above strategy fails to produce the optimal solution. For example, for d =, d =, d = and n =, the greedy strategy will use three -cent coins and four -cent coins but the optimal solution is to use four -cent coins and one -cent coin. This is the reason why solving the coin-change problem using dynamic programming, which guarantees finding the optimal solution, is considered a worthy exercise. A Greedy Algorithm for the 0/-Knapsack Problem As another example of using a greedy strategy, we solve the 0/-knapsack problem. Given n items with values: v, v,, v n and weights: w, w,, w n ; the goal is to pack a sack of capacity C so that for the items put in the sack, their sum of weights C, and their sum of values is maximized. Our greedy algorithm is the following. First, sort the items in decreasing order of the ratios: r i = v i /w i (i.e., r i is the profit per unit weight for the i-th item). Then consider adding items to the sack based on the specified order, adding an item if the remaining sack capacity permits. Essentially, the greedy strategy favors items that have large values and small weights at the same time.

2 Greedy Technique Items 4 Item Values Item Weights 8 Item Ratios Table. Item data for an instance of the 0/-knapsack problem. Example. Consider using the preceding algorithm to solve the instance whose item data is given in Table.. The order of items by decreasing ratio is:, 4,,,. For a sack capacity C=, we would pick items as follows: add item, cannot add item 4 (because the sack capacity would be exceeded), add item, cannot add item, cannot add item. It can be shown that the preceding greedy strategy does not always yield the optimal solution for the 0/- knapsack problem. However, the strategy yields the optimal solution for the fractional knapsack problem. In the fractional knapsack problem, we are allowed to take a fractional part of an item (which gives a fraction of its value). The idea is that if the next item considered does not fit into the sack, then use as a big fraction of it as possible. For the preceding example, we would take item ; this leaves unused capacity of units, which we fill using /8 of item 4. The return value is +(/8)*0 =. Exercise. Show by example that the preceding greedy strategy for the 0/-knapsack problem does not always produce the optimal solution. Hint: Consider a situation where the greedy solution will leave part of the sack capacity unused. The greedy strategy is often described as nearest-neighbor search, because when we apply it to an optimization problem, we always move in the direction that is nearest to the target. It is also known as hill climbing because using it enables a person to reach the top of a hill. In particular, to find the maximum value attained by a function f(x), we do the following: Choose x 0 at random from domain of f i = 0; while (true) { Let x be a point in the vicinity of x i such that f(x) > f(x i ); if x is found then { i=i+; x i = x; else break; return f(x i ); Figure. shows how to apply this strategy. Let be a small number. At x i, we must decide whether to move in the direction of x i + or x i. In this case, we simply compute f(x i + ), f(x i ), and compare each to f(x i ), then move in the direction that gives a higher value (since we are to maximize f) for the function. We can clearly see the problem with this strategy; because of being short sighted, we get trapped at a local maxima x max, whereas the global maxima z max lies elsewhere.

3 f(x) x 0 x max z max x Figure. Greedy strategy is known by other names: hill climbing, nearest-neighbor search. Greedy technique is considered a general design technique despite the fact that it is applicable to optimization problems only. Using a greedy technique, the optimal solution is constructed incrementally via a series of decisions until a complete solution is found. At any particular state (i.e., a partial solution), a greedy algorithm examines the available choices and decides on a particular choice considering the following rules: Feasible: the choice made must satisfy the problem s constraints. Locally optimal: the choice made must be the best among all feasible choices at that state. Irrevocable: once made, the choice cannot be changed later. Quick: we should not spend too much time to decide. Greedy algorithms by their nature are fast but the important question is whether a greedy strategy works. For most practical optimization problems, a greedy strategy will not work; however, there are few problems for which a greedy strategy works and always produces an optimal solution. Even in situations where it fails to produce an optimal solution, a greedy algorithm can still be useful if we are satisfied with an approximate solution (because other algorithms that guarantee the optimal solution are slow); especially, if we can argue that the greedy solution cannot be too much far away from an optimal solution this is discussed further in Section 9. on Approximation Algorithms and Bounds. In the rest of this chapter, we examine several classic greedy algorithms. Note: For proper understanding of the material presented in the next two sections, the reader must be familiar with graphs and their data structures (Section.).

4 Greedy Technique. DijKstra s Single-Source Shortest-Paths Algorithm The shortest-paths problem (and its variations, see Section.9) is a good example of a practical problem for which the greedy technique works well. The Single-Source Shortest-Paths (SSSP) Problem. Given a directed weighted graph, find the shortest paths from a given start vertex (source) to every other vertex. A famous greedy algorithm for this problem was conceived by Dijkstra in 99 [Dij9]. Dijkstra s algorithm finds the shortest paths in the order of their lengths. The shortest of the shortest paths that start from a given source vertex sv is the path from sv to sv having no edges, and thus has a length of zero. The second shortest path must use exactly one edge that is outgoing from sv, because if the path uses more than one edge then, assuming that edges have positive weights, we can remove all edges except the first edge and obtain a shorter path. The third shortest path from sv may consist of a single edge outgoing from sv or use two edges, the first of which must be that of the second path. This observation generalizes to the following. If the k-th shortest path (from sv) is to vertex v and uses the edge (u,v) then we have: (a) The part of this path from sv to u must be the shortest path to u (by the principle of optimality), and (b) The part of this path from sv to u must be shorter than the k-th shortest path (Why?). The conditions (a) and (b) imply that the k-th shortest path is some i-th shortest path (for i < k) plus an edge to vertex v so that the path length is minimum over all such (i,v) possibilities. This last conclusion is the essence of Dijkstra s algorithm. We consider the implementation details. For a given graph G=(V,E) where V =n, we maintain the vertices of the graph as two disjoint sets S and V S, where the vertices in S are those for which the shortest paths have already been determined. Also, we define a vector Dist[..n], where Dist[u] is the length of the shortest path from sv to u whose vertices belong to S. Actually, when the algorithm is run till completion, Dist[w] gives the length of the shortest path from sv to w for all vertices w. The vertices of the graphs are added to S one vertex at a time in accordance with the following pseudocode. // Initialization; Add sv to S since // the shortest of the shortest paths (from sv) is to sv S = {sv; for i= to n { Dist[i] = cost of edge (sv,i); // Main loop while S < n { u x for which Dist[x] is minimum where x S; S = S { u ; // Add vertex u to S // Update Dist[w] where w V-S for each edge (u,w) where w S { t Dist[u] + cost of edge (u,w); if t < Dist[w] then Dist[w] = t; At any time during Dijkstra s algorithm, the vertices of the graph are split into two disjoints sets S and V S. This information can be stored as a Boolean (or 0/) array, S[..n], where vertex i belongs to S iff S[i]=. The algorithm s main loop does n iterations; in each iteration we select (and remove) a vertex from V S, and add it to S. Listing. gives program code for Dijkstra s algorithm assuming the input is given by the adjacency cost-matrix.

5 Recovering the Shortest Paths In order to recover the shortest paths, we define the vector Pred[..n] (Pred is short for Predecessor), where Pred[v] is the vertex immediately preceding vertex v on the shortest path from the source vertex sv to v. Dijkstra s algorithm is easily modified to compute this vector. In the initialization phase, because initially the shortest paths are simply the edges (sv,i), we set Pred[i]=sv for i =,,..., n. In the algorithm main loop, we update Pred[w] whenever Dist[w] is updated. Example. Figure. and Table. illustrate the workings of Dijkstra s algorithm. For this example, we have assumed vertex to be the source vertex. The table shows Dist and Pred arrays as the algorithm progresses through the main loop. The figure more or less gives the same information but the values of Dist and Pred are given as vertex labels. For both the figure and table, the underlined values represent final values. Input: Source vertex sv; adjacency cost-matrix C[..n,..n] Output: The array Dist[..n] stores length of shortest paths; the array Pred[..n] void Dijkstra_shortestpaths(int sv, int[,] C, ref int[] Dist, ref int[] Pred) { Dist = new int[n+]; // Allocate output Dist array Pred = new int[n+]; // Allocate output Pred array int[] S = new int[n+]; // Allocate a working array S // Initialization { S[i] = 0; // S[i]= iff vertex i belongs to S Dist[i] = C[sv,i]; Pred[i]= sv; S[sv]=; // The shortest of the shortest paths (from sv) is to sv // Main loop for i = to n- // Find n- shortest paths { u x for which Dist[x] is minimum where S[x] = 0; // Need a loop here S[u] = ; // Add veretex u to S // Update Dist[w] for w in V-S for each edge (u,w) where S[w] = 0 if (Dist[u]+C[u,w] < Dist[w]) { Dist[w] = Dist[u]+C[u,w]; Pred[w] = u; Listing. An O(n ) implementation of Dijkstra s SSSP algorithm using the adjacency cost-matrix of the input graph.

6 Greedy Technique 0/ 0/ / 0 9/ 0/ 4 / 4 / 8/4 9 / 4/ / / / / Figure. Illustration of Dijkstra s single-source shortest-paths algorithm through labeling vertices; the label for vertex v: Dist[v] / Pred[v]. The (tree of) shortest paths correspond to the solid edges. Iteration Vertex u* added to S Dist[v] / Pred[v] 4 initialize 0/ / 0/ / / / / 4 / 0/ 8/4 / / / 9/ / 4/ 0/ / 4/ 4 / / / *u is the vertex in V S with smallest Dist (in reference to previous row). Table. A trace of Dijkstra s single-source shortest-paths algorithm showing key program variables. Running-Time Analysis for Dijkstra s Algorithm For a graph with n vertices, Dijkstra s algorithm (as given above) has O(n ) running time. This can be easily seen by examining the algorithm s code given in Listing.. The algorithm processing is dominated by the main loop which does n iterations where each iteration consist of two O(n) loop blocks: one loop to find vertex u (this loop scans the array Dist[..n]) and another loop to update Dist (this loop scans one row in the adjacency matrix).

7 An O( E log V ) Implementation of Dijkstra s Algorithm Dijkstra s algorithm can be implemented to run in O( E log V ) provided that the graph is input using adjacency lists, and the set V S is maintained as a heap (see below). For dense graphs a graph for which E V is a dense graph; whereas, a graph for which E V is a sparse graph, because most edges are missing E log V is larger than V ; therefore, the proposed modification might not be worthwhile for dense graphs. The key to the modified implementation is how we maintain the set V S and its Dist array entries. If the set V S is maintained as a min-heap, where for any w V S, the priority of w is Dist[w], then algorithm can be modified to use the heap DeleteMin and Changekey operations as shown below. // Main loop while S < n { u DeleteMin(); S = S { u ; // Add vertex u to S // Update Dist[w] where w V-S for each edge (u,w) where w S { t Dist[u] + cost of edge (u,w); if t < Dist[w] then ChangeKey(w,t); // Set priority for key w to t First, we note that a DeleteMin (also, ChangeKey) operation is O(log n) for a heap of size n. Here we see that a DeleteMin operation is executed a total of V times; each time it takes O(log V ) because the heap has at most V elements. This gives O( V log V ) running time. On the other hand, for a given iteration of the main loop, a ChangeKey operation is executed as many times as the length of adjacency list of vertex u. Thus, in total for all iterations, this operation is executed E times, for a total running time of O( E log V ). Therefore, the order of the running time is O( V log V )+O( E log V ) = (assuming E V ) O( E log V ). Finally, we should note that ChangeKey(k,val) operation requires maintaining an index that tells the location in the heap where a key k is stored. The implementation of the various priority-queue operations has to be extended to include the necessary updates to the index. All of this introduces extra overhead that makes this implementation not worthwhile unless, besides the graph being sparse, V is very large. Question: In the preceding algorithm, can we do away with the call to ChangeKey? Note: For the preceding algorithm, the keys are simply vertex IDs, which are integers in [,n]. Thus, the index can be maintained as an array A[..n] where A[i] gives the location in the heap (assuming array-based implementation) of key i. Exercise. In Dijkstra s algorithm, as an optimization measure, can the test for S[w]=0 be omitted in the statement for each edge (u,w) where S[w] = 0? Justify your answer.

8 Greedy Technique. Minimum Spanning Trees In the context of undirected graphs, a tree is any graph that is connected (i.e., between any two vertices there is a path) and has no cycles. For a given connected undirected graph G, any subgraph of G that is a tree and includes all the vertices of G is by definition a spanning tree. If a graph G has n vertices then a spanning tree of G will have n vertices and n edges. Also, adding an edge to a spanning tree creates a cycle. A connected undirected graph can have many spanning trees. For example, the cycle graph on n vertices (for n ) has n spanning trees, because we can remove exactly one of the edges and get a spanning tree. Normally, if the input graph is weighted (i.e., each edge is assigned some numeric value) then out of all the possible spanning trees, we may be interested in the spanning tree having minimum cost, where the cost of a tree is simply the sum of weights of its edges. Such a tree is known as a minimum spanning tree (MST). As shown in Figure., the minimum spanning tree is not unique; however, all minimum spanning trees for a given graph have the same cost. 8 4 G 4 T 4 T Figure. A graph G and two minimum spanning trees of G: T and T ; both trees have cost = 4. The minimum spanning tree gives the cheapest way to link all the vertices of a graph by edges. In a practical application of this concept, the minimum spanning tree represents the cheapest way to connect power distribution centers by power lines to form a single connected network. However, for a minimum spanning tree, the removal of any one edge severs the tree into two or more parts. Therefore, if, in addition to minimizing cost, reliability is sought, then the MST can be augmented with additional edges. There are two classical algorithms for the minimum spanning tree problem: Prim s algorithm [Pri] and Kruskal s algorithm [Kru]. Both of these algorithms are greedy and always produce the optimal solution. Both algorithms rely on the so-called MST property for their proof of correctness.

9 The MST Property If the vertices of a graph G=(V,E) are split arbitrary into two disjoint subsets W and V W then the minimum-cost edge (u,v) that joins a vertex u W and v V W can belong to the MST. Proof: u' u v' v W V-W Figure.4 An MST for G that uses between W and V W an edge (u',v') other than the minimum cost edge (u,v). The proof is made clear by use of Figure.4. Suppose that the MST (solid edges of Figure.4) does not use the edge (u,v) and instead uses the edge (u',v'). Note that the MST uses exactly one edge (x,y) where x W and y V W; otherwise, if the MST uses no edge then the MST becomes disconnected and if it uses more than one edge then the MST will have a cycle. Since (u,v) is the minimum-cost edge that goes between W and V W, cost(u,v) cost(u',v'). Therefore, if we delete the edge (u',v') from the MST and add the edge (u,v), the result is still a spanning tree whose cost cannot exceed the cost of the MST. We consider Prim s algorithm first, then consider Kruskal s algorithm in the next section.

10 Greedy Technique.. Prim s MST Algorithm Prim s algorithm exhibits a structure that is very much similar to Dijkstra s algorithm presented in the previous section. At any time during Prim s algorithm, the vertices of the input graph G=(V,E) are split into two disjoints sets, S and V S, where the vertices in S are those that have been included in the MST. Similar to Dijkstra s algorithm, which grows the set of shortest paths one vertex at a time, Prim s algorithm grows the MST one vertex at time. The start vertex for the MST can be chosen arbitrarily but, for simplicity, we chose it to be vertex. For the implementation of Prim s algorithm, we define a vector Cost[..n], where Cost[u] is the cost of adding vertex u to the MST tree via an edge (u,v) where v is a vertex in S (i.e., v is already included in the MST). The vertices of the graphs are added to the MST one vertex at a time in accordance with the following pseudocode. // Initialization; S = { // Initially, only vertex is in the MST for i= to n { Cost[i] = cost of edge (,i); // Main loop while S < n { u x for which Cost[x] is minimum where x S; S = S { u ; // Add vertex u to S // Update Cost[w] where w V-S for each edge (u,w) where w S { t cost of edge (u,x); if t < Cost[w] then Cost[w] = t; The main loop does n iterations; in each iteration we select (and remove) a vertex u from V S (u is the vertex with minimum cost) and add it to S. Once vertex u is added to S, we have to check whether Cost[v] for any of the vertices in V S needs to be updated. Listing. gives program code for Prim s algorithm assuming the input is given by the adjacency cost-matrix. In order to recover the edges of the MST, we define the vector Closest[..n], where Closest[v] is the MST vertex that is closest to v. Prim s algorithm is easily modified to compute this vector. In the initialization part, we set Closest[i]= for i =,..., n; since initially the MST consists of vertex only. In the algorithm main loop, we update Closest[w] whenever Cost[w] is updated. When the algorithm is run till completion, the edges of the MST can be recovered from Closest array (i.e., the edges are: (i, Closest[i]) for i = to n). Example. Figure. and Table. illustrate the workings of Prim s algorithm. The table shows Cost and Closest arrays as the algorithm progresses through the main loop. The figure more or less gives the same information but the values of Cost/Closest are given as vertex labels. For both figures, the underlined values represent final values.

11 Proof of Correctness of Prim s Algorithm Proof by contradiction. Suppose the algorithm fails. Consider the first edge e chosen by the algorithm that is not consistent with an MST. Let P be the partial tree that is built just before adding in e, and let M be the MST consistent with P (i.e., P is part of M). Now, suppose we add e to M. This creates a cycle. If we trace around this cycle, we must eventually reach an edge e' that goes back to P, because e has one endpoint in P and one outside of P. Also, we have length(e') length(e) by definition of the algorithm. So, if we add e to M and remove e', we get a spanning tree that is no larger (costwise) than M was, contradicting our definition of e. Input: The adjacency cost-matrix C[..n,..n] Output: The array Closest[..n] stores MST; (i,closest[i]) is MST edge, for i= to n void Prim_MST(int[,] C, ref int[] Closest) { Closest = new int[n+]; // Allocate output array int[] Cost = new int[n+]; // Allocate a working array int[] S = new int[n+]; // Allocate a working array // Initialization S[]=; // Initially S contains vertex for i = to n { S[i] = 0; // S[i]= iff vertex i belongs to S Cost[i] = C[,i]; Closest[i]= ; // Main loop for i = to n- // Find n- edges for the MST { u x for which Cost[x] is minimum where S[x] = 0; // Need a loop here S[u] = ; // Add vertex u to S // Update Cost[w] for w in V-S for each edge (u,w) where S[w] = 0 if (C[u,w] < Cost[w]) { Cost[w] = C[u,w]; Closest[w] = u; Listing. An O(n ) implementation of Prim s MST algorithm using the adjacency cost-matrix of the input graph.

12 Greedy Technique / / 0/ / 0 / /4 0/ 4 / 4 / /4 9 / / / 4/ 9/ / Figure. Illustration of Prim s MST algorithm through labeling vertices; the label for vertex v: Cost[v] /Closest[v]. The MST is given by the solid edges. Iteration Vertex u* added to S : MST edge Cost[v] / Closest[v] 4 initialize S = { 0/ / 0/ / / / / 4: (4,) / /4 /4 / / : (,4) / / 9/ / : (,) / 4/ / 4 : (,) 4/ / : (,) 4/ : (,) *u is the vertex in V S with smallest Cost (in reference to previous row); (u, Closest[u]) is an MST edge. Cost of MST = =. Table. A trace of Prim s MST algorithm showing key program variables. Comparing Prim s Algorithm to Dijkstra s Algorithm The close similarity of these algorithms is quite apparent from their preceding description. The Cost and Closest vectors in Prim s algorithm play analogous roles to Dist and Pred vectors, respectively, in Dijkstra s algorithm. The key difference between these algorithms is in the way Cost and Dist vectors are updated every time a vertex u is added to S. In Prim s algorithm, this update is based on the cost of the edge (u,w), whereas in Dijkstra s algorithm, it is based on Dist[u] + cost of the directed edge (u,w). There are few other differences between the two algorithms. The input to the MST problem is undirected weighted graph, while the input to the single-source shortest-paths problem is a directed weighted graph. The output for the first problem is a non-rooted undirected tree; the output for the second problem is a directed tree

13 rooted at the source vertex. It is an interesting exercise to come up with a 4-edge graph for which the minimum spanning tree is different from the tree (viewed as undirected) of shortest paths. Figure. shows such a graph. 4 A graph G 4 Minimum spanning tree for G 4 Tree of shortest paths (from ) for G Figure. A graph showing that the minimum spanning tree and the tree of shortest paths are different trees. Running-Time Analysis for Prim s MST Algorithm For a graph with n vertices, Prim s algorithm (as given above) has O(n ) running time. This can be easily seen by following similar analysis to that we used for Dijkstra s algorithm. As done for Dijkstra s algorithm, Prim s algorithm can be implemented to run in O( E log V ) provided that the graph is input using adjacency lists and the set V S is maintained as a heap; However, such implementation is only useful for sparse graphs where E is much smaller than V.

14 Greedy Technique.. Kruskal s MST Algorithm The other classical algorithm for finding a minimum spanning tree is by Kruskal [Kru]. In contrast to Prim s algorithm, which grows the MST one vertex at a time, Kruskal s algorithm grows the MST one edge at a time. For an input graph G=(V,E), the algorithm consists of a preprocessing step that builds a list of edges of G sorted by edge weights. Then, starting with an empty graph T, it scans this sorted list adding the next edge on the list to T provided that the edge does not create a cycle; otherwise, the edge is skipped. The process terminates after adding V edges. The algorithm s pseudocode is given by the following. L A list of edges sorted in nondecreasing order by edge weight T = {; // The set of MST edges // Main loop while T V - { (u,v) next edge from L; if Adding edge (u,v) does not create a cycle in T then T = T {(u,v); Example.4 Figure. and the associated commentary shows how Kruskal s algorithm works. First, we build a list of the edges sorted in nondecreasing order by cost. The initial graph for the MST can be viewed as a forest (a collection of trees) having V trees, where each tree T v is simply vertex v by itself. Then the edges (,),(,),(,),(4,),(,) are added to the MST, since none of these causes a cycle. Then the edge (,) is considered, but found to cause a cycle and thus it is skipped. The next edge considered is (,4) but this causes a cycle and is skipped. Finally we add the edge (,4) and finish because the number of edges in our MST has reached V. Note that every time we add an edge (u,v) where u T u and v T v causes the two trees T u and T v to be merged into one tree. How can we tell if adding an edge (u,v) will create a cycle? Simple, a cycle will not be created if u and v happen to be in different trees, and only if u and v are already in the same tree then a cycle will be created. If we think of each tree as a set of vertices then what we need is a fast algorithm for set membership. Furthermore, the sets we are dealing with are disjoint and the merging of two trees should be mapped into a union operation on sets. This suggests that we use union-find data structures to implement Kruskal s algorithm. This implementation is given in Listing.. The output MST (as a set of n edges) is stored in the array MST_Edges[0..n,0..]. Question: Why cannot we use a single dimensional array as we did in Prim s algorithm?

15 Figure. Illustration of Kruskal s MST algorithm. The order of edges (by cost):(,),(,),(,),(4,),(,),(,), (,4),(,4), etc. The edges (,) and (,4) cause cycles and, therefore, are skipped. Adding edges in order of cost: every time an edge is added, two trees are merged into one tree. 4 Initial set of trees (forest) Final MST; dashed edges cause cycles and, therefore, are skipped Input graph

16 Greedy Technique Input: A weighted connected undirected graph G Output: The array MST_Edges[0..n-, 0..] stores MST edges void Kruskal_MST(Graph G, ref int[,] MST_Edges) { n number of vertices of G; MST_Edges = new int[n-,]; // output array EdgeList List of edges of G sorted in nondecreasing order by edge weight UnionFind uf = new UnionFind(n); // Use an instance of a Union-Find ADT EdgeCount = 0; while EdgeCount n- { (u,v) next edge from L; if uf.find(u) uf.find(v) { uf.union(u,v); MST_Edges[EdgeCount,0] = u; MST_Edges[EdgeCount,] = v; // Add edge (u,v) to MST EdgeCount++; Listing. Implementation of Kruskal s MST algorithm using union-find data structures. Running-Time Analysis for Kruskal s Algorithm We present an argument that Kruskal s algorithm is O( E log E ). The initialization step of sorting the edges by their weights is O( E log E ), since there are E elements to be sorted. Next, consider the main loop. In the worst case, the list of edges needs to be examined until the last entry, which leads to a total of E iterations for the main loop. Each iteration consists of a constant number of find and union operations on an n-element universe (n is the number of vertices). For union-find, we recall that using union-by-rank, the operations of union and find are done in O(log n) (note that this time can be expected to be almost constant if, further, path compression is used). Thus, the main loop is O( E log n). The combination of the initialization step and the main loop gives O( E log E )+O( E log n). This latter expression is dominated by O( E log E ). This is because the input graph for the MST problem must be connected, and normally will have much more edges than vertices note that the graph will not be connected if E < n.

Chapter 9. Greedy Technique. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

Chapter 9. Greedy Technique. Copyright 2007 Pearson Addison-Wesley. All rights reserved. Chapter 9 Greedy Technique Copyright 2007 Pearson Addison-Wesley. All rights reserved. Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that

More information

Minimum-Spanning-Tree problem. Minimum Spanning Trees (Forests) Minimum-Spanning-Tree problem

Minimum-Spanning-Tree problem. Minimum Spanning Trees (Forests) Minimum-Spanning-Tree problem Minimum Spanning Trees (Forests) Given an undirected graph G=(V,E) with each edge e having a weight w(e) : Find a subgraph T of G of minimum total weight s.t. every pair of vertices connected in G are

More information

CS 310 Advanced Data Structures and Algorithms

CS 310 Advanced Data Structures and Algorithms CS 0 Advanced Data Structures and Algorithms Weighted Graphs July 0, 07 Tong Wang UMass Boston CS 0 July 0, 07 / Weighted Graphs Each edge has a weight (cost) Edge-weighted graphs Mostly we consider only

More information

2.1 Greedy Algorithms. 2.2 Minimum Spanning Trees. CS125 Lecture 2 Fall 2016

2.1 Greedy Algorithms. 2.2 Minimum Spanning Trees. CS125 Lecture 2 Fall 2016 CS125 Lecture 2 Fall 2016 2.1 Greedy Algorithms We will start talking about methods high-level plans for constructing algorithms. One of the simplest is just to have your algorithm be greedy. Being greedy,

More information

Minimum Spanning Trees

Minimum Spanning Trees CS124 Lecture 5 Spring 2011 Minimum Spanning Trees A tree is an undirected graph which is connected and acyclic. It is easy to show that if graph G(V,E) that satisfies any two of the following properties

More information

DESIGN AND ANALYSIS OF ALGORITHMS GREEDY METHOD

DESIGN AND ANALYSIS OF ALGORITHMS GREEDY METHOD 1 DESIGN AND ANALYSIS OF ALGORITHMS UNIT II Objectives GREEDY METHOD Explain and detail about greedy method Explain the concept of knapsack problem and solve the problems in knapsack Discuss the applications

More information

6.1 Minimum Spanning Trees

6.1 Minimum Spanning Trees CS124 Lecture 6 Fall 2018 6.1 Minimum Spanning Trees A tree is an undirected graph which is connected and acyclic. It is easy to show that if graph G(V,E) that satisfies any two of the following properties

More information

Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-4004) Subject Notes

Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-4004) Subject Notes Page no: Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-00) Subject Notes Unit- Greedy Technique. Introduction: Greedy is the most straight forward design technique.

More information

CSE 100: GRAPH ALGORITHMS

CSE 100: GRAPH ALGORITHMS CSE 100: GRAPH ALGORITHMS Dijkstra s Algorithm: Questions Initialize the graph: Give all vertices a dist of INFINITY, set all done flags to false Start at s; give s dist = 0 and set prev field to -1 Enqueue

More information

UNIT 3. Greedy Method. Design and Analysis of Algorithms GENERAL METHOD

UNIT 3. Greedy Method. Design and Analysis of Algorithms GENERAL METHOD UNIT 3 Greedy Method GENERAL METHOD Greedy is the most straight forward design technique. Most of the problems have n inputs and require us to obtain a subset that satisfies some constraints. Any subset

More information

Lecture 13. Reading: Weiss, Ch. 9, Ch 8 CSE 100, UCSD: LEC 13. Page 1 of 29

Lecture 13. Reading: Weiss, Ch. 9, Ch 8 CSE 100, UCSD: LEC 13. Page 1 of 29 Lecture 13 Connectedness in graphs Spanning trees in graphs Finding a minimal spanning tree Time costs of graph problems and NP-completeness Finding a minimal spanning tree: Prim s and Kruskal s algorithms

More information

CSE 100 Minimum Spanning Trees Prim s and Kruskal

CSE 100 Minimum Spanning Trees Prim s and Kruskal CSE 100 Minimum Spanning Trees Prim s and Kruskal Your Turn The array of vertices, which include dist, prev, and done fields (initialize dist to INFINITY and done to false ): V0: dist= prev= done= adj:

More information

Minimum Spanning Trees. CSE 373 Data Structures

Minimum Spanning Trees. CSE 373 Data Structures Minimum Spanning Trees CSE 373 Data Structures Reading Chapter 3 Section 3.7 MSTs 2 Spanning Tree Given (connected) G(V,E) a spanning tree T(V,E ): Spans the graph (V = V) Forms a tree (no cycle); E has

More information

looking ahead to see the optimum

looking ahead to see the optimum ! Make choice based on immediate rewards rather than looking ahead to see the optimum! In many cases this is effective as the look ahead variation can require exponential time as the number of possible

More information

22 Elementary Graph Algorithms. There are two standard ways to represent a

22 Elementary Graph Algorithms. There are two standard ways to represent a VI Graph Algorithms Elementary Graph Algorithms Minimum Spanning Trees Single-Source Shortest Paths All-Pairs Shortest Paths 22 Elementary Graph Algorithms There are two standard ways to represent a graph

More information

Minimum Spanning Trees My T. UF

Minimum Spanning Trees My T. UF Introduction to Algorithms Minimum Spanning Trees @ UF Problem Find a low cost network connecting a set of locations Any pair of locations are connected There is no cycle Some applications: Communication

More information

Greedy algorithms is another useful way for solving optimization problems.

Greedy algorithms is another useful way for solving optimization problems. Greedy Algorithms Greedy algorithms is another useful way for solving optimization problems. Optimization Problems For the given input, we are seeking solutions that must satisfy certain conditions. These

More information

Week 11: Minimum Spanning trees

Week 11: Minimum Spanning trees Week 11: Minimum Spanning trees Agenda: Minimum Spanning Trees Prim s Algorithm Reading: Textbook : 61-7 1 Week 11: Minimum Spanning trees Minimum spanning tree (MST) problem: Input: edge-weighted (simple,

More information

Notes on Minimum Spanning Trees. Red Rule: Given a cycle containing no red edges, select a maximum uncolored edge on the cycle, and color it red.

Notes on Minimum Spanning Trees. Red Rule: Given a cycle containing no red edges, select a maximum uncolored edge on the cycle, and color it red. COS 521 Fall 2009 Notes on Minimum Spanning Trees 1. The Generic Greedy Algorithm The generic greedy algorithm finds a minimum spanning tree (MST) by an edge-coloring process. Initially all edges are uncolored.

More information

22 Elementary Graph Algorithms. There are two standard ways to represent a

22 Elementary Graph Algorithms. There are two standard ways to represent a VI Graph Algorithms Elementary Graph Algorithms Minimum Spanning Trees Single-Source Shortest Paths All-Pairs Shortest Paths 22 Elementary Graph Algorithms There are two standard ways to represent a graph

More information

(Refer Slide Time: 00:18)

(Refer Slide Time: 00:18) Programming, Data Structures and Algorithms Prof. N. S. Narayanaswamy Department of Computer Science and Engineering Indian Institute of Technology, Madras Module 11 Lecture 58 Problem: single source shortest

More information

Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret

Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret Greedy Algorithms (continued) The best known application where the greedy algorithm is optimal is surely

More information

Dijkstra s algorithm for shortest paths when no edges have negative weight.

Dijkstra s algorithm for shortest paths when no edges have negative weight. Lecture 14 Graph Algorithms II 14.1 Overview In this lecture we begin with one more algorithm for the shortest path problem, Dijkstra s algorithm. We then will see how the basic approach of this algorithm

More information

Decreasing a key FIB-HEAP-DECREASE-KEY(,, ) 3.. NIL. 2. error new key is greater than current key 6. CASCADING-CUT(, )

Decreasing a key FIB-HEAP-DECREASE-KEY(,, ) 3.. NIL. 2. error new key is greater than current key 6. CASCADING-CUT(, ) Decreasing a key FIB-HEAP-DECREASE-KEY(,, ) 1. if >. 2. error new key is greater than current key 3.. 4.. 5. if NIL and.

More information

Week 12: Minimum Spanning trees and Shortest Paths

Week 12: Minimum Spanning trees and Shortest Paths Agenda: Week 12: Minimum Spanning trees and Shortest Paths Kruskal s Algorithm Single-source shortest paths Dijkstra s algorithm for non-negatively weighted case Reading: Textbook : 61-7, 80-87, 9-601

More information

Name: Lirong TAN 1. (15 pts) (a) Define what is a shortest s-t path in a weighted, connected graph G.

Name: Lirong TAN 1. (15 pts) (a) Define what is a shortest s-t path in a weighted, connected graph G. 1. (15 pts) (a) Define what is a shortest s-t path in a weighted, connected graph G. A shortest s-t path is a path from vertex to vertex, whose sum of edge weights is minimized. (b) Give the pseudocode

More information

Chapter 23. Minimum Spanning Trees

Chapter 23. Minimum Spanning Trees Chapter 23. Minimum Spanning Trees We are given a connected, weighted, undirected graph G = (V,E;w), where each edge (u,v) E has a non-negative weight (often called length) w(u,v). The Minimum Spanning

More information

CSE373: Data Structures & Algorithms Lecture 17: Minimum Spanning Trees. Dan Grossman Fall 2013

CSE373: Data Structures & Algorithms Lecture 17: Minimum Spanning Trees. Dan Grossman Fall 2013 CSE373: Data Structures & Algorithms Lecture 7: Minimum Spanning Trees Dan Grossman Fall 03 Spanning Trees A simple problem: Given a connected undirected graph G=(V,E), find a minimal subset of edges such

More information

CS200: Graphs. Rosen Ch , 9.6, Walls and Mirrors Ch. 14

CS200: Graphs. Rosen Ch , 9.6, Walls and Mirrors Ch. 14 CS200: Graphs Rosen Ch. 9.1-9.4, 9.6, 10.4-10.5 Walls and Mirrors Ch. 14 Trees as Graphs Tree: an undirected connected graph that has no cycles. A B C D E F G H I J K L M N O P Rooted Trees A rooted tree

More information

Minimum Spanning Trees

Minimum Spanning Trees Minimum Spanning Trees Overview Problem A town has a set of houses and a set of roads. A road connects and only houses. A road connecting houses u and v has a repair cost w(u, v). Goal: Repair enough (and

More information

Greedy Algorithms. Previous Examples: Huffman coding, Minimum Spanning Tree Algorithms

Greedy Algorithms. Previous Examples: Huffman coding, Minimum Spanning Tree Algorithms Greedy Algorithms A greedy algorithm is one where you take the step that seems the best at the time while executing the algorithm. Previous Examples: Huffman coding, Minimum Spanning Tree Algorithms Coin

More information

Chapter 9 Graph Algorithms

Chapter 9 Graph Algorithms Chapter 9 Graph Algorithms 2 Introduction graph theory useful in practice represent many real-life problems can be slow if not careful with data structures 3 Definitions an undirected graph G = (V, E)

More information

11/22/2016. Chapter 9 Graph Algorithms. Introduction. Definitions. Definitions. Definitions. Definitions

11/22/2016. Chapter 9 Graph Algorithms. Introduction. Definitions. Definitions. Definitions. Definitions Introduction Chapter 9 Graph Algorithms graph theory useful in practice represent many real-life problems can be slow if not careful with data structures 2 Definitions an undirected graph G = (V, E) is

More information

Undirected Graphs. DSA - lecture 6 - T.U.Cluj-Napoca - M. Joldos 1

Undirected Graphs. DSA - lecture 6 - T.U.Cluj-Napoca - M. Joldos 1 Undirected Graphs Terminology. Free Trees. Representations. Minimum Spanning Trees (algorithms: Prim, Kruskal). Graph Traversals (dfs, bfs). Articulation points & Biconnected Components. Graph Matching

More information

Union Find and Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 8

Union Find and Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 8 Union Find and Greedy Algorithms CSE 101: Design and Analysis of Algorithms Lecture 8 CSE 101: Design and analysis of algorithms Union find Reading: Section 5.1 Greedy algorithms Reading: Kleinberg and

More information

10/31/18. About A6, Prelim 2. Spanning Trees, greedy algorithms. Facts about trees. Undirected trees

10/31/18. About A6, Prelim 2. Spanning Trees, greedy algorithms. Facts about trees. Undirected trees //8 About A, Prelim Spanning Trees, greedy algorithms Lecture CS Fall 8 Prelim : Thursday, November. Visit exams page of course website and read carefully to find out when you take it (: or 7:) and what

More information

Spanning Trees, greedy algorithms. Lecture 20 CS2110 Fall 2018

Spanning Trees, greedy algorithms. Lecture 20 CS2110 Fall 2018 1 Spanning Trees, greedy algorithms Lecture 20 CS2110 Fall 2018 1 About A6, Prelim 2 Prelim 2: Thursday, 15 November. Visit exams page of course website and read carefully to find out when you take it

More information

Spanning Trees. Lecture 20 CS2110 Spring 2015

Spanning Trees. Lecture 20 CS2110 Spring 2015 1 Spanning Trees Lecture 0 CS110 Spring 01 1 Undirected trees An undirected graph is a tree if there is exactly one simple path between any pair of vertices Root of tree? It doesn t matter choose any vertex

More information

Representations of Weighted Graphs (as Matrices) Algorithms and Data Structures: Minimum Spanning Trees. Weighted Graphs

Representations of Weighted Graphs (as Matrices) Algorithms and Data Structures: Minimum Spanning Trees. Weighted Graphs Representations of Weighted Graphs (as Matrices) A B Algorithms and Data Structures: Minimum Spanning Trees 9.0 F 1.0 6.0 5.0 6.0 G 5.0 I H 3.0 1.0 C 5.0 E 1.0 D 28th Oct, 1st & 4th Nov, 2011 ADS: lects

More information

Spanning Trees, greedy algorithms. Lecture 22 CS2110 Fall 2017

Spanning Trees, greedy algorithms. Lecture 22 CS2110 Fall 2017 1 Spanning Trees, greedy algorithms Lecture 22 CS2110 Fall 2017 1 We demo A8 Your space ship is on earth, and you hear a distress signal from a distance Planet X. Your job: 1. Rescue stage: Fly your ship

More information

Greedy Algorithms. At each step in the algorithm, one of several choices can be made.

Greedy Algorithms. At each step in the algorithm, one of several choices can be made. Greedy Algorithms At each step in the algorithm, one of several choices can be made. Greedy Strategy: make the choice that is the best at the moment. After making a choice, we are left with one subproblem

More information

CSE331 Introduction to Algorithms Lecture 15 Minimum Spanning Trees

CSE331 Introduction to Algorithms Lecture 15 Minimum Spanning Trees CSE1 Introduction to Algorithms Lecture 1 Minimum Spanning Trees Antoine Vigneron antoine@unist.ac.kr Ulsan National Institute of Science and Technology July 11, 201 Antoine Vigneron (UNIST) CSE1 Lecture

More information

Announcements. HW4 due Friday. Summer 2016 CSE373: Data Structures & Algorithms 1

Announcements. HW4 due Friday. Summer 2016 CSE373: Data Structures & Algorithms 1 Announcements HW4 due Friday Summer 0 Minimum Spanning Trees Hunter Zahn Summer 0 Summer 0 Spanning Trees A simple problem: Given a connected undirected graph G=(V,E), find a minimal subset of edges such

More information

CS 561, Lecture 9. Jared Saia University of New Mexico

CS 561, Lecture 9. Jared Saia University of New Mexico CS 561, Lecture 9 Jared Saia University of New Mexico Today s Outline Minimum Spanning Trees Safe Edge Theorem Kruskal and Prim s algorithms Graph Representation 1 Graph Definition A graph is a pair of

More information

Announcements Problem Set 5 is out (today)!

Announcements Problem Set 5 is out (today)! CSC263 Week 10 Announcements Problem Set is out (today)! Due Tuesday (Dec 1) Minimum Spanning Trees The Graph of interest today A connected undirected weighted graph G = (V, E) with weights w(e) for each

More information

Algorithm Analysis Graph algorithm. Chung-Ang University, Jaesung Lee

Algorithm Analysis Graph algorithm. Chung-Ang University, Jaesung Lee Algorithm Analysis Graph algorithm Chung-Ang University, Jaesung Lee Basic definitions Graph = (, ) where is a set of vertices and is a set of edges Directed graph = where consists of ordered pairs

More information

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms Lecturer: Shi Li Department of Computer Science and Engineering University at Buffalo Main Goal of Algorithm Design Design fast

More information

CSE 431/531: Analysis of Algorithms. Greedy Algorithms. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

CSE 431/531: Analysis of Algorithms. Greedy Algorithms. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo CSE 431/531: Analysis of Algorithms Greedy Algorithms Lecturer: Shi Li Department of Computer Science and Engineering University at Buffalo Main Goal of Algorithm Design Design fast algorithms to solve

More information

CSC 8301 Design & Analysis of Algorithms: Kruskal s and Dijkstra s Algorithms

CSC 8301 Design & Analysis of Algorithms: Kruskal s and Dijkstra s Algorithms CSC 8301 Design & Analysis of Algorithms: Kruskal s and Dijkstra s Algorithms Professor Henry Carter Fall 2016 Recap Greedy algorithms iterate locally optimal choices to construct a globally optimal solution

More information

CS 161: Design and Analysis of Algorithms

CS 161: Design and Analysis of Algorithms CS 161: Design and Analysis of Algorithms Greedy Algorithms 2: Minimum Spanning Trees Definition The cut property Prim s Algorithm Kruskal s Algorithm Disjoint Sets Tree A tree is a connected graph with

More information

managing an evolving set of connected components implementing a Union-Find data structure implementing Kruskal s algorithm

managing an evolving set of connected components implementing a Union-Find data structure implementing Kruskal s algorithm Spanning Trees 1 Spanning Trees the minimum spanning tree problem three greedy algorithms analysis of the algorithms 2 The Union-Find Data Structure managing an evolving set of connected components implementing

More information

Minimum Spanning Trees Shortest Paths

Minimum Spanning Trees Shortest Paths Minimum Spanning Trees Shortest Paths Minimum Spanning Tree Given a set of locations, with positive distances to each other, we want to create a network that connects all nodes to each other with minimal

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms CSE 101, Winter 018 D/Q Greed SP s DP LP, Flow B&B, Backtrack Metaheuristics P, NP Design and Analysis of Algorithms Lecture 8: Greed Class URL: http://vlsicad.ucsd.edu/courses/cse101-w18/ Optimization

More information

Chapter 9 Graph Algorithms

Chapter 9 Graph Algorithms Chapter 9 Graph Algorithms 2 Introduction graph theory useful in practice represent many real-life problems can be if not careful with data structures 3 Definitions an undirected graph G = (V, E) is a

More information

Spanning Trees 4/19/17. Prelim 2, assignments. Undirected trees

Spanning Trees 4/19/17. Prelim 2, assignments. Undirected trees /9/7 Prelim, assignments Prelim is Tuesday. See the course webpage for details. Scope: up to but not including today s lecture. See the review guide for details. Deadline for submitting conflicts has passed.

More information

Spanning Trees. Lecture 22 CS2110 Spring 2017

Spanning Trees. Lecture 22 CS2110 Spring 2017 1 Spanning Trees Lecture 22 CS2110 Spring 2017 1 Prelim 2, assignments Prelim 2 is Tuesday. See the course webpage for details. Scope: up to but not including today s lecture. See the review guide for

More information

1 Shortest Paths. 1.1 Breadth First Search (BFS) CS 124 Section #3 Shortest Paths and MSTs 2/13/2018

1 Shortest Paths. 1.1 Breadth First Search (BFS) CS 124 Section #3 Shortest Paths and MSTs 2/13/2018 CS Section # Shortest Paths and MSTs //08 Shortest Paths There are types of shortest paths problems: Single source single destination Single source to all destinations All pairs shortest path In today

More information

Chapter 9 Graph Algorithms

Chapter 9 Graph Algorithms Introduction graph theory useful in practice represent many real-life problems can be if not careful with data structures Chapter 9 Graph s 2 Definitions Definitions an undirected graph is a finite set

More information

Weighted Graph Algorithms Presented by Jason Yuan

Weighted Graph Algorithms Presented by Jason Yuan Weighted Graph Algorithms Presented by Jason Yuan Slides: Zachary Friggstad Programming Club Meeting Weighted Graphs struct Edge { int u, v ; int w e i g h t ; // can be a double } ; Edge ( int uu = 0,

More information

Introduction to Parallel & Distributed Computing Parallel Graph Algorithms

Introduction to Parallel & Distributed Computing Parallel Graph Algorithms Introduction to Parallel & Distributed Computing Parallel Graph Algorithms Lecture 16, Spring 2014 Instructor: 罗国杰 gluo@pku.edu.cn In This Lecture Parallel formulations of some important and fundamental

More information

Theorem 2.9: nearest addition algorithm

Theorem 2.9: nearest addition algorithm There are severe limits on our ability to compute near-optimal tours It is NP-complete to decide whether a given undirected =(,)has a Hamiltonian cycle An approximation algorithm for the TSP can be used

More information

G205 Fundamentals of Computer Engineering. CLASS 21, Mon. Nov Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm

G205 Fundamentals of Computer Engineering. CLASS 21, Mon. Nov Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm G205 Fundamentals of Computer Engineering CLASS 21, Mon. Nov. 22 2004 Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm Greedy Algorithms, 1 Algorithms for Optimization Problems Sequence of steps Choices at

More information

UC Berkeley CS 170: Efficient Algorithms and Intractable Problems Handout 8 Lecturer: David Wagner February 20, Notes 8 for CS 170

UC Berkeley CS 170: Efficient Algorithms and Intractable Problems Handout 8 Lecturer: David Wagner February 20, Notes 8 for CS 170 UC Berkeley CS 170: Efficient Algorithms and Intractable Problems Handout 8 Lecturer: David Wagner February 20, 2003 Notes 8 for CS 170 1 Minimum Spanning Trees A tree is an undirected graph that is connected

More information

CSC 1700 Analysis of Algorithms: Minimum Spanning Tree

CSC 1700 Analysis of Algorithms: Minimum Spanning Tree CSC 1700 Analysis of Algorithms: Minimum Spanning Tree Professor Henry Carter Fall 2016 Recap Space-time tradeoffs allow for faster algorithms at the cost of space complexity overhead Dynamic programming

More information

CHAPTER 23. Minimum Spanning Trees

CHAPTER 23. Minimum Spanning Trees CHAPTER 23 Minimum Spanning Trees In the design of electronic circuitry, it is often necessary to make the pins of several components electrically equivalent by wiring them together. To interconnect a

More information

Union/Find Aka: Disjoint-set forest. Problem definition. Naïve attempts CS 445

Union/Find Aka: Disjoint-set forest. Problem definition. Naïve attempts CS 445 CS 5 Union/Find Aka: Disjoint-set forest Alon Efrat Problem definition Given: A set of atoms S={1, n E.g. each represents a commercial name of a drugs. This set consists of different disjoint subsets.

More information

Pred 8 1. Dist. Pred

Pred 8 1. Dist. Pred CS Graph Algorithms, Page Shortest Path Problem Task: given a graph G, find the shortest path from a vertex u to a vertex v. ffl If all edges are of the same length, then BFS will do. ffl But some times

More information

Title. Ferienakademie im Sarntal Course 2 Distance Problems: Theory and Praxis. Nesrine Damak. Fakultät für Informatik TU München. 20.

Title. Ferienakademie im Sarntal Course 2 Distance Problems: Theory and Praxis. Nesrine Damak. Fakultät für Informatik TU München. 20. Title Ferienakademie im Sarntal Course 2 Distance Problems: Theory and Praxis Nesrine Damak Fakultät für Informatik TU München 20. September 2010 Nesrine Damak: Classical Shortest-Path Algorithms 1/ 35

More information

COP 4531 Complexity & Analysis of Data Structures & Algorithms

COP 4531 Complexity & Analysis of Data Structures & Algorithms COP 4531 Complexity & Analysis of Data Structures & Algorithms Lecture 9 Minimum Spanning Trees Thanks to the text authors who contributed to these slides Why Minimum Spanning Trees (MST)? Example 1 A

More information

1. [1 pt] What is the solution to the recurrence T(n) = 2T(n-1) + 1, T(1) = 1

1. [1 pt] What is the solution to the recurrence T(n) = 2T(n-1) + 1, T(1) = 1 Asymptotics, Recurrence and Basic Algorithms 1. [1 pt] What is the solution to the recurrence T(n) = 2T(n-1) + 1, T(1) = 1 1. O(logn) 2. O(n) 3. O(nlogn) 4. O(n 2 ) 5. O(2 n ) 2. [1 pt] What is the solution

More information

Introduction: (Edge-)Weighted Graph

Introduction: (Edge-)Weighted Graph Introduction: (Edge-)Weighted Graph c 8 7 a b 7 i d 9 e 8 h 6 f 0 g These are computers and costs of direct connections. What is a cheapest way to network them? / 8 (Edge-)Weighted Graph Many useful graphs

More information

CIS 121 Data Structures and Algorithms Minimum Spanning Trees

CIS 121 Data Structures and Algorithms Minimum Spanning Trees CIS 121 Data Structures and Algorithms Minimum Spanning Trees March 19, 2019 Introduction and Background Consider a very natural problem: we are given a set of locations V = {v 1, v 2,..., v n }. We want

More information

Weighted Graphs and Greedy Algorithms

Weighted Graphs and Greedy Algorithms COMP 182 Algorithmic Thinking Weighted Graphs and Greedy Algorithms Luay Nakhleh Computer Science Rice University Reading Material Chapter 10, Section 6 Chapter 11, Sections 4, 5 Weighted Graphs In many

More information

CSE 21: Mathematics for Algorithms and Systems Analysis

CSE 21: Mathematics for Algorithms and Systems Analysis CSE 21: Mathematics for Algorithms and Systems Analysis Week 10 Discussion David Lisuk June 4, 2014 David Lisuk CSE 21: Mathematics for Algorithms and Systems Analysis June 4, 2014 1 / 26 Agenda 1 Announcements

More information

Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology. Assignment

Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology. Assignment Class: V - CE Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology Sub: Design and Analysis of Algorithms Analysis of Algorithm: Assignment

More information

CS : Data Structures

CS : Data Structures CS 600.226: Data Structures Michael Schatz Dec 7, 2016 Lecture 38: Union-Find Assignment 10: Due Monday Dec 5 @ 10pm Remember: javac Xlint:all & checkstyle *.java & JUnit Solutions should be independently

More information

CSE 521: Design and Analysis of Algorithms I

CSE 521: Design and Analysis of Algorithms I CSE 521: Design and Analysis of Algorithms I Greedy Algorithms Paul Beame 1 Greedy Algorithms Hard to define exactly but can give general properties Solution is built in small steps Decisions on how to

More information

CS 341: Algorithms. Douglas R. Stinson. David R. Cheriton School of Computer Science University of Waterloo. February 26, 2019

CS 341: Algorithms. Douglas R. Stinson. David R. Cheriton School of Computer Science University of Waterloo. February 26, 2019 CS 341: Algorithms Douglas R. Stinson David R. Cheriton School of Computer Science University of Waterloo February 26, 2019 D.R. Stinson (SCS) CS 341 February 26, 2019 1 / 296 1 Course Information 2 Introduction

More information

Minimum spanning trees

Minimum spanning trees Minimum spanning trees [We re following the book very closely.] One of the most famous greedy algorithms (actually rather family of greedy algorithms). Given undirected graph G = (V, E), connected Weight

More information

Three Graph Algorithms

Three Graph Algorithms Three Graph Algorithms Shortest Distance Paths Distance/Cost of a path in weighted graph sum of weights of all edges on the path path A, B, E, cost is 2+3=5 path A, B, C, E, cost is 2+1+4=7 How to find

More information

Three Graph Algorithms

Three Graph Algorithms Three Graph Algorithms Shortest Distance Paths Distance/Cost of a path in weighted graph sum of weights of all edges on the path path A, B, E, cost is 2+3=5 path A, B, C, E, cost is 2+1+4=7 How to find

More information

CSci 231 Final Review

CSci 231 Final Review CSci 231 Final Review Here is a list of topics for the final. Generally you are responsible for anything discussed in class (except topics that appear italicized), and anything appearing on the homeworks.

More information

Notes for Lecture 24

Notes for Lecture 24 U.C. Berkeley CS170: Intro to CS Theory Handout N24 Professor Luca Trevisan December 4, 2001 Notes for Lecture 24 1 Some NP-complete Numerical Problems 1.1 Subset Sum The Subset Sum problem is defined

More information

Minimum-Cost Spanning Tree. Example

Minimum-Cost Spanning Tree. Example Minimum-Cost Spanning Tree weighted connected undirected graph spanning tree cost of spanning tree is sum of edge costs find spanning tree that has minimum cost Example 2 4 12 6 3 Network has 10 edges.

More information

Example. Minimum-Cost Spanning Tree. Edge Selection Greedy Strategies. Edge Selection Greedy Strategies

Example. Minimum-Cost Spanning Tree. Edge Selection Greedy Strategies. Edge Selection Greedy Strategies Minimum-Cost Spanning Tree weighted connected undirected graph spanning tree cost of spanning tree is sum of edge costs find spanning tree that has minimum cost Example 2 4 12 6 3 Network has 10 edges.

More information

Greedy Algorithms 1. For large values of d, brute force search is not feasible because there are 2 d

Greedy Algorithms 1. For large values of d, brute force search is not feasible because there are 2 d Greedy Algorithms 1 Simple Knapsack Problem Greedy Algorithms form an important class of algorithmic techniques. We illustrate the idea by applying it to a simplified version of the Knapsack Problem. Informally,

More information

Algorithm Design (8) Graph Algorithms 1/2

Algorithm Design (8) Graph Algorithms 1/2 Graph Algorithm Design (8) Graph Algorithms / Graph:, : A finite set of vertices (or nodes) : A finite set of edges (or arcs or branches) each of which connect two vertices Takashi Chikayama School of

More information

Optimization I : Brute force and Greedy strategy

Optimization I : Brute force and Greedy strategy Chapter 3 Optimization I : Brute force and Greedy strategy A generic definition of an optimization problem involves a set of constraints that defines a subset in some underlying space (like the Euclidean

More information

Minimum Spanning Tree

Minimum Spanning Tree Minimum Spanning Tree 1 Minimum Spanning Tree G=(V,E) is an undirected graph, where V is a set of nodes and E is a set of possible interconnections between pairs of nodes. For each edge (u,v) in E, we

More information

COMP 182: Algorithmic Thinking Prim and Dijkstra: Efficiency and Correctness

COMP 182: Algorithmic Thinking Prim and Dijkstra: Efficiency and Correctness Prim and Dijkstra: Efficiency and Correctness Luay Nakhleh 1 Prim s Algorithm In class we saw Prim s algorithm for computing a minimum spanning tree (MST) of a weighted, undirected graph g. The pseudo-code

More information

Greedy Approach: Intro

Greedy Approach: Intro Greedy Approach: Intro Applies to optimization problems only Problem solving consists of a series of actions/steps Each action must be 1. Feasible 2. Locally optimal 3. Irrevocable Motivation: If always

More information

Lecture Notes for Chapter 23: Minimum Spanning Trees

Lecture Notes for Chapter 23: Minimum Spanning Trees Lecture Notes for Chapter 23: Minimum Spanning Trees Chapter 23 overview Problem A town has a set of houses and a set of roads. A road connects 2 and only 2 houses. A road connecting houses u and v has

More information

Partha Sarathi Manal

Partha Sarathi Manal MA 515: Introduction to Algorithms & MA5 : Design and Analysis of Algorithms [-0-0-6] Lecture 20 & 21 http://www.iitg.ernet.in/psm/indexing_ma5/y09/index.html Partha Sarathi Manal psm@iitg.ernet.in Dept.

More information

Lecture 25 Spanning Trees

Lecture 25 Spanning Trees Lecture 25 Spanning Trees 15-122: Principles of Imperative Computation (Fall 2018) Frank Pfenning, Iliano Cervesato The following is a simple example of a connected, undirected graph with 5 vertices (A,

More information

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved. Chapter 4 Greedy Algorithms Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved. 1 4.5 Minimum Spanning Tree Minimum Spanning Tree Minimum spanning tree. Given a connected

More information

Lecture 6 Basic Graph Algorithms

Lecture 6 Basic Graph Algorithms CS 491 CAP Intro to Competitive Algorithmic Programming Lecture 6 Basic Graph Algorithms Uttam Thakore University of Illinois at Urbana-Champaign September 30, 2015 Updates ICPC Regionals teams will be

More information

CSE332: Data Abstractions Lecture 25: Minimum Spanning Trees. Ruth Anderson via Conrad Nied Winter 2015

CSE332: Data Abstractions Lecture 25: Minimum Spanning Trees. Ruth Anderson via Conrad Nied Winter 2015 CSE33: Data Abstractions Lecture 5: Minimum Spanning Trees Ruth Anderson via Conrad Nied Winter 05 A quick note about Gradescope 3/06/05 Today s XKCD 3/06/05 3 You guys are awesome 3/06/05 4 Do you still

More information

tree follows. Game Trees

tree follows. Game Trees CPSC-320: Intermediate Algorithm Design and Analysis 113 On a graph that is simply a linear list, or a graph consisting of a root node v that is connected to all other nodes, but such that no other edges

More information

Elementary Graph Algorithms: Summary. Algorithms. CmSc250 Intro to Algorithms

Elementary Graph Algorithms: Summary. Algorithms. CmSc250 Intro to Algorithms Elementary Graph Algorithms: Summary CmSc250 Intro to Algorithms Definition: A graph is a collection (nonempty set) of vertices and edges A path from vertex x to vertex y : a list of vertices in which

More information

Data Structures and Algorithms

Data Structures and Algorithms Data Structures and Algorithms CS245-2015S-18 Spanning Trees David Galles Department of Computer Science University of San Francisco 18-0: Spanning Trees Given a connected, undirected graph G A subgraph

More information