CHAPTER 23. Minimum Spanning Trees

Similar documents
Decreasing a key FIB-HEAP-DECREASE-KEY(,, ) 3.. NIL. 2. error new key is greater than current key 6. CASCADING-CUT(, )

22 Elementary Graph Algorithms. There are two standard ways to represent a

22 Elementary Graph Algorithms. There are two standard ways to represent a

Greedy Algorithms. At each step in the algorithm, one of several choices can be made.

Lecture Notes for Chapter 23: Minimum Spanning Trees

Minimum Spanning Trees

Minimum Spanning Trees My T. UF

2pt 0em. Computer Science & Engineering 423/823 Design and Analysis of Algorithms. Lecture 04 Minimum-Weight Spanning Trees (Chapter 23)

Minimum Spanning Trees

COP 4531 Complexity & Analysis of Data Structures & Algorithms

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

Minimum Spanning Trees

Minimum-Spanning-Tree problem. Minimum Spanning Trees (Forests) Minimum-Spanning-Tree problem

Minimum spanning trees

Announcements Problem Set 5 is out (today)!

Week 11: Minimum Spanning trees

Introduction to Algorithms

Introduction to Algorithms

2.1 Greedy Algorithms. 2.2 Minimum Spanning Trees. CS125 Lecture 2 Fall 2016

CIS 121 Data Structures and Algorithms Minimum Spanning Trees

Minimum Spanning Trees

Minimum Spanning Trees

Chapter 23. Minimum Spanning Trees

CS 561, Lecture 9. Jared Saia University of New Mexico

6.1 Minimum Spanning Trees

CS 5321: Advanced Algorithms Minimum Spanning Trees. Acknowledgement. Minimum Spanning Trees

23.2 Minimum Spanning Trees

Elementary Graph Algorithms: Summary. Algorithms. CmSc250 Intro to Algorithms

Minimum Spanning Tree

Introduction to Algorithms

Context: Weighted, connected, undirected graph, G = (V, E), with w : E R.

1 Start with each vertex being its own component. 2 Merge two components into one by choosing the light edge

Minimum Spanning Trees Outline: MST

UC Berkeley CS 170: Efficient Algorithms and Intractable Problems Handout 8 Lecturer: David Wagner February 20, Notes 8 for CS 170

Minimum Spanning Trees Ch 23 Traversing graphs

G205 Fundamentals of Computer Engineering. CLASS 21, Mon. Nov Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm

Week 12: Minimum Spanning trees and Shortest Paths

Lecture 10. Elementary Graph Algorithm Minimum Spanning Trees

Representations of Weighted Graphs (as Matrices) Algorithms and Data Structures: Minimum Spanning Trees. Weighted Graphs

Minimum-Cost Spanning Tree. Example

Example. Minimum-Cost Spanning Tree. Edge Selection Greedy Strategies. Edge Selection Greedy Strategies

Algorithm Design and Analysis

Chapter 9. Greedy Technique. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

Minimum Spanning Trees

Notes on Minimum Spanning Trees. Red Rule: Given a cycle containing no red edges, select a maximum uncolored edge on the cycle, and color it red.

Partha Sarathi Manal

CS60020: Foundations of Algorithm Design and Machine Learning. Sourangshu Bhattacharya

CSE 431/531: Analysis of Algorithms. Greedy Algorithms. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Dijkstra s algorithm for shortest paths when no edges have negative weight.

Graphs and Network Flows ISE 411. Lecture 7. Dr. Ted Ralphs

CSC 8301 Design & Analysis of Algorithms: Kruskal s and Dijkstra s Algorithms

Lecture 13. Reading: Weiss, Ch. 9, Ch 8 CSE 100, UCSD: LEC 13. Page 1 of 29

Shortest Path Algorithms

CS 6783 (Applied Algorithms) Lecture 5

CSE 100: GRAPH ALGORITHMS

2 A Template for Minimum Spanning Tree Algorithms

CSci 231 Final Review

CS 310 Advanced Data Structures and Algorithms

) $ f ( n) " %( g( n)

Chapter 9 Graph Algorithms

11/22/2016. Chapter 9 Graph Algorithms. Introduction. Definitions. Definitions. Definitions. Definitions

Lecture Summary CSC 263H. August 5, 2016

1 Minimum Spanning Trees (MST) b 2 3 a. 10 e h. j m

looking ahead to see the optimum

Minimum Spanning Trees and Prim s Algorithm

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

CSE 100 Minimum Spanning Trees Prim s and Kruskal

CSE373: Data Structures & Algorithms Lecture 17: Minimum Spanning Trees. Dan Grossman Fall 2013

Lecture 10: Analysis of Algorithms (CS ) 1

Lecture 11: Analysis of Algorithms (CS ) 1

Part VI Graph algorithms. Chapter 22 Elementary Graph Algorithms Chapter 23 Minimum Spanning Trees Chapter 24 Single-source Shortest Paths

Algorithm Analysis Graph algorithm. Chung-Ang University, Jaesung Lee

Greedy Approach: Intro

Chapter 9 Graph Algorithms

Introduction to Algorithms. Minimum Spanning Tree. Chapter 23: Minimum Spanning Trees

Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-4004) Subject Notes

Algorithms and Data Structures: Minimum Spanning Trees I and II - Prim s Algorithm. ADS: lects 14 & 15 slide 1

CSE331 Introduction to Algorithms Lecture 15 Minimum Spanning Trees

1 Shortest Paths. 1.1 Breadth First Search (BFS) CS 124 Section #3 Shortest Paths and MSTs 2/13/2018

Minimum Spanning Trees. CSE 373 Data Structures

Algorithm Design (8) Graph Algorithms 1/2

Optimization I: Greedy Algorithms

The minimum spanning tree problem

Taking Stock. IE170: Algorithms in Systems Engineering: Lecture 20. Example. Shortest Paths Definitions

Advanced algorithms. topological ordering, minimum spanning tree, Union-Find problem. Jiří Vyskočil, Radek Mařík 2012

Complexity of Prim s Algorithm

Solutions to Exam Data structures (X and NV)

COMP 355 Advanced Algorithms

Introduction to Algorithms. Lecture 11

Minimum Spanning Trees

In this chapter, we consider some of the interesting problems for which the greedy technique works, but we start with few simple examples.

( ) + n. ( ) = n "1) + n. ( ) = T n 2. ( ) = 2T n 2. ( ) = T( n 2 ) +1

Minimum Spanning Trees. Minimum Spanning Trees. Minimum Spanning Trees. Minimum Spanning Trees

Design and Analysis of Algorithms

CSC 505, Fall 2000: Week 7. In which our discussion of Minimum Spanning Tree Algorithms and their data Structures is brought to a happy Conclusion.

Multiple Choice. Write your answer to the LEFT of each problem. 3 points each

Minimum Spanning Trees. Lecture II: Minimium Spanning Tree Algorithms. An Idea. Some Terminology. Dr Kieran T. Herley

Outline. Computer Science 331. Analysis of Prim's Algorithm

Algorithms for Minimum Spanning Trees

Undirected Graphs. Hwansoo Han

Transcription:

CHAPTER 23 Minimum Spanning Trees

In the design of electronic circuitry, it is often necessary to make the pins of several components electrically equivalent by wiring them together. To interconnect a set of n pins, we can use an arrangement of n 1 wires, each connecting two pins. Of all such arrangements, the one that uses the least amount of wire is usually the most desirable. We can model this wiring problem with a connected, undirected graph G = (V, E), where V is the set of pins, E is the set of possible interconnections between pairs of pins, and for each edge ( u, ) E, we have a weight w ( u, ) specifying the cost (amount of wire needed) to connect u and. We then wish to find an acyclic subset T E that connects all of the vertices and whose total weight w ( T) ( u, ) T w( u, ) is minimized. Since T is acyclic and connects all of the vertices, it must form a tree, which we call a spanning tree since it spans the graph G. Definition Minimum Spanning Tree: A tree that connects all the vertices of a graph with lowest possible sum of weights. )2( كللالناسللكتلون اللرالنعدلل ن ل لل نل أوائكلناذونلنفضيلاهملبأد ن نك.

اللترقفلطفلكل س لح ودلتعلاك ل فهرلمراردلفيلزمنلمخ لف..نحمادةنشعبانن 260 4444 )3( ن 9 info@ eng-hs.com

23.2 The algorithms of Kruskal and prim Kruskal s algorithm Kruskal s algorithm is based directly on the generic minimum-spanning-tree algorithm given in Section 23.1. It finds a safe edge to add to the growing forest by finding, of all the edge that connect any two trees in the forest, an edge ( u, ) of least weight. Let C 1 and C 2 denote the two trees that are connected by ( u, ). Since ( u, ) must be a light edge connecting C 1 to some other tree, Corollary 23.2 implies that ( u, ) is a safe edge for C 1. Kruskal s algorithm is a greedy algorithm, because at each step it adds to the forest an edge of least possible weight. Our implementation of Kruskal s algorithm is like the algorithm to compute connected components. It uses a disjoint-set data structure to maintain several disjoint sets of elements. Each set contains the vertices in a tree of the current forest. اللو رقفلناسكتل نلنالعبلعنهملكب ون ل بالونب والعنهملترقفرنل نلنالعب. )4(

The operation FIND-SET (u) returns a representative element from the set that contains u. Thus, we can determine whether two vertices u and belong to the same tree by testing whether FIND-SET (u) equals FIND-SET ( ). The combining of trees is accomplished by the UNION procedure. MST-KRUSKAL (G, w) 1 A 0 2 For each vertex V[G] 3 Do MAKE-SET ( ) 4 Sort the edges of E into nondecreasing order by weight w 5 for each edge ( u, ) E, taken in nondecreasing order by weight 6 do if FIND-SET (u) FIND-SET ( ) 7 Then A A {( u, )} 8 UNION ( u, ) 9 Return A اللوس طيعلأح ل كربلظه كل إاللإذنلكستلمسحسيك. )5(

اللواننلعح لأالو شع كلبكارضك ةل منلدوالمرنفق ك.نحمادةنشعبانن 260 4444 )6( ن 9 info@ eng-hs.com

Lines 1-3 initialize the set A to the empty set and create V trees, one containing each vertex. The edges in E are sorted into nondecreasing order by weight in line 4. The for loop in lines 5-8 checks, for each edge ( u, ), whether the endpoints u and belong to the same tree. If they do, then the edge ( u, ) cannot be added to the forest without creating a cycle, and the edge is discarded. Otherwise, the two vertices belong to different trees. In this case, the edge ( u, vertices in the two trees are merged in line 8. ) is added to A in line 7, and the The running time of Kruskal s algorithm for a graph G = (V, E) depends on the implementation of the disjoint-set data structure. We shall assume the disjoint-set-forest implementation with the union-by-rank and path-compression heuristics, since it is the asymptotically fastest implementation known. Initializing the set A in line 1 takes O (1) time, and the time to sort the edges in line 4 is O (E lg E). (We will account for the cost of the V MAKE-SET operations in the for loop of lines 2-3 in a moment.) اللوانسكلأالتبسيلداعةل لىلمكلد فعلهلالحقك. )7(

The for loop of lines 5-8 performs O (E) FIND-SET and UNION operations on the disjoint-set forest. Along with the V MAKE-SET operations, these take a total of O ((V + E) (V)) time, where is the very slowly growing function defined in Section 21.4. Because G is assumed to be connected, we have E V -1, and so the disjoint-set operations take O (E (V)) time. Moreover, since (V ) = O (lg V) = O (lg E), the total running time of Kruskal s algorithm is O (E lg E). Observing that E < 2 V, we have lg E = O (lg V), and so we can restate the running time of Kruskal s algorithm as O (E lg V). Remember: V is the number of vertices, and sometimes referred to by V. We prefer using lg V instead of lg E since lg V in general has lower value than lg E. اقللللل لونللللل ونلأالأفضلللللالط وقلللللةل الد جكبةلأوالدكلاسصلحكلأالتبحل ل اللكلو ولل والفعلللهلوتسصللحهملبللأال وفعلره لوأحيكنكلوفشالذاكلأوضك. )8(

Prim s algorithm Like Kruskal s algorithm, prim s algorithm is a special case of the generic minimum-spanning-tree algorithm. Prim s algorithm operates much like Dijkstra s algorithm for finding shortest paths in a graph, which we shall see in Section 24.3. Prim s algorithm has the property that the edges in the set A always from a single tree. اللورن لدلببلوحبلكلناسلكتلملنلأنللهل. أفضالمنل ملنا طلعلااكلفيلأو وهم eng-hs.net التسلما نسخ نلكتلونسة نمننسوتاتناكموقعنمجاسا نعلىنليمةلكنقمنبزيارةن 61862999 نأننتصويوناكجمعة ناكوئةخة نباكخودابنأتسفلنبةاسون 61941842 اكنوتاتنملوفوةنمجاسا نفينكلنمننتصويوناكفوعنأما ناكهندتس نأتسفلنصاكوننرسةمن eng-hs.com, eng-hs.net اكنوتاتنملوفوةنمجاس نانباكموقعةنن )9( info@ eng-hs.com ن 9 4444 260 نحمادةنشعبانن.

As is illustrated in the previous figure, the tree starts from an arbitrary root vertex r and grows until the tree spans all the vertices in V. At each step, a light edge is added to the tree A that connects A to an isolated vertex of G A = (V, A). This rule adds only edges that are safe for A; therefore, when the algorithm terminates, the edges in A from a minimum spanning tree. This strategy is greedy since the tree is augmented at each step with an edge that contributes the minimum amount possible to the tree s weight. The key to implementing Prim s algorithm efficiently is to make it easy to select a new edge to be added to the tree formed by the edges in A. In the pseudocode below, the connected graph G and the root r of the minimum spanning tree to be grown are inputs to the algorithm. During execution of the algorithm, all vertices that are not in the tree reside in a min-priority queue Q based on a key field. For each vertex, key [ ] is the minimum weight of any edge connecting to a vertex in the tree; by convention, key [ ] = if there is no such edge. The field the tree. names the parent of in )01( اق لالحظتلأالكالمنلو رال اإلنهكضلق لو ا ونلبكافعا.

During the algorithm, the set A from GENERIC-MST is kept implicitly as A {(, [ ]) : V { r} Q}. When the algorithm terminates, the min-priority queue Q is empty; the minimum spanning tree A for G is thus A {(, [ ]) : V { r}}. MST-PRIM (G, w, r) 1 for each u V [G] 2 do key [u] 3 [u] NIL 4 key[r] 0 5 Q V[G] 6 while Q 0 7 do u EXTRACT-MIN(Q ) 8 fro each Adj[u] 9 do of Q and w ( u, ) < key [ ] 10 then [ ] u 11 key[ ] w ( u, ) انلأودلناارتلمنلأنالمع ق نتيل أب ن لفق لأكرالمخطئك. )00(

Prim s algorithm works as shown in the previous figure. Lines 1-5 set the key of each vertex to (except for the root r, whose key is set to 0 so that it will be the first vertex processed), set the parent of each vertex to NIL, and initialize the min priority queue Q to contain all the vertices. The algorithm maintains the following three-part loop invariant: Prior to each iteration of the while loop of lines 6-11, 1. A {(, [ ]) : V { r} Q}. 2. The vertices already placed into the minimum spanning tree are those in V Q. 3. For all vertices Q, if [ ] NIL, then key [ ] < and key [ ] is the weight of a light edge (, [ ]) connecting to some vertex already placed into the minimum spanning tree. ine 7 identifies a vertex u Qincident on a light edge crossing the cut (V Q, Q ) (with the exception of the first iteration, in which u = r due to line 4). Removing u from the set Q adds it to the set V Q of vertices in the tree, thus adding (, [ u] u ) to A. ارلاملتننلنزءن لمنلناحا لفعلىل نعقالاللتننلنزءنلمنلنااشنلة. )02(

The for loop of lines 8-11 updates the key and fields of every vertex adjacent to u but not in the tree. The updating maintains the third part of the loop invariant. The performance of Prim s algorithm depends on how we implement the min priority queue Q. If Q is implemented as a binary min-heap (see Chapter 6), we can use the BUILD-MIN-HEAP procedure to perform the initialization in lines 1-5 in O (V) time. The body of the while loop is executed V times, and since each EXRACT-MIN operation takes O (lg V) time, the total time for all calls to EXTRACT-MIN is O (V lg V). The for loop in lines 8-11 is executed O (E) times altogether, since the sum of the lengths of all adjacency lists is 2 E. Within the for loop, the test for membership in Q in line 9 can be implemented in constant time by keeping a bit for each vertex that tells whether or not it is in Q, and updating the bit when the vertex is removed from Q. The assignment in line 11 involves an implicit DECREASE-KEY operation on the min-heap, which can be implemented in b binary min-heap in O (lg V) time. ايستلناحقيقةلدومك لنا يل وارتلمنلأنلهكلناسكت. )03(

Thus, the total time for Prim s algorithm is O (V gl V + E lg V) = O (E lg V), which is asymptotically the same as for our implementation of Kruskal s algorithm. The asymptotic running time of Prim s algorithm can be improved, however, by using Fibonacci heaps. Chapter 20 shows that if V elements are organized into a Fibonacci heap, we can perform an EXTRACT-MIN operation in O (lg V) amortized time and a DECREASE-KEY operation (to implement line 11) in O (1) amortized time. Therefore, if we use a Fibonacci heap to implement the min priority queue Q, The running time of Prim s algorithm improves to O (E + V lg V). مكل أوتلظكااك لأشبهل باظلرملمنلناحكد. )04(

Exercise 23.2-1 Kruskal s algorithm can return different spanning trees for the same input graph G, depending on how ties are broken when the edges are sorted into order. Show that for each minimum spanning tree T of G, there is a way to sort the edges of G in Kruskal s algorithm so that the algorithm returns T. Solution For each edge e in T, simply make sure that it precedes any other edge not in T with weight w(e). مكلككالالذئبلأالونرالذئبك لارل املتننلناخ نف لخ نفك. )05(

Exercise 23.2-3 Is the Fibonacci-heap implementation of Prim s algorithm asymptotically faster than the binary-heap implementation for a sparse graph G = (V,E), where E = Θ(V)? What about for a dense graph, where E = Θ(V 2 )? How must E and V be related for the Fibonacci-heap implementation to be asymptotically faster than the binary-heap implementation? Solution Consider the running time of Prim s algorithm implemented with either a binary heap or a Fibonacci-heap. Suppose E = Θ (V) then the running times are: Binary: O(E lg V) = O(V lg V). Fibonacci: O(E + V lg V) = O(V lg V) If E = Θ (V 2 ) then: Binary: O(E lg V) = O(V 2 lg V) Fibonacci: O(E + V lg V) = O(V 2 ). The Fibonacci heap beats the binary heap implementation of Prim s algorithm when E = w(v), since O(E + V lg V) = O(V lg V) t for E = O(V lg V) but O(E lg V) = w(e lg V) for E = w(v). For E = w(v lg V) the Fibonacci version clearly has a better running time than the ordinary version. مكلو نءىلفيلضرءلنااصكبيحل ق لوخ لفلفيلضرءلناشاس. )06(

Exercise 23.2-4 Suppose that all edge weights in a graph are integers in the range from 1 to V. How fast can you make Kruskal s algorithm run? What if the edge weights are integers in the range from 1 to W from some constant W? Solution We know that Kruskal s algorithm takes O(V) time for initialization, O(e lg E) time to sort the edges, and O(E α(v)) time for the disjoint-set operations, for a total running time of O(V + E lg E + E α(v)) = O(E lg E). If we knew that all of the edge weights in the graph were integers in the range from 1 to V, then we could sort the edges in O(V + E) time using counting sort. Since the graph is connected, V = O(E), and so the sorting time is reduced to O(E). This would yield a total running time of O(V + E + E α(v)) = O(E α(v)), again since V = O(V), and since E = O(E α(v)). The time to process the edge, not the time sort them, is now the dominant term. If the edge weights were integers in the range from 1 to W, then we could again use counting sort to sort the edges more quickly. Sorting would take O(E + W) = O(E) time, since W is a constant as, we get the total running time of O(E α(v)). معظلللللملناسلللللكتلوقضلللللرالحيلللللكتهمل محكواينلإظهك لمحكدنلايستلفيهم. )07(

Exercise 23.2-5 Suppose that all edge weights in a graph are integers in the range from 1 to V. How fast can you make Prim s algorithm run? What if the edge weights are integers in the range from 1 to Q for some constant W? Solution The running time of Prim s algorithm is composed: O(V) initialization. O(V. time for EXTRACT-MIN). O(E. time for DECREASE-KEY). If the edges are in the range 1,, V then priority queue can speed up EXTRACT-MIN and DECREASE-KEY to O(lg lg V), thus yielding a total running time of O(V lg lg V + E lg lg V) = O(E lg lg V). If the edges are in the range from 1 to W we can implement the queue as an array [1 W+1], where the ith slot holds a doubly linked list of the edges with weight i. The W + 1 st slot contains. EXTRACT-MIN now runs in O(W) = O(1) time since we can simply scan for the first nonempty slot and return the first element of that list. DECREASE-KEY runs in O(1) time as well since it can be implemented by moving an مللللنلناسللللكد لأالنفنلللل لفياللللكلنالللللك ل another. element from one slot to نعغلبلأنسكلمشغرارالباكلاللنالك. )08(

Exercise 23.2-7 Suppose that a graph G has a minimum spanning tree already computed. How quickly can the minimum spanning tree be updated if a new vertex and incident edges are added to G? Solution Let v be the new added vertex and T be the original MST with root s. Add the lightest incident edge to T and we get a new spanning tree T 1. For each new edge left, say (u,v), add it to the current tree, which must result in ascycle, then remove the edge with the maximum weight on the cycle. Repeat this until all new added edges are processed. To find the maximum-weighted edge in a cycle, first find the path P, from v to s and the path P 1 from u to s. From the root s down, compare the vertices appearing on P and P 1. The first common vertex x on both paths is the least common ancestor of u and v. Further more, the path x u, x v and edge (u,v) from a cycle. The maximum weighted edge on this cycle is the maximum of edges on paths P and P 1, and edge (u,v). This runs in O(eV) time, where e is the number of new edges. منلأش لناسرءلأالونرالونردكلمنل مهلمس روكل س لكالمنلوع فك. )09(