Introduction: (Edge-)Weighted Graph

Similar documents
Minimum Spanning Trees

Minimum Spanning Trees My T. UF

Algorithms and Data Structures: Minimum Spanning Trees I and II - Prim s Algorithm. ADS: lects 14 & 15 slide 1

Week 11: Minimum Spanning trees

Greedy Algorithms. At each step in the algorithm, one of several choices can be made.

COP 4531 Complexity & Analysis of Data Structures & Algorithms

Representations of Weighted Graphs (as Matrices) Algorithms and Data Structures: Minimum Spanning Trees. Weighted Graphs

tree follows. Game Trees

Minimum-Spanning-Tree problem. Minimum Spanning Trees (Forests) Minimum-Spanning-Tree problem

CSE 431/531: Analysis of Algorithms. Greedy Algorithms. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

CSE 100 Minimum Spanning Trees Prim s and Kruskal

Introduction to Algorithms

Week 12: Minimum Spanning trees and Shortest Paths

Minimum Spanning Trees

Lecture Notes for Chapter 23: Minimum Spanning Trees

Context: Weighted, connected, undirected graph, G = (V, E), with w : E R.

Decreasing a key FIB-HEAP-DECREASE-KEY(,, ) 3.. NIL. 2. error new key is greater than current key 6. CASCADING-CUT(, )

CS/COE

Introduction to Algorithms

CSE331 Introduction to Algorithms Lecture 15 Minimum Spanning Trees

Algorithm Analysis Graph algorithm. Chung-Ang University, Jaesung Lee

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

Introduction to Algorithms

Minimum Spanning Trees and Prim s Algorithm

Graphs and Network Flows ISE 411. Lecture 7. Dr. Ted Ralphs

Depth-first Search (DFS)

UC Berkeley CS 170: Efficient Algorithms and Intractable Problems Handout 8 Lecturer: David Wagner February 20, Notes 8 for CS 170

CSC 1700 Analysis of Algorithms: Minimum Spanning Tree

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

Algorithms and Data Structures: Minimum Spanning Trees (Kruskal) ADS: lecture 16 slide 1

Chapter 23. Minimum Spanning Trees

CHAPTER 13 GRAPH ALGORITHMS

Theory of Computing. Lecture 10 MAS 714 Hartmut Klauck

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Algorithms and Theory of Computation. Lecture 5: Minimum Spanning Tree

CS 310 Advanced Data Structures and Algorithms

6.1 Minimum Spanning Trees

Partha Sarathi Manal

CIS 121 Data Structures and Algorithms Minimum Spanning Trees

Minimum Spanning Trees

Chapter 9. Greedy Technique. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

Algorithms and Theory of Computation. Lecture 5: Minimum Spanning Tree

Minimum Spanning Trees

Minimum Spanning Trees. Lecture II: Minimium Spanning Tree Algorithms. An Idea. Some Terminology. Dr Kieran T. Herley

22 Elementary Graph Algorithms. There are two standard ways to represent a

What is a minimal spanning tree (MST) and how to find one

Minimum spanning trees

CS60020: Foundations of Algorithm Design and Machine Learning. Sourangshu Bhattacharya

Dijkstra s algorithm for shortest paths when no edges have negative weight.

Minimum Spanning Trees

Lecture Summary CSC 263H. August 5, 2016

Weighted Graph Algorithms Presented by Jason Yuan

23.2 Minimum Spanning Trees

CSC 8301 Design & Analysis of Algorithms: Kruskal s and Dijkstra s Algorithms

Algorithms and Data Structures (INF1) Lecture 15/15 Hua Lu

CSci 231 Final Review

CSE373: Data Structures & Algorithms Lecture 17: Minimum Spanning Trees. Dan Grossman Fall 2013

Minimum Spanning Trees

22 Elementary Graph Algorithms. There are two standard ways to represent a

Spanning Tree. Lecture19: Graph III. Minimum Spanning Tree (MSP)

Data Structures and Algorithms

2pt 0em. Computer Science & Engineering 423/823 Design and Analysis of Algorithms. Lecture 04 Minimum-Weight Spanning Trees (Chapter 23)

CSE 100: GRAPH ALGORITHMS

Module 5 Graph Algorithms

We ve done. Introduction to the greedy method Activity selection problem How to prove that a greedy algorithm works Fractional Knapsack Huffman coding

Greedy Algorithms. Previous Examples: Huffman coding, Minimum Spanning Tree Algorithms

Announcements Problem Set 5 is out (today)!

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

COMP 182: Algorithmic Thinking Prim and Dijkstra: Efficiency and Correctness

Complexity of Prim s Algorithm

Lecture 13. Reading: Weiss, Ch. 9, Ch 8 CSE 100, UCSD: LEC 13. Page 1 of 29

Design and Analysis of Algorithms

Algorithm Design and Analysis

CS 561, Lecture 10. Jared Saia University of New Mexico

Algorithms for Minimum Spanning Trees

Greedy algorithms. Given a problem, how do we design an algorithm that solves the problem? There are several strategies:

CS61B, Fall 2002 Discussion #15 Amir Kamil UC Berkeley 12/5/02

Greedy algorithms is another useful way for solving optimization problems.

Lecture 11: Analysis of Algorithms (CS ) 1

Minimum spanning trees

1. [1 pt] What is the solution to the recurrence T(n) = 2T(n-1) + 1, T(1) = 1

2.1 Greedy Algorithms. 2.2 Minimum Spanning Trees. CS125 Lecture 2 Fall 2016

Minimum-Cost Spanning Tree. Example

COS 226 Lecture 19: Minimal Spanning Trees (MSTs) PFS BFS and DFS examples. Classic algorithms for solving a natural problem

Example. Minimum-Cost Spanning Tree. Edge Selection Greedy Strategies. Edge Selection Greedy Strategies

Lecture 13: Minimum Spanning Trees Steven Skiena

Minimum Spanning Trees Ch 23 Traversing graphs

( D. Θ n. ( ) f n ( ) D. Ο%

Outline and Reading. Minimum Spanning Tree. Minimum Spanning Tree. Cycle Property. Minimum Spanning Trees ( 12.7)

Introduction to Algorithms. Lecture 11

Minimum Spanning Trees

Minimum Spanning Trees

Chapter 14. Graphs Pearson Addison-Wesley. All rights reserved 14 A-1

Solutions to relevant spring 2000 exam problems

Pred 8 1. Dist. Pred

Lecture 13: Minimum Spanning Trees Steven Skiena. Department of Computer Science State University of New York Stony Brook, NY

CSE 5311 Notes 8: Minimum Spanning Trees

Minimum Spanning Trees. Minimum Spanning Trees. Minimum Spanning Trees. Minimum Spanning Trees

COMP 355 Advanced Algorithms

CISC 320 Midterm Exam

Transcription:

Introduction: (Edge-)Weighted Graph c 8 7 a b 7 i d 9 e 8 h 6 f 0 g These are computers and costs of direct connections. What is a cheapest way to network them? / 8

(Edge-)Weighted Graph Many useful graphs have numbers assigned to edges. Think of: each edge has a price tag. (Usually 0. Some cases have < 0.) A weighted (edge-weighted) graph consists of: a set of vertices a set of edges weights: a map from edges to numbers if undirected graph: {u, v} = {v, u}, same weight if directed graph: (u, v) and (v, u) may have different weights Notation: w(u, v) or weight(u, v). / 8

Storing a Weighted Graph C E A B 5 D Adjacency matrix: Adjacency lists: A B C D E A 0 B 0 5 C 0 D 5 0 E 0 A B C D E adjacency list (B,), (C,) (A,), (C,), (D,5) (A,), (B,) (B,5) 3 / 8

Common Task # on Weighted Graphs Minimum spanning tree: Find a spanning tree. Minimize the sum of the weights of the edges used. C A B 5 D Usually just for undirected, connected graphs. / 8

Kruskal s Algorithm: Idea Kruskal s algorithm finds a minimum spanning tree by successive mergers.. At first, each vertex is its own small cluster (tree/set in textbook).. Find an edge of minimum weight, use it to merge two clusters into one. 3. Do it again.... In general, find an edge of minimum weight that crosses two clusters; merge them into one. So each iteration you find a cheapest way to merge two trees. (Not a correctness proof, but a good hint.) 5 / 8

Kruskal s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g 6 / 8

Kruskal s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g 6 / 8

Kruskal s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g 6 / 8

Kruskal s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g 6 / 8

Kruskal s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g 6 / 8

Kruskal s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g 6 / 8

Kruskal s Algorithm A := new container() for chosen edges L := list of edges sorted in increasing weights for each vertex v: v.cluster := {v} for each {u, v} in L in order: if u.cluster v.cluster: A.add({u, v}) merge u.cluster and v.cluster return A 7 / 8

Storing Clusters: Easy Way each cluster is a linked list v.cluster is pointer to v s owning linked list u.cluster v.cluster is pointer equality, Θ() time merging two clusters is merging two linked lists, BUT: a lot of vertices need their v.cluster s updated Luckily, if you move the smaller list to the larger one, then: whenever v.cluster needs update, cluster size doubles each v.cluster is updated at most lg V times, ever A much faster way will appear later in this course. 8 / 8

Kruskal s Algorithm Time Collecting and sorting edges: Θ( E lg E ). v.cluster updates: O(lg V ) per vertex the rest is Θ() per vertex or edge Total O( V lg V + E lg E ) time worst case. Note lg E O(lg V ). O(( V + E ) lg V ) time. Faster if faster cluster implementation. 9 / 8

Prim s Algorithm: Idea Prim s algorithm finds a minimum spanning tree by growing the tree around the starting vertex, successively adding the next cheapest edge that links up one more vertex. This is like how breadth-first search grows a breadth-first tree, but with a twist: The queue is changed to a min priority queue. Priority of vertex v = smallest seen edge weight between v and the tree so far. ( if no such edge.) So every time you extract-min, you get a cheapest edge to add to the tree. (Not a correctness proof, but a good hint.) 0 / 8

Prim s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g vertex a b c d e f g h i priority 0 pred / 8

Prim s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g vertex b h c d e f g i priority 8 pred a a / 8

Prim s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g vertex h c d e f g i priority 8 8 pred a b / 8

Prim s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g vertex g i c d e f priority 7 8 pred h h b / 8

Prim s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g vertex f i c d e priority 6 8 pred g g b / 8

Prim s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g vertex c i e d priority 6 0 pred f g f f / 8

Prim s Algorithm A := new container() for chosen edges Q := new min-heap() start := pick a vertex Q.insert(start, 0) for each vertex v start: Q.insert(v, ) while Q not empty: u := Q.extract-min() A.add({u.pred, u}) for each z in u s adjacency list: if z in Q && weight(u, z) < priority of z: Q.decrease-priority(z, weight(u, z)) z.pred := u return A / 8

Prim s Algorithm Time every vertex enters and leaves min-heap once: Θ(lg V ) each every edge may trigger a change of priority: O(lg V ) each the rest can be done in Θ() per vertex or per edge (need clever coding) Total O(( V + E ) lg V ) time worst case. 3 / 8

Prim s Correctness Sketch Loop invariant:. A some MST T. A spans finished vertices (outside Q) 3. What I said about vertex priorities. Why the loop body maintains # and #3: Exercise. Why the loop body maintains #: Let p = u.pred. The loop body adds {p, u} to A. Let A = A {{p, u}}. A some MST T because: If T has {p, u}, then choose T = T. Else, next slide... Can replace some edge by {p, u}. / 8

Prim s Correctness Sketch If T does not have {p, u}: T has a unique simple path from p (finished) to u ( Q), via some edge {x, y} with x finished and y Q. T without {x, y} would break apart; {p, u} would reconnect. {p, u} is as cheap as {x, y} because u is fresh off Q.extract-min: weight(p, u) = priority(u) priority(y) weight(x, y) Choose T = T {{x, y}} {{p, u}}, still MST, A T. 5 / 8

Kruskal s Correctness Sketch Loop invariant:. A some MST T.. Clusters correspond to trees in A. Why the loop body maintains #: {u, v} is a cheapest edge that can glue two clusters. The loop body adds {u, v} to A. Let A = A {{u, v}}. A some MST T because: If T has {u, v}, then choose T = T. Else, next slide... Can replace some edge by {u, v}. 6 / 8

Kruskal s Correctness Sketch If T does not have {u, v}: Partition V into S and V S such that: No edge in A goes across. So no cluster goes across. u s cluster on S side, v s cluster on V S side. T has a unique simple path from u ( S) to v ( S), via some edge {x, y} with x S and y S. T without {x, y} would break apart; {u, v} would reconnect. {x, y} can glue two clusters. {u, v} is a cheapest edge that can glue two clusters. So {u, v} is as cheap as {x, y}. Choose T = T {{x, y}} {{u, v}}, still MST, A T. 7 / 8

General Theorem Suppose: A some MST T Can partition V into S and V S such that: No edge in A goes across. {u, v} is a cheapest edge going across. Then A {{u, v}} some MST T. Proof sketch: If T does not have {u, v}: T has a unique simple path from u to v, via some edge {x, y} going across S to V S. T without {x, y} would break apart; {u, v} would reconnect. {u, v} is as cheap as {x, y}. Choose T = T {{x, y}} {{u, v}}. 8 / 8