Introduction: (Edge-)Weighted Graph c 8 7 a b 7 i d 9 e 8 h 6 f 0 g These are computers and costs of direct connections. What is a cheapest way to network them? / 8
(Edge-)Weighted Graph Many useful graphs have numbers assigned to edges. Think of: each edge has a price tag. (Usually 0. Some cases have < 0.) A weighted (edge-weighted) graph consists of: a set of vertices a set of edges weights: a map from edges to numbers if undirected graph: {u, v} = {v, u}, same weight if directed graph: (u, v) and (v, u) may have different weights Notation: w(u, v) or weight(u, v). / 8
Storing a Weighted Graph C E A B 5 D Adjacency matrix: Adjacency lists: A B C D E A 0 B 0 5 C 0 D 5 0 E 0 A B C D E adjacency list (B,), (C,) (A,), (C,), (D,5) (A,), (B,) (B,5) 3 / 8
Common Task # on Weighted Graphs Minimum spanning tree: Find a spanning tree. Minimize the sum of the weights of the edges used. C A B 5 D Usually just for undirected, connected graphs. / 8
Kruskal s Algorithm: Idea Kruskal s algorithm finds a minimum spanning tree by successive mergers.. At first, each vertex is its own small cluster (tree/set in textbook).. Find an edge of minimum weight, use it to merge two clusters into one. 3. Do it again.... In general, find an edge of minimum weight that crosses two clusters; merge them into one. So each iteration you find a cheapest way to merge two trees. (Not a correctness proof, but a good hint.) 5 / 8
Kruskal s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g 6 / 8
Kruskal s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g 6 / 8
Kruskal s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g 6 / 8
Kruskal s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g 6 / 8
Kruskal s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g 6 / 8
Kruskal s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g 6 / 8
Kruskal s Algorithm A := new container() for chosen edges L := list of edges sorted in increasing weights for each vertex v: v.cluster := {v} for each {u, v} in L in order: if u.cluster v.cluster: A.add({u, v}) merge u.cluster and v.cluster return A 7 / 8
Storing Clusters: Easy Way each cluster is a linked list v.cluster is pointer to v s owning linked list u.cluster v.cluster is pointer equality, Θ() time merging two clusters is merging two linked lists, BUT: a lot of vertices need their v.cluster s updated Luckily, if you move the smaller list to the larger one, then: whenever v.cluster needs update, cluster size doubles each v.cluster is updated at most lg V times, ever A much faster way will appear later in this course. 8 / 8
Kruskal s Algorithm Time Collecting and sorting edges: Θ( E lg E ). v.cluster updates: O(lg V ) per vertex the rest is Θ() per vertex or edge Total O( V lg V + E lg E ) time worst case. Note lg E O(lg V ). O(( V + E ) lg V ) time. Faster if faster cluster implementation. 9 / 8
Prim s Algorithm: Idea Prim s algorithm finds a minimum spanning tree by growing the tree around the starting vertex, successively adding the next cheapest edge that links up one more vertex. This is like how breadth-first search grows a breadth-first tree, but with a twist: The queue is changed to a min priority queue. Priority of vertex v = smallest seen edge weight between v and the tree so far. ( if no such edge.) So every time you extract-min, you get a cheapest edge to add to the tree. (Not a correctness proof, but a good hint.) 0 / 8
Prim s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g vertex a b c d e f g h i priority 0 pred / 8
Prim s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g vertex b h c d e f g i priority 8 pred a a / 8
Prim s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g vertex h c d e f g i priority 8 8 pred a b / 8
Prim s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g vertex g i c d e f priority 7 8 pred h h b / 8
Prim s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g vertex f i c d e priority 6 8 pred g g b / 8
Prim s Algorithm: A Few Example Steps c 8 7 a b 7 i d 9 e 8 h 6 f 0 g vertex c i e d priority 6 0 pred f g f f / 8
Prim s Algorithm A := new container() for chosen edges Q := new min-heap() start := pick a vertex Q.insert(start, 0) for each vertex v start: Q.insert(v, ) while Q not empty: u := Q.extract-min() A.add({u.pred, u}) for each z in u s adjacency list: if z in Q && weight(u, z) < priority of z: Q.decrease-priority(z, weight(u, z)) z.pred := u return A / 8
Prim s Algorithm Time every vertex enters and leaves min-heap once: Θ(lg V ) each every edge may trigger a change of priority: O(lg V ) each the rest can be done in Θ() per vertex or per edge (need clever coding) Total O(( V + E ) lg V ) time worst case. 3 / 8
Prim s Correctness Sketch Loop invariant:. A some MST T. A spans finished vertices (outside Q) 3. What I said about vertex priorities. Why the loop body maintains # and #3: Exercise. Why the loop body maintains #: Let p = u.pred. The loop body adds {p, u} to A. Let A = A {{p, u}}. A some MST T because: If T has {p, u}, then choose T = T. Else, next slide... Can replace some edge by {p, u}. / 8
Prim s Correctness Sketch If T does not have {p, u}: T has a unique simple path from p (finished) to u ( Q), via some edge {x, y} with x finished and y Q. T without {x, y} would break apart; {p, u} would reconnect. {p, u} is as cheap as {x, y} because u is fresh off Q.extract-min: weight(p, u) = priority(u) priority(y) weight(x, y) Choose T = T {{x, y}} {{p, u}}, still MST, A T. 5 / 8
Kruskal s Correctness Sketch Loop invariant:. A some MST T.. Clusters correspond to trees in A. Why the loop body maintains #: {u, v} is a cheapest edge that can glue two clusters. The loop body adds {u, v} to A. Let A = A {{u, v}}. A some MST T because: If T has {u, v}, then choose T = T. Else, next slide... Can replace some edge by {u, v}. 6 / 8
Kruskal s Correctness Sketch If T does not have {u, v}: Partition V into S and V S such that: No edge in A goes across. So no cluster goes across. u s cluster on S side, v s cluster on V S side. T has a unique simple path from u ( S) to v ( S), via some edge {x, y} with x S and y S. T without {x, y} would break apart; {u, v} would reconnect. {x, y} can glue two clusters. {u, v} is a cheapest edge that can glue two clusters. So {u, v} is as cheap as {x, y}. Choose T = T {{x, y}} {{u, v}}, still MST, A T. 7 / 8
General Theorem Suppose: A some MST T Can partition V into S and V S such that: No edge in A goes across. {u, v} is a cheapest edge going across. Then A {{u, v}} some MST T. Proof sketch: If T does not have {u, v}: T has a unique simple path from u to v, via some edge {x, y} going across S to V S. T without {x, y} would break apart; {u, v} would reconnect. {u, v} is as cheap as {x, y}. Choose T = T {{x, y}} {{u, v}}. 8 / 8