Graph Algorithms shortest paths, minimum spanning trees, etc.

Similar documents
Graph Algorithms shortest paths, minimum spanning trees, etc.

Graphs Weighted Graphs. Reading: 12.5, 12.6, 12.7

Shortest Paths D E. Shortest Paths 1

Minimum Spanning Trees

Minimum Spanning Trees

Prim & Kruskal Algorithm

Minimum Spanning Trees

CHAPTER 13 GRAPH ALGORITHMS

Graphs ORD SFO LAX DFW. Data structures and Algorithms

Graphs ORD SFO LAX DFW. Graphs 1

Lecture 17: Weighted Graphs, Dijkstra s Algorithm. Courtesy of Goodrich, Tamassia and Olga Veksler

Routing & Congestion Control

Graphs ORD SFO LAX DFW Goodrich, Tamassia Graphs 1

Weighted Graphs ( 12.5) Shortest Paths. Shortest Paths ( 12.6) Shortest Path Properties PVD ORD SFO LGA HNL LAX DFW MIA PVD PVD ORD ORD SFO SFO LGA

Minimum Spanning Trees

Outline and Reading. Minimum Spanning Tree. Minimum Spanning Tree. Cycle Property. Minimum Spanning Trees ( 12.7)

Dijkstra s Algorithm

CHAPTER 14 GRAPH ALGORITHMS ORD SFO LAX DFW

CS2210 Data Structures and Algorithms

Graph Algorithms Introduction to Data Structures. Ananda Gunawardena 7/31/2011 1

CS 341: Algorithms. Douglas R. Stinson. David R. Cheriton School of Computer Science University of Waterloo. February 26, 2019

The big picture: from Perception to Planning to Control

CHAPTER 14 GRAPH ALGORITHMS ORD SFO LAX DFW

22 Elementary Graph Algorithms. There are two standard ways to represent a

Minimum-Spanning-Tree problem. Minimum Spanning Trees (Forests) Minimum-Spanning-Tree problem

Shortest path problems

CHAPTER 13 GRAPH ALGORITHMS ORD SFO LAX DFW

Lecture 6 Basic Graph Algorithms

Minimum Spanning Trees

Dijkstra s algorithm for shortest paths when no edges have negative weight.

22 Elementary Graph Algorithms. There are two standard ways to represent a

CSE 100 Minimum Spanning Trees Prim s and Kruskal

Directed Graphs BOS Goodrich, Tamassia Directed Graphs 1 ORD JFK SFO DFW LAX MIA

Artificial Intelligence

Minimum Spanning Trees

Algorithms and Theory of Computation. Lecture 5: Minimum Spanning Tree

Week 12: Minimum Spanning trees and Shortest Paths

Algorithms and Theory of Computation. Lecture 5: Minimum Spanning Tree

2. True or false: even though BFS and DFS have the same space complexity, they do not always have the same worst case asymptotic time complexity.

SHORTEST PATHS. Weighted Digraphs. Shortest paths. Shortest Paths BOS ORD JFK SFO LAX DFW MIA

Week 11: Minimum Spanning trees

Shortest Paths. Nishant Mehta Lectures 10 and 11

Shortest Path Problem

CHAPTER 13 GRAPH ALGORITHMS

Graphs 3/25/14 15:37 SFO LAX Goodrich, Tamassia, Goldwasser Graphs 2

CSE 431/531: Analysis of Algorithms. Greedy Algorithms. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Shortest Paths. Nishant Mehta Lectures 10 and 11

managing an evolving set of connected components implementing a Union-Find data structure implementing Kruskal s algorithm

CS350: Data Structures Dijkstra s Shortest Path Alg.

Minimum Spanning Trees

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

Minimum Spanning Tree

CIS 121 Data Structures and Algorithms Minimum Spanning Trees

Decreasing a key FIB-HEAP-DECREASE-KEY(,, ) 3.. NIL. 2. error new key is greater than current key 6. CASCADING-CUT(, )

CSE331 Introduction to Algorithms Lecture 15 Minimum Spanning Trees

4/8/11. Single-Source Shortest Path. Shortest Paths. Shortest Paths. Chapter 24

Minimum Spanning Trees. CSE 373 Data Structures

What is a minimal spanning tree (MST) and how to find one

Graphs. CS16: Introduction to Data Structures & Algorithms Spring 2018

CSE 421 Greedy Alg: Union Find/Dijkstra s Alg

Chapter 9. Greedy Technique. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

Shortest Path Algorithms

CS 561, Lecture 9. Jared Saia University of New Mexico

Weighted Graph Algorithms Presented by Jason Yuan

Minimum Spanning Trees My T. UF

Chapter 23. Minimum Spanning Trees

CSE 100: GRAPH ALGORITHMS

Graph Terminology and Representations

Parallel Graph Algorithms

Tutorial. Question There are no forward edges. 4. For each back edge u, v, we have 0 d[v] d[u].

Representations of Weighted Graphs (as Matrices) Algorithms and Data Structures: Minimum Spanning Trees. Weighted Graphs

2 A Template for Minimum Spanning Tree Algorithms

Minimum Spanning Trees

The Shortest Path Problem. The Shortest Path Problem. Mathematical Model. Integer Programming Formulation

Algorithm Design (8) Graph Algorithms 1/2

Lecture 11: Analysis of Algorithms (CS ) 1

UC Berkeley CS 170: Efficient Algorithms and Intractable Problems Handout 8 Lecturer: David Wagner February 20, Notes 8 for CS 170

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Lecture 4: Graph Algorithms

CSE373: Data Structures & Algorithms Lecture 17: Minimum Spanning Trees. Dan Grossman Fall 2013

Part VI Graph algorithms. Chapter 22 Elementary Graph Algorithms Chapter 23 Minimum Spanning Trees Chapter 24 Single-source Shortest Paths

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CS 310 Advanced Data Structures and Algorithms

Graphs. Part II: SP and MST. Laura Toma Algorithms (csci2200), Bowdoin College

Minimum Spanning Trees Ch 23 Traversing graphs

1 Shortest Paths. 1.1 Breadth First Search (BFS) CS 124 Section #3 Shortest Paths and MSTs 2/13/2018

Outline. Graphs. Divide and Conquer.

6.1 Minimum Spanning Trees

Algorithms for Minimum Spanning Trees

2.1 Greedy Algorithms. 2.2 Minimum Spanning Trees. CS125 Lecture 2 Fall 2016

CS420/520 Algorithm Analysis Spring 2009 Lecture 14

CS200: Graphs. Rosen Ch , 9.6, Walls and Mirrors Ch. 14

DHANALAKSHMI COLLEGE OF ENGINEERING, CHENNAI. Department of Computer Science and Engineering CS6301 PROGRAMMING DATA STRUCTURES II

Lecture 13. Reading: Weiss, Ch. 9, Ch 8 CSE 100, UCSD: LEC 13. Page 1 of 29

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Shortest Paths Date: 10/13/15

CSC 8301 Design & Analysis of Algorithms: Kruskal s and Dijkstra s Algorithms

MST & Shortest Path -Prim s -Djikstra s

Partha Sarathi Manal

Minimum-Cost Spanning Tree. Example

Example. Minimum-Cost Spanning Tree. Edge Selection Greedy Strategies. Edge Selection Greedy Strategies

Transcription:

SFO ORD LAX Graph Algorithms shortest paths, minimum spanning trees, etc. DFW Nancy Amato Parasol Lab, Dept. CSE, Texas A&M University Acknowledgement: These slides are adapted from slides provided with Data Structures and Algorithms in C++, Goodrich, Tamassia and Mount (Wiley 200) http://parasol.tamu.edu

Minimum Spanning Trees 20 6 BOS SFO 33 LAX 16 1235 9 PVD ORD 1 0 1 16 621 JFK 1 125 02 1391 BWI 1090 DFW 96 1121 MIA 232 2

Outline and Reading Minimum Spanning Trees ( 13.6) Definitions A crucial fact The Prim-Jarnik Algorithm ( 13.6.2) Kruskal's Algorithm ( 13.6.1) 3

Reminder: Weighted In a weighted graph, each edge has an associated numerical value, called the weight of the edge Edge weights may represent, distances, costs, etc. Example: In a flight route graph, the weight of an edge represents the distance in miles between the endpoint airports SFO ORD LGA PVD HNL LAX DFW MIA

Minimum Spanning Tree Spanning subgraph Subgraph of a graph G containing all the vertices of G Spanning tree Spanning subgraph that is itself a (free) tree Minimum spanning tree (MST) Spanning tree of a weighted graph with minimum total edge weight Applications Communications networks Transportation networks DEN DFW 1 9 ORD 6 STL 10 3 5 ATL PIT DCA 2 5

Exercise: MST Show an MSF of the following graph. SFO ORD LGA PVD HNL LAX DFW MIA 6

Cycle Property Cycle Property: Let T be a minimum spanning tree of a weighted graph G Let e be an edge of G that is not in T and C let be the cycle formed by e with T For every edge f of C, weight(f) weight(e) Proof: By contradiction If weight(f) > weight(e) we can get a spanning tree of smaller weight by replacing e with f 2 2 6 6 f C f C 9 3 3 e Replacing f with e yields a better spanning tree 9 e

Partition Property Partition Property: Consider a partition of the vertices of G into subsets U and V Let e be an edge of minimum weight across the partition There is a minimum spanning tree of G containing edge e Proof: Let T be an MST of G If T does not contain e, consider the cycle C formed by e with T and let f be an edge of C across the partition By the cycle property, weight(f) weight(e) Thus, weight(f) = weight(e) We obtain another MST by replacing f with e 2 2 U U 5 5 f e f e 9 9 V 3 Replacing f with e yields another MST V 3

Prim-Jarnik s Algorithm (Similar to Dijkstra s algorithm, for a connected graph) We pick an arbitrary vertex s and we grow the MST as a cloud of vertices, starting from s We store with each vertex v a label d(v) = the smallest weight of an edge connecting v to a vertex in the cloud At each step: We add to the cloud the vertex u outside the cloud with the smallest distance label We update the labels of the vertices adjacent to u 9

Prim-Jarnik s Algorithm (cont.) A priority queue stores the vertices outside the cloud Key: distance Element: vertex Locator-based methods insert(k,e) returns a locator replacekey(l,k) changes the key of an item We store three labels with each vertex: Distance Parent edge in MST Locator in priority queue Algorithm PrimJarnikMST(G) Q new heap-based priority queue s a vertex of G for all v G.vertices() if v = s setdistance(v, 0) else setdistance(v, ) setparent(v, ) l Q.insert(getDistance(v), v) setlocator(v,l) while Q.isEmpty() u Q.removeMin() for all e G.incidentEdges(u) z G.opposite(u,e) r weight(e) if r < getdistance(z) setdistance(z,r) setparent(z,e) Q.replaceKey(getLocator(z),r) 10

Example 11 B D C A F E 2 5 3 9 0 2 B D C A F E 2 5 3 9 0 2 5 B D C A F E 2 5 3 9 0 2 5 B D C A F E 2 5 3 9 0 2 5

Example (contd.) 2 2 A 0 B 5 D 5 9 F C 3 E 3 2 2 A 0 B 5 D 5 9 F C 3 E 3 12

Exercise: Prim s MST alg Show how Prim s MST algorithm works on the following graph, assuming you start with SFO, I.e., s=sfo. Show how the MST evolves in each iteration (a separate figure for each iteration). SFO ORD LGA PVD HNL LAX DFW MIA 13

Analysis Graph operations Method incidentedges is called once for each vertex Label operations We set/get the distance, parent and locator labels of vertex z O(deg(z)) times Setting/getting a label takes O(1) time Priority queue operations Each vertex is inserted once into and removed once from the priority queue, where each insertion or removal takes O(log n) time The key of a vertex w in the priority queue is modified at most deg(w) times, where each key change takes O(log n) time Prim-Jarnik s algorithm runs in O((n + m) log n) time provided the graph is represented by the adjacency list structure Recall that S v deg(v) = 2m The running time is O(m log n) since the graph is connected 1

Kruskal s Algorithm A priority queue stores the edges outside the cloud Key: weight Element: edge At the end of the algorithm We are left with one cloud that encompasses the MST A tree T which is our MST Algorithm KruskalMST(G) for each vertex V in G do define a Cloud(v) of {v} let Q be a priority queue. Insert all edges into Q using their weights as the key T while T has fewer than n-1 edges edge e = T.removeMin() Let u, v be the endpoints of e if Cloud(v) Cloud(u) then Add edge e to T Merge Cloud(v) and Cloud(u) return T 15

Data Structure for Kruskal Algortihm The algorithm maintains a forest of trees An edge is accepted it if connects distinct trees We need a data structure that maintains a partition, i.e., a collection of disjoint sets, with the operations: find(u): return the set storing u union(u,v): replace the sets storing u and v with their union 16

Representation of a Partition Each set is stored in a sequence Each element has a reference back to the set operation find(u) takes O(1) time, and returns the set of which u is a member. in operation union(u,v), we move the elements of the smaller set to the sequence of the larger set and update their references the time for operation union(u,v) is min(n u,n v ), where n u and n v are the sizes of the sets storing u and v Whenever an element is processed, it goes into a set of size at least double, hence each element is processed at most log n times 1

Partition-Based Implementation A partition-based version of Kruskal s Algorithm performs cloud merges as unions and tests as finds. Algorithm Kruskal(G): Input: A weighted graph G. Output: An MST T for G. Let P be a partition of the vertices of G, where each vertex forms a separate set. Let Q be a priority queue storing the edges of G, sorted by their weights Let T be an initially-empty tree while Q is not empty do (u,v) Q.removeMinElement() if P.find(u)!= P.find(v) then return T Add (u,v) to T P.union(u,v) Running time: O((n+m) log n) 1

Kruskal Example 20 6 BOS SFO 33 16 9 ORD 0 16 621 02 1391 PVD 1 1 JFK 1 BWI 1090 125 LAX 1235 DFW 96 232 1121 MIA 19

Example 20 6 BOS SFO 33 16 ORD 0 16 621 02 1391 9 PVD 1 1 JFK 1 125 BWI 1090 LAX 1235 DFW 96 232 1121 MIA 20

Example 20 6 BOS SFO 33 16 ORD 0 16 621 02 1391 9 PVD 1 1 JFK 1 125 BWI 1090 LAX 1235 DFW 96 232 1121 MIA 21

Example 20 6 BOS SFO 33 16 ORD 0 16 621 02 1391 9 PVD 1 1 JFK 1 125 BWI 1090 LAX 1235 DFW 96 232 1121 MIA 22

Example 20 6 BOS SFO 33 16 ORD 0 16 621 02 1391 9 PVD 1 1 JFK 1 125 BWI 1090 LAX 1235 DFW 96 232 1121 MIA 23

Example 20 6 BOS SFO 33 16 ORD 0 16 621 02 1391 9 PVD 1 1 JFK 1 125 BWI 1090 LAX 1235 DFW 96 232 1121 MIA 2

Example 20 6 BOS SFO 33 16 ORD 0 16 621 02 1391 9 PVD 1 1 JFK 1 125 BWI 1090 LAX 1235 DFW 96 232 1121 MIA 25

Example 20 6 BOS SFO 33 16 ORD 0 16 621 02 1391 9 PVD 1 1 JFK 1 125 BWI 1090 LAX 1235 DFW 96 232 1121 MIA 26

Example 20 6 BOS SFO 33 16 ORD 0 16 621 02 1391 9 PVD 1 1 JFK 1 125 BWI 1090 LAX 1235 DFW 96 232 1121 MIA 2

Example 20 6 BOS SFO 33 16 ORD 0 16 621 02 1391 9 PVD 1 1 JFK 1 125 BWI 1090 LAX 1235 DFW 96 232 1121 MIA 2

Example 20 6 BOS SFO 33 16 ORD 0 16 621 02 1391 9 PVD 1 1 JFK 1 125 BWI 1090 LAX 1235 DFW 96 232 1121 MIA 29

Example 20 6 BOS SFO 33 16 ORD 0 16 621 02 1391 9 PVD 1 1 JFK 1 125 BWI 1090 LAX 1235 DFW 96 232 1121 MIA 30

Example 20 6 BOS SFO 33 16 ORD 0 16 621 02 1391 9 PVD 1 1 JFK 1 125 BWI 1090 LAX 1235 DFW 96 232 1121 MIA 31

Example 20 6 BOS SFO 33 LAX 16 1235 9 PVD ORD 1 0 1 16 621 JFK 1 125 02 1391 BWI 1090 DFW 96 1121 MIA 232 32

Exercise: Kruskal s MST alg Show how Kruskal s MST algorithm works on the following graph. Show how the MST evolves in each iteration (a separate figure for each iteration). SFO ORD LGA PVD HNL LAX DFW MIA 33

Shortest Paths 0 A 2 2 1 B C D 5 3 9 2 5 E F 3 3

Outline and Reading Weighted graphs ( 13.5.1) Shortest path problem Shortest path properties Dijkstra s algorithm ( 13.5.2) Algorithm Edge relaxation The Bellman-Ford algorithm Shortest paths in DAGs All-pairs shortest paths 39

Weighted In a weighted graph, each edge has an associated numerical value, called the weight of the edge Edge weights may represent, distances, costs, etc. Example: In a flight route graph, the weight of an edge represents the distance in miles between the endpoint airports SFO ORD LGA PVD HNL LAX DFW MIA 0

Shortest Path Problem Given a weighted graph and two vertices u and v, we want to find a path of minimum total weight between u and v. Length of a path is the sum of the weights of its edges. Example: Shortest path between Providence and Honolulu Applications Internet packet routing Flight reservations Driving directions HNL SFO LAX ORD DFW LGA PVD MIA 1

Shortest Path Problem If there is no path from v to u, we denote the distance between them by d(v, u)=+ What if there is a negative-weight cycle in the graph? SFO ORD LGA PVD HNL LAX DFW MIA 2

Shortest Path Properties Property 1: A subpath of a shortest path is itself a shortest path Property 2: There is a tree of shortest paths from a start vertex to all the other vertices Example: Tree of shortest paths from Providence SFO ORD LGA PVD HNL LAX DFW MIA 3

Dijkstra s Algorithm The distance of a vertex v from a vertex s is the length of a shortest path between s and v Dijkstra s algorithm computes the distances of all the vertices from a given start vertex s (single-source shortest paths) Assumptions: the graph is connected the edges are undirected the edge weights are nonnegative We grow a cloud of vertices, beginning with s and eventually covering all the vertices We store with each vertex v a label D[v] representing the distance of v from s in the subgraph consisting of the cloud and its adjacent vertices The label D[v] is initialized to positive infinity At each step We add to the cloud the vertex u outside the cloud with the smallest distance label, D[v] We update the labels of the vertices adjacent to u (i.e. edge relaxation)

Edge Relaxation Consider an edge e = (u,z) such that u is the vertex most recently added to the cloud z is not in the cloud s D[u] = 50 u D[y] = 20 y e D[z] = 5 z The relaxation of edge e updates distance D[z] as follows: D[z]min{D[z],D[u]+weight(e) } s D[u] = 50 u D[y] = 20 y e D[z] = 60 z 5

1. Pull in one of the vertices with red labels Example 2. The relaxation of edges updates the labels of LARGER font size B 0 A 2 2 1 C D B 0 A 2 2 1 C D 3 3 9 2 5 E F 5 3 9 2 5 E F B 0 A 2 2 1 C D 3 B 0 A 2 2 1 C D 3 3 9 2 5 E F 5 11 5 3 9 2 5 E F 6

Example (cont.) B 0 A 2 2 1 C D 3 5 3 9 2 5 E F B 0 A 2 2 1 C D 3 5 3 9 2 5 E F

Exercise: Dijkstra s alg Show how Dijkstra s algorithm works on the following graph, assuming you start with SFO, I.e., s=sfo. Show how the labels are updated in each iteration (a separate figure for each iteration). SFO ORD LGA PVD HNL LAX DFW MIA

Dijkstra s Algorithm A priority queue stores the vertices outside the cloud Key: distance Element: vertex Locator-based methods insert(k,e) returns a locator replacekey(l,k) changes the key of an item We store two labels with each vertex: distance (D[v] label) locator in priority queue Algorithm DijkstraDistances(G, s) Q new heap-based priority queue for all v G.vertices() if v = s setdistance(v, 0) else setdistance(v, ) l Q.insert(getDistance(v), v) setlocator(v,l) while Q.isEmpty() { pull a new vertex u into the cloud } u Q.removeMin() for all e G.incidentEdges(u) { relax edge e } z G.opposite(u,e) r getdistance(u) + weight(e) if r < getdistance(z) setdistance(z,r) Q.replaceKey(getLocator(z),r) O(n) iter s O(logn) O(n) iter s O(logn) v deg(u) iter s O(logn) 9

Analysis Graph operations Method incidentedges is called once for each vertex Label operations We set/get the distance and locator labels of vertex z O(deg(z)) times Setting/getting a label takes O(1) time Priority queue operations Each vertex is inserted once into and removed once from the priority queue, where each insertion or removal takes O(log n) time The key of a vertex in the priority queue is modified at most deg(w) times, where each key change takes O(log n) time Dijkstra s algorithm runs in O((n + m) log n) time provided the graph is represented by the adjacency list structure Recall that S v deg(v) = 2m The running time can also be expressed as O(m log n) since the graph is connected The running time can be expressed as a function of n, O(n 2 log n) 50

Exercise: Analysis Algorithm DijkstraDistances(G, s) Q new unsorted-sequence-based priority queue for all v G.vertices() if v = s setdistance(v, 0) else setdistance(v, ) l Q.insert(getDistance(v), v) setlocator(v,l) while Q.isEmpty() { pull a new vertex u into the cloud } u Q.removeMin() for all e G.incidentEdges(u) //use adjacency list { relax edge e } z G.opposite(u,e) r getdistance(u) + weight(e) if r < getdistance(z) setdistance(z,r) Q.replaceKey(getLocator(z),r) O(1) 51

Extension Using the template method pattern, we can extend Dijkstra s algorithm to return a tree of shortest paths from the start vertex to all other vertices We store with each vertex a third label: parent edge in the shortest path tree In the edge relaxation step, we update the parent label Algorithm DijkstraShortestPathsTree(G, s) for all v G.vertices() setparent(v, ) for all e G.incidentEdges(u) { relax edge e } z G.opposite(u,e) r getdistance(u) + weight(e) if r < getdistance(z) setdistance(z,r) setparent(z,e) Q.replaceKey(getLocator(z),r) 52

Why Dijkstra s Algorithm Works Dijkstra s algorithm is based on the greedy method. It adds vertices by increasing distance. Suppose it didn t find all shortest distances. Let F be the first wrong vertex the algorithm processed. When the previous node, D, on the true shortest path was considered, its distance was correct. But the edge (D,F) was relaxed at that time! Thus, so long as D[F]>D[D], F s distance cannot be wrong. That is, there is no wrong vertex. 0 s 2 2 1 B C D 5 3 9 2 5 E F 3 53

Why It Doesn t Work for Negative- Weight Edges Dijkstra s algorithm is based on the greedy method. It adds vertices by increasing distance. If a node with a negative incident edge were to be added late to the cloud, it could mess up distances for vertices already in the cloud. B A 2 2-3 C 3 9 2 5 E F B A 0 2 2-3 C 3 9 2 5 E F 0 5 11 D D C s true distance is 1, but it is already in the cloud with D[C]=2! 0 5

Bellman-Ford Algorithm Works even with negativeweight edges Must assume directed edges (for otherwise we would have negative-weight cycles) Iteration i finds all shortest paths that use i edges. Running time: O(nm). Can be extended to detect a negative-weight cycle if it exists How? Algorithm BellmanFord(G, s) for all v G.vertices() if v = s setdistance(v, 0) else setdistance(v, ) for i 1 to n-1 do for each e G.edges() { relax edge e } u G.origin(e) z G.opposite(u,e) r getdistance(u) + weight(e) if r < getdistance(z) setdistance(z,r) 55

Bellman-Ford Example 0-2 1 3 9-2 5 Nodes are labeled with their d(v) values First round 0-2 -2 1 3 9-2 5 Second round 5 0-2 -2 1-1 Third round 5 0-2 -2 1-1 1 3 9-2 6 9 5 3 9-2 5 1 9 56

Exercise: Bellman-Ford s alg Show how Bellman-Ford s algorithm works on the following graph, assuming you start with the top node Show how the labels are updated in each iteration (a separate figure for each iteration). 0-2 1 3 9-5 5 5

DAG-based Algorithm Works even with negative-weight edges Uses topological order Is much faster than Dijkstra s algorithm Running time: O(n+m). Algorithm DagDistances(G, s) for all v G.vertices() if v = s setdistance(v, 0) else setdistance(v, ) Perform a topological sort of the vertices for u 1 to n do {in topological order} for each e G.outEdges(u) { relax edge e } z G.opposite(u,e) r getdistance(u) + weight(e) if r < getdistance(z) setdistance(z,r) 5

DAG Example 3 1 0-2 1 3 9-5 5 Nodes are labeled with their d(v) values 2 6 5 3 1 0-2 2-2 1 3 9-5 5 6 5 3 5 1 0-2 -2 1 2 3 9-5 1 5 6 5-1 3 5 1 0-2 -2 1 2-1 3 9-5 0 5 1 6 5 (two steps) 59

Exercize: DAG-based Alg Show how DAG-based algorithm works on the following graph, assuming you start with the second rightmost node Show how the labels are updated in each iteration (a separate figure for each iteration). 6 1 2 5 0 2-1 -2 1 3 5 3 2 60

Summary of Shortest-Path Algs Breadth-First-Search Dijkstra s algorithm ( 13.5.2) Algorithm Edge relaxation The Bellman-Ford algorithm Shortest paths in DAGs 61

All-Pairs Shortest Paths Find the distance between every pair of vertices in a weighted directed graph G. We can make n calls to Dijkstra s algorithm (if no negative edges), which takes O(nmlog n) time. Likewise, n calls to Bellman-Ford would take O(n 2 m) time. We can achieve O(n 3 ) time using dynamic programming (similar to the Floyd-Warshall algorithm). Algorithm AllPair(G) {assumes vertices 1,,n} for all vertex pairs (i,j) if i = j D 0 [i,i] 0 else if (i,j) is an edge in G D 0 [i,j] weight of edge (i,j) else D 0 [i,j] + for k 1 to n do for i 1 to n do for j 1 to n do D k [i,j] min{d k-1 [i,j], D k-1 [i,k]+d k-1 [k,j]} return D n Uses only vertices numbered 1,,k-1 i Uses only vertices numbered 1,,k (compute weight of this edge) k j Uses only vertices numbered 1,,k-1 62