Last week: Breadth-First Search

Similar documents
1 More on the Bellman-Ford Algorithm

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Shortest Paths Date: 10/13/15

Lecture 11. Weighted Graphs: Dijkstra and Bellman-Ford

1 Dynamic Programming

Dynamic-Programming algorithms for shortest path problems: Bellman-Ford (for singlesource) and Floyd-Warshall (for all-pairs).

Solving Linear Recurrence Relations (8.2)

Lecture 18. What we ve done and what s to come

Unit-5 Dynamic Programming 2016

Algorithms Activity 6: Applications of BFS

Dynamic Programming II

Longest Common Subsequence, Knapsack, Independent Set Scribe: Wilbur Yang (2016), Mary Wootters (2017) Date: November 6, 2017

Dynamic Programming. CSE 421: Intro Algorithms. Some Algorithm Design Techniques, I. Techniques, II. Outline:

Chapter 9 Graph Algorithms

11/22/2016. Chapter 9 Graph Algorithms. Introduction. Definitions. Definitions. Definitions. Definitions

Shortest path problems

Shortest Path Problem

Minimum Spanning Trees

Object-oriented programming. and data-structures CS/ENGRD 2110 SUMMER 2018

Lecture 6 Basic Graph Algorithms

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Dynamic Programming I Date: 10/6/16

Lecture 18. What we ve done and what s to come

CSE 521: Algorithms. Dynamic Programming, I Intro: Fibonacci & Stamps. W. L. Ruzzo

Dynamic Programming Intro

Dynamic Programming. CSE 101: Design and Analysis of Algorithms Lecture 19

CSE 421: Intro Algorithms. Winter 2012 W. L. Ruzzo Dynamic Programming, I Intro: Fibonacci & Stamps

CS 161 Lecture 11 BFS, Dijkstra s algorithm Jessica Su (some parts copied from CLRS) 1 Review

Dijkstra s Algorithm Last time we saw two methods to solve the all-pairs shortest path problem: Min-plus matrix powering in O(n 3 log n) time and the

Algorithm Design and Analysis

Dynamic Programming. Ellen Feldman and Avishek Dutta. February 27, CS155 Machine Learning and Data Mining

Lecture 10. Finding strongly connected components

1 i n (p i + r n i ) (Note that by allowing i to be n, we handle the case where the rod is not cut at all.)

Lecture 11: Analysis of Algorithms (CS ) 1

Algorithm Design and Analysis

Algorithmic Paradigms

Chapter 9 Graph Algorithms

CS 380 ALGORITHM DESIGN AND ANALYSIS

Title. Ferienakademie im Sarntal Course 2 Distance Problems: Theory and Praxis. Nesrine Damak. Fakultät für Informatik TU München. 20.

Algorithms for Data Science

(Re)Introduction to Graphs and Some Algorithms

CS490 Quiz 1. This is the written part of Quiz 1. The quiz is closed book; in particular, no notes, calculators and cell phones are allowed.

END-TERM EXAMINATION

CS 341: Algorithms. Douglas R. Stinson. David R. Cheriton School of Computer Science University of Waterloo. February 26, 2019

CS 3410 Ch 14 Graphs and Paths

CS521 \ Notes for the Final Exam

CSC 373 Lecture # 3 Instructor: Milad Eftekhar

from notes written mostly by Dr. Carla Savage: All Rights Reserved

Design and Analysis of Algorithms Prof. Madhavan Mukund Chennai Mathematical Institute. Module 02 Lecture - 45 Memoization

Dijkstra s algorithm for shortest paths when no edges have negative weight.

Chapter 6. Dynamic Programming

Design and Analysis of Algorithms

Search: Advanced Topics and Conclusion

Design and Analysis of Algorithms

Shortest Paths. Nishant Mehta Lectures 10 and 11

Lecture 6: Shortest distance, Eulerian cycles, Maxflow

Dynamic Programming. Introduction, Weighted Interval Scheduling, Knapsack. Tyler Moore. Lecture 15/16

Graph Algorithms (part 3 of CSC 282),

(Refer Slide Time: 05:25)

CS 4349 Lecture September 13th, 2017

Shortest Paths. Nishant Mehta Lectures 10 and 11

Chapter 9 Graph Algorithms

Lecture 6. Binary Search Trees and Red-Black Trees

Dynamic Programming: 1D Optimization. Dynamic Programming: 2D Optimization. Fibonacci Sequence. Crazy 8 s. Edit Distance

Problem Score Maximum MC 34 (25/17) = 50 Total 100

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 9 Notes. Class URL:

1 Shortest Paths. 1.1 Breadth First Search (BFS) CS 124 Section #3 Shortest Paths and MSTs 2/13/2018

Classical Shortest-Path Algorithms

CS4800: Algorithms & Data Jonathan Ullman

CSE 373 NOVEMBER 20 TH TOPOLOGICAL SORT

Introduction to Optimization

Figure 1: A directed graph.

ALGORITHM DESIGN DYNAMIC PROGRAMMING. University of Waterloo

Outline. Graphs. Divide and Conquer.

Addis Ababa University, Amist Kilo July 20, 2011 Algorithms and Programming for High Schoolers. Lecture 12

CSE 417 Network Flows (pt 4) Min Cost Flows

Artificial Intelligence

Lecture 4 Median and Selection

Scribe: Virginia Williams, Sam Kim (2016), Mary Wootters (2017) Date: May 22, 2017

CMPSCI611: The SUBSET-SUM Problem Lecture 18

Memoization/Dynamic Programming. The String reconstruction problem. CS124 Lecture 11 Spring 2018

Multithreaded Algorithms Part 1. Dept. of Computer Science & Eng University of Moratuwa

Lecture 5: Graph algorithms 1

Lecture outline. Graph coloring Examples Applications Algorithms

Algorithm Design and Analysis

Lecture 15. Minimum Spanning Trees

CS 361 Data Structures & Algs Lecture 15. Prof. Tom Hayes University of New Mexico

Lecture 1: An Introduction to Graph Theory

Analysis of Algorithms, I

Graphs II: Trailblazing

Algorithm Design and Analysis

Tutorial 7: Advanced Algorithms. (e) Give an example in which fixed-parameter tractability is useful.

15-451/651: Design & Analysis of Algorithms January 26, 2015 Dynamic Programming I last changed: January 28, 2015

CS 231: Algorithmic Problem Solving

CS161 - Final Exam Computer Science Department, Stanford University August 16, 2008

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

TIE Graph algorithms

A4B33ALG 2015/10 ALG 10. Dynamic programming. abbreviation: DP. Sources, overviews, examples see.

Lectures 12 and 13 Dynamic programming: weighted interval scheduling

Practice Problems for the Final

22.3 Depth First Search

Department of Computer Applications. MCA 312: Design and Analysis of Algorithms. [Part I : Medium Answer Type Questions] UNIT I

Transcription:

1 Last week: Breadth-First Search Set L i = [] for i=1,,n L 0 = {w}, where w is the start node For i = 0,, n-1: For u in L i : For each v which is a neighbor of u: If v isn t yet visited: - mark v as visited, and put it in L i+1 L 0 Think about this: How come BFS finds the shortest path? There are exponentially many candidates to check L 1 L 2 L 3

Lecture 10 Dynamic Programming and Floyd-Warshall!

3 Last week Graphs! DFS Topological Sorting Strongly Connected Components BFS Shortest Paths from u in unweighted graphs i.e. all edges were identical

4 Last week: Breadth-First Search Exploring the world with a bird s-eye view Not been there yet Can reach there in zero steps start Can reach there in one step Can reach there in two steps Can reach there in three steps

5 Last week: Breadth-First Search Exploring the world with a bird s-eye view Not been there yet Can reach there in zero steps start Can reach there in one step Can reach there in two steps Can reach there in three steps

6 Last week: Breadth-First Search Exploring the world with a bird s-eye view Not been there yet Can reach there in zero steps start Can reach there in one step Can reach there in two steps Can reach there in three steps

7 Last week: Breadth-First Search Exploring the world with a bird s-eye view Not been there yet Can reach there in zero steps start Can reach there in one step Can reach there in two steps Can reach there in three steps

8 Last week: Breadth-First Search Exploring the world with a bird s-eye view Not been there yet Can reach there in zero steps start Can reach there in one step Can reach there in two steps Can reach there in three steps

9 Last week: Breadth-First Search Exploring the world with pseudocode L i is the set of nodes we can reach in i steps from w Set L i = [] for i=1,,n L 0 = {w}, where w is the start node For i = 0,, n-1: For u in L i : For each v which is a neighbor of u: If v isn t yet visited: mark v as visited, and put it in L i+1 - L 0 L 1 Go through all the nodes in L i and add their unvisited neighbors to L i+1 L 2 L 3

10 Today Dynamic programming All-pairs shortest path (APSP) What if the graphs are weighted? Floyd-Warshall algorithm!

11 Example: Fibonacci Numbers Definition: F(n) = F(n-1) + F(n-2), with F(0) = F(1) = 1. The first several are: 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, Question: Given n, what is F(n)?

12 Candidate algorithm See CLRS Problem 4-4 for a walkthrough of how fast the Fibonacci numbers grow! def Fibonacci(n): if n == 0 or n == 1: return 1 return Fibonacci(n-1) + Fibonacci(n-2) Running time? T(n) = T(n-1) + T(n-2) + O(1) T(n) T(n-1) + T(n-2) for n 2 So T(n) grows at least as fast as the Fibonacci numbers themselves Fun fact, that s like φ n where φ = 1+ 5 is the golden ratio. 2 aka, EXPONENTIALLY QUICKLY

13 What s going on? Consider Fib(8) That s a lot of repeated computation! 8 6 7 4 5 5 6 2 3 3 4 3 4 4 5 0 1 1 2 1 2 2 3 1 2 2 3 2 3 3 4 1 2 etc 0 1 0 1 0 1 1 2 0 1 0 1 0 1 2 1 1 2 0 1 0 1 0 1 0 1

14 Maybe this would be better: 8 7 6 def fasterfibonacci(n): F = [1, 1, None, None,, None ] \\ F has length n for i = 2,, n: F[i] = F[i-1] + F[i-2] return F[n] Much better running time! 5 4 3 2 1 0

15 This was an example of

16 What is dynamic programming? It is an algorithm design paradigm like divide-and-conquer is an algorithm design paradigm. Usually it is for solving optimization problems eg, shortest path (Fibonacci numbers aren t an optimization problem, but they are a good example )

17 Elements of dynamic programming 1. Optimal sub-structure: Big problems break up into sub-problems. Fibonacci: F(i) for i n The solution to a problem can be expressed in terms of solutions to smaller sub-problems. Fibonacci: F(i+1) = F(i) + F(i-1)

18 Elements of dynamic programming 2. Overlapping sub-problems: The sub-problems overlap a lot. Fibonacci: Lots of different F[j] will use F[i]. This means that we can save time by solving a sub-problem just once and storing the answer.

19 Elements of dynamic programming Optimal substructure. Optimal solutions to sub-problems are sub-solutions to the optimal solution of the original problem. Overlapping subproblems. The subproblems show up again and again Using these properties, we can design a dynamic programming algorithm: Keep a table of solutions to the smaller problems. Use the solutions in the table to solve bigger problems. At the end we can use information we collected along the way to find the solution to the whole thing.

20 Two ways to think about and/or implement DP algorithms Top down Bottom up

21 Bottom up approach what we just saw. For Fibonacci: Solve the small problems first fill in F[0],F[1] Then bigger problems fill in F[2] Then bigger problems fill in F[n-1] Then finally solve the real problem. fill in F[n]

22 Top down approach Think of it like a recursive algorithm. To solve the big problem: Recurse to solve smaller problems Those recurse to solve smaller problems etc.. The difference from divide and conquer: Memo-ization Keep track of what small problems you ve already solved to prevent re-solving the same problem twice.

23 Example of top-down Fibonacci define a global list F = [1,1,None, None,, None] def Fibonacci(n): if F[n]!= None: return F[n] else: F[n] = Fibonacci(n-1) + Fibonacci(n-2) return F[n]

24 Memo-ization visualization 8 6 7 4 5 5 6 2 3 3 4 3 4 4 5 0 1 1 2 1 2 2 3 1 2 2 3 2 3 3 4 1 2 etc 0 1 0 1 0 1 1 2 0 1 0 1 0 1 2 1 1 2 0 1 0 1 0 1 0 1

25 Memo-ization Visualization ctd 8 7 6 5 define a global list F = [1,1,None, None,, None] def Fibonacci(n): if F[n]!= None: return F[n] else: F[n] = Fibonacci(n-1) + Fibonacci(n-2) return F[n] 4 3 2 1 0

26 What have we learned? Paradigm in algorithm design. Uses optimal substructure Uses overlapping subproblems Can be implemented bottom-up or top-down. It s a fancy name for a pretty common-sense idea:

27 Wait, what about BFS? is it also Dynamic Programming? Set L i = [] for i=1,,n L 0 = {w}, where w is the start node For i = 0,, n-1: For u in L i : For each v which is a neighbor of u: If v isn t yet visited: - mark v as visited, and put it in L i+1 L 0 L 1 Think-Pair-Share! L 2 L 3

28 Why dynamic programming? Programming refers to finding the optimal program. as in, a shortest route is a plan aka a program. Dynamic refers to the fact that it s multi-stage. But also it s just a fancy-sounding name. Manipulating computer code in an action movie?

29 Why dynamic programming? Richard Bellman invented the name in the 1950 s. At the time, he was working for the RAND Corporation, which was basically working for the Air Force, and government projects needed flashy names to get funded. From Bellman s autobiography: It s impossible to use the word, dynamic, in the pejorative sense I thought dynamic programming was a good name. It was something not even a Congressman could object to.

30 Today Dynamic programming All-pairs shortest path (APSP) What if the graphs are weighted? Floyd-Warshall algorithm!

31 Source All-Pairs Shortest Path Problem All-Pairs Shortest Paths (APSP) That is, I want to know the shortest path from u to v for ALL pairs u,v of vertices in the graph. Not just from a special single source s. Destination s u v t s s 0 1 1 2 u 1 0 1 2 u v 0 1 t 0 t v

32 Source All-Pairs Shortest Path Problem All-Pairs Shortest Paths (APSP) That is, I want to know the shortest path from u to v for ALL pairs u,v of vertices in the graph. Not just from a special single source s. Destination In theory, yes. s u v t s 0 1 1 2 There are faster algorithms that use fast u matrix 1 multiplication 0 1 2 (which you saw in Section 1). v 0 1 Learn faster APSP alg s in CS367 t 0 Candidate algorithm: Run BFS from every node! Complexity: O n 2 + nm Can we do better?

33 All-Pairs Shortest Path Problem All-Pairs Shortest Paths (APSP) That is, I want to know the shortest path from u to v for ALL pairs u,v of vertices in the graph. Not just from a special single source s. Instead, today we ll ask: Can we do more? Candidate algorithm: Run BFS from every node! Complexity: O n 2 + nm Can we do better?

34 Today Dynamic programming All-pairs shortest path (APSP) What if the graphs are weighted? Floyd-Warshall algorithm!

35 Caltrain Hospital Stadium Packard Gates STLC YOU ARE HERE Union Dish

36 Just the graph How do I get from Gates to the Union? Caltrain Hospital 25 Packard 22 10 1 Gates 4 Union 17 1 STLC 10 Run BFS 15 Stadium I should go to the dish and then back to the union! Dish 20 That doesn t make sense if I label the edges by walking time.

37 Just the graph How do I get from Gates to the Union? Caltrain weighted Hospital graph 1 w(u,v) = weight of edge between u and v. 25 Packard 22 10 Gates 4 Union 17 1 STLC 10 15 Stadium If I pay attention to the weights Dish 20 I should go to STLC, then the union.

38 Shortest path problem What is the shortest path between u and v in a weighted graph? the cost of a path is the sum of the weights along that path The shortest path is the one with the minimum cost. s 3 20 The distance d(u,v) between two vertices u and v is the cost of the the shortest path between u and v. For this lecture all graphs are directed, but to save on notation = I m just going to draw undirected edges. 2 This path from s to t has cost 25. 2 1 1 1 This path is shorter, t it has cost 5.

39 Recall A weighted directed graph: Weights on edges represent costs. The cost of a path is the sum of the weights along that path. a 1 s 1 This is a path from s to t of cost 22. b u 3 32 21 13 5 2 v A shortest path from s to t is a directed path from s to t with the smallest cost. 16 t This is a path from s to t of cost 10. It is the shortest path from s to t.

40 Source Weighted All-Pairs Shortest Path All-Pairs Shortest Paths (APSP) That is, I want to know the shortest path from u to v for ALL pairs u,v of vertices in the graph. Not just from a special single source s. Destination s u v t s 2 s 0 2 4 2 u 1 0 2 0 v 0-2 5 1 2 u t 0 t -2 v

41 Today Dynamic programming All-pairs shortest path (APSP) What if the graphs are weighted? Floyd-Warshall algorithm!

42 [Intuition slide 1/3] Weighted APSP: How can we generalize BFS to weighted graphs? 1. What are the sub-problems of BFS? - P(u, v, k): Is distance u, v k? - Recursion: P u, v, k = max P(u, w, k 1) w,v E 2. How can you re-word your answer to #1, so that it makes sense for weighted graphs? - D(u, v, k) Shortest path from u to v using k vertices? - Recursion: D u, v, k = min D u, w, k 1 + cost(w, v) w,v E Think-Pair-Share!

43 [Intuition slide 2/3] Weighted APSP: How can we generalize BFS to weighted graphs? 1. What are the sub-problems of BFS? - L(u, v, i): Is distance u, v i? - Recursion: L u, v, i = max L(u, w, i 1) w,v E 2. How can you re-word your answer to #1, so that it makes sense for weighted graphs? - D(u, v, k): Shortest path from u to v using k vertices? - Recursion: D u, v, k = min D u, w, k 1 + cost(w, v) w,v E

44 [Intuition slide 3/3] Is this a good idea? Goals for next slide: 1. O n 3 time! 2. Consider only O(1) sub-problems 3. Use different source sub-problems - D(u, v, k): Shortest path from u to v using k vertices? - Recursion: D u, v, k = min D u, w, k 1 + cost(w, v) w,v E - It s actually not a bad starting point (next week we ll see it s called Bellman-Ford ) - For (u, v, k), we re only using (u, w, k 1) sub-problems but we also know w, v, k 1 - Enumerating over deg(v) sub-problems is expensive - Complexity: O n 2 m

45 [Actual algorithm] Optimal substructure Sub-problem(k-1): For all pairs, u,v, find the cost of the shortest path from u to v, so that all the internal vertices on that path are in {1,,k-1}. Label the vertices 1,2,,n (We omit some edges in the picture below). Let D (k-1) [u,v] be the solution to Sub-problem(k-1). k k+1 u 1 2 v 3 n k-1 This is the shortest path from u to v through the blue set. It has length D (k-1) [u,v]

46 Optimal substructure Sub-problem(k-1): For all pairs, u,v, find the cost of the shortest path from u to v, so that all the internal vertices on that path are in {1,,k-1}. Label the vertices 1,2,,n (We omit some edges in the picture below). Let D (k-1) [u,v] be the solution to Sub-problem(k-1). k Question: How can we find D (k) [u,v] using D (k-1)? k+1 u 1 2 v 3 n k-1 This is the shortest path from u to v through the blue set. It has length D (k-1) [u,v]

47 How can we find D (k) [u,v] using D (k-1)? D (k) [u,v] is the cost of the shortest path from u to v so that all internal vertices on that path are in {1,, k}. k k+1 u 1 2 v 3 k-1 n

48 How can we find D (k) [u,v] using D (k-1)? D (k) [u,v] is the cost of the shortest path from u to v so that all internal vertices on that path are in {1,, k}. Case 1: we don t need vertex k. k k+1 u 1 2 v 3 k-1 n D (k) [u,v] = D (k-1) [u,v]

49 How can we find D (k) [u,v] using D (k-1)? D (k) [u,v] is the cost of the shortest path from u to v so that all internal vertices on that path are in {1,, k}. Case 2: we need vertex k. k k+1 u 1 2 v 3 k-1 n

50 Case 2 continued Suppose there are no negative cycles. Then WLOG the shortest path from u to v through {1,,k} is simple. Case 2: we need vertex k. k If that path passes through k, it must look like this: This path is the shortest path from u to k through {1,,k-1}. sub-paths of shortest paths are shortest paths Same for this path. u n 3 1 2 k-1 v D (k) [u,v] = D (k-1) [u,k] + D (k-1) [k,v]

51 How can we find D (k) [u,v] using D (k-1)? D (k) [u,v] = min{ D (k-1) [u,v], D (k-1) [u,k] + D (k-1) [k,v] } Case 1: Cost of shortest path through {1,,k-1} Case 2: Cost of shortest path from u to k and then from k to v through {1,,k-1} Optimal substructure: We can solve the big problem using smaller problems. Overlapping sub-problems: D (k-1) [k,v] can be used to help compute D (k) [u,v] for lots of different u s.

52 How can we find D (k) [u,v] using D (k-1)? D (k) [u,v] = min{ D (k-1) [u,v], D (k-1) [u,k] + D (k-1) [k,v] } Case 1: Cost of shortest path through {1,,k-1} Case 2: Cost of shortest path from u to k and then from k to v through {1,,k-1} Using our paradigm, this immediately gives us an algorithm!

53 Floyd-Warshall algorithm Initialize n-by-n arrays D (k) for k = 0,,n D (k) [u,u] = 0 for all u, for all k D (k) [u,v] = for all u v, for all k D (0) [u,v] = weight(u,v) for all (u,v) in E. For k = 1,, n: For pairs u,v in V 2 : D (k) [u,v] = min{ D (k-1) [u,v], D (k-1) [u,k] + D (k-1) [k,v] } Return D (n) The base case checks out: the only path through zero other vertices are edges directly from u to v. This is a bottom-up algorithm.

54 We ve basically just shown Theorem: If there are no negative cycles in a weighted directed graph G, then the Floyd-Warshall algorithm, running on G, returns a matrix D (n) so that: D (n) [u,v] = distance between u and v in G. Running time: O(n 3 ) For dense graphs (m = Θ n 2 ), it s as good as running BFS from every vertex Work out the details of the proof! (Or see Lecture Notes for a few more details). Storage: Need to store two n-by-n arrays, and the original graph. We don t really need to store all n of the D (k).

55 What if there are negative cycles? Floyd-Warshall can detect negative cycles: Negative cycle v s.t. there is a path from v to v that goes through all n vertices that has cost < 0. Negative cycle v s.t. D (n) [v,v] < 0. Algorithm: Run Floyd-Warshall as before. If there is some v so that D (n) [v,v] < 0: return negative cycle.

56 What have we learned? The Floyd-Warshall algorithm is another example of dynamic programming. It computes All Pairs Shortest Paths in a directed weighted graph in time O(n 3 ).

57 Another Example of DP? Longest simple path (say all edge weights are 1): s a t b What is the longest simple path from s to t?

58 This is an optimization problem Can we use Dynamic Programming? Optimal Substructure? Longest path from s to t = longest path from s to a + longest path from a to t? s a NOPE! t b

59 This doesn t give optimal sub-structure Optimal solutions to subproblems don t give us an optimal solution to the big problem. (At least if we try to do it this way). The subproblems we came up with aren t independent: Once we ve chosen the longest path from a to t which uses b, our longest path from s to a shouldn t be allowed to use b since b was already used. Actually, the longest simple path problem is NP-complete. We don t know of any polynomialtime algorithms for it, DP or otherwise! s t a b

60 Recap Floyd-Warshall for weighted all-pairs shortest path Dynamic programming! This is a fancy name for: Break up an optimization problem into smaller problems The optimal solutions to the sub-problems should be subsolutions to the original problem. Build the optimal solution iteratively by filling in a table of subsolutions. Take advantage of overlapping sub-problems!

61 Next time More examples of dynamic programming! We will stop bullets with our action-packed coding skills, and also maybe find longest common subsequences. Before next time Go to section! No HW this week!!