Algorithms and Data Structures 2014 Exercises and Solutions Week 9

Similar documents
Week 12: Minimum Spanning trees and Shortest Paths

Algorithm Design and Analysis

Algorithm Design and Analysis

Undirected Graphs. DSA - lecture 6 - T.U.Cluj-Napoca - M. Joldos 1

Solution. violating the triangle inequality. - Initialize U = {s} - Add vertex v to U whenever DIST[v] decreases.

CS 161 Lecture 11 BFS, Dijkstra s algorithm Jessica Su (some parts copied from CLRS) 1 Review

1 Dijkstra s Algorithm

Elementary Graph Algorithms: Summary. Algorithms. CmSc250 Intro to Algorithms

Dr. Alexander Souza. Winter term 11/12

Algorithms (VII) Yijia Chen Shanghai Jiaotong University

Graph Representation

CSE 431/531: Analysis of Algorithms. Greedy Algorithms. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Algorithms (VII) Yijia Chen Shanghai Jiaotong University

CS 270 Algorithms. Oliver Kullmann. Analysing BFS. Depth-first search. Analysing DFS. Dags and topological sorting.

Data Structures Brett Bernstein

Graph Search. Adnan Aziz

CS 270 Algorithms. Oliver Kullmann. Breadth-first search. Analysing BFS. Depth-first. search. Analysing DFS. Dags and topological sorting.

Directed Graphs. DSA - lecture 5 - T.U.Cluj-Napoca - M. Joldos 1

CS 341: Algorithms. Douglas R. Stinson. David R. Cheriton School of Computer Science University of Waterloo. February 26, 2019

Graph Algorithms. Chapter 22. CPTR 430 Algorithms Graph Algorithms 1

Week 5. 1 Analysing BFS. 2 Depth-first search. 3 Analysing DFS. 4 Dags and topological sorting. 5 Detecting cycles. CS 270 Algorithms.

Unit 2: Algorithmic Graph Theory

Homework Assignment #3 Graph

Lecture 10. Elementary Graph Algorithm Minimum Spanning Trees

Basic Graph Algorithms

Chapter 9 Graph Algorithms

A loose end: binary search

11/22/2016. Chapter 9 Graph Algorithms. Introduction. Definitions. Definitions. Definitions. Definitions

Part VI Graph algorithms. Chapter 22 Elementary Graph Algorithms Chapter 23 Minimum Spanning Trees Chapter 24 Single-source Shortest Paths

Chapter 22. Elementary Graph Algorithms

Basic Graph Definitions

Minimum Spanning Trees

CS/COE 1501 cs.pitt.edu/~bill/1501/ Graphs

Graph. Vertex. edge. Directed Graph. Undirected Graph

CSE 4/531 Solution 3

COL351: Analysis and Design of Algorithms (CSE, IITD, Semester-I ) Name: Entry number:

CHAPTER 23: ELEMENTARY GRAPH ALGORITHMS Representations of graphs

Algorithm Design and Analysis

22 Elementary Graph Algorithms. There are two standard ways to represent a

Single Source Shortest Path

DFS & STRONGLY CONNECTED COMPONENTS

Graph Representations and Traversal

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

22 Elementary Graph Algorithms. There are two standard ways to represent a

Data Structures. Elementary Graph Algorithms BFS, DFS & Topological Sort

Graphs. Graph G = (V, E) Types of graphs E = O( V 2 ) V = set of vertices E = set of edges (V V)

1 Heaps. 1.1 What is a heap? 1.2 Representing a Heap. CS 124 Section #2 Heaps and Graphs 2/5/18

Single Source Shortest Paths

CS 310 Advanced Data Structures and Algorithms

CSI 604 Elementary Graph Algorithms

1 Heaps. 1.1 What is a heap? 1.2 Representing a Heap. CS 124 Section #2 Heaps and Graphs 2/5/18

CS200: Graphs. Prichard Ch. 14 Rosen Ch. 10. CS200 - Graphs 1

W4231: Analysis of Algorithms

Graph Algorithms. Definition

Minimum Spanning Trees Ch 23 Traversing graphs

Elementary Graph Algorithms. Ref: Chapter 22 of the text by Cormen et al. Representing a graph:

Graph: representation and traversal

Shortest Path Problem

Graphs and Network Flows ISE 411. Lecture 7. Dr. Ted Ralphs

DHANALAKSHMI COLLEGE OF ENGINEERING, CHENNAI. Department of Computer Science and Engineering CS6301 PROGRAMMING DATA STRUCTURES II

Chapter 9 Graph Algorithms

Algorithms and Data Structures

Algorithms (V) Path in Graphs. Guoqiang Li. School of Software, Shanghai Jiao Tong University

Priority Queues and Heaps. Heaps of fun, for everyone!

22.3 Depth First Search

Graphs. Part II: SP and MST. Laura Toma Algorithms (csci2200), Bowdoin College

Design and Analysis of Algorithms

Total Points: 60. Duration: 1hr

1 The Shortest Path Problem

Shortest path problems

CS/COE

Name: Lirong TAN 1. (15 pts) (a) Define what is a shortest s-t path in a weighted, connected graph G.

Graph Algorithms Using Depth First Search

Practical Session No. 12 Graphs, BFS, DFS, Topological sort

Dijkstra s Shortest Path Algorithm

Solutions to relevant spring 2000 exam problems

Reference Sheet for CO142.2 Discrete Mathematics II

Outline. Computer Science 331. Definitions: Paths and Their Costs. Computation of Minimum Cost Paths

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

Lecture 6 Basic Graph Algorithms

Outlines: Graphs Part-2

Solutions to Midterm 2 - Monday, July 11th, 2009

Chapter 9 Graph Algorithms

Graph Algorithms. Andreas Klappenecker. [based on slides by Prof. Welch]

Graph Applications. Topological Sort Shortest Path Problems Spanning Trees. Data Structures 1 Graph Applications

logn D. Θ C. Θ n 2 ( ) ( ) f n B. nlogn Ο n2 n 2 D. Ο & % ( C. Θ # ( D. Θ n ( ) Ω f ( n)

( ) ( ) C. " 1 n. ( ) $ f n. ( ) B. " log( n! ) ( ) and that you already know ( ) ( ) " % g( n) ( ) " #&

Graph: representation and traversal

Thus, it is reasonable to compare binary search trees and binary heaps as is shown in Table 1.

Priority Queues. Meld(Q 1,Q 2 ) merge two sets

Strongly connected: A directed graph is strongly connected if every pair of vertices are reachable from each other.

Decreasing a key FIB-HEAP-DECREASE-KEY(,, ) 3.. NIL. 2. error new key is greater than current key 6. CASCADING-CUT(, )

CS 125 Section #6 Graph Traversal and Linear Programs 10/13/14

Design and Analysis of Algorithms

Shortest Paths. Shortest Path. Applications. CSE 680 Prof. Roger Crawfis. Given a weighted directed graph, one common problem is finding the shortest

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

Algorithms for Data Science

Lecture 3: Graphs and flows

These are not polished as solutions, but ought to give a correct idea of solutions that work. Note that most problems have multiple good solutions.

Description of The Algorithm

Transcription:

Algorithms and Data Structures 2014 Exercises and Solutions Week 9 November 26, 2014 1 Directed acyclic graphs We are given a sequence (array) of numbers, and we would like to find the longest increasing subsequence (abbreviated to LIS). A LIS is a subsequence in which the elements are in sorted order, lowest to highest, and in which the number of elements is maximal. It is not allowed to change the order of elements; the subsequence itself, however, is not necessarily contiguous. For example, a LIS of 5, 2, 8, 6, 3, 6, 9, 7 is 2, 3, 6, 9. Note that a LIS is not always unique: in our example the subsequence 2, 3, 6, 7 is also a LIS. A (very) naive algorithm would examine all 2 N possible subsequences. A smarter solution can be obtained by using a DAG (directed acyclic graph). In this DAG we consider array indices as vertices, and add an edge (s, d) to this graph if the element at index s is less than the element at index d. The resulting DAG for our example sequence is: In this figure we have replaced the indices by the corresponding elements. 1. Give a representation of a DAG in your favourite programming language. It should contain an operation to construct a DAG for a given sequence. Use the adjacency list method to store edges. 2. Reformulate the original problem in terms of a DAG property, and give a recursive solution. 3. Give an efficient implementation for your solution. Solution 1. Using the adjacency list method, a graph is a tuple (V, Adj ), where V is a set of vertices and Adj [v] assigns to a vertex v V the set (or a list) of 1

vertices that are adjacent to v. Given an array A of n numbers, we define n = {1, 2,..., n} and construct the DAG D A = (n, Adj A ) by setting, for all i n, Adj A [i] = {j n j > i A[j] > A[i]}. 2. The increasing subsequences in A are precisely the paths in D A. We can find a longest path in an arbitrary DAG (V, Adj ) using a DFS. The main task is to find the longest path starting at a given vertex, which is achieved by the following pseudocode. 1: function LongestPath(Adj, v) 2: l [ ] 3: for w Adj [v] do 4: s LongestPath(Adj, w) 5: if length(s) > length(l) then 6: l s 7: return [v : l] A path is represented as the list of vertices that it passes through. We write [ ] for the empty list and [v : l] for the list that results from prepending the vertex v to the list l. The function length provides the length of a given list. To solve the full problem, we call LongestPath(Adj, v) for all v V and determine the longest of the resulting paths. 3. To make the implementation more efficient, we memoize our function. 1: function LongestPath(Adj, v) 2: if M[v] is defined then 3: return M[v] 4: for w Adj [v] do 5: s LongestPath(Adj, w) 6: if length(s) > length(l) then 7: l s 8: M[v] [v : l] 9: return M[v] We may assume that lists keep track of their lengths such that the length function executes in constant time; the time required for the full algorithm is then linear in the number of vertices and edges, and the original problem is now solved in O(n 2 ). 2 Strongly connected components A directed graph is said to be strongly connected if every vertex is reachable from every other vertex. Any directed graph G can de divided into strongly connected components, i.e. maximal subgraphs of G that are strongly connected. Here maximal means that such a subgraph cannot be extended by adding vertices without breaking the property of being strongly connected. Obviously, if the whole graph G is strongly connected then it contains only one strongly connected component, namely G itself. The following figure shows an example. 2

1. Give an algorithm that divides a graph G into strongly connected components. The result of this algorithm should again be a graph, say G S, in which each vertex represents a strongly connected component. There is an edge (s, d) in G S if the original graph G contains an edge (u, v) such that u occurs in component s, and v in component d. Is it possible that G S contains cycles? Hint: Perform a DFS-traversal to visit all the nodes. Besides the discovery time, you should maintain a list dl which contains all nodes that have been discovered, but not yet finished. Add new nodes at the head of this list (you do not need the finishing time in your algorithm). Once you have detected a maximal component, the vertices of this component appear at the front of the list. Try to figure out what condition should be met in order to decide whether you can collect these nodes (i.e remove the nodes from the list and store them in a new vertex of G S ). Solution We construct G S = (V S, Adj S ) by keeping track of the vertices Q that still need to be discovered, the list dl of discovered vertices that are not yet assigned to a component, the current discovery time t, and the next component number i. Furthermore, d[v] and c[v] are, respectively, the discovery time and component number of the vertex v, whenever these are defined (but we will manipulate d[v] after finishing v). The main part of the algorithm is as follows. 1: function FindComponents(V, Adj ) 2: V S 3: dl [ ] 4: t 0 5: i 0 6: Q V 7: while v Q do 8: Discover(v) 9: return (V S, Adj S ) We say that a DFS finishes a component C at the moment that the last vertex in C is finished. The only cycles in the graph G S will be self-loops because the components are maximal, and we obtain topological sorts of G S by ignoring those self-loops. Observe that these are precisely the reverses of the orders in which DFSs on G can finish the components. Suppose we are processing a vertex v. The DFS starting from v will discover a vertex adjacent to a vertex discovered before v, except if v is the first vertex discovered in its component. In that case it will, however, discover a vertex adjacent to v. After finishing v, we therefore determine the minimum m of d[v] and d[u] for the vertices u adjacent to v, and then the vertices discovered starting from v form a component if and only if d[v] = m, which follows inductively from 3

the fact that we update d[v] to m if this is not the case (we will set discovery times for vertices in already known components to infinity). 1: function Discover(v) 2: Q Q \ {v} 3: dl [v : dl] 4: d[v] t 5: t t + 1 6: for u (Adj [v] Q) do 7: Discover(u) 8: m min{d[u] u ({v} Adj [v])} 9: if d[v] = m then 10: AddComponent(v) 11: else 12: d[v] m The call AddComponent(v) removes the vertices up to v from the list dl and turns them into a new vertex for V S represented by the current value of i. It remains to specify how this is done. For all vertices u in the component, we set c[u] to the component number i. Note that, after doing this, we know that also all vertices w adjacent to the vertices in the component have a value assigned to c[w] because of the order in which components are added. We use this to immediately update Adj S for the new component: in the code below, the set A is constructed to contain all vertices adjacent to a vertex in the new component, and in the end this set is transformed into a set of adjacent component numbers by using c. 1: function AddComponent(v) 2: A 3: repeat 4: u head(dl) 5: dl tail(dl) 6: d[u] 7: c[u] i 8: A A Adj (u) 9: until u = v 10: V S V S {i} 11: Adj S [i] {c[w] w A} 12: i i + 1 The function head gives the first element of a list; tail yields the remaining list. 3 Dijkstra s shortest path algorithm The running time of Dijkstra s shortest path algorithm depends on the implementation of the priority queue used for maintaining the vertices that are not yet finished. During the main loop of Dijkstra s algorithm two queue operations are used: (1) extract the element with the least d value, and (2) update the d value of an element already present in the queue. We will call these operations deletemin and decreasekey, respectively. Suppose, we use an array to store queued vertices. The time complexity of deletemin and decreasekey depends on how the elements of the array are organized. 4

1. A straightforward representation simply uses vertices as index in the array. The time complexity of deletemin? And of decreasekey? So, what is the running time of Dijkstra s algorithm? 2. The elements of the array could also be organized as a min-heap; see exercises week 7. In this case, the d value will be taken as the key. Again, what is the complexity of deletemin and decreasekey? 3. In order to implement decreasekey efficiently, it is necessary that for each vertex that the element in the queue that corresponds to that vertex is directly accessible. How would you implement decreasekey? Give a complete implementation of Dijkstra s algorithm using a heap based priority queue, including an implementation of the priority queue itself. Solution 1. To implement deletemin, we need to find an element with a minimal key, which means that we have to traverse the entire array. Each element should contain a boolean that specifies whether the corresponding vertex is in the queue, and therefore the actual deletion can be performed in constant time. Thus, deletemin is in O( V ). We can immediately access the element for a specific vertex and change its key without having to deal with complications, so decreasekey is in O(1). The complexity of Dijkstra s algorithm in general is O( V m + E k), where m is the complexity of deletemin and k describes the complexity of decreasekey (this can be seen from the code in the solution for the last part of this exercise). The formula instantiates to O( V 2 + E ) = O( V 2 ) for the current implementation. 2. We already know that the minimal element can be removed from the heap in logarithmic time. Once we have found the element representing a given vertex, decreasekey is also logarithmic because we only have to check if we need to swap the changed element with its parent, after which we do the same for that parent and repeat this until we reach the root of the tree. However, searching the heap for the element representing the vertex requires a linear amount of time. Therefore, Dijkstra s algorithm is now in O( V lg V + E V ), which is usually dominated by O( E V ). 3. If we can find the element in the heap for a certain vertex in constant time, we know from the previous answer that decreasekey becomes logarithmic, and this will give the algorithm a time complexity of O(( E + V ) lg V ), which reduces to O( E lg V ) for the most interesting graphs. In the case of sparse graphs this is an improvement over the original implementation: we now have a complexity of O( V lg V ). The accessibility problem can be solved by using an index array that records for each vertex the index of its element in the underlying array of the heap. It remains to specify how the index array is updated for the operations on the heap, but first we give pseudocode for the main algorithm itself. 5

1: function Dijkstra(V, Adj, w, s) 2: for v V do 3: d[v] 4: init(q, V, ) 5: d[s] 0 6: decreasekey(q, s, 0) 7: while empty(q) do 8: u deletemin(q) 9: for v Adj [u] do 10: if d[u] + w(u, v) < d[v] then 11: p[v] u 12: d[v] d[u] + w(u, v) 13: decreasekey(q, v, d[v]) Recall that for each vertex v, d[v] is the shortest distance from the source vertex s to v; this value is used as the key of v in the queue Q. Furthermore, p[v] refers to the predecessor of v in the shortest path from s to v. The algorithm starts by initializing Q to contain all vertices V with a key of infinity. While filling the heap in this way, the heap property remains trivially satisfied, and we can immediately fill the index array according to the position to which the vertices are assigned in the heap array. In the implementation of the heap methods, the heap is subject to two basic transformations: the last element may be removed and two elements may be swapped. These operations can simply be reflected in the index array; performing them on two arrays instead of one will not impair any complexities. After calling Dijkstra for a source vertex s, there exists a path from s to a vertex g if and only if either p[g] is defined or g = s. If there is a path, we can construct a minimal one by using the predecessor references. For the sake of completeness, we give an algorithm to do this. 1: function ShortestPath(g) 2: l [ ] 3: v g 4: while v s do 5: l [v : l] 6: v p[v] 7: return [s : l] 6