CS Final - Review material

Similar documents
Algorithms (IX) Yijia Chen Shanghai Jiaotong University

Algorithms (VII) Yijia Chen Shanghai Jiaotong University

Algorithms (VII) Yijia Chen Shanghai Jiaotong University

2.1 Greedy Algorithms. 2.2 Minimum Spanning Trees. CS125 Lecture 2 Fall 2016

Algorithms (VI) Greedy Algorithms. Guoqiang Li. School of Software, Shanghai Jiao Tong University

CSci 231 Final Review

Greedy Approach: Intro

Building a network. Properties of the optimal solutions. Trees. A greedy approach. Lemma (1) Lemma (2) Lemma (3) Lemma (4)

Chapter 5. Greedy algorithms

looking ahead to see the optimum

6.1 Minimum Spanning Trees

Minimum Spanning Trees

CS521 \ Notes for the Final Exam

Algorithms (V) Path in Graphs. Guoqiang Li. School of Software, Shanghai Jiao Tong University

CS 310 Advanced Data Structures and Algorithms

CS 161: Design and Analysis of Algorithms

Lecture 4 Greedy algorithms

CSC 373 Lecture # 3 Instructor: Milad Eftekhar

CSC Intro to Intelligent Robotics, Spring Graphs

CS 341: Algorithms. Douglas R. Stinson. David R. Cheriton School of Computer Science University of Waterloo. February 26, 2019

Representations of Weighted Graphs (as Matrices) Algorithms and Data Structures: Minimum Spanning Trees. Weighted Graphs

managing an evolving set of connected components implementing a Union-Find data structure implementing Kruskal s algorithm

Topic: Heaps and priority queues

CSE 100 Minimum Spanning Trees Prim s and Kruskal

CIS 121 Data Structures and Algorithms Minimum Spanning Trees

Algorithms IV. Dynamic Programming. Guoqiang Li. School of Software, Shanghai Jiao Tong University

CSE 100: GRAPH ALGORITHMS

logn D. Θ C. Θ n 2 ( ) ( ) f n B. nlogn Ο n2 n 2 D. Ο & % ( C. Θ # ( D. Θ n ( ) Ω f ( n)

Minimum Spanning Trees and Union Find. CSE 101: Design and Analysis of Algorithms Lecture 7

UC Berkeley CS 170: Efficient Algorithms and Intractable Problems Handout 8 Lecturer: David Wagner February 20, Notes 8 for CS 170

CSE 431/531: Analysis of Algorithms. Greedy Algorithms. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Decreasing a key FIB-HEAP-DECREASE-KEY(,, ) 3.. NIL. 2. error new key is greater than current key 6. CASCADING-CUT(, )

Heapsort. Heap data structure

2pt 0em. Computer Science & Engineering 423/823 Design and Analysis of Algorithms. Lecture 04 Minimum-Weight Spanning Trees (Chapter 23)

403: Algorithms and Data Structures. Heaps. Fall 2016 UAlbany Computer Science. Some slides borrowed by David Luebke

Lecture Summary CSC 263H. August 5, 2016

Union Find and Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 8

Minimum Spanning Trees My T. UF

Dijkstra s algorithm for shortest paths when no edges have negative weight.

COMP 251 Winter 2017 Online quizzes with answers

( D. Θ n. ( ) f n ( ) D. Ο%

Minimum-Cost Spanning Tree. Example

Example. Minimum-Cost Spanning Tree. Edge Selection Greedy Strategies. Edge Selection Greedy Strategies

) $ f ( n) " %( g( n)

UNIT 3. Greedy Method. Design and Analysis of Algorithms GENERAL METHOD

Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-4004) Subject Notes

Greedy Algorithms. Textbook reading. Chapter 4 Chapter 5. CSci 3110 Greedy Algorithms 1/63

CSC 8301 Design & Analysis of Algorithms: Kruskal s and Dijkstra s Algorithms

Comparisons. Θ(n 2 ) Θ(n) Sorting Revisited. So far we talked about two algorithms to sort an array of numbers. What is the advantage of merge sort?

1 Shortest Paths. 1.1 Breadth First Search (BFS) CS 124 Section #3 Shortest Paths and MSTs 2/13/2018

( ) n 3. n 2 ( ) D. Ο

CSE 101. Algorithm Design and Analysis Miles Jones Office 4208 CSE Building Lecture 6: BFS and Dijkstra s

22 Elementary Graph Algorithms. There are two standard ways to represent a

Lecture 5: Sorting Part A

Comparisons. Heaps. Heaps. Heaps. Sorting Revisited. Heaps. So far we talked about two algorithms to sort an array of numbers

( ) ( ) C. " 1 n. ( ) $ f n. ( ) B. " log( n! ) ( ) and that you already know ( ) ( ) " % g( n) ( ) " #&

Chapter 9 Graph Algorithms

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

11/22/2016. Chapter 9 Graph Algorithms. Introduction. Definitions. Definitions. Definitions. Definitions

Introduction to Algorithms 3 rd edition

1 Heaps. 1.1 What is a heap? 1.2 Representing a Heap. CS 124 Section #2 Heaps and Graphs 2/5/18

Algorithms and Theory of Computation. Lecture 7: Priority Queue

Lecturers: Sanjam Garg and Prasad Raghavendra March 20, Midterm 2 Solutions

( ) + n. ( ) = n "1) + n. ( ) = T n 2. ( ) = 2T n 2. ( ) = T( n 2 ) +1

Minimum Spanning Trees

n 2 C. Θ n ( ) Ο f ( n) B. n 2 Ω( n logn)

Lecture 13. Reading: Weiss, Ch. 9, Ch 8 CSE 100, UCSD: LEC 13. Page 1 of 29

n 2 ( ) ( ) Ο f ( n) ( ) Ω B. n logn Ο

Algorithms and Theory of Computation. Lecture 5: Minimum Spanning Tree

CIS 121 Data Structures and Algorithms Midterm 3 Review Solution Sketches Fall 2018

The Heap Data Structure

( ) D. Θ ( ) ( ) Ο f ( n) ( ) Ω. C. T n C. Θ. B. n logn Ο

Chapter 9. Greedy Technique. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

Minimum Spanning Trees Ch 23 Traversing graphs

Algorithms and Theory of Computation. Lecture 5: Minimum Spanning Tree

Part VI Graph algorithms. Chapter 22 Elementary Graph Algorithms Chapter 23 Minimum Spanning Trees Chapter 24 Single-source Shortest Paths

Thomas H. Cormen Charles E. Leiserson Ronald L. Rivest. Introduction to Algorithms

Algorithm Design and Analysis

CSC 373: Algorithm Design and Analysis Lecture 4

Graph Representations and Traversal

2 A Template for Minimum Spanning Tree Algorithms

CSE373: Data Structures & Algorithms Lecture 17: Minimum Spanning Trees. Dan Grossman Fall 2013

Greedy Algorithms. At each step in the algorithm, one of several choices can be made.

CS 241 Analysis of Algorithms

22 Elementary Graph Algorithms. There are two standard ways to represent a

CS2223: Algorithms Sorting Algorithms, Heap Sort, Linear-time sort, Median and Order Statistics

Lecture 4: Graph Algorithms

( ). Which of ( ) ( ) " #& ( ) " # g( n) ( ) " # f ( n) Test 1

Algorithms, Spring 2014, CSE, OSU Lecture 2: Sorting

D. Θ nlogn ( ) D. Ο. ). Which of the following is not necessarily true? . Which of the following cannot be shown as an improvement? D.

Lecture 6 Basic Graph Algorithms

Course Review for Finals. Cpt S 223 Fall 2008

DESIGN AND ANALYSIS OF ALGORITHMS GREEDY METHOD

Partha Sarathi Manal

Analysis of Algorithms, I

Minimum Spanning Trees

Virtual University of Pakistan

Graphs and Network Flows ISE 411. Lecture 7. Dr. Ted Ralphs

Parallel Graph Algorithms

CS 125 Section #6 Graph Traversal and Linear Programs 10/13/14

Transcription:

CS4800 Algorithms and Data Professor Fell Fall 2009 October 28, 2009 Old stuff CS 4800 - Final - Review material Big-O notation Though you won t be quizzed directly on Big-O notation, you should be able to apply it to analyzing algorithms, e.g. ones that you produce a problem solutions. Recurrence relations As with Big-O. Basic Arithmetic Most of you still remember the algorithms that you learned in elementary school. You should now have an idea of how they work on a computer and how and when they can be sped up. Modular arithmetic Never forget this. Binary search Never forget this. Merge sort, Partition in quicksort, Insertion sort You really should be able to tell me about these ten years from now even if you don t remember all the details. New stuff You should know these algorithms and how to analyze their running times. Chapter 3 - Decomposition of Graphs Adjacency lit and adjacency matrix representation Depth-first search in undirected graphs Depth-first search in directed graphs Topological sort Strongly connected components Chapter 4 - Paths in Graphs Distances Breadth-first search Lengths on edges (weighted graphs) Dijkstras algorithm Priority queue implementations: array and binary heap Heap sort NO Shortest paths in the presence of negative edges Shortest paths in dags. 1

Chapter 5 - Greedy algorithms Minimum spanning trees Huffman encoding Horn formulas Set cover Chapter 6- Dynamic programming Shortest paths in dags, revisited Longest increasing subsequences Edit distance Knapsack Chapter7 NO - Linear programming (NO reductions) An introduction to linear programming The simplex algorithm Graphs A undirected graph G = (V,E) where each edge e E is 2-way and represented by a set {u,v} with u,v V. We also write (u,v) when we are talking about following the edge from u to v. A graph G = (V,E) where each edge e E is 1-way and represented by an ordered pair (u,v) with u,v V. The edge (u,v) goes from u to v. A graph can be represented by an n n adjacency matrix where n = V. The (i,)th entry is a ij = { 1 if there is an edge from vi to v j 0 otherwise. or an adjacency list that consists of V linked lists, one per vertex. The linked list for vertexu holds the names of vertices to which u has an outgoing edge, i.e. vertices v for which (u,v) E. Depth-first search You should understand procedures explore(g,v) on page 84 and dfs(g) on page 85 using the previsit and postvisit on page 87. You should be able to perform depth-first search in a directed or undirected graph, label the vertices withpre andpost numbers and label the edges as tree or back in an undirected graph, tree, forward, back, or cross in a directed graph. Tree edges are actually part of the DFS forest. Forward edges lead from a node to a non-child descendant in the DFS tree. Back edges lead to an ancestor in the DFS tree. 2

Cross edges lead to neither descendant nor ancestor; they therefore lead to a node that has already been completely explored (that is, already post-visited). Strongly connected components - the algorithm 1. Run DFS on G R. 2. Run DFS on G from vertex in 1 with highest post number. Remove those vertices from G (or their component from the dag) and repeat starting with the remaining vertex highest post number from step 1. Breadth-first search BFS(G, s) for all u V dist(u) = dist(u) = 0 Q = [s] (queue containing just s) while Q is not empty u = eject(q) for all edges (u,v) E if dist(v) = inject(q, v) dist(v) = dist(u) + 1 Shortest Path dijkstra(g, s) Initialize dist(s) = 0, other dist( ) values to, R = { } (the known region ) while R V : Pick the node v V with smallest dist( ) Add v to R for all edges (v,z) E: if dist(z) > dist(v) + l(v,z): dist(z) = dist(v) + l(v,z) Insert : Add a new element to the set. Decrease-key : Accommodate the decrease in key value of a particular element. (Notifies the queue that a certain key value has been decreased.) Delete-min : Return the element with the smallest key, and remove it from the set. Make-queue : Build a priority queue out of the given elements with the given key values. (In many implementations, this is significantly faster than inserting the elements one by one.) 3

Priority queue implementations Binary heap wins over array implementation as soon as E < V 2 /log V. Good for both sparse and dense graphs. sparse, E = O( V ), running time is O( V log V ). dense, E = O( V 2 ), running time is O( V 2 ). intermetiate, E = O( V 1+δ ) running time is O( E ). -linear! Fibonacci heap is best but hard to implement. Insert averages (amortized cost) O(1). Max-Heapify(A, i, n) l = Left(i) r = Right(i) if l n and A[l] > A[i] largest = l else largest = i if r n and A[r] > A[largest] largest = r if largest i exchange A[i] with A[largest] Max-Heapify(A, largest, n) Building a Heap Build-Max-Heap(A, n) for i = n/2 downto 1 Max-Heapify(A, i, n) The heapsort algorithm Given an input array, the heapsort algorithm acts as follows: Build a max-heap from the array. The root is the maximum element so swap it with the element in the last position in the array. Decrease the heap size by 1, and call Max-Heapify on the new (possibly incorrectlyplaced) root. Repeat until only one node (the smallest element) remains, and therefore is in the correct place in the array. Analysis Build-Max-Heap: O(n) for loop: n 1 times 4

swap elements: O(1) Max-Heapify: O(n lg n) Total time: O(n lg n). Minimum spanning trees Kruskal s algorithm Repeatedly add the next lightest edge that doesnt produce a cycle. kruskal(g, w) Input: A connected undirected graph G = (V,E) with edge weights we Output: A minimum spanning three defined by the edges X for all u V : makeset(u) X = {} sort the edges E by weight for all edges {u,v} E,in increasing order of weight: if find(u) find(v): add edge {u,v} to X union(u, v) Data structure for disjoint sets Union by rank: store a set is as a directed tree. Nodes of the tree are elements of the set, in no particular order, and each has parent pointers that eventually lead up to the root of the tree. This root element is a representative, or name, for the set. It is distinguished from the other elements by the fact that its parent pointer is a self-loop. makeset(x) π(x) = x rank(x) = 0 find(x) if x π(x): π(x) = find(π(x)) return π(x) union(x, y) r x = find(x) r y = find(y) if r x = r y return if rank(r x ) > rank(r y ) π(r y ) = r x else π(r x ) = r y if rank(r x ) = rank(r y ): rank(r y ) = rank(r y ) + 1 5

The amortized cost of a sequence of n union find operations starting from an empty data structure averages O(1) down from O(log n). Prims algorithm for MST The intermediate set of edges X always forms a subtree, and S is chosen to be the set of this trees vertices. On each iteration, the subtree defined by X grows by one edge, namely, the lightest edge between a vertex in S and a vertex outside S. Huffman encoding Huffman encoding is a variable-length encoding in which just one bit is used for the most frequently occurring symbol. Huffman(f) Input: An array f[1.. n] of frequencies Output: An encoding tree with n leaves let H be a priority queue of integers, ordered by f for i = 1 to n Insert(H, i) for k = n + 1 to 2n 1 i = deletemin(h), j = deletemin(h) create a node numbered k with children i,j f[k] = f[i] + f[j] insert(h, k) This algorithm can be described in terms of priority queue operations and takes O(nlogn) time if a binary heap is used. Horn formulas Knowledge about variables is represented by two kinds of clauses: 1. implication: left side is an and of 0 or more variables. (z w) u u z w 2. negative clause: an or of 0 or more negative literals. (x u y) 6

Input: a Horn formula Output: a satisfying assignment, if one exists set all variables to false while there is an implication that is not satisfied set the right-hand variable of the implication to true if all pure negative clauses are satisfied: return the assignment else return formula is not satisfiable Set Cover - greedy algorithm Input: A set of elements B; sets S 1,...,S m B Output: A selection of the S i whose union is B. Cost: Number of sets picked. Repeat until all elements of B are covered: Pick the set S i with the largest number of uncovered elements. Claim Suppose B contains n elements and that the optimal cover consists of k sets. Then the greedy algorithm will use at most k ln n sets. Dynamic Programming Dynamic programming is a very powerful algorithmic paradigm in which a problem is solved by identifying a collection of subproblems and tackling them one by one, smallest first, using the answers to small problems to help figure out larger ones, until the whole lot of them is solved. Shortest path in dags initialize all dist( ) values to dist(s) = 0 for each v V \{s}, in linearized order: dist(v) = min (u,v) E {dist(u) + l(u,v)} Edit distance for i = 0,1,2,...,m E(i,0) = i for j = 0,1,2,...,n E(0,j) = j for i = 1,2,...,m for j = 1,2,...,n E(i,j) = min{1 + E(i 1,j),1 + E(i,j 1),diff(i,j) + E(i 1,j 1)} return E(m, n) 7

Longest common subsequence LCS-Length(X, Y, m, n) let b[1..m,1.. n] and c[0.. m,0.. n] be new tables for i = 1 to nm c[i,0] = 0 for j = 1 to n c[0,j] = 0 for i = 1 to nm for j = 1 to n if x i = y j c[i,j] = c[i 1,j 1] + 1 b[i,j] = տ else if c[i 1,j] c[i,j 1] c[i,j] = c[i 1,j] b[i,j] = else c[i,j] = c[i,j 1] b[i,j] = return c and b Print-LCS(b, X, i, j) if i = 0 or j = 0 return if b[i,j] = տ Print-LCS(b,X,i 1,j 1) print x elseif b[i,j] == Print-LCS(b,X,i 1,j) else Print-LCS(b,X,i,j 1) Knapsack Fractional-Knapsack(v, w, W) load = 0 i = 1 while load < W and i n if w i W load take all of item i else take (W load)/w i of item i add what was taken to load i = i + 1 Time: O(n lg n) to sort, O(n) thereafter. Knapsack with repetition Define K(w) = maximum value achievable with a knapsack of capacity w. 8

K(0) = 0 for w = 1 to W: K(w) = max{k(w w i ) + v i : w i w} return K(W) This fills a one-dimensional table of length W + 1 from left to right. Each entry can take O(n) to compute so the running time is O(nW). This is EXPONENTIAL! Knapsack without repetition Add a second parameter, 0 j n: K(w, j) = maximum value achievable using a knapsack of capacity w and items 1,..., j. We seek (K(W,n). Initialize all K(0,j) = 0 and K(w,0) = 0 for w = t to n for w = 1 to W: if w j > w K(w,j) = K(w,j 1) else K(w,j) = max{k(w w j,j 1) + v j,k(w,j 1)} return K(W, n) 9