CS 350 - Algorithms and Complexity Dynamic Programming Sean Anderson 2/20/18 Portland State University
Table of contents 1. Homework 3 Solutions 2. Dynamic Programming 3. Problem of the Day 4. Application to Hard Problems 1
Homework 3 Solutions
Problem 1 - Single cycle detection Easiest way: BFS or DFS tracking visited vertices. If a vertex is revisited, there is a cycle. Time Complexity: if there is no cycle, we will visit every vertex and expand every edge. An acyclic connected graph has V + 1 edges. If there is, in the worst case, we discover it at the very end after visiting every vertex, and stop exploring then. So we will never expand more than V + 1 vertices. Either way, O( V ) time. Space Complexity: tracking visited vertices is O( V ) 2
Problem 2 - Bipartite Graphs Part 1: a tree is acyclic, so we can color it by assigning a color to one node, the other color to its neighbors, and so on - no neighboring nodes share a neighbor Part 2: in a rooted tree, all nodes of even depth will share a color, and nodes of odd depth the other color 3
Dynamic Programming
Recursive Fibonacci Fibonacci sequence is recursively defined: F i = F i 1 + F i 2 We can compute it recursively: F i = F i 1 + F i 2 = F i 2 + F i 3 + F i 2 = Pseudocode: Fibonacci(i): if i 2: return 1 return Fibonacci(i 1) + Fibonacci(i 2) 4
Recursive Fibonacci F i F i 1 F i 2 F i 2 F i 3 F i 3 F i 4 F i 3 F i 4 F i 4 F i 4 5
Recursive Fibonacci F i F i 1 F i 2 F i 2 F i 3 F i 3 F i 4 F i 3 F i 4 F i 4 F i 4 6
Recursive Fibonacci Complexity This is exponential growth! But we don t need to compute subproblems repeatedly. If we compute and store F 1, F 2, F i 1, we have everything we need to compute F i That s O(n) time and space complexity! If we discard values more than two below current, that can be constant space complexity 7
Memoization and Tabulation Dynamic Programming is analogous to Divide-and-Conquer, but with overlapping subproblems. Building a table of computed subproblems recursively is called Memoization Starting at the bottom and building up a table of computed subproblems is called Tabulation 8
Is There a Difference? The key difference is that tabulation will typically compute all subproblems, whether or not they would come up in a recursive memoization algorithm. On the other hand, depending on the problem, either top-down or bottom-up techniques could be more intuitive or perform differently. 9
Coin-Path Problem Suppose we have an m n grid with coins scattered throughout it. Each tile (i, j) has c ij coins. We want to send a robot from (1, 1) to (m, n) along a shortest path that also allows it to collect the maximum number of coins. Notes: Assuming (1, 1) is the top-left corner, we re trying to reach the bottom-right, so we will always move either down or right Any path will have m 1 right moves and n 1 down moves 10
Pictures 11
Recursive Pseudocode CoinPath(i, j): if i < m: down CoinPath(i, j + 1) if j < n: right CoinPath(i + 1, j) if j = i = 0: return 0 return c ij + max(down, right) 12
Memoization Pseudocode T m n array of NULL CoinPath(i, j): if i < m: if T[i, j + 1] = NULL: T[i, j + 1] CoinPath(i, j + 1) if j < n: dn T[i, j + 1] if T[i + 1, j] = NULL: T[i + 1, j] CoinPath(i + 1, j) rt T[i + 1, j] else: return 0 return c ij + max(dn, rt) 13
Tabulation Pseudocode CoinPath(): T m n array for i m..1: return T[1][1] for j n..1: T[i][j] c ij +max(t[i + 1][j], T[i][j + 1]) 14
Pictures 15
Analysis The recursive version acts like the tabulation when the recursion is unrolled. Either way, O(mn) time and space. 16
String Edit Distance Measure of how different two strings are. Minimum path from string A to B via one of three operations: Insertion - bother brother by add r Deletion - brother bother by add r Modification - sitting fitting by s f 17
Wagner-Fischer Algorithm for String Edit Distance For two strings A and B, let d ij be the distance between their i and j length suffixes So, for kitten and sitting, d 67 is the distance between kitten and sitting d 56 is the distance between itten and itting 18
Wagner-Fischer Recurrence The we can define our distances: d i,0 = i d 0,j = j d i,j = d i 1,j 1 else d i,j = min of: for 0 i A for 0 i B if a i = b i d i 1,j + 1 (delete) d i,j 1 + 1 (add) d i 1,j 1 + 1 (modify) 19
Analysis Let m = A, n = B Normal recursive approach has runtime O(3 MIN(m,n) ) on some inputs (imagine aaaaaaaaa and bbbbbbbb. With dynamic programming, O(mn time and space 20
20
Problem of the Day
Plank Cutting Suppose we have a plank of wood n units long, and an array of prices p i for boards each integer length 0 i n. Give an algorithm to find the optimal combination of boards to cut our plank into. 21
Application to Hard Problems
{0,1}-Knapsack Recursive algorithm idea for knapsack: for each item i, recurse on subproblem with items < i and with weight capacity W w i This gives us an n W table 22
Polynomial and Pseudopolynomial We ve said a lot about problems being hard, but not necessarily what that means A polynomial time algorithm has runtime in O(n c ) for some c We make a distinction between that and exponential, factorial, etc. If our knapsack dynamic programming is O(Wn c ), that s not quite polynomial because W = 2 W - but it s a bit different from O(2 n ). It s called pseudopolynomial. 23
References i