CS 473: Fundamental Algorithms, Spring Dynamic Programming. Sariel (UIUC) CS473 1 Spring / 42. Part I. Longest Increasing Subsequence

Similar documents
Dynamic Programming. CS 374: Algorithms & Models of Computation, Fall Lecture 11. October 1, 2015

Dynamic Programming. Part I. Checking if string is in L. Dynamic Programming. Algorithms & Models of Computation CS/ECE 374, Fall 2017

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Dynamic Programming I Date: 10/6/16

CS 170 DISCUSSION 8 DYNAMIC PROGRAMMING. Raymond Chan raychan3.github.io/cs170/fa17.html UC Berkeley Fall 17

memoization or iteration over subproblems the direct iterative algorithm a basic outline of dynamic programming

Dynamic Programming. Introduction, Weighted Interval Scheduling, Knapsack. Tyler Moore. Lecture 15/16

CMSC 451: Lecture 10 Dynamic Programming: Weighted Interval Scheduling Tuesday, Oct 3, 2017

1 Dynamic Programming

Elements of Dynamic Programming. COSC 3101A - Design and Analysis of Algorithms 8. Discovering Optimal Substructure. Optimal Substructure - Examples

Chapter 6. Dynamic Programming. Modified from slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CMSC 451: Dynamic Programming

Chapter 6. Dynamic Programming

Algorithmic Paradigms

1 Dynamic Programming

Algorithm Design and Analysis

CSC 373 Lecture # 3 Instructor: Milad Eftekhar

Algorithm Design Techniques part I

1 Non greedy algorithms (which we should have covered

CS173 Longest Increasing Substrings. Tandy Warnow

CS 380 ALGORITHM DESIGN AND ANALYSIS

4.1 Interval Scheduling

CS2223 Algorithms B Term 2013 Exam 3 Solutions

Greedy Algorithms CLRS Laura Toma, csci2200, Bowdoin College

Analysis of Algorithms Prof. Karen Daniels

Algorithms. Ch.15 Dynamic Programming

Efficient Sequential Algorithms, Comp309. Problems. Part 1: Algorithmic Paradigms

Lectures 12 and 13 Dynamic programming: weighted interval scheduling

Unit-5 Dynamic Programming 2016

Algorithmic Paradigms. Chapter 6 Dynamic Programming. Steps in Dynamic Programming. Dynamic Programming. Dynamic Programming Applications

Outline. CS38 Introduction to Algorithms. Fast Fourier Transform (FFT) Fast Fourier Transform (FFT) Fast Fourier Transform (FFT)

CS473-Algorithms I. Lecture 10. Dynamic Programming. Cevdet Aykanat - Bilkent University Computer Engineering Department

Today: Matrix Subarray (Divide & Conquer) Intro to Dynamic Programming (Rod cutting) COSC 581, Algorithms January 21, 2014

Dynamic Programming. Applications. Applications. Applications. Algorithm Design 6.1, 6.2, 6.3

Design and Analysis of Algorithms

CSC2420 Fall 2012: Algorithm Design, Analysis and Theory

Algorithms for Data Science

Multithreaded Algorithms Part 1. Dept. of Computer Science & Eng University of Moratuwa

CS 350 Final Algorithms and Complexity. It is recommended that you read through the exam before you begin. Answer all questions in the space provided.

Dynamic Programming Algorithms

Main approach: always make the choice that looks best at the moment.

COMP 182: Algorithmic Thinking Dynamic Programming

Algorithms IV. Dynamic Programming. Guoqiang Li. School of Software, Shanghai Jiao Tong University

Algorithms - Ch2 Sorting

Dynamic Programming. Nothing to do with dynamic and nothing to do with programming.

Dynamic Programming. Design and Analysis of Algorithms. Entwurf und Analyse von Algorithmen. Irene Parada. Design and Analysis of Algorithms

Dynamic Programming Assembly-Line Scheduling. Greedy Algorithms

Last week: Breadth-First Search

12 Dynamic Programming (2) Matrix-chain Multiplication Segmented Least Squares

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Dynamic Programming

CSE 101 Homework 5. Winter 2015

Dynamic Programming Algorithms Greedy Algorithms. Lecture 29 COMP 250 Winter 2018 (Slides from M. Blanchette)

Chapter 3 Dynamic programming

CSC 373: Algorithm Design and Analysis Lecture 8

Greedy algorithms is another useful way for solving optimization problems.

The divide-and-conquer paradigm involves three steps at each level of the recursion: Divide the problem into a number of subproblems.

Memoization/Dynamic Programming. The String reconstruction problem. CS124 Lecture 11 Spring 2018

Main approach: always make the choice that looks best at the moment. - Doesn t always result in globally optimal solution, but for many problems does

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

Algorithms Dr. Haim Levkowitz

CS141: Intermediate Data Structures and Algorithms Greedy Algorithms

Dynamic Programming Algorithms

Dynamic Programming Part One

15-451/651: Design & Analysis of Algorithms November 4, 2015 Lecture #18 last changed: November 22, 2015

1 i n (p i + r n i ) (Note that by allowing i to be n, we handle the case where the rod is not cut at all.)

Dynamic Programming Homework Problems

We augment RBTs to support operations on dynamic sets of intervals A closed interval is an ordered pair of real

More Dynamic Programming

Greedy Algorithms. This is such a simple approach that it is what one usually tries first.

Dynamic Programming Group Exercises

10. EXTENDING TRACTABILITY

DESIGN AND ANALYSIS OF ALGORITHMS (DAA 2017)

UML CS Algorithms Qualifying Exam Fall, 2003 ALGORITHMS QUALIFYING EXAM

CSE 431/531: Analysis of Algorithms. Greedy Algorithms. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Chapter 6. Dynamic Programming

Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-4004) Subject Notes

Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 15, 2015 洪國寶

IN101: Algorithmic techniques Vladimir-Alexandru Paun ENSTA ParisTech

Recursive-Fib(n) if n=1 or n=2 then return 1 else return Recursive-Fib(n-1)+Recursive-Fib(n-2)

CS 125 Section #6 Graph Traversal and Linear Programs 10/13/14

So far... Finished looking at lower bounds and linear sorts.

CSC 505, Spring 2005 Week 6 Lectures page 1 of 9

Optimization II: Dynamic Programming

QB LECTURE #1: Algorithms and Dynamic Programming

Dynamic Programming. An Enumeration Approach. Matrix Chain-Products. Matrix Chain-Products (not in book)

Scribe: Virginia Williams, Sam Kim (2016), Mary Wootters (2017) Date: May 22, 2017

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS

Lecture 2: Getting Started

Algorithms, Spring 2014, CSE, OSU Greedy algorithms II. Instructor: Anastasios Sidiropoulos

Unit 4: Dynamic Programming

Solving NP-hard Problems on Special Instances

Lecture 13: Chain Matrix Multiplication

Introduction to Algorithms I

We ve done. Now. Next

Applied Algorithm Design Lecture 3

Data Structures and Algorithms Week 4

EE/CSCI 451 Spring 2018 Homework 8 Total Points: [10 points] Explain the following terms: EREW PRAM CRCW PRAM. Brent s Theorem.

Dynamic Programming Homework Problems

Ensures that no such path is more than twice as long as any other, so that the tree is approximately balanced

Greedy Algorithms. Algorithms

The divide and conquer strategy has three basic parts. For a given problem of size n,

Transcription:

CS 473: Fundamental Algorithms, Spring 2011 Dynamic Programming Lecture 8 February 15, 2011 Sariel (UIUC) CS473 1 Spring 2011 1 / 42 Part I Longest Increasing Subsequence Sariel (UIUC) CS473 2 Spring 2011 2 / 42

Sequences Definition Sequence: an ordered list a 1, a 2,..., a n. Length of a sequence is number of elements in the list. Definition a i1,..., a ik is a subsequence of a 1,..., a n if 1 i 1 <... < i k n. Definition A sequence is increasing if a 1 < a 2 <... < a n. It is non-decreasing if a 1 a 2... a n. Similarly decreasing and non-increasing. Sariel (UIUC) CS473 3 Spring 2011 3 / 42 Sequences Example... Example Sequence: 6, 3, 5, 2, 7, 8, 1 Subsequence: 5, 2, 1 Increasing sequence: 3, 5, 9 Increasing subsequence: 2, 7, 8 Sariel (UIUC) CS473 4 Spring 2011 4 / 42

Longest Increasing Subsequence Problem Input A sequence of numbers a 1, a 2,..., a n Goal Find an increasing subsequence a i1, a i2,..., a ik maximum length of Example Sequence: 6, 3, 5, 2, 7, 8, 1 Increasing subsequences: 6, 7, 8 and 3, 5, 7, 8 and 2, 7 etc Longest increasing subsequence: 3, 5, 7, 8 Sariel (UIUC) CS473 5 Spring 2011 5 / 42 Naïve Enumeration Assume a 1, a 2,..., a n is contained in an array A alglisnaive(a[1..n]): max = 0 for each subsequence B of A do if B is increasing and B > max then max = B Output max Running time: O(n2 n ). 2 n subsequences of a sequence of length n and O(n) time to check if a given sequence is increasing. Sariel (UIUC) CS473 6 Spring 2011 6 / 42

Recursive Approach: Take 1 LIS: Longest increasing subsequence Can we find a recursive algorithm for LIS? LIS (A[1..n]): Case 1: does not contain a n in which case LIS (A[1..n]) = LIS (A[1..(n 1)]) Case 2: contains a n in which case LIS (A[1..n]) is not so clear. Observation: if a n is in the longest increasing subsequence then all the elements before it must be smaller. Sariel (UIUC) CS473 7 Spring 2011 7 / 42 Recursive Approach: Take 1 alglis(a[1..n]): if (n = 0) then return 0 m = alglis(a[1..(n 1)]) B is subsequence of A[1..(n 1)] with only elements less than a n (* let h be size of B, h n-1 *) m = max(m, 1 + alglis(b[1..h])) Output m Recursion for running time: T(n) 2T(n 1) + O(n). Easy to see that T(n) is O(n2 n ). Sariel (UIUC) CS473 8 Spring 2011 8 / 42

Recursive Approach: Take 2 LIS(A[1..n]): Case 1: does not contain a n in which case LIS(A[1..n]) = LIS(A[1..(n 1)]) Case 2: contains a n in which case LIS(A[1..n]) is not so clear. Observation For second case we want to find a subsequence in A[1..(n 1)] that is restricted to numbers less than a n. This suggests that a more general problem is LIS smaller(a[1..n], x) which gives the longest increasing subsequence in A where each number in the sequence is less than x. Sariel (UIUC) CS473 9 Spring 2011 9 / 42 Recursive Approach: Take 2 LIS smaller(a[1..n], x) : length of longest increasing subsequence in A[1..n] with all numbers in subsequence less than x LIS smaller(a[1..n], x): if (n = 0) then return 0 m = LIS smaller(a[1..(n 1)], x) if (A[n] < x) then m = max(m, 1 + LIS smaller(a[1..(n 1)], A[n])) Output m LIS(A[1..n]): return LIS smaller(a[1..n], ) Recursion for running time: T(n) 2T(n 1) + O(1). Question: Is there any advantage? Sariel (UIUC) CS473 10 Spring 2011 10 / 42

Recursive Algorithm: Take 2 Observation: The number of different subproblems generated by LIS smaller(a[1..n], x) is O(n 2 ). Memoization the recursive algorithm leads to an O(n 2 ) running time! Question: What are the recursive subproblem generated by LIS smaller(a[1..n], x)? For 0 i < n LIS smaller(a[1..i], y) where y is either x or one of A[i + 1],..., A[n]. Observation: previous recursion also generates only O(n 2 ) subproblems. Slightly harder to see. Sariel (UIUC) CS473 11 Spring 2011 11 / 42 Recursive Algorithm: Take 3 LISEnding(A[1..n]): length of longest increasing sub-sequence that ends in A[n]. Question: can we obtain a recursive expression? LISEnding(A[1..n]) = max (1 + LISEnding(A[1..i])) i:a[i]<a[n] Sariel (UIUC) CS473 12 Spring 2011 12 / 42

Recursive Algorithm: Take 3 LIS ending(a[1..n]): if (n = 0) return 0 m = 1 for i = 1 to n 1 do if (A[i] < A[n]) then m = max(m, 1 + LIS ending(a[1..i])) return m LIS(A[1..n]): return max n i=1lis ending(a[1... i]) Question: How many distinct subproblems generated by LISEnding(A[1..n])? n. Sariel (UIUC) CS473 13 Spring 2011 13 / 42 Iterative Algorithm via Memoization Compute the values LISEnding(A[1..i]) iteratively in a bottom up fashion. LIS ending(a[1..n]): Array L[1..n] (* L[i] = value of LIS ending(a[1..i]) *) for i = 1 to n do L[i] = 1 for j = 1 to i 1 do if (A[j] < A[i]) do L[i] = max(l[i], 1 + L[j]) return L LIS(A[1..n]): L = LIS ending(a[1..n]) return the maximum value in L Sariel (UIUC) CS473 14 Spring 2011 14 / 42

Iterative Algorithm via Memoization Simplifying: LIS(A[1..n]): Array L[1..n] (* L[i] stores the value LIS-Ending(A[1..i]) *) m = 0 for i = 1 to n do L[i] = 1 for j = 1 to i 1 do if (A[j] < A[i]) do L[i] = max(l[i], 1 + L[j]) m = max(m, L[i]) return m Correctness: Via induction following the recursion Running time: O(n 2 ), Space: Θ(n) Sariel (UIUC) CS473 15 Spring 2011 15 / 42 Example Example Sequence: 6, 3, 5, 2, 7, 8, 1 Longest increasing subsequence: 3, 5, 7, 8 L[i] is value of longest increasing subsequence ending in A[i] Recursive algorithm computes L[i] from L[1] to L[i 1] Iterative algorithm builds up the values from L[1] to L[n] Sariel (UIUC) CS473 16 Spring 2011 16 / 42

Memoizing LIS smaller LIS(A[1..n]): A[n + 1] = (* add a sentinel at the end *) Array L[(n + 1), (n + 1)] (* two-dimensional array*) (* L[i, j] for j i stores the value LIS smaller(a[1..i],a[j]) *) for j = 1 to n + 1 do L[0, j] = 0 for i = 1 to n + 1 do for j = i to n + 1 do L[i, j] = L[i 1, j] if (A[i] < A[j]) then L[i, j] = max(l[i, j], 1 + L[i 1, i]) return L[n, (n + 1)] Correctness: Via induction following the recursion (take 2) Running time: O(n 2 ), Space: Θ(n 2 ) Sariel (UIUC) CS473 17 Spring 2011 17 / 42 Longest increasing subsequence Another way to get quadratic time algorithm G = ({s, 1,..., n}, {}): directed graph. i, j: If i < j and A[i] < A[j] then add the edge i j to G. i: Add s i. The graph G is a DAG. LIS corresponds to longest path in G starting at s. We know how to compute this in O( V(G) + E(G) ) = O(n 2 ). Comment: One can compute LIS in O(n log n) time with a bit more work. Sariel (UIUC) CS473 18 Spring 2011 18 / 42

Dynamic Programming Find a smart recursion for the problem in which the number of distinct subproblems is small; polynomial in the original problem size. Eliminate recursion and find an iterative algorithm to compute the problems bottom up by storing the intermediate values in an appropriate data structure; need to find the right way or order the subproblem evaluation. Estimate the number of subproblems, the time to evaluate each subproblem and the space needed to store the value. Evaluate the total running time. Optimize the resulting algorithm further Sariel (UIUC) CS473 19 Spring 2011 19 / 42 Part II Weighted Interval Scheduling Sariel (UIUC) CS473 20 Spring 2011 20 / 42

Weighted Interval Scheduling Input A set of jobs with start times, finish times and weights (or profits). Goal Schedule jobs so that total weight of jobs is maximized. Two jobs with overlapping intervals cannot both be scheduled! 10 1 1 1 4 10 2 1 2 3 Sariel (UIUC) CS473 21 Spring 2011 21 / 42 Interval Scheduling Input A set of jobs with start and finish times to be scheduled on a resource; special case where all jobs have weight 1. Goal Schedule as many jobs as possible. Greedy strategy of considering jobs according to finish times produces optimal schedule (to be seen later). Sariel (UIUC) CS473 22 Spring 2011 22 / 42

Greedy Strategies Earliest finish time first Largest weight/profit first Largest weight to length ratio first Shortest length first... None of the above strategies lead to an optimum solution. Moral: Greedy strategies often don t work! Sariel (UIUC) CS473 23 Spring 2011 23 / 42 Reduction to Max Weight Independent Set Problem Given weighted interval scheduling instance I create an instance of max weight independent set on a graph G(I) as follows. For each interval i create a vertex v i with weight w i. Add an edge between v i and v j if i and j overlap. Claim: max weight independent set in G(I) has weight equal to max weight set of intervals in I that do not overlap We do not know an efficient (polynomial time) algorithm for independent set! Can we take advantage of the interval structure to find an efficient algorithm? Sariel (UIUC) CS473 24 Spring 2011 24 / 42

Conventions Definition Let the requests be sorted according to finish time, i.e., i < j implies f i f j Define p(j) to be the largest i (less than j) such that job i and job j are not in conflict Example 1 2 3 4 5 6 v 1 = 2 v 2 = 4 v 3 = 4 v 4 = 7 v 5 = 2 v 6 = 1 p(1) = 0 p(2) = 0 p(3) = 1 p(4) = 0 p(5) = 3 p(6) = 3 Sariel (UIUC) CS473 25 Spring 2011 25 / 42 Towards a Recursive Solution Observation Consider an optimal schedule O Case n O : None of the jobs between n and p(n) can be scheduled. Moreover O must contain an optimal schedule for the first p(n) jobs. Case n O : O is an optimal schedule for the first n 1 jobs. Sariel (UIUC) CS473 26 Spring 2011 26 / 42

A Recursive Algorithm Let O i be value of an optimal schedule for the first i jobs. Time Analysis Schedule(n): if n = 0 then return 0 if n = 1 then return w(v 1 ) O p(n) Schedule(p(n)) O n 1 Schedule(n 1) if (O p(n) + w(v n ) < O n 1 ) then O n = O n 1 else O n = O p(n) + w(v n ) return O n Running time is T(n) = T(p(n)) + T(n 1) + O(1) which is... Sariel (UIUC) CS473 27 Spring 2011 27 / 42 Bad Example Figure: Bad instance for recursive algorithm Running time on this instance is T(n) = T(n 1) + T(n 2) + O(1) = Θ(φ n ) where φ 1.618 is the golden ratio. Sariel (UIUC) CS473 28 Spring 2011 28 / 42

Analysis of the Problem n n 1 n 2 n 2 n 3 n 3 n 4.... Figure: Label of node indicates size of sub-problem. Tree of sub-problems grows very quickly Sariel (UIUC) CS473 29 Spring 2011 29 / 42 Memo(r)ization Observation Number of different sub-problems in recursive algorithm is O(n); they are O 1, O 2,..., O n 1 Exponential time is due to recomputation of solutions to sub-problems Solution Store optimal solution to different sub-problems, and perform recursive call only if not already computed. Sariel (UIUC) CS473 30 Spring 2011 30 / 42

Recursive Solution with Memoization schdimem(j) if j = 0 then return 0 if M[j] is defined then (* sub-problem already solved *) return M[j] if M[j] is not defined ( then ) M[j] = max w(v j ) + schdimem(p(j)), schdimem(j 1) return M[j] Time Analysis Each invocation, O(1) time plus: either return a computed value, or generate 2 recursive calls and fill one M[ ] Initially no entry of M[] is filled; at the end all entries of M[] are filled So total time is O(n) Sariel (UIUC) CS473 31 Spring 2011 31 / 42 Automatic Memoization Fact Many functional languages (like LISP) automatically do memoization for recursive function calls! Sariel (UIUC) CS473 32 Spring 2011 32 / 42

Back to Weighted Interval Scheduling Iterative Solution M[0] = 0 for i = 1 to n do ( ) M[i] = max w(v i ) + M[p(i)], M[i 1] M: table of subproblems Implicitly dynamic programming fills the values of M Recursion determines order in which table is filled up Think of decomposing problem first (recursion) and then worry about setting up table this comes naturally from recursion Sariel (UIUC) CS473 33 Spring 2011 33 / 42 Example 30 1 70 80 3 4 20 2 10 5 p(5) = 2, p(4) = 1, p(3) = 1, p(2) = 0, p(1) = 0 Sariel (UIUC) CS473 34 Spring 2011 34 / 42

Computing Solutions + First Attempt Memoization + Recursion/Iteration allows one to compute the optimal value. What about the actual schedule? M[0] = 0 S[0] is empty schedule for i = 1 to n do ( ) M[i] = max w(v i ) + M[p(i)], M[i 1] if w(v i ) + M[p(i)] < M[i 1] then S[i] = S[i 1] else S[i] = S[p(i)] {i} Naïvely updating S[] takes O(n) time Total running time is O(n 2 ) Using pointers running time can be improved to O(n). Sariel (UIUC) CS473 35 Spring 2011 35 / 42 Computing Implicit Solutions Observation Solution can be obtained from M[] in O(n) time, without any additional information findsolution( j ) if (j = 0) then return empty schedule if (v j + M[p(j)] > M[j 1]) then return findsolution(p(j)) {j} else return findsolution(j 1) Makes O(n) recursive calls, so findsolution runs in O(n) time. Sariel (UIUC) CS473 36 Spring 2011 36 / 42

Computing Implicit Solutions A generic strategy for computing solutions in dynamic programming: keep track of the decision in computing the optimum value of a sub-problem. decision space depends on recursion once the optimum values are computed, go back and use the decision values to compute an optimum solution. Question: What is the decision in computing M[i]? A: Whether to include i or not. Sariel (UIUC) CS473 37 Spring 2011 37 / 42 Computing Implicit Solutions M[0] = 0 for i = 1 to n do M[i] = max(v i + M[p(i)], M[i 1]) if (v i + M[p(i)] > M[i 1])then Decision[i] = 1 (* 1: i included in solution M[i] *) else Decision[i] = 0 (* 0: i not included in solution M[i] *) S =, i = n while (i > 0) do if (Decision[i] = 1) then S = S {i} i = p(i) else i = i 1 return S Sariel (UIUC) CS473 38 Spring 2011 38 / 42