Dynamic Programming. Nothing to do with dynamic and nothing to do with programming.

Similar documents
CMSC 451: Lecture 10 Dynamic Programming: Weighted Interval Scheduling Tuesday, Oct 3, 2017

Unit-5 Dynamic Programming 2016

Algorithm Design and Analysis

memoization or iteration over subproblems the direct iterative algorithm a basic outline of dynamic programming

CMSC 451: Dynamic Programming

Dynamic Programming. Outline and Reading. Computing Fibonacci

CS473-Algorithms I. Lecture 10. Dynamic Programming. Cevdet Aykanat - Bilkent University Computer Engineering Department

Outline. CS38 Introduction to Algorithms. Fast Fourier Transform (FFT) Fast Fourier Transform (FFT) Fast Fourier Transform (FFT)

Algorithmic Paradigms

Data Structures and Algorithms Week 8

We ve done. Now. Next

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Dynamic Programming

Introduction to Algorithms

Algorithm Design Techniques part I

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Dynamic Programming I Date: 10/6/16

Chapter 6. Dynamic Programming. Modified from slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Dynamic Programming. Introduction, Weighted Interval Scheduling, Knapsack. Tyler Moore. Lecture 15/16

Dynamic Programming II

So far... Finished looking at lower bounds and linear sorts.

The Knapsack Problem an Introduction to Dynamic Programming. Slides based on Kevin Wayne / Pearson-Addison Wesley

We augment RBTs to support operations on dynamic sets of intervals A closed interval is an ordered pair of real

Introduction to Algorithms

Dynamic Programming (Part #2)

Lecture 8. Dynamic Programming

Dynamic Programming Group Exercises

IN101: Algorithmic techniques Vladimir-Alexandru Paun ENSTA ParisTech

CS 231: Algorithmic Problem Solving

Algorithmic Paradigms. Chapter 6 Dynamic Programming. Steps in Dynamic Programming. Dynamic Programming. Dynamic Programming Applications

Module 27: Chained Matrix Multiplication and Bellman-Ford Shortest Path Algorithm

12 Dynamic Programming (2) Matrix-chain Multiplication Segmented Least Squares

CS60020: Foundations of Algorithm Design and Machine Learning. Sourangshu Bhattacharya

Lecture 16: Introduction to Dynamic Programming Steven Skiena. Department of Computer Science State University of New York Stony Brook, NY

Dynamic Programming. An Enumeration Approach. Matrix Chain-Products. Matrix Chain-Products (not in book)

Dynamic Programming part 2

ECE608 - Chapter 15 answers

Chapter 3 Dynamic programming

CLASS: II YEAR / IV SEMESTER CSE CS 6402-DESIGN AND ANALYSIS OF ALGORITHM UNIT I INTRODUCTION

Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology. Assignment

CMPS 2200 Fall Dynamic Programming. Carola Wenk. Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk

Dynamic Programming. Design and Analysis of Algorithms. Entwurf und Analyse von Algorithmen. Irene Parada. Design and Analysis of Algorithms

Lecture 22: Dynamic Programming

Elements of Dynamic Programming. COSC 3101A - Design and Analysis of Algorithms 8. Discovering Optimal Substructure. Optimal Substructure - Examples

10/24/ Rotations. 2. // s left subtree s right subtree 3. if // link s parent to elseif == else 11. // put x on s left

Analysis of Algorithms. Unit 4 - Analysis of well known Algorithms

1 Non greedy algorithms (which we should have covered

Greedy algorithms is another useful way for solving optimization problems.

Dynamic Programming. Lecture Overview Introduction

Chapter 6. Dynamic Programming

Homework3: Dynamic Programming - Answers

Subsequence Definition. CS 461, Lecture 8. Today s Outline. Example. Assume given sequence X = x 1, x 2,..., x m. Jared Saia University of New Mexico

Computer Sciences Department 1

Lecture 13: Chain Matrix Multiplication

Contents. 1 Introduction. 2 Searching and Traversal Techniques. Preface... (vii) Acknowledgements... (ix)

Chapter 6. Dynamic Programming

Algorithms for Data Science

CSE 101, Winter Design and Analysis of Algorithms. Lecture 11: Dynamic Programming, Part 2

Computer Science 385 Design and Analysis of Algorithms Siena College Spring Topic Notes: Dynamic Programming

Algorithms: COMP3121/3821/9101/9801

Department of Computer Applications. MCA 312: Design and Analysis of Algorithms. [Part I : Medium Answer Type Questions] UNIT I

CSC 505, Spring 2005 Week 6 Lectures page 1 of 9

Lecture 6: Combinatorics Steven Skiena. skiena

L.J. Institute of Engineering & Technology Semester: VIII (2016)

Design and Analysis of Algorithms

Efficient Sequential Algorithms, Comp309. Problems. Part 1: Algorithmic Paradigms

F(0)=0 F(1)=1 F(n)=F(n-1)+F(n-2)

Resources matter. Orders of Growth of Processes. R(n)= (n 2 ) Orders of growth of processes. Partial trace for (ifact 4) Partial trace for (fact 4)

Unit 4: Dynamic Programming

CS141: Intermediate Data Structures and Algorithms Greedy Algorithms

Special Topics on Algorithms Fall 2017 Dynamic Programming. Vangelis Markakis, Ioannis Milis and George Zois

CS 170 DISCUSSION 8 DYNAMIC PROGRAMMING. Raymond Chan raychan3.github.io/cs170/fa17.html UC Berkeley Fall 17

6. Algorithm Design Techniques

Partha Sarathi Manal

Dynamic Programming. Applications. Applications. Applications. Algorithm Design 6.1, 6.2, 6.3

Unit-2 Divide and conquer 2016

Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 15, 2015 洪國寶

CSC 373 Lecture # 3 Instructor: Milad Eftekhar

1. Basic idea: Use smaller instances of the problem to find the solution of larger instances

Dynamic Programming 1

CS2223 Algorithms B Term 2013 Exam 3 Solutions

END-TERM EXAMINATION

Dynamic Programmming: Activity Selection

Shai Simonson, ADU Page 1 2/20/2001

CMSC351 - Fall 2014, Homework #4

Longest Common Subsequence, Knapsack, Independent Set Scribe: Wilbur Yang (2016), Mary Wootters (2017) Date: November 6, 2017

Data Structures and Algorithms (CSCI 340)

Algorithm Design and Analysis

A BRIEF INTRODUCTION TO DYNAMIC PROGRAMMING (DP) by Amarnath Kasibhatla Nanocad Lab University of California, Los Angeles 04/21/2010

DESIGN AND ANALYSIS OF ALGORITHMS

Algorithms: Dynamic Programming

Analysis of Algorithms Prof. Karen Daniels

Dynamic Programming Matrix-chain Multiplication

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS

Chain Matrix Multiplication

CSE 417 Dynamic Programming (pt 4) Sub-problems on Trees

Deliverables. Quick Sort. Randomized Quick Sort. Median Order statistics. Heap Sort. External Merge Sort

CS Algorithms. Dynamic programming and memoization. (Based on slides by Luebke, Lim, Wenbin)

15-451/651: Design & Analysis of Algorithms January 26, 2015 Dynamic Programming I last changed: January 28, 2015

Lectures 12 and 13 Dynamic programming: weighted interval scheduling

ECE608 - Chapter 16 answers

1 Dynamic Programming

Transcription:

Dynamic Programming

Deliverables Dynamic Programming basics Binomial Coefficients Weighted Interval Scheduling Matrix Multiplication /1 Knapsack Longest Common Subsequence 6/12/212 6:56 PM copyright @ gdeepak.com 2

Dynamic Programming Nothing to do with dynamic and nothing to do with programming. Actually the name should be bottom up approach without repeated calculations 6/12/212 6:56 PM copyright @ gdeepak.com 3

Dynamic Programming paradigm Difference between Dynamic Programming and Divide and Conquer is that the sub problems in D&C are disjoint and distinct but in DP they are overlapping. Dynamic programming always gives a optimal solution as compared to the greedy paradigm which fails in some cases. Recursively solve sub problems, starting from the trivial case, and save their solutions in memory. In the end we get solution of the whole problem Optimal Substructure of the problems: The solution to the problem must be a composition of sub problem solutions 6/12/212 6:56 PM copyright @ gdeepak.com 4

Dynamic Programming paradigm Divide and Conquer is top-down while DP is bottom up. It works when the number of substructures are polynomial in size of the input. Recursion uses a stack while DP uses a queue whenever required. In many cases we start with the base case and upper level sub problems are generated after doing some work on lower levels. 6/12/212 6:56 PM copyright @ gdeepak.com 5

Pascal triangle or Binomial Coefficients 6/12/212 6:56 PM copyright @ gdeepak.com 6

Pascal triangle or Binomial Coefficients 6/12/212 6:56 PM copyright @ gdeepak.com 7

Pascal triangle or Binomial Coefficients 6/12/212 6:56 PM copyright @ gdeepak.com 8

Binomial Coefficients n where n is the row of Pascal triangle m and m is the column of Pascal triangle. = n 1 m + n 1 m 1 Recursive Algorithm B(n, m) { If m= or m=n then set 1 else ret B(n-1,m) + B(n-1,m-1) } 6/12/212 6:56 PM copyright @ gdeepak.com 9

Analysis Recurrence is T(size) = T(size-1)+T(size-2) This looks like Fibonacci equation. Nth Fibonacci number is an exponential, so we will end up in getting exponential number of calculations in doing recursive algorithm. If you want to calculate B(6,3) it will calculate B(3,2) 3 times, B(3,1) 3 times, B(2,1) 5 times, B(1,1) 6 times and B(1,) 6 times It can be done with an iterative function of factorials in linear time. 6/12/212 6:56 PM copyright @ gdeepak.com 1

Weighted Interval Scheduling Greedy algorithm works if all weights are 1 and consider jobs in ascending order of finish time and add job to subset if it is compatible with previously chosen jobs. Greedy algorithm fails if arbitrary weights are allowed weight = 999 b weight = 1 a 6/12/212 6:56 PM copyright @ gdeepak.com 11

Weighted Interval Scheduling Label jobs by finishing time: f 1 f 2... f n.p(j) = largest index i < j such that job i is compatible with j. p(8) = 5, p(7) = 3, p(2) =. 1 3 2 4 5 6 7 1 2 3 4 5 6 7 8 9 1 11 8 Time 6/12/212 6:56 PM copyright @ gdeepak.com 12

Dynamic Programming: Key Technique OPT(j) = value of optimal solution to the problem consisting of job requests 1, 2,..., j. Case 1: OPT selects job j. can't use incompatible jobs {p(j) + 1, p(j) + 2,..., j - 1 } must include optimal solution to problem consisting of remaining compatible jobs 1, 2,..., p(j) Case 2: OPT does not select job j. must include optimal solution to problem consisting of remaining compatible jobs 1, 2,..., j-1 if j OPT( j) max v j OPT( p( j)), OPT( j 1) otherwise 6/12/212 6:56 PM copyright @ gdeepak.com 13

WIS Brute Force Input n,s 1 s n,f 1 f n,v 1 v n Sort jobs by finish times so that f 1 f 2 f n Compute p(1),p(2) p(n) Compute-opt(j) { if j= then return else return max(v j +Compute-opt(p(j)),Compute-opt(j-1)) } 6/12/212 6:56 PM copyright @ gdeepak.com 14

Example of Weighted Interval Scheduling 6/12/212 6:56 PM copyright @ gdeepak.com 15

Exponential Growth 6/12/212 6:56 PM copyright @ gdeepak.com 16

Worst Case Instance 6/12/212 6:56 PM copyright @ gdeepak.com 17

WIS Algorithm Input n,s 1 s n,f 1 f n,v 1 v n Sort jobs by finish times so that f 1 f 2 f n Compute p(1),p(2) p(n) Memo-WIS(j) { if j= then return else if M[j] is not empty then return M[j] else M[j]= max(v j + Memo-WIS(p(j)), Memo-WIS(j-1)) return M[j] } 6/12/212 6:56 PM copyright @ gdeepak.com 18

WIS Improved Input n,s 1 s n,f 1 f n,v 1 v n Sort jobs by finish times so that f 1 f 2 f n Compute p(1),p(2) p(n) iterative-wis(j) { M[]= for j=1,2,,n M[j]= max(v j + M(p(j)),M[j-1]) } 6/12/212 6:56 PM copyright @ gdeepak.com 19

Example with Iterations 6/12/212 6:56 PM copyright @ gdeepak.com 2

Complexity Analysis Running Time in O(nlogn), If we exclude sorting for finishing time then the remaining complexity is only O(n) 6/12/212 6:56 PM copyright @ gdeepak.com 21

Multiplying two matrices For multiplication two matrices A and B number of columns of A must equal the number of rows of B. If A is a p q matrix and B is a q r matrix, the resulting matrix C is a p r matrix. The time to compute C is dominated by the number of scalar multiplications which is pqr Compute A=A *A 1 * *A n-1 A i is d i d i+1 Problem: How to parenthesize? Example B is 3 1 C is 1 5 D is 5 5 (B*C)*D takes 15 + 75 = 1575 ops B*(C*D) takes 15 + 25 = 4 ops 6/12/212 6:56 PM copyright @ gdeepak.com 22

Matrix Multiplication Want to multiply matrices A x B x C x D x E We could parenthesize many ways (A x (B x (C x (D x E)))) ((((A x B) x C) x D) x E) Each different way has different number of multiplications A x B x C x D x E Calculate in advance the cost (multiplies) AB, BC, CD, DE Use those to find the cheapest way to form ABC, BCD, CDE and then ABCD BCDE From that derive best way to form ABCDE 6/12/212 6:56 PM copyright @ gdeepak.com 23

Example Let matrices are 1x4,4x5,5x3,3x6,6x7,7x1 6/12/212 6:56 PM copyright @ gdeepak.com 24

Example 6/12/212 6:56 PM copyright @ gdeepak.com 25

Time complexity In the th Row or base row No of calucations - In the 1 st Row 1*n In the 2 nd Row- 2*n-1 In the 3 rd Row- 3* n-2 : In the nth Row n*1 Adding All the above n k=1 k(n k + 1) It comes out to be Ө(n 3 ) and the constant is 1/6 6/12/212 6:56 PM copyright @ gdeepak.com 26

Keeping track of the solution Need to keep track of the Split point. For this we need to go backward from the final solution. Getting the actual sequence of solution is always recursive. For Example (M1M2) *(M3M4M5) ((M1)(M2)) * ((M3) (M4M5)) ((M1)(M2)) * ((M3) ((M4)(M5))) 6/12/212 6:56 PM copyright @ gdeepak.com 27

/1 knapsack: Recursive Formulation S k : Set of items numbered 1 to k. Define B[k,w] = best selection from S k with weight exactly equal to w this does have sub problem optimality: B[ k, w] max{ B[ k B[ k 1, w] 1, w], B[ k 1, w w k ] b } k if w k else w 6/12/212 6:56 PM copyright @ gdeepak.com 28

Running Time Running time: O(nW). Not a polynomial-time algorithm if W is large. This is a pseudo-polynomial time algorithm Algorithm 1Knapsack(S, W): Input: set S of items w/ benefit b i and weight w i ; max. weight W Output: benefit of best subset with weight at most W for w to W do B[w] for i = 1 to n B[i,] = for k 1 to n do for w W downto w k do if B[w-w k ]+b k > B[w] then B[w] B[w-w k ]+b k 6/12/212 6:56 PM copyright @ gdeepak.com 29

Initial Setting of the /1 knapsack 1 2 3 4 5 6 7 8 9 Item:W, P 1 2 3 4 5 1 : 4,2 2: 2,3 3 : 2,6 4 : 6,25 5 :2,8 W=9 6/12/212 7:7 PM copyright @ gdeepak.com 3

Considering Item 1 1 2 3 4 5 6 7 8 9 Item:W, P 1 2 2 2 2 2 2 2 3 4 5 1 : 4,2 2: 2,3 3 : 2,6 4 : 6,25 5 :2,8 W=9 6/12/212 7:8 PM copyright @ gdeepak.com 31

Considering Item 2 1 2 3 4 5 6 7 8 9 Item:W, P 1 2 2 2 2 2 2 2 3 3 2 2 23 23 23 23 3 4 5 1 : 4,2 2: 2,3 3 : 2,6 4 : 6,25 5 :2,8 W=9 6/12/212 7:9 PM copyright @ gdeepak.com 32

Considering Item 3 1 2 3 4 5 6 7 8 9 Item:W, P 1 2 2 2 2 2 2 2 3 3 2 2 23 23 23 23 3 6 6 2 2 26 26 29 29 4 5 1 : 4,2 2: 2,3 3 : 2,6 4 : 6,25 5 :2,8 W=9 6/12/212 7:9 PM copyright @ gdeepak.com 33

Considering Item 4 1 2 3 4 5 6 7 8 9 Item:W, P 1 2 2 2 2 2 2 2 3 3 2 2 23 23 23 23 3 6 6 2 2 26 26 29 29 4 6 6 2 2 26 26 31 31 5 1 : 4,2 2: 2,3 3 : 2,6 4 : 6,25 5 :2,8 W=9 6/12/212 7:1 PM copyright @ gdeepak.com 34

Considering Item 5 1 2 3 4 5 6 7 8 9 Item:W, P 1 2 2 2 2 2 2 2 3 3 2 2 23 23 23 23 3 6 6 2 2 26 26 29 29 4 6 6 2 2 26 26 31 31 5 8 8 86 86 1 1 16 16 1 : 4,2 2: 2,3 3 : 2,6 4 : 6,25 5 :2,8 W=9 6/12/212 7:1 PM copyright @ gdeepak.com 35

Time Complexity Complexity is Ө(nM) where M is the size of the knapsack. It can be written as Ө(n2 s ) where s is the size of the input. It is not polynomial but exponential. 6/12/212 6:56 PM copyright @ gdeepak.com 36

Longest common subsequence (LCS) For a sequence X = x 1, x 2,, x n, a subsequence is a subset of the sequence defined by a set of increasing indices (i 1, i 2,, i k ) where 1 i 1 < i 2 < < i k n X = A B A C D A B A B ABA? 6/12/212 6:56 PM copyright @ gdeepak.com 37

Longest common subsequence (LCS) For a sequence X = x 1, x 2,, x n, a subsequence is a subset of the sequence defined by a set of increasing indices (i 1, i 2,, i k ) where 1 i 1 < i 2 < < i k n X = A B A C D A B A B ABA 6/12/212 6:56 PM copyright @ gdeepak.com 38

Longest common subsequence (LCS) For a sequence X = x 1, x 2,, x n, a subsequence is a subset of the sequence defined by a set of increasing indices (i 1, i 2,, i k ) where 1 i 1 < i 2 < < i k n X = A B A C D A B A B ACA? 6/12/212 6:56 PM copyright @ gdeepak.com 39

Longest common subsequence (LCS) For a sequence X = x 1, x 2,, x n, a subsequence is a subset of the sequence defined by a set of increasing indices (i 1, i 2,, i k ) where 1 i 1 < i 2 < < i k n X = A B A C D A B A B ACA 6/12/212 6:56 PM copyright @ gdeepak.com 4

Longest common subsequence (LCS) For a sequence X = x 1, x 2,, x n, a subsequence is a subset of the sequence defined by a set of increasing indices (i 1, i 2,, i k ) where 1 i 1 < i 2 < < i k n X = A B A C D A B A B DCA? 6/12/212 6:56 PM copyright @ gdeepak.com 41

Longest common subsequence (LCS) For a sequence X = x 1, x 2,, x n, a subsequence is a subset of the sequence defined by a set of increasing indices (i 1, i 2,, i k ) where 1 i 1 < i 2 < < i k n X = A B A C D A B A B DCA 6/12/212 6:56 PM copyright @ gdeepak.com 42

Longest common subsequence (LCS) For a sequence X = x 1, x 2,, x n, a subsequence is a subset of the sequence defined by a set of increasing indices (i 1, i 2,, i k ) where 1 i 1 < i 2 < < i k n X = A B A C D A B A B AADAA? 6/12/212 6:56 PM copyright @ gdeepak.com 43

Longest common subsequence (LCS) For a sequence X = x 1, x 2,, x n, a subsequence is a subset of the sequence defined by a set of increasing indices (i 1, i 2,, i k ) where 1 i 1 < i 2 < < i k n X = A B A C D A B A B AADAA 6/12/212 6:56 PM copyright @ gdeepak.com 44

LCS problem Given two sequences X and Y, a common subsequence is a subsequence that occurs in both X and Y Given two sequences X = x 1, x 2,, x n and Y = y 1, y 2,, y n, What is the longest common subsequence? X = A B C B D A B Y = B D C A B A 6/12/212 6:56 PM copyright @ gdeepak.com 45

LCS problem Given two sequences X and Y, a common subsequence is a subsequence that occurs in both X and Y Given two sequences X = x 1, x 2,, x n and Y = y 1, y 2,, y n, What is the longest common subsequence? X = A B C B D A B Y = B D C A B A 6/12/212 6:56 PM copyright @ gdeepak.com 46

Step 1: Define the problem with respect to subproblems X = A B C B D A B Y = B D C A B A 6/12/212 6:56 PM copyright @ gdeepak.com 47

Step 1: Define the problem with respect to subproblems X = A B C B D A? Y = B D C A B? Is the last character part of the LCS? 6/12/212 6:56 PM copyright @ gdeepak.com 48

Step 1: Define the problem with respect to subproblems X = A B C B D A? Y = B D C A B? Two cases: either the characters are the same or they re different 6/12/212 6:56 PM copyright @ gdeepak.com 49

Step 1: Define the problem with respect to subproblems X = A B C B D A A LCS Y = B D C A B A The characters are part of the LCS LCS If they re the same ( X, Y) LCS( X1... n 1, Y1... m 1) x n 6/12/212 6:56 PM copyright @ gdeepak.com 5

Step 1: Define the problem with respect to subproblems X = A B C B D A B LCS Y = B D C A B A If they re different LCS( X, Y) LCS( X 1... n 1, Y) 6/12/212 6:56 PM copyright @ gdeepak.com 51

Step 1: Define the problem with respect to subproblems X = A B C B D A B LCS Y = B D C A B A If they re different LCS( X, Y) LCS( X, Y1... m 1) 6/12/212 6:56 PM copyright @ gdeepak.com 52

Step 1: Define the problem with respect to subproblems X = A B C B D A B Y = B D C A B A X = A B C B D A B? Y = B D C A B A If they re different 6/12/212 6:56 PM copyright @ gdeepak.com 53

Step 1: Define the problem with respect to subproblems X = A B C B D A B Y = B D C A B A LCS( X, Y) 1 LCS( X1... n 1, Y1... m 1) max( LCS( X1... n 1, Y), LCS( X, Y 1... m 1 ) if x n y m otherwise 6/12/212 6:57 PM copyright @ gdeepak.com 54

Step 2: Build the solution from the bottom up LCS( X, Y) 1 LCS( X1... n 1, Y1... m 1) max( LCS( X1... n 1, Y), LCS( X, Y 1... m 1 ) if x n y m otherwise What types of subproblem solutions do we need to store? LCS(X 1 j, Y 1 k ) two different indices 6/12/212 6:57 PM copyright @ gdeepak.com 55

Step 2: Build the solution from the bottom up LCS( X, Y) 1 LCS( X1... n 1, Y1... m 1) max( LCS( X1... n 1, Y), LCS( X, Y 1... m 1 ) if x n y m otherwise What types of subproblem solutions do we need to store? LCS(X 1 j, Y 1 k ) LCS[ i, j] 1 LCS[ i max( LCS[ i 1, 1, j 1] j], LCS[ i, j 1] if x i y otherwise j 6/12/212 6:57 PM copyright @ gdeepak.com 56

LCS[ i, j] 1 LCS[ i max( LCS[ i 1, 1, j 1] j], LCS[ i, j 1] if x i y otherwise j i j 1 2 3 4 5 6 y j B D C A B A x i 1 A 2 B 3 C 4 B 5 D 6 A 7 B 6/12/212 6:57 PM copyright @ gdeepak.com 57

LCS[ i, j] 1 LCS[ i max( LCS[ i 1, 1, j 1] j], LCS[ i, j 1] if x i y otherwise j i x i 1 A 2 B 3 C 4 B 5 D 6 A 7 B j 1 2 3 4 5 6 y j B D C A B A 6/12/212 6:57 PM copyright @ gdeepak.com 58

LCS[ i, j] 1 LCS[ i max( LCS[ i 1, 1, j 1] j], LCS[ i, j 1] if x i y otherwise j i x i 1 A 2 B 3 C 4 B 5 D 6 A 7 B j 1 2 3 4 5 6 y j B D C A B A? LCS(A, B) 6/12/212 6:57 PM copyright @ gdeepak.com 59

LCS[ i, j] 1 LCS[ i max( LCS[ i 1, 1, j 1] j], LCS[ i, j 1] if x i y otherwise j i x i 1 A 2 B 3 C 4 B 5 D 6 A 7 B j 1 2 3 4 5 6 y j B D C A B A 6/12/212 6:57 PM copyright @ gdeepak.com 6

LCS[ i, j] 1 LCS[ i max( LCS[ i 1, 1, j 1] j], LCS[ i, j 1] if x i y otherwise j i x i 1 A 2 B 3 C 4 B 5 D 6 A 7 B j 1 2 3 4 5 6 y j B D C A B A? LCS(A, BDCA) 6/12/212 6:57 PM copyright @ gdeepak.com 61

LCS[ i, j] 1 LCS[ i max( LCS[ i 1, 1, j 1] j], LCS[ i, j 1] if x i y otherwise j i x i 1 A 2 B 3 C 4 B 5 D 6 A 7 B j 1 2 3 4 5 6 y j B D C A B A 1 LCS(A, BDCA) 6/12/212 6:57 PM copyright @ gdeepak.com 62

LCS[ i, j] 1 LCS[ i max( LCS[ i 1, 1, j 1] j], LCS[ i, j 1] if x i y otherwise j i x i 1 A 2 B 3 C 4 B 5 D 6 A 7 B j 1 2 3 4 5 6 y j B D C A B A 1 1 1 1 1 1 1 2 2 1 1 2 2 2 2 1 1 2 2? LCS(ABCB, BDCAB) 6/12/212 6:57 PM copyright @ gdeepak.com 63

LCS[ i, j] 1 LCS[ i max( LCS[ i 1, 1, j 1] j], LCS[ i, j 1] if x i y otherwise j i x i 1 A 2 B 3 C 4 B 5 D 6 A 7 B j 1 2 3 4 5 6 y j B D C A B A 1 1 1 1 1 1 1 2 2 1 1 2 2 2 2 1 1 2 2 3 LCS(ABCB, BDCAB) 6/12/212 6:57 PM copyright @ gdeepak.com 64

LCS[ i, j] 1 LCS[ i max( LCS[ i 1, 1, j 1] j], LCS[ i, j 1] if x i y otherwise j i x i 1 A 2 B 3 C 4 B 5 D 6 A 7 B j 1 2 3 4 5 6 y j B D C A B A 1 1 1 1 1 1 1 2 2 1 1 2 2 2 2 1 1 2 2 3 3 1 2 2 2 3 3 1 2 2 3 3 4 1 2 2 3 4 4 Where s the final answer? 6/12/212 6:57 PM copyright @ gdeepak.com 65

Questions, Comments and Suggestions 6/12/212 6:57 PM copyright @ gdeepak.com 66

Question 1 Any Dynamic Programming algorithm with n sub problems will run in O(n) time 6/12/212 6:57 PM copyright @ gdeepak.com 67

Question 2 Memoization is the basis for a top-down alternative to the usual bottom-up version of dynamic programming. 6/12/212 6:57 PM copyright @ gdeepak.com 68

Question 3 The Floyd-Warshall algorithm solves the all-pairs shortestpaths problem using dynamic programming. 6/12/212 6:57 PM copyright @ gdeepak.com 69