Algorithms: Dynamic Programming

Similar documents
Dynamic Programming II

Dynamic Programming. Design and Analysis of Algorithms. Entwurf und Analyse von Algorithmen. Irene Parada. Design and Analysis of Algorithms

Lecture 13: Chain Matrix Multiplication

Chain Matrix Multiplication

14 Dynamic. Matrix-chain multiplication. P.D. Dr. Alexander Souza. Winter term 11/12

Dynamic Programming (Part #2)

Efficient Sequential Algorithms, Comp309. Problems. Part 1: Algorithmic Paradigms

Lecture 8. Dynamic Programming

12 Dynamic Programming (2) Matrix-chain Multiplication Segmented Least Squares

CS473-Algorithms I. Lecture 10. Dynamic Programming. Cevdet Aykanat - Bilkent University Computer Engineering Department

Algorithm Design Techniques part I

/463 Algorithms - Fall 2013 Solution to Assignment 3

Elements of Dynamic Programming. COSC 3101A - Design and Analysis of Algorithms 8. Discovering Optimal Substructure. Optimal Substructure - Examples

Chapter 3 Dynamic programming

Midterm solutions. n f 3 (n) = 3

ECE G205 Fundamentals of Computer Engineering Fall Exercises in Preparation to the Midterm

We ve done. Now. Next

Lecture 4: Dynamic programming I

Unit-5 Dynamic Programming 2016

Data Structures and Algorithms (CSCI 340)

Optimization II: Dynamic Programming

Unit 4: Dynamic Programming

Computer Science 385 Design and Analysis of Algorithms Siena College Spring Topic Notes: Dynamic Programming

Graph Algorithms. Chromatic Polynomials. Graph Algorithms

1 Definition of Reduction

Write an algorithm to find the maximum value that can be obtained by an appropriate placement of parentheses in the expression

Dynamic Programming. Outline and Reading. Computing Fibonacci

Dynamic Programming. Nothing to do with dynamic and nothing to do with programming.

15.Dynamic Programming

Greedy Algorithms. Algorithms

ECE608 - Chapter 15 answers

HOMEWORK FILE SOLUTIONS

CS 231: Algorithmic Problem Solving

Introduction to Algorithms

Framework for Design of Dynamic Programming Algorithms

Recursion. COMS W1007 Introduction to Computer Science. Christopher Conway 26 June 2003

So far... Finished looking at lower bounds and linear sorts.

Module 27: Chained Matrix Multiplication and Bellman-Ford Shortest Path Algorithm

Dynamic Programming Shabsi Walfish NYU - Fundamental Algorithms Summer 2006

Lecture 16: Introduction to Dynamic Programming Steven Skiena. Department of Computer Science State University of New York Stony Brook, NY

Memoization/Dynamic Programming. The String reconstruction problem. CS124 Lecture 11 Spring 2018

CSE 417 Dynamic Programming (pt 4) Sub-problems on Trees

Dynamic Programming part 2

CSC 505, Spring 2005 Week 6 Lectures page 1 of 9

CS/ENGRD 2110 Object-Oriented Programming and Data Structures Spring 2012 Thorsten Joachims. Lecture 10: Asymptotic Complexity and

Introduction to Multithreaded Algorithms

F(0)=0 F(1)=1 F(n)=F(n-1)+F(n-2)

Problem. Input: An array A = (A[1],..., A[n]) with length n. Output: a permutation A of A, that is sorted: A [i] A [j] for all. 1 i j n.

Design and Analysis of Algorithms PART III

Assignment 1 (concept): Solutions

Data Structures and Algorithms Week 8

CSE 373 Analysis of Algorithms, Fall Homework #3 Solutions Due Monday, October 18, 2003

Algorithms: Graphs. Amotz Bar-Noy. Spring 2012 CUNY. Amotz Bar-Noy (CUNY) Graphs Spring / 95

CS:3330 (22c:31) Algorithms

CPS 231 Exam 2 SOLUTIONS

UML CS Algorithms Qualifying Exam Fall, 2003 ALGORITHMS QUALIFYING EXAM

Plotting run-time graphically. Plotting run-time graphically. CS241 Algorithmics - week 1 review. Prefix Averages - Algorithm #1

1. (a) O(log n) algorithm for finding the logical AND of n bits with n processors

Reference Sheet for CO142.2 Discrete Mathematics II

Chapter 8 Sort in Linear Time

n 2 ( ) ( ) + n is in Θ n logn

y j LCS-Length(X,Y) Running time: O(st) set c[i,0] s and c[0,j] s to 0 for i=1 to s for j=1 to t if x i =y j then else if

CLASS: II YEAR / IV SEMESTER CSE CS 6402-DESIGN AND ANALYSIS OF ALGORITHM UNIT I INTRODUCTION

PROGRAM EFFICIENCY & COMPLEXITY ANALYSIS

1 Format. 2 Topics Covered. 2.1 Minimal Spanning Trees. 2.2 Union Find. 2.3 Greedy. CS 124 Quiz 2 Review 3/25/18

10. EXTENDING TRACTABILITY

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Dynamic Programming

Design and Analysis of Algorithms

Tutorial 6-7. Dynamic Programming and Greedy

7. Sorting I. 7.1 Simple Sorting. Problem. Algorithm: IsSorted(A) 1 i j n. Simple Sorting

CS473 - Algorithms I

Algorithms and Data Structures

Solutions. (a) Claim: A d-ary tree of height h has at most 1 + d +...

Test 1 Last 4 Digits of Mav ID # Multiple Choice. Write your answer to the LEFT of each problem. 2 points each t 1

Algorithms IV. Dynamic Programming. Guoqiang Li. School of Software, Shanghai Jiao Tong University

Other techniques for sorting exist, such as Linear Sorting which is not based on comparisons. Linear Sorting techniques include:

ECE608 - Chapter 16 answers

Lecture 22: Dynamic Programming

8.1 Polynomial-Time Reductions

Lecturers: Sanjam Garg and Prasad Raghavendra March 20, Midterm 2 Solutions

CMPSCI 311: Introduction to Algorithms Practice Final Exam

CS 473: Fundamental Algorithms, Spring Dynamic Programming. Sariel (UIUC) CS473 1 Spring / 42. Part I. Longest Increasing Subsequence

Algorithm Design and Analysis

Lecture 6: Combinatorics Steven Skiena. skiena

Algorithms Dr. Haim Levkowitz

Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology. Assignment

EXAM ELEMENTARY MATH FOR GMT: ALGORITHM ANALYSIS NOVEMBER 7, 2013, 13:15-16:15, RUPPERT D

INTRODUCTION. Analysis: Determination of time and space requirements of the algorithm

Analysis of Algorithms. Unit 4 - Analysis of well known Algorithms

The complexity of Sorting and sorting in linear-time. Median and Order Statistics. Chapter 8 and Chapter 9

L.J. Institute of Engineering & Technology Semester: VIII (2016)

Writing Parallel Programs; Cost Model.

Analysis of Algorithms Prof. Karen Daniels

Agenda. The worst algorithm in the history of humanity. Asymptotic notations: Big-O, Big-Omega, Theta. An iterative solution

Dynamic Programming Algorithms

Dynamic Programming. Introduction, Weighted Interval Scheduling, Knapsack. Tyler Moore. Lecture 15/16

8. Sorting II. 8.1 Heapsort. Heapsort. [Max-]Heap 6. Heapsort, Quicksort, Mergesort. Binary tree with the following properties Wurzel

Solution for Homework set 3

Dynamic Programming. December 15, CMPE 250 Dynamic Programming December 15, / 60

2. (a) Explain the concept of virtual functions in C++ with suitable examples. (b) Explain the concept of operator overloading in C++.

Transcription:

Algorithms: Dynamic Programming Amotz Bar-Noy CUNY Spring 2012 Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 1 / 58

Dynamic Programming General Strategy: Solve recursively the problem top-down based on solutions for sub-problems. Find a bottom-up order to compute all the sub-problems. Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 2 / 58

Dynamic Programming General Strategy: Solve recursively the problem top-down based on solutions for sub-problems. Find a bottom-up order to compute all the sub-problems. Time complexity: If there are polynomial number of sub-problems. If each sub-problem can be computed in polynomial time. Then the solution can be found in polynomial time. Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 2 / 58

Dynamic Programming General Strategy: Solve recursively the problem top-down based on solutions for sub-problems. Find a bottom-up order to compute all the sub-problems. Time complexity: If there are polynomial number of sub-problems. If each sub-problem can be computed in polynomial time. Then the solution can be found in polynomial time. Remark: Greedy also applies a top-down strategy but usually on one sub-problem so the order of computation is clear. Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 2 / 58

Fibonacci Numbers The sequence: 0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55,... Recursive Definition: 0 for k = 0 F k = 1 for k = 1 F k 1 + F k 2 for k 2 Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 3 / 58

Fibonacci Numbers Some Facts F k = (φk ˆφ k ) 5. φ = 1+ 5 2 1.618 (the golden ratio number). ˆφ = 1 5 2 0.618. φ and ˆφ are the roots of x 2 = x + 1. ˆφ < 1 implies that F k φk 5. k log φ F k = Θ(log F k ). Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 4 / 58

Top-Down Recursive Computation Fibo(k) if k = 0 then return(0) if k = 1 then return(1) return(fibo(k 1)+Fibo(k 2)) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 5 / 58

Top-Down Recursive Computation Fibo(k) if k = 0 then return(0) if k = 1 then return(1) return(fibo(k 1)+Fibo(k 2)) Memory complexity: Θ(k). Because k is the maximum depth of the recursion. Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 5 / 58

The Recursion Tree F(5) F(4) F(3) F(3) F(2) F(2) F(1) F(2) F(1) F(1) F(0) F(1) F(0) F(1) F(0) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 6 / 58

Time Complexity I Notation: T (k) Number of calls to Fibo excluding Fibo(0) and Fibo(1). Number of internal nodes in the recursion tree. Initial values: T (1) = T (0) = 0. Recursive formula: T (k) = 1 + T (k 1) + T (k 2). The sequence: 0, 0, 1, 2, 4, 7, 12, 20,... Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 7 / 58

Time Complexity I Guess: T (k) = F k+1 1 for k 0. Why? The recursive formula for T (k) is very similar to the formula for F k. Proof by induction: T (0) = F 1 1 = 0 T (1) = F 2 1 = 0 T (k) = 1 + T (k 1) + T (k 2) = 1 + (F k 1) + (F k 1 1) = F k+1 1 Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 8 / 58

Time Complexity II Notation: T (k) Number of calls to Fibo including Fibo(0) and Fibo(1). Number of all nodes in the recursion tree. Initial values: T (1) = T (0) = 1. Recursive formula: T (k) = 1 + T (k 1) + T (k 2). The sequence: 1, 1, 3, 5, 9, 15, 25, 41,... Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 9 / 58

Time Complexity II Solution: T (k) = 2F k+1 1 for k 0. Why? The number of leaves in a binary tree is one more than the number of internal nodes. Proof by induction: T (0) = 2F 1 1 = 1 T (1) = 2F 2 1 = 1 T (k) = 1 + T (k 1) + T (k 2) = 1 + (2F k 1) + (2F k 1 1) = 2F k+1 1 Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 10 / 58

Bottom-Up Dynamic Programming Fibo(k) F(0) = 0 F(1) = 1 for i = 2 to k do F(i) = F(i 1) + F(i 2) return(f(k)) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 11 / 58

Bottom-Up Dynamic Programming Fibo(k) F(0) = 0 F(1) = 1 for i = 2 to k do F(i) = F(i 1) + F(i 2) return(f(k)) Time and Memory Complexity: Θ(k) = Θ(log F k ). Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 11 / 58

Bottom-Up Dynamic Programming F(5) F(4) F(3) F(3) F(2) F(2) F(1) F(1) F(0) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 12 / 58

Saving Memory Fibo(k) x = 0 y = 1 for i = 2 to k do z = x + y x = y y = z return(z) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 13 / 58

Saving Memory Fibo(k) x = 0 y = 1 for i = 2 to k do z = x + y x = y y = z return(z) Time Complexity: Θ(k) = Θ(log F k ). Memory Complexity: Θ(1). Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 13 / 58

Top-Down Dynamic Programming Fibo(k) Fibo(0)= 0 Fibo(1)= 1 for i = 2 to k do Fibo(i)= F(k) (* the recursion *) F(k) if Fibo(k) then return(fibo(k)) else return(f(k 1)+F(k 2)) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 14 / 58

Top-Down Dynamic Programming Time and Memory Complexity: Θ(k) = Θ(log F k ). Advantage: It is easier to imitate the recursive solution. Disadvantage: Same complexity but less efficient with both time and memory. Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 15 / 58

Binomial Coefficients Notation: C(n, k) = ( n k). The number of size k subsets from the set {1, 2,..., n}. The coefficient of x k in the expansion of (1 + x) n. Recursive Definition: C(n, k) = ( ) n k = 1 for k = 0 1 for k = n ( n 1 ) ( k 1 + n 1 ) k for 0 < k < n 0 otherwise Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 16 / 58

The Pascal Triangle 1 1 1 1 2 1 1 3 3 1 1 4 6 4 1 1 5 10 10 5 1 1 6 15 20 15 6 1 Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 17 / 58

Binomial Coefficients Some Facts ( n ) k = n! k!(n k)! = n(n 1) (n k+1) k(k 1) 1. ( n k) = ( n n k). ( n k ) k n k n 1 k 1 n k+1 1 (n k + 1) k. ( n ) k = Θ(n k ) for a fixed k. Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 18 / 58

Top-Down Recursive Computation C(n, k) (* integers 0 k n *) if k = 0 then return(1) if k = n then return(1) return(c(n 1, k 1)+C(n 1, k)) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 19 / 58

Top-Down Recursive Computation C(n, k) (* integers 0 k n *) if k = 0 then return(1) if k = n then return(1) return(c(n 1, k 1)+C(n 1, k)) Memory complexity: Θ(n). Because there exists a path of length n 1 in the recursion tree (RR...RLL...L). Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 19 / 58

The Recursion Tree (5,2) (4,1) (4,2) (3,0) (3,1) (3,1) (3,2) (2,0) (2,1) (2,0) (2,1) (2,1) (2,2) (1,0) (1,1) (1,0) (1,1) (1,0) (1,1) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 20 / 58

Time Complexity I Notation: T (n, k) Number of calls to C(i, j) excluding C(i, i) and C(i, 0). Number of internal nodes in the recursion tree. Initial values: T (i, i) = T (i, 0) = 0 for 0 i n. Recursive formula: T (n, k) = 1 + T (n 1, k 1) + T (n 1, k). Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 21 / 58

Time Complexity I - the Triangle 0 0 0 0 1 0 0 2 2 0 0 3 5 3 0 0 4 9 9 4 0 0 5 14 19 14 5 0 Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 22 / 58

Time Complexity I Guess: T (n, k) = ( n k) 1. Why? The recursive formula for T (n, k) is very similar to the formula for ( n k). Proof by induction: ( ) i T (i, i) = 1 = 0 i ( ) i T (i, 0) = 1 = 0 0 T (n, k) = 1 + T (n 1, k 1) + T (n 1, k) (( ) ) (( ) ) n 1 n 1 = 1 + 1 + 1 k 1 k ( ) n = 1. k Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 23 / 58

Time Complexity II Notation: T (n, k) Number of calls to C(i, j) including C(i, i) and C(i, 0). Number of all nodes in the recursion tree. Initial values: T (i, i) = T (i, 0) = 1 for 0 i n. Recursive formula: T (n, k) = 1 + T (n 1, k 1) + T (n 1, k). Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 24 / 58

Time Complexity II the Triangle 1 1 1 1 3 1 1 5 5 1 1 7 11 7 1 1 9 19 19 9 1 1 11 29 39 29 11 1 Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 25 / 58

Time Complexity II Solution: T (n, k) = 2 ( n k) 1. Why? The number of leaves in a binary tree is one more than the number of internal nodes. Proof by induction: ( ) i T (i, i) = 2 1 = 1 i ( ) i T (i, 0) = 2 1 = 1 0 T (n, k) = 1 + T (n 1, k 1) + T (n 1, k) ( ( ) ) ( ( ) ) n 1 n 1 = 1 + 2 1 + 2 1 k 1 k ( ) n = 2 1. k Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 26 / 58

Bottom-Up Dynamic Programming C(n, k) (* integers 0 k n *) for i = 0 to n do c(i, i) = 1 c(i, 0) = 1 for i = 2 to n do for j = 1 to min {i 1, k} do c(i, j) = c(i 1, j 1) + c(i 1, j) return(c(n, k)) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 27 / 58

Bottom-Up Dynamic Programming C(n, k) (* integers 0 k n *) for i = 0 to n do c(i, i) = 1 c(i, 0) = 1 for i = 2 to n do for j = 1 to min {i 1, k} do c(i, j) = c(i 1, j 1) + c(i 1, j) return(c(n, k)) Time and Memory Complexity: Θ(kn). Possible with only Θ(n) memory. Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 27 / 58

Bottom-Up Dynamic Programming (5,2) (4,1) (4,2) (3,0) (3,1) (3,2) (2,0) (2,1) (2,2) (1,0) (1,1) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 28 / 58

Top-Down Dynamic Programming C(n, k) (* integers 0 k n *) for i = 0 to n do c(i, i) = 1 c(i, 0) = 1 for i = 2 to n do for j = 1 to min {i 1, k} do c(i, j) = return(cc(n, k)) (* the recursion *) CC(n, k) if c(n, k) = then return(cc(n 1, k 1)+CC(n 1, k)) else return(c(n, k)) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 29 / 58

Top-Down Dynamic Programming C(n, k) (* integers 0 k n *) for i = 0 to n do c(i, i) = 1 c(i, 0) = 1 for i = 2 to n do for j = 1 to min {i 1, k} do c(i, j) = return(cc(n, k)) (* the recursion *) CC(n, k) if c(n, k) = then return(cc(n 1, k 1)+CC(n 1, k)) else return(c(n, k)) Time and Memory Complexity: Θ(kn). Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 29 / 58

The 0-1 Knapsack Problem Input: Output: n items I 1,..., I n. The value of item I i is v i and its weight is w i. A maximum weight W. A set of items whose total weight is at most W and whose total value is maximum. The set either includes all of item I i or not. Assumption: All the weights and W are positive integers. Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 30 / 58

Dynamic Programming Sub-problems: v(i, w) for 0 i n and 0 w W. v(i, w) is the maximum value for items {I 1,..., I i } and maximum weight w. The final solution is v(n, W ). Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 31 / 58

Dynamic Programming Initial values: v(i, 0) = 0 for 0 i n. v(0, w) = 0 for 0 w W. Recursive formula: v(i, w) = v(i 1, w) if w i > w. v(i, w) = v(i 1, w) if v(i 1, w) v i + v(i 1, w w i ). v(i, w) = v i + v(i 1, w w i ) if v(i 1, w) < v i + v(i 1, w w i ). Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 32 / 58

A Top-Down Recursive Implementation Rec-Knapsack(i, w) if i = 0 then return(0) if w = 0 then return(0) otherwise if w < w i then return(rec-knapsack(i 1, w)) else x = Rec-Knapsack(i 1, w) y = v i +Rec-Knapsack(i 1, w w i ) return(max {x, y}) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 33 / 58

A Top-Down Recursive Implementation Initial call: Rec-Knapsack(n, W ). Time complexity: Could be exponential in n. Memory complexity: Θ(n) because the depth of the recursion is at most n. Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 34 / 58

A Bottom-Up Implementation Knapsack(W, (v 1, w 1 ),..., (v n, w n )) for i = 0 to n do v(i, 0) = 0 for w = 0 to W do v(0, w) = 0 for i = 1 to n do for w = 1 to W do if w w i and v i + v(i 1, w w i ) > v(i 1, w) then v(i, w) = v i + v(i 1, w w i ) else v(i, w) = v(i 1, w) return(v(n, W )) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 35 / 58

A Bottom-Up Implementation Time and memory complexity: Θ(nW ). Remark: This is a weakly polynomial algorithm since the complexity depends on the value of the input (W ) and not only on the size of the input (n). Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 36 / 58

Finding the Optimal Set of Items Definition: B(i, w) is true iff item I i is included in the optimal set for the sub-problem v(i, w). Implementation:. if w w i and v i + v(i 1, w w i ) > v(i 1, w) then v(i, w) = v i + v(i 1, w w i ) and B(i, w) = true else v(i, w) = v(i 1, w) and B(i, w) = false. Complexity: The same as for computing v(i, w). Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 37 / 58

Finding the Optimal Set of Items Find-S(B(, ), W, w 1,..., w n ) S = w = W for i = n downto 1 do if B(i, w) is true then S = S {I i } w = w w i return(s) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 38 / 58

Finding the Optimal Set of Items Find-S(B(, ), W, w 1,..., w n ) S = w = W for i = n downto 1 do if B(i, w) is true then S = S {I i } w = w w i return(s) Output: S The optimal set of items. Time complexity: Θ(n). Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 38 / 58

Multiplying n Numbers Objective: Find C(n), the number of ways to compute x 1 x 2 x n. n multiplication order 2 (x 1 x 2 ) 3 (x 1 (x 2 x 3 )) ((x 1 x 2 ) x 3 ) 4 (x 1 (x 2 (x 3 x 4 ))) (x 1 ((x 2 x 3 ) x 4 )) ((x 1 x 2 ) (x 3 x 4 )) ((x 1 (x 2 x 3 )) x 4 ) (((x 1 x 2 ) x 3 ) x 4 ) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 39 / 58

Multiplying n Numbers Small n n C(n) 1 1 2 1 3 2 4 5 5 14 6 42 7 132 Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 40 / 58

Multiplying n Numbers Small n Recursive equation: Where is the last multiplication? Catalan numbers: C(n) = 1 n C(n) = n 1 k=1 C(k) C(n k) ( 2n 2 n 1 ). Asymptotic value: C(n) 4n n 3/2. C(n) C(n 1) 4 for n. Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 41 / 58

Multiplying Two Matrices A[pxq] X B[qxr] = C[pxr] Objective: Compute C[p r] = A[p q] B[q r]. Validity condition: Columns(A) = Rows(B). Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 42 / 58

Multiplying Two Matrices Matrix-Multiply(A[p q], B[q r]) for i = 1 to p for j = 1 to r C[i, j] = 0 for k = 1 to q C[i, j] = C[i, j] + A[i, k] B[k, j] return(c[p r]) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 43 / 58

Multiplying Two Matrices Matrix-Multiply(A[p q], B[q r]) for i = 1 to p for j = 1 to r C[i, j] = 0 for k = 1 to q C[i, j] = C[i, j] + A[i, k] B[k, j] return(c[p r]) Complexity: Number of multiplications is equal to number of additions: pqr Total number of operations is Θ(pqr). Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 43 / 58

Multiplying n Matrices Objectives: Compute efficiently A[p 0 p n ] = A 1 [p 0 p 1 ] A 2 [p 1 p 2 ] A n [p n 1 p n ] Find m(a) - the minimal number of scalar multiplications needed to compute A. Claim: The order matters! Problem: It is impossible to check all the 4n n 3/2 ways. Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 44 / 58

Example A[10 50] = A 1 [10 100] A 2 [100 5] A 3 [5 50] A = ((A 1 A 2 ) A 3 ) m(a) = 5000 + m(a 12 [10 5] A 3 [5 50]) = 5000 + 2500 = 7500. A = (A 1 (A 2 A 3 )) m(a) = 25000 + m(a 1 [10 100] A 23 [100 50]) = 25000 + 50000 = 75000. Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 45 / 58

The Recursive Solution Notation: A i,j [p i 1 p j ] = A i [p i 1 p i ] A j [p j 1 p j ]. m[i, j] - minimal number of operations to compute A i,j. Initial values: A i,i = A i and m[i, i] = 0. Final solution: A 1,n and m[1, n]. Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 46 / 58

The Recursive Solution Key observation: If the last multiplication is A i,j [p i 1 p j ] = A i,k [p i 1 p k ] A k+1,j [p k p j ] then A i,k and A k+1,j should be computed optimally. The cost of this last multiplication is m[i, k] + m[k + 1, j] + p i 1 p k p j. The recursive formula: m[i, j] = min i k<j { m[i, k] + m[k + 1, j] + pi 1 p k p j }. Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 47 / 58

A Bottom-Up Computation There are n + ( n 2) = Θ(n 2 ) sub-problems. The computation order is by the difference j i. S[i, j] - the last multiplication in computing m[i, j]. Input: The matrices dimensions P = p 0, p 1,..., p n. Complexity: Θ(n) complexity per each sub-problem Θ(n 3 ) overall complexity. Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 48 / 58

A Bottom-Up Computation Matrix-Chain-Order(P) for i = 1 to n do m[i, i] = 0 for d = 1 to n 1 do for i = 1 to n d do j = i + d m[i, j] = for k = i to j 1 do q = m[i, k] + m[k + 1, j] + p i 1 p k p j if q < m[i, j] then m[i, j] = q S[i, j] = k return(m, S) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 49 / 58

Example j 1 6 1 15125 5 2 11875 10500 4 3 9375 7125 5375 3 4 7875 4375 2500 3500 2 5 15750 2625 750 1000 5000 0 0 0 0 0 0 6 i P = p 0, p 1, p 2, p 3, p 4, p 5, p 6 = 30, 35, 15, 5, 10, 20, 25 Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 50 / 58

Example m[1, 3] = min m[1, 1] + m[2, 3] + p 0 p 1 p 3 = 0 + 2625 + 5250 = 7875 m[1, 2] + m[3, 3] + p 0 p 2 p 3 = 15750 + 0 + 2250 = 18000 S[1, 3] = 1 m[2, 5] = min m[2, 2] + m[3, 5] + p 1 p 2 p 5 = 0 + 2500 + 10500 = 13000 m[2, 3] + m[4, 5] + p 1 p 3 p 5 = 2625 + 1000 + 3500 = 7125 m[2, 4] + m[5, 5] + p 1 p 4 p 5 = 4375 + 0 + 7000 = 11375 S[2, 5] = 3 Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 51 / 58

How to Multiply? How to compute A 1,n? Compute all the A i,j needed to compute A 1,n. Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 52 / 58

How to Multiply? How to compute A 1,n? Compute all the A i,j needed to compute A 1,n. Matrix-Chain-Multiply(i, j) if j > i then k = S[i, j] X = Matrix-Chain-Multiply(i, k) Y = Matrix-Chain-Multiply(k + 1, j) return(matrix-multiply(x, Y )) else return(a i ) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 52 / 58

How to Multiply? How to compute A 1,n? Compute all the A i,j needed to compute A 1,n. Matrix-Chain-Multiply(i, j) if j > i then k = S[i, j] X = Matrix-Chain-Multiply(i, k) Y = Matrix-Chain-Multiply(k + 1, j) return(matrix-multiply(x, Y )) else return(a i ) Complexity: m[i, j] operations. Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 52 / 58

Example j 6 1 3 5 2 3 3 4 3 3 3 3 3 4 1 3 3 5 2 5 1 2 3 4 5 P = 30, 35, 15, 5, 10, 20, 25 i Initially (A 1 A 2 A 3 A 4 A 5 A 6 ) S[1, 6] = 3 ((A 1 A 2 A 3 ) (A 4 A 5 A 6 )) S[1, 3] = 1 ((A 1 (A 2 A 3 )) (A 4 A 5 A 6 )) S[4, 6] = 5 ((A 1 (A 2 A 3 )) ((A 4 A 5 ) A 6 )) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 53 / 58

The Recursive Top-Down Computation Recursive-Matrix-Chain(i, j) if i < j then m = for k = i to j 1 do q = R-M-C(i, k) + R-M-C(k + 1, j) +p i 1 p k p j if q < m then m = q return(m) else return(0) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 54 / 58

The Recursive Top-Down Computation Recursive-Matrix-Chain(i, j) if i < j then m = for k = i to j 1 do q = R-M-C(i, k) + R-M-C(k + 1, j) +p i 1 p k p j if q < m then m = q return(m) else return(0) Initial call: Recursive-Matrix-Chain(1, n) Complexity: Check all possibilities Ω ( 4 n n 3/2 ). Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 54 / 58

Where to Save? 1..4 1..1 2..4 1..2 3..4 1..3 4..4 2..2 3..4 2..3 4..4 1..1 2..2 3..3 4..4 1..1 2..3 1..2 3..3 3..3 4..4 2..2 3..3 2..2 3..3 1..1 2..2 Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 55 / 58

A Top-Down Dynamic Programming Initial values and first call: Memoized-Matrix-Chain(P) for i = 1 to n do m[i, i] = 0 for j = i + 1 to n do m[i, j] = return(lookup-chain(1, n)) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 56 / 58

A Top-Down Dynamic Programming The recursive procedure: LookUp-Chain(i, j) if m[i, j] = then for k = i to j 1 do q = L-C(i, k)+l-c(k + 1, j) +p i 1 p k p j if q < m[i, j] then m[i, j] = q return(m[i, j]) Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 57 / 58

A Top-Down Dynamic Programming Complexity Number of recursive calls is Θ(n 2 ). Each call takes Θ(n) time. All together, Θ(n 3 ) time. Amotz Bar-Noy (CUNY) Dynamic Programming Spring 2012 58 / 58