Special Topics on Algorithms Fall 2017 Dynamic Programming. Vangelis Markakis, Ioannis Milis and George Zois

Size: px
Start display at page:

Download "Special Topics on Algorithms Fall 2017 Dynamic Programming. Vangelis Markakis, Ioannis Milis and George Zois"

Transcription

1 Special Topics on Algorithms Fall 2017 Dynamic Programming Vangelis Markakis, Ioannis Milis and George Zois

2 Basic Algorithmic Techniques Content Dynamic Programming Introduc<on Back to Fibonacci numbers Weighted Interval Scheduling Coin Changing Maximum Sub-array Edit Distance All-Pairs Shortest Paths

3 Richard Bellman (1953) Dynamic Programming Etymology (at that <me; Bellman was studying mul<-stage decision processes) Dynamic: rela<ng to <me Programming : what to do and when to do it Dynamic Programming: planning over <me Bellman gave an impressive name to be accepted by the Secretary of Defense (Wilson) who didn t like math research

4 Define sub-problems of the same structure with the original Overlapping sub-problems Op<mal substructure the op<mal solu<on includes/can be constructed from the op<mal solu<on to its sub-problems Solve the (sub)problem(s) star<ng from trivial ones write a recursive formula for the op<mal solu<on This gives an order of subproblems such that one can be solved given the answers of smaller ones (appearing earlier in this order) - A?en@on: We are not going to solve the problem via recursion Translate the recursive formula into an itera<ve algorithm Use a table to save intermediate results for later use Get the value of the op<mal solu<on Find the solu<on itself Dynamic Programming

5 Algorithm design methods DIVIDE AND CONQUER Non overlapping sub-problems Recursion can be used GREEDY A sub-problem defines the next one A single (greedy) choice DYNAMIC PROGRAMMING Overlapping sub-problems Recursion is forbidden Many choices for a sub-problem OPTIMAL SUB-STRUCTURE

6 Algorithm design methods DIVIDE AND CONQUER Non overlapping sub-problems Recursion can be used Tree GREEDY A sub-problem defines the next one A single (greedy) choice Chain DYNAMIC PROGRAMMING Overlapping sub-problems Recursion is forbidden Many choices for a sub-problem OPTIMAL SUB-STRUCTURE DAG

7 Fibonacci numbers Recall the Fibonacci sequence: F 0 = 0; F 1 = 1; F n = F n-1 + F n-2, n 2 Direct implementa<on of recursion: Algorithm fib1(n) // Direct implementa<on of recursion if n<2 then return n else return fib1(n-1)+ fib1(n-2)

8 Fibonacci numbers Recursion tree: 3 rd 2 nd st call 6 17 th 4 12 th 18 th rd Recursion? 5 th 1 4 th th 7 th 9 th 13 th 19 th 22 nd th th 10 th 11 th 14 th 15 th th th call th 21 st No, thanks! Τ(n) > 2 n/2

9 Fibonacci numbers Itera<ve version (non-recursive): Use a table to store intermediate values Algorithm fib2(n) //remember already computed values f[0]:=0; f[1]:=1; for i:=2 to n do f[i]:= f[i-1] + f[i-2] Note however: Time Complexity: O(n) NOT polynomial in I = O(logn) Space complexity: Also O(n) (but we could do it with 3 memory cells = O(1), as we saw before) Dynamic programming does not always yield polynomial <me algorithms

10 (Interval Scheduling) Suppose we want to schedule some jobs (e.g., courses) that need to use a common resource (e.g., a classroom) No 2 jobs can be scheduled at the same <me Interval Scheduling I: A set Α, of n jobs, each with a start <me s i and a finish <me f i Q: Find a feasible schedule with the maximum possible number of jobs (maximum throughput) Comment: There can be many op<mal solu<ons, scheduling different jobs each. We do not care here which op<mal solu<on we find For an instance I, we let OPT denote the op<mal schedule

11 (Interval Scheduling) OPT = 3

12 Some Algorithm Ideas ALG_1: Choose in each round the task that starts at the earliest possible feasible <me ALG_1=1, OPT=4 ALG_2: Choose in each round the shortest feasible task among the remaining ones. ALG_2=1, OPT=2 Other ideas?

13 Α greedy algorithm Rename the jobs so that f 1 f 2 f 3 f n Choose first the job with the earliest finish <me, f 1 Remove those that overlap with job 1 Con<nue in the same manner, choosing the earliest finish <me among the remaining ones Clearly a polynomial <me algorithm Proof of op@mality by induc@on or by contradic@on

14 Correctness How do we prove that a greedy algorithm is op<mal? Usually proof by contradic<on or by induc<on. But, the crucial property for a greedy algorithm to be op<mal, is that the problem should sa<sfy the op@mal substructure property. Op@mal substructure in general: A problem sa<sfies op#mal substructure if an op<mal solu<on to a problem contains within it op<mal solu<ons to subproblems. Op@mal Substructure for Ac@vity Selec@on: An op<mal solu<on (that contains the task with the earliest finishing <me), contains the op<mal solu<on for the tasks Α = { i Α : s i f 1 } (why?)

15 Correctness Theorem: Choosing in every round the job with the earliest finish <me produces an op<mal solu<on for the Ac<vity Selec<on problem.

16 Correctness Theorem: Choosing in every round the job with the earliest finish <me produces an op<mal solu<on for the Ac<vity Selec<on problem.

17 Weighted Interval Scheduling Weighted version: each job has a weight, may correspond to value or profit that we derive from the execu<on of each job Same constraints as before (no 2 jobs can be scheduled at the same <me) Weighted Interval Scheduling I: A set Α, of n jobs, each with a start <me s i, a finish <me f i, and a weight w i Q: Find a feasible schedule with the maximum possible total weight

18 Weighted Interval Scheduling The greedy algorithm we saw before does not work any more It may be beneficial to select just one job of high value than maximize the number of non-overlapping jobs Actually, no other greedy approach is known for this problem Dynamic Programming approach: We need to iden<fy if op<mal substructure occurs: A problem sa<sfies the op#mal substructure property if an op<mal solu<on to a problem contains within it op<mal solu<ons to subproblems Warmup: Reorder the jobs so that f 1 f 2 f 3 f n Let O j = op<mal schedule if we had only the jobs {1, 2,..., j} Let OPT(j) = total weight of the op<mal solu<on O j OPT( j) = w i O i j

19 Weighted Interval Scheduling Idea: try to find a recursive formula We need to relate OPT(j) with the op<mal values for smaller instances Defini<on: Let p(j) = largest index i, with i < j such that jobs i and j do not overlap i.e., the jobs p(j)+1, p(j)+2,...up to j-1 overlap with j Why is this useful? Consider an op<mal solu<on O j for {1, 2,..., j}. 2 observa<ons: If j is included in O j, then O j also contains an op<mal solu<on for {1, 2,..., p(j)} since there is no overlap with such jobs If j is not included in O j, then OPT(j) = OPT(j-1) Hence: OPT(j) = max{ w j + OPT(p(j)), OPT(j-1) }, for every j 1

20 Weighted Interval Scheduling This directly yields a recursive algorithm: Algorithm WIS1(n) // suppose we have pre-computed the values p(j) for every j if n=0 return 0; else return max(v n + WIS1(p(n)), WIS1(n-1)) To be more precise: The input to the algorithm consists of the vectors s = (s 1, s 2,..., s n ), the start <mes f = (f 1, f 2,..., f n ), the finish <mes (assume we have ordered them) w = (w 1, w 2,..., w n ), the weights WIS1(j) means the execu<on of the algorithm on the first j jobs Complexity: Recursion tree grows exponen<ally Same problem as with recursive algorithm for Fibonacci

21 Weighted Interval Scheduling Memoiza<on: Use an array to remember already computed values Loop through the array to compute the op<mal values to all subproblems Algorithm WIS2(n) Set M[0]=0; Compute the values p(j) for every j for j = 1 to n do M[j] = max(v j + M[p(j)], M[j-1]) return M[n] Complexity: Within each itera<on, we need only O(1) Hence O(n) total

22 Weighted Interval Scheduling The algorithm only computes the value of the op<mal solu<on What if we want to find the schedule as well We could use a different array S, so that S[i] maintains the op<mal solu<on up to {1,...,i} But this causes some blowup We can instead recover the solu<on from M (why?) Summarizing: Theorem: We can solve the Weighted Interval Scheduling problem in <me O(n) O(nlogn) if the finish <mes of the jobs are not sorted

23 Coin Changing Input: A value Ε and a set of n coins of values V={d 1, d 2, d 3,,d n } Question: Give the minimum number of coins to exchange the value E Note that: 1. We study inputs where the minimum value of a coin is 1 THEN there is always an optimal solution to the problem e.g., E = 13, V= {10, 6, 2 }, No solution! 2. Optimal Substructure : Let the optimal solution for value E using a coin of d i, then it must also contain the optimal solution for value E-d i 3. Greedy algorithm: Always choose the coin with the maximum possible value. Does not always gives the optimal solution: E = 40, V= {25, 20, 10, 5,1 }: E = , c=3, E = 2 20, OPT =2 OTHER IDEAS? DYNAMIC PROGRAMMING OPT = optimal

24 Coin Changing Sub-problem? c[j]: The minimum number of coins for value j c(e): The initial problem If the optimal solution for value j uses a coin of value d i, then c[j] = 1 + c[j d i ] obviously c[0]= 0 We check all the values in V={d 1, d 2, d 3,,d n } n We study inputs with minimum value equal to 1

25 Coin Changing Change(E,d[1..n]) c[0]=0 for j = 1 to E: c[j] = for i = 1 to n: if j d i and 1+c[j d i ] < c[j]: c[j] = 1+c[j d i ] return c Avoid examining c[j] for j<0 by ensuring that j d i Complexity: O(nΕ) Νοt O(poly)!

26 Coin Changing Example: E=10, {1,3,4} c[0] =0 c[1] =1+ min { c[1-1], c[1-3], c[1-4] } =1, c[2] =1+ min { c[2-1], c[2-3], c[2-4] } =2, c[3] =1+ min { c[3-1], c[3-3], c[3-4] } =1, c[4] =1+ min { c[4-1], c[4-3], c[4-4] } =1, c[5] =1+ min { c[5-1], c[5-3], c[5-4] } =2, c[6] =1+ min { c[6-1], c[6-3], c[6-4] } =2, c[7] =1+ min { c[7-1], c[7-3], c[7-4] } =2, c[8] =1+ min { c[8-1], c[8-3], c[8-4] } =2, c[9] =1+ min { c[9-1], c[9-3], c[9-4] } =3, denom[1]=1 denom[2]=1 denom[3]=3 denom[4]=4 denom[5]=1 denom[6]=3 denom[7]=3 denom[8]=4 denom[9]=1 c[10]=1+ min { c[10-1], c[10-3], c[10-4] } =3, denom[10]=3

27 Coin Changing Which Coins? denom[1..n]: The value of the coin that gave the optimal solution for value j Change(E,d[1..n]) c[0]=0 for j = 1 to E: c[j] = for i = 1 to n: if j d i and 1+c[j d i ] < c[j]: Complexity: (ne) c[j] = 1+c[j d i ] denom[j]=d i j=e while j>0: write denom[j] j= j-denom[j] Not O(poly I )!

28 Maximum Sub-array (MSA) Maximum Sub-Array (MSA): I: Array of numbers Α[1..n] Q: Find a sub-array Α[p..q] with a maximum sum of its elements Hence, we are looking for indices p, q, so that the sub-array A[p..q] maximizes the quantity Example: Profit history V(p, q) = A(i) i=p q Year Profit We want the period of years with the greatest profit: V(5,8)=9

29 MCS: Dynamic Programming Maximum Sub-array (MSA) We need to find a recurrence Let E(i) be the value of the maximum sequence ending in posi<on i Observa<on: The MSA is one of the E(i) s, that is Vmax = max i { E(i) } The problem is then reduced to the calcula<on of the E(i) s DP: Find the E(i) based on E(i-1) (exploit op<mal substructure) E(i) has to contain A(i) Two cases for E(i): Either it contains only A[i]: E(i) = A[i] Or it contains the op<mal solu<on E(i-1): E(i) = E(i-1) + A[i] Hence: E(i) = max { E(i-1)+A[i], A[i] } E(1)=A[1]

30 Maximum Sub-array (MSA) Example: E(i) = max { E(i-1)+A[i], A[i] }, E(1)=A[1] A[1..n] Maximum Sum for any sub Array ending at i th location E[1..n] Vmax Maximum so far

31 Maximum Sub-array (MSA) E(i) = max { E(i-1)+A[i], A[i] }, E(1)=A[1] MSA(A[1..n]) E(1)=A[1], Vmax=A(1) for i = 2 to n do E(i) = E(i-1)+A(i) if E(i) < A(i) then E(i) = A(i) if E(i) > Vmax then Vmax = E(i) Time complexity: O(n) Space Complexity: O(n) (for the array E) O(1) if we remove the indices

32 Maximum Sub-array (MSA) What about the indices p, q of the op<mal solu<on? Let P(i) be the start index of E(i) MSA(A[1..n]) E(1)=A[1], Vmax=A(1), P(i)=1 For i = 2 to n do E(i) = E(i-1)+A(i), P(i)=P(i-1) if E(i) < A(i) E(i) = A(i) P(i) = i if E(i) > Vmax Vmax = E(i) p=p(i) q=i Time complexity: O(n) Space Complexity: O(n) (for arrays E, P), O(1) if we remove the indices

33 Edit distance Edit distance between two strings = the minimum number of edits insertions deletions substitutions to transform the first string into the second Alignment: writing the strings one above the other using spaces (-) # of edits = # of columns in which the characters of the strings differ Edit distance: the minimum cost over all possible alignments

34 Edit distance Example # of edits=3 # of edits=5 insert U insert S substitute O with N substitute S with U delete W delete O delete W insert N Too many possible alignments between two strings! How can we find the best alignment?

35 Edit distance Dynamic Programming Strings X[1..m] and Y[1..n] X[1..i], i m-1 is a prefix of X Y[1..i], i n-1 is a prefix of Y Subproblem: E[i,j]) = Minimum Edit Distance between X[1..i] and Y[1..j] Optimal substructure: Consider the rightmost column of an optimal alignment In each of the cases (i) to (iv) below the Edit Distance of the involved prefixes must be minimum: (i) Match: E(i,j)= E(i-1,j-1) (ii) Substitute: E(i,j)= E(i-1,j-1) +1 (iii) Insert: E(i,j)= E(i-1,j)+1 (iv) Delete: E(i,j)= E(i,j-1) +1 Edit distance between X and Y = E[m,n] First we ll find the Edit distance, then the edits themselves

36 Edit distance Three choices for the rightmost column of an alignment: 1 + E(i-1, j) 1+ E(i, j-1) diff(i, j) + E(i-1, j-1) where 1 if x[i] y[j] diff = 0 if x[i] = y[j] Base cases: E(0,j) = j : alignment of empty string and Y[1..j] E(i,0) = i : alignment of X[1..i] and empty string

37 Edit distance The subproblem E(7, 5), diff =0 N or - or N - N N E(7,5) = min {1+ E(6,5), 1+ E(7,4), 0+E(6,4) }

38 Edit distance Fill a two-dimensional array by solving subproblems Row by row or column by column match or subst delete insert x x x x x x = 6 edits

39 Edit distance Complexity? Extent the algorithm to find the actual edits

40 Shortest paths in DAGs Dag G=(V,E) Topological sorting of G We know that there are no nega<ve weight cycles in such graphs We apply a dynamic programming approach aler we find a topological sor<ng of the ver<ces Define the op<mal substructure and a recursive formula? d( u) = min v Γ ( u) { w( v, u) + d( v)}, d(s) = 0

41 Shortest paths in DAGs d( u) = min v Γ ( u) { w( v, u) + d( v)}, d(s) = 0 for each u V do { d(u}:= ; pred(u):= null } d(s):=0; Find a topological sorting of G for each u V {s} in topological order do {for each v Γ - (u) do if w(v,u)+d(v) < d(u) then { d(u):= w(v,u)+d(v) ; pred(u):= v; } } Complexity? Nega@ve weights? O(n+m) No problem!

42 All pairs shortest paths All-pairs shortest paths I: A weighted graph G = (V, E, w) Q: A shortest path for every pair of nodes Run Bellman-Ford n <mes O(n 2 m) Can we do bezer? First, we extend the weight func<on to all pairs: w( u, v) = 0 w( e) if if if u u u = v v v and and e e = = ( u, v) ( u, v) E E

43 All pairs shortest paths We will again use a dynamic programming approach but a different one from before Suppose we name the ver<ces as 1, 2,..., n d(u, v, k):= shortest path from node u to node v using only nodes {1,2,, k} as intermediates u d(u,v,k-1) o o v d(u,k,k-1) o k d(k,v,k-1) d( u, v,0) = w( u, v) d( u, v, k) = min{ d( u, v, k 1), d( u, k, k 1) + d( k, v, k 1)}

44 All pairs shortest paths d( u, v,0) = w( u, v) d( u, v, k) = min{ d( u, v, k 1), d( u, k, k 1) + d( k, v, k 1)} We will gradually find d(u,v,0), d(u,v,1), d(u,v,2),, d(u,v,n) for all u, v Idea of the algorithm implementa@on: Use d(u, v) as the current es<mate, ini<ally set to w(u, v) Think of it as filling up a n x n array for each k Itera<on 1: update d(u, v) to equal the shortest path length using node 1 as intermediate... Itera<on k: update d(u, v) to equal the shortest u-v path with {1, 2,...,k} as intermediates Con<nue un<l itera<on n d(u, v) = min{d(u, v), d(u, k)+ d(k, v)}

45 All pairs shortest paths Algorithm Floyd-Warshall(G) for u:=1 to n do for v:=1 to n do { d(u,v):= w(u,v); pred(u,v):= null } for k:=1 to n-1 do for u:=1 to n do for v:=1 to n do if d(u,k) + d(k,v) < d(u,v) then { d(u,v):= d(u,k) + d(k,v); pred(u,v):= k } pred(u,v): to be used for extrac<ng the u-v shortest path Complexity: O(n 3 )

46 All pairs shortest paths

47 All pairs shortest paths Extract the u-v shortest path Path (u,v); { if pred(u,v)=null then output (u,v) else {Path(u,pred(u,v)), Path(pred(u,v),v) } }

memoization or iteration over subproblems the direct iterative algorithm a basic outline of dynamic programming

memoization or iteration over subproblems the direct iterative algorithm a basic outline of dynamic programming Dynamic Programming 1 Introduction to Dynamic Programming weighted interval scheduling the design of a recursive solution memoizing the recursion 2 Principles of Dynamic Programming memoization or iteration

More information

Algorithmic Paradigms

Algorithmic Paradigms Algorithmic Paradigms Greedy. Build up a solution incrementally, myopically optimizing some local criterion. Divide-and-conquer. Break up a problem into two or more sub -problems, solve each sub-problem

More information

Algorithm Design and Analysis

Algorithm Design and Analysis Algorithm Design and Analysis LECTURE 16 Dynamic Programming (plus FFT Recap) Adam Smith 9/24/2008 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Discrete Fourier Transform

More information

CMSC 451: Lecture 10 Dynamic Programming: Weighted Interval Scheduling Tuesday, Oct 3, 2017

CMSC 451: Lecture 10 Dynamic Programming: Weighted Interval Scheduling Tuesday, Oct 3, 2017 CMSC 45 CMSC 45: Lecture Dynamic Programming: Weighted Interval Scheduling Tuesday, Oct, Reading: Section. in KT. Dynamic Programming: In this lecture we begin our coverage of an important algorithm design

More information

Dynamic Programming. Introduction, Weighted Interval Scheduling, Knapsack. Tyler Moore. Lecture 15/16

Dynamic Programming. Introduction, Weighted Interval Scheduling, Knapsack. Tyler Moore. Lecture 15/16 Dynamic Programming Introduction, Weighted Interval Scheduling, Knapsack Tyler Moore CSE, SMU, Dallas, TX Lecture /6 Greedy. Build up a solution incrementally, myopically optimizing some local criterion.

More information

Input: n jobs (associated start time s j, finish time f j, and value v j ) for j = 1 to n M[j] = empty M[0] = 0. M-Compute-Opt(n)

Input: n jobs (associated start time s j, finish time f j, and value v j ) for j = 1 to n M[j] = empty M[0] = 0. M-Compute-Opt(n) Objec&ves Dnamic Programming Ø Wrapping up: weighted interval schedule Ø Ø Subset Sums Summar: Proper&es of Problems for DP Polnomial number of subproblems Solu&on to original problem can be easil computed

More information

Lectures 12 and 13 Dynamic programming: weighted interval scheduling

Lectures 12 and 13 Dynamic programming: weighted interval scheduling Lectures 12 and 13 Dynamic programming: weighted interval scheduling COMP 523: Advanced Algorithmic Techniques Lecturer: Dariusz Kowalski Lectures 12-13: Dynamic Programming 1 Overview Last week: Graph

More information

CMSC 451: Dynamic Programming

CMSC 451: Dynamic Programming CMSC 41: Dynamic Programming Slides By: Carl Kingsford Department of Computer Science University of Maryland, College Park Based on Sections 6.1&6.2 of Algorithm Design by Kleinberg & Tardos. Dynamic Programming

More information

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Dynamic Programming I Date: 10/6/16

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Dynamic Programming I Date: 10/6/16 600.463 Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Dynamic Programming I Date: 10/6/16 11.1 Introduction Dynamic programming can be very confusing until you ve used it a

More information

Dynamic Programming. Applications. Applications. Applications. Algorithm Design 6.1, 6.2, 6.3

Dynamic Programming. Applications. Applications. Applications. Algorithm Design 6.1, 6.2, 6.3 Set of weighted intervals with start and finishing times Goal: find maimum weight subset of non-overlapping intervals Dnamic Programming Algorithm Design.,.,. j j j j8 Given n points in the plane find

More information

Chapter 16. Greedy Algorithms

Chapter 16. Greedy Algorithms Chapter 16. Greedy Algorithms Algorithms for optimization problems (minimization or maximization problems) typically go through a sequence of steps, with a set of choices at each step. A greedy algorithm

More information

1 More on the Bellman-Ford Algorithm

1 More on the Bellman-Ford Algorithm CS161 Lecture 12 Shortest Path and Dynamic Programming Algorithms Scribe by: Eric Huang (2015), Anthony Kim (2016), M. Wootters (2017) Date: May 15, 2017 1 More on the Bellman-Ford Algorithm We didn t

More information

Chapter 6. Dynamic Programming. Modified from slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Chapter 6. Dynamic Programming. Modified from slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved. Chapter 6 Dynamic Programming Modified from slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved. 1 Think recursively (this week)!!! Divide & conquer and Dynamic programming

More information

Dynamic Programming. Nothing to do with dynamic and nothing to do with programming.

Dynamic Programming. Nothing to do with dynamic and nothing to do with programming. Dynamic Programming Deliverables Dynamic Programming basics Binomial Coefficients Weighted Interval Scheduling Matrix Multiplication /1 Knapsack Longest Common Subsequence 6/12/212 6:56 PM copyright @

More information

Unit-5 Dynamic Programming 2016

Unit-5 Dynamic Programming 2016 5 Dynamic programming Overview, Applications - shortest path in graph, matrix multiplication, travelling salesman problem, Fibonacci Series. 20% 12 Origin: Richard Bellman, 1957 Programming referred to

More information

Chapter 6. Dynamic Programming

Chapter 6. Dynamic Programming Chapter 6 Dynamic Programming CS 573: Algorithms, Fall 203 September 2, 203 6. Maximum Weighted Independent Set in Trees 6..0. Maximum Weight Independent Set Problem Input Graph G = (V, E) and weights

More information

4.1 Interval Scheduling

4.1 Interval Scheduling 41 Interval Scheduling Interval Scheduling Interval scheduling Job j starts at s j and finishes at f j Two jobs compatible if they don't overlap Goal: find maximum subset of mutually compatible jobs a

More information

CSC 373 Lecture # 3 Instructor: Milad Eftekhar

CSC 373 Lecture # 3 Instructor: Milad Eftekhar Huffman encoding: Assume a context is available (a document, a signal, etc.). These contexts are formed by some symbols (words in a document, discrete samples from a signal, etc). Each symbols s i is occurred

More information

Memoization/Dynamic Programming. The String reconstruction problem. CS124 Lecture 11 Spring 2018

Memoization/Dynamic Programming. The String reconstruction problem. CS124 Lecture 11 Spring 2018 CS124 Lecture 11 Spring 2018 Memoization/Dynamic Programming Today s lecture discusses memoization, which is a method for speeding up algorithms based on recursion, by using additional memory to remember

More information

Algorithmic Paradigms. Chapter 6 Dynamic Programming. Steps in Dynamic Programming. Dynamic Programming. Dynamic Programming Applications

Algorithmic Paradigms. Chapter 6 Dynamic Programming. Steps in Dynamic Programming. Dynamic Programming. Dynamic Programming Applications lgorithmic Paradigms reed. Build up a solution incrementally, only optimizing some local criterion. hapter Dynamic Programming Divide-and-conquer. Break up a problem into two sub-problems, solve each sub-problem

More information

Last week: Breadth-First Search

Last week: Breadth-First Search 1 Last week: Breadth-First Search Set L i = [] for i=1,,n L 0 = {w}, where w is the start node For i = 0,, n-1: For u in L i : For each v which is a neighbor of u: If v isn t yet visited: - mark v as visited,

More information

Outline. CS38 Introduction to Algorithms. Fast Fourier Transform (FFT) Fast Fourier Transform (FFT) Fast Fourier Transform (FFT)

Outline. CS38 Introduction to Algorithms. Fast Fourier Transform (FFT) Fast Fourier Transform (FFT) Fast Fourier Transform (FFT) Outline CS8 Introduction to Algorithms Lecture 9 April 9, 0 Divide and Conquer design paradigm matrix multiplication Dynamic programming design paradigm Fibonacci numbers weighted interval scheduling knapsack

More information

CS 170 DISCUSSION 8 DYNAMIC PROGRAMMING. Raymond Chan raychan3.github.io/cs170/fa17.html UC Berkeley Fall 17

CS 170 DISCUSSION 8 DYNAMIC PROGRAMMING. Raymond Chan raychan3.github.io/cs170/fa17.html UC Berkeley Fall 17 CS 170 DISCUSSION 8 DYNAMIC PROGRAMMING Raymond Chan raychan3.github.io/cs170/fa17.html UC Berkeley Fall 17 DYNAMIC PROGRAMMING Recursive problems uses the subproblem(s) solve the current one. Dynamic

More information

Chapter 6. Dynamic Programming

Chapter 6. Dynamic Programming Chapter 6 Dynamic Programming We began our study of algorithmic techniques with greedy algorithms, which in some sense form the most natural approach to algorithm design. Faced with a new computational

More information

1 Dynamic Programming

1 Dynamic Programming CS161 Lecture 13 Dynamic Programming and Greedy Algorithms Scribe by: Eric Huang Date: May 13, 2015 1 Dynamic Programming The idea of dynamic programming is to have a table of solutions of subproblems

More information

5.1 The String reconstruction problem

5.1 The String reconstruction problem CS125 Lecture 5 Fall 2014 5.1 The String reconstruction problem The greedy approach doesn t always work, as we have seen. It lacks flexibility; if at some point, it makes a wrong choice, it becomes stuck.

More information

Objec&ves. Review. Directed Graphs: Strong Connec&vity Greedy Algorithms

Objec&ves. Review. Directed Graphs: Strong Connec&vity Greedy Algorithms Objec&ves Directed Graphs: Strong Connec&vity Greedy Algorithms Ø Interval Scheduling Feb 7, 2018 CSCI211 - Sprenkle 1 Review Compare and contrast directed and undirected graphs What is a topological ordering?

More information

CS 473: Fundamental Algorithms, Spring Dynamic Programming. Sariel (UIUC) CS473 1 Spring / 42. Part I. Longest Increasing Subsequence

CS 473: Fundamental Algorithms, Spring Dynamic Programming. Sariel (UIUC) CS473 1 Spring / 42. Part I. Longest Increasing Subsequence CS 473: Fundamental Algorithms, Spring 2011 Dynamic Programming Lecture 8 February 15, 2011 Sariel (UIUC) CS473 1 Spring 2011 1 / 42 Part I Longest Increasing Subsequence Sariel (UIUC) CS473 2 Spring 2011

More information

CSE 421 Applications of DFS(?) Topological sort

CSE 421 Applications of DFS(?) Topological sort CSE 421 Applications of DFS(?) Topological sort Yin Tat Lee 1 Precedence Constraints In a directed graph, an edge (i, j) means task i must occur before task j. Applications Course prerequisite: course

More information

Solving NP-hard Problems on Special Instances

Solving NP-hard Problems on Special Instances Solving NP-hard Problems on Special Instances Solve it in poly- time I can t You can assume the input is xxxxx No Problem, here is a poly-time algorithm 1 Solving NP-hard Problems on Special Instances

More information

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Dynamic Programming

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Dynamic Programming Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 25 Dynamic Programming Terrible Fibonacci Computation Fibonacci sequence: f = f(n) 2

More information

CS583 Lecture 10. Graph Algorithms Shortest Path Algorithms Dynamic Programming. Many slides here are based on D. Luebke slides.

CS583 Lecture 10. Graph Algorithms Shortest Path Algorithms Dynamic Programming. Many slides here are based on D. Luebke slides. // S58 Lecture Jana Kosecka Graph lgorithms Shortest Path lgorithms Dynamic Programming Many slides here are based on D. Luebke slides Previously Depth first search DG s - topological sort - strongly connected

More information

Elements of Dynamic Programming. COSC 3101A - Design and Analysis of Algorithms 8. Discovering Optimal Substructure. Optimal Substructure - Examples

Elements of Dynamic Programming. COSC 3101A - Design and Analysis of Algorithms 8. Discovering Optimal Substructure. Optimal Substructure - Examples Elements of Dynamic Programming COSC 3A - Design and Analysis of Algorithms 8 Elements of DP Memoization Longest Common Subsequence Greedy Algorithms Many of these slides are taken from Monica Nicolescu,

More information

Algorithms Dr. Haim Levkowitz

Algorithms Dr. Haim Levkowitz 91.503 Algorithms Dr. Haim Levkowitz Fall 2007 Lecture 4 Tuesday, 25 Sep 2007 Design Patterns for Optimization Problems Greedy Algorithms 1 Greedy Algorithms 2 What is Greedy Algorithm? Similar to dynamic

More information

Op#miza#on Problems, John Gu7ag MIT Department of Electrical Engineering and Computer Science LECTURE 2 1

Op#miza#on Problems, John Gu7ag MIT Department of Electrical Engineering and Computer Science LECTURE 2 1 Op#miza#on Problems, John Gu7ag MIT Department of Electrical Engineering and Computer Science 6.0002 LECTURE 2 1 Relevant Reading for Today s Lecture Chapter 13 6.0002 LECTURE 2 2 The Pros and Cons of

More information

Shortest path problems

Shortest path problems Next... Shortest path problems Single-source shortest paths in weighted graphs Shortest-Path Problems Properties of Shortest Paths, Relaxation Dijkstra s Algorithm Bellman-Ford Algorithm Shortest-Paths

More information

Tutorial 6-7. Dynamic Programming and Greedy

Tutorial 6-7. Dynamic Programming and Greedy Tutorial 6-7 Dynamic Programming and Greedy Dynamic Programming Why DP? Natural Recursion may be expensive. For example, the Fibonacci: F(n)=F(n-1)+F(n-2) Recursive implementation memoryless : time= 1

More information

CSE 521: Algorithms. Dynamic Programming, I Intro: Fibonacci & Stamps. W. L. Ruzzo

CSE 521: Algorithms. Dynamic Programming, I Intro: Fibonacci & Stamps. W. L. Ruzzo CSE 521: Algorithms Dynamic Programming, I Intro: Fibonacci & Stamps W. L. Ruzzo 1 Dynamic Programming Outline: General Principles Easy Examples Fibonacci, Licking Stamps Meatier examples Weighted interval

More information

Applied Algorithm Design Lecture 3

Applied Algorithm Design Lecture 3 Applied Algorithm Design Lecture 3 Pietro Michiardi Eurecom Pietro Michiardi (Eurecom) Applied Algorithm Design Lecture 3 1 / 75 PART I : GREEDY ALGORITHMS Pietro Michiardi (Eurecom) Applied Algorithm

More information

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Shortest Paths Date: 10/13/15

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Shortest Paths Date: 10/13/15 600.363 Introduction to Algorithms / 600.463 Algorithms I Lecturer: Michael Dinitz Topic: Shortest Paths Date: 10/13/15 14.1 Introduction Today we re going to talk about algorithms for computing shortest

More information

CSE 421: Intro Algorithms. Winter 2012 W. L. Ruzzo Dynamic Programming, I Intro: Fibonacci & Stamps

CSE 421: Intro Algorithms. Winter 2012 W. L. Ruzzo Dynamic Programming, I Intro: Fibonacci & Stamps CSE 421: Intro Algorithms Winter 2012 W. L. Ruzzo Dynamic Programming, I Intro: Fibonacci & Stamps 1 Dynamic Programming Outline: General Principles Easy Examples Fibonacci, Licking Stamps Meatier examples

More information

CSC 505, Spring 2005 Week 6 Lectures page 1 of 9

CSC 505, Spring 2005 Week 6 Lectures page 1 of 9 CSC 505, Spring 2005 Week 6 Lectures page 1 of 9 Objectives: learn general strategies for problems about order statistics learn how to find the median (or k-th largest) in linear average-case number of

More information

Algorithms for Data Science

Algorithms for Data Science Algorithms for Data Science CSOR W4246 Eleni Drinea Computer Science Department Columbia University Thursday, October 1, 2015 Outline 1 Recap 2 Shortest paths in graphs with non-negative edge weights (Dijkstra

More information

Module 27: Chained Matrix Multiplication and Bellman-Ford Shortest Path Algorithm

Module 27: Chained Matrix Multiplication and Bellman-Ford Shortest Path Algorithm Module 27: Chained Matrix Multiplication and Bellman-Ford Shortest Path Algorithm This module 27 focuses on introducing dynamic programming design strategy and applying it to problems like chained matrix

More information

Dynamic Programming 1

Dynamic Programming 1 Dynamic Programming 1 Jie Wang University of Massachusetts Lowell Department of Computer Science 1 I thank Prof. Zachary Kissel of Merrimack College for sharing his lecture notes with me; some of the examples

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms Design and Analysis of Algorithms CSE 5311 Lecture 16 Greedy algorithms Junzhou Huang, Ph.D. Department of Computer Science and Engineering CSE5311 Design and Analysis of Algorithms 1 Overview A greedy

More information

IN101: Algorithmic techniques Vladimir-Alexandru Paun ENSTA ParisTech

IN101: Algorithmic techniques Vladimir-Alexandru Paun ENSTA ParisTech IN101: Algorithmic techniques Vladimir-Alexandru Paun ENSTA ParisTech License CC BY-NC-SA 2.0 http://creativecommons.org/licenses/by-nc-sa/2.0/fr/ Outline Previously on IN101 Python s anatomy Functions,

More information

Dynamic Programming Intro

Dynamic Programming Intro Dynamic Programming Intro Imran Rashid University of Washington February 15, 2008 Dynamic Programming Outline: General Principles Easy Examples Fibonacci, Licking Stamps Meatier examples RNA Structure

More information

CSE 101, Winter Design and Analysis of Algorithms. Lecture 11: Dynamic Programming, Part 2

CSE 101, Winter Design and Analysis of Algorithms. Lecture 11: Dynamic Programming, Part 2 CSE 101, Winter 2018 Design and Analysis of Algorithms Lecture 11: Dynamic Programming, Part 2 Class URL: http://vlsicad.ucsd.edu/courses/cse101-w18/ Goal: continue with DP (Knapsack, All-Pairs SPs, )

More information

Algorithms: COMP3121/3821/9101/9801

Algorithms: COMP3121/3821/9101/9801 NEW SOUTH WALES Algorithms: COMP3121/3821/9101/9801 Aleks Ignjatović School of Computer Science and Engineering University of New South Wales TOPIC 5: DYNAMIC PROGRAMMING COMP3121/3821/9101/9801 1 / 38

More information

CMPS 2200 Fall Dynamic Programming. Carola Wenk. Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk

CMPS 2200 Fall Dynamic Programming. Carola Wenk. Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk CMPS 00 Fall 04 Dynamic Programming Carola Wenk Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk 9/30/4 CMPS 00 Intro. to Algorithms Dynamic programming Algorithm design technique

More information

Greedy Algorithms. Algorithms

Greedy Algorithms. Algorithms Greedy Algorithms Algorithms Greedy Algorithms Many algorithms run from stage to stage At each stage, they make a decision based on the information available A Greedy algorithm makes decisions At each

More information

CSL 730: Parallel Programming. Algorithms

CSL 730: Parallel Programming. Algorithms CSL 73: Parallel Programming Algorithms First 1 problem Input: n-bit vector Output: minimum index of a 1-bit First 1 problem Input: n-bit vector Output: minimum index of a 1-bit Algorithm: Divide into

More information

Algorithms IV. Dynamic Programming. Guoqiang Li. School of Software, Shanghai Jiao Tong University

Algorithms IV. Dynamic Programming. Guoqiang Li. School of Software, Shanghai Jiao Tong University Algorithms IV Dynamic Programming Guoqiang Li School of Software, Shanghai Jiao Tong University Dynamic Programming Shortest Paths in Dags, Revisited Shortest Paths in Dags, Revisited The special distinguishing

More information

Dynamic Programming. CSE 421: Intro Algorithms. Some Algorithm Design Techniques, I. Techniques, II. Outline:

Dynamic Programming. CSE 421: Intro Algorithms. Some Algorithm Design Techniques, I. Techniques, II. Outline: Dynamic Programming CSE 42: Intro Algorithms Summer 2007 W. L. Ruzzo Dynamic Programming, I Fibonacci & Stamps Outline: General Principles Easy Examples Fibonacci, Licking Stamps Meatier examples RNA Structure

More information

Greedy Algorithms 1. For large values of d, brute force search is not feasible because there are 2 d

Greedy Algorithms 1. For large values of d, brute force search is not feasible because there are 2 d Greedy Algorithms 1 Simple Knapsack Problem Greedy Algorithms form an important class of algorithmic techniques. We illustrate the idea by applying it to a simplified version of the Knapsack Problem. Informally,

More information

Overview. A mathema5cal proof technique Proves statements about natural numbers 0,1,2,... (or more generally, induc+vely defined objects) Merge Sort

Overview. A mathema5cal proof technique Proves statements about natural numbers 0,1,2,... (or more generally, induc+vely defined objects) Merge Sort Goals for Today Induc+on Lecture 22 Spring 2011 Be able to state the principle of induc+on Iden+fy its rela+onship to recursion State how it is different from recursion Be able to understand induc+ve proofs

More information

******** Chapter-4 Dynamic programming

******** Chapter-4 Dynamic programming repeat end SHORT - PATHS Overall run time of algorithm is O ((n+ E ) log n) Example: ******** Chapter-4 Dynamic programming 4.1 The General Method Dynamic Programming: is an algorithm design method that

More information

EE/CSCI 451 Spring 2018 Homework 8 Total Points: [10 points] Explain the following terms: EREW PRAM CRCW PRAM. Brent s Theorem.

EE/CSCI 451 Spring 2018 Homework 8 Total Points: [10 points] Explain the following terms: EREW PRAM CRCW PRAM. Brent s Theorem. EE/CSCI 451 Spring 2018 Homework 8 Total Points: 100 1 [10 points] Explain the following terms: EREW PRAM CRCW PRAM Brent s Theorem BSP model 1 2 [15 points] Assume two sorted sequences of size n can be

More information

Greedy Algorithms CLRS Laura Toma, csci2200, Bowdoin College

Greedy Algorithms CLRS Laura Toma, csci2200, Bowdoin College Greedy Algorithms CLRS 16.1-16.2 Laura Toma, csci2200, Bowdoin College Overview. Sometimes we can solve optimization problems with a technique called greedy. A greedy algorithm picks the option that looks

More information

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS Department of Computer Science University of Babylon LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS By Faculty of Science for Women( SCIW), University of Babylon, Iraq Samaher@uobabylon.edu.iq

More information

CSE 421: Introduction to Algorithms

CSE 421: Introduction to Algorithms CSE 421: Introduction to Algorithms Dynamic Programming Paul Beame 1 Dynamic Programming Dynamic Programming Give a solution of a problem using smaller sub-problems where the parameters of all the possible

More information

Efficient Sequential Algorithms, Comp309. Problems. Part 1: Algorithmic Paradigms

Efficient Sequential Algorithms, Comp309. Problems. Part 1: Algorithmic Paradigms Efficient Sequential Algorithms, Comp309 Part 1: Algorithmic Paradigms University of Liverpool References: T. H. Cormen, C. E. Leiserson, R. L. Rivest Introduction to Algorithms, Second Edition. MIT Press

More information

Homework3: Dynamic Programming - Answers

Homework3: Dynamic Programming - Answers Most Exercises are from your textbook: Homework3: Dynamic Programming - Answers 1. For the Rod Cutting problem (covered in lecture) modify the given top-down memoized algorithm (includes two procedures)

More information

CSC 373: Algorithm Design and Analysis Lecture 8

CSC 373: Algorithm Design and Analysis Lecture 8 CSC 373: Algorithm Design and Analysis Lecture 8 Allan Borodin January 23, 2013 1 / 19 Lecture 8: Announcements and Outline Announcements No lecture (or tutorial) this Friday. Lecture and tutorials as

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms CSE 101, Winter 018 D/Q Greed SP s DP LP, Flow B&B, Backtrack Metaheuristics P, NP Design and Analysis of Algorithms Lecture 8: Greed Class URL: http://vlsicad.ucsd.edu/courses/cse101-w18/ Optimization

More information

Dynamic Programmming: Activity Selection

Dynamic Programmming: Activity Selection Dynamic Programmming: Activity Selection Select the maximum number of non-overlapping activities from a set of n activities A = {a 1,, a n } (sorted by finish times). Identify easier subproblems to solve.

More information

: Advanced Compiler Design. 8.0 Instruc?on scheduling

: Advanced Compiler Design. 8.0 Instruc?on scheduling 6-80: Advanced Compiler Design 8.0 Instruc?on scheduling Thomas R. Gross Computer Science Department ETH Zurich, Switzerland Overview 8. Instruc?on scheduling basics 8. Scheduling for ILP processors 8.

More information

Framework for Design of Dynamic Programming Algorithms

Framework for Design of Dynamic Programming Algorithms CSE 441T/541T Advanced Algorithms September 22, 2010 Framework for Design of Dynamic Programming Algorithms Dynamic programming algorithms for combinatorial optimization generalize the strategy we studied

More information

Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-4004) Subject Notes

Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-4004) Subject Notes Page no: Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-00) Subject Notes Unit- Greedy Technique. Introduction: Greedy is the most straight forward design technique.

More information

Lecture 2. 1 Introduction. 2 The Set Cover Problem. COMPSCI 632: Approximation Algorithms August 30, 2017

Lecture 2. 1 Introduction. 2 The Set Cover Problem. COMPSCI 632: Approximation Algorithms August 30, 2017 COMPSCI 632: Approximation Algorithms August 30, 2017 Lecturer: Debmalya Panigrahi Lecture 2 Scribe: Nat Kell 1 Introduction In this lecture, we examine a variety of problems for which we give greedy approximation

More information

Analysis of Algorithms. Unit 4 - Analysis of well known Algorithms

Analysis of Algorithms. Unit 4 - Analysis of well known Algorithms Analysis of Algorithms Unit 4 - Analysis of well known Algorithms 1 Analysis of well known Algorithms Brute Force Algorithms Greedy Algorithms Divide and Conquer Algorithms Decrease and Conquer Algorithms

More information

1 Non greedy algorithms (which we should have covered

1 Non greedy algorithms (which we should have covered 1 Non greedy algorithms (which we should have covered earlier) 1.1 Floyd Warshall algorithm This algorithm solves the all-pairs shortest paths problem, which is a problem where we want to find the shortest

More information

Algorithms for Data Science

Algorithms for Data Science Algorithms for Data Science CSOR W4246 Eleni Drinea Computer Science Department Columbia University Shortest paths in weighted graphs (Bellman-Ford, Floyd-Warshall) Outline 1 Shortest paths in graphs with

More information

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

Computer Science & Engineering 423/823 Design and Analysis of Algorithms Computer Science & Engineering 423/823 Design and Analysis of Algorithms Lecture 07 Single-Source Shortest Paths (Chapter 24) Stephen Scott and Vinodchandran N. Variyam sscott@cse.unl.edu 1/36 Introduction

More information

Algorithm Design Techniques part I

Algorithm Design Techniques part I Algorithm Design Techniques part I Divide-and-Conquer. Dynamic Programming DSA - lecture 8 - T.U.Cluj-Napoca - M. Joldos 1 Some Algorithm Design Techniques Top-Down Algorithms: Divide-and-Conquer Bottom-Up

More information

Algorithms (IX) Guoqiang Li. School of Software, Shanghai Jiao Tong University

Algorithms (IX) Guoqiang Li. School of Software, Shanghai Jiao Tong University Algorithms (IX) Guoqiang Li School of Software, Shanghai Jiao Tong University Q: What we have learned in Algorithm? Algorithm Design Algorithm Design Basic methodologies: Algorithm Design Algorithm Design

More information

Shortest Path Problem

Shortest Path Problem Shortest Path Problem CLRS Chapters 24.1 3, 24.5, 25.2 Shortest path problem Shortest path problem (and variants) Properties of shortest paths Algorithmic framework Bellman-Ford algorithm Shortest paths

More information

1. (a) O(log n) algorithm for finding the logical AND of n bits with n processors

1. (a) O(log n) algorithm for finding the logical AND of n bits with n processors 1. (a) O(log n) algorithm for finding the logical AND of n bits with n processors on an EREW PRAM: See solution for the next problem. Omit the step where each processor sequentially computes the AND of

More information

Greedy Algorithms CHAPTER 16

Greedy Algorithms CHAPTER 16 CHAPTER 16 Greedy Algorithms In dynamic programming, the optimal solution is described in a recursive manner, and then is computed ``bottom up''. Dynamic programming is a powerful technique, but it often

More information

Problem Strategies. 320 Greedy Strategies 6

Problem Strategies. 320 Greedy Strategies 6 Problem Strategies Weighted interval scheduling: 2 subproblems (include the interval or don t) Have to check out all the possibilities in either case, so lots of subproblem overlap dynamic programming:

More information

Least Squares; Sequence Alignment

Least Squares; Sequence Alignment Least Squares; Sequence Alignment 1 Segmented Least Squares multi-way choices applying dynamic programming 2 Sequence Alignment matching similar words applying dynamic programming analysis of the algorithm

More information

CMSC351 - Fall 2014, Homework #4

CMSC351 - Fall 2014, Homework #4 CMSC351 - Fall 2014, Homework #4 Due: November 14th at the start of class PRINT Name: Grades depend on neatness and clarity. Write your answers with enough detail about your approach and concepts used,

More information

Dynamic Programming Algorithms

Dynamic Programming Algorithms CSC 364S Notes University of Toronto, Fall 2003 Dynamic Programming Algorithms The setting is as follows. We wish to find a solution to a given problem which optimizes some quantity Q of interest; for

More information

Analyzing Complexity of Lists

Analyzing Complexity of Lists Analyzing Complexity of Lists Operation Sorted Array Sorted Linked List Unsorted Array Unsorted Linked List Search( L, x ) O(logn) O( n ) O( n ) O( n ) Insert( L, x ) O(logn) O( n ) + O( 1 ) O( 1 ) + O(

More information

Data Structures and Algorithms Week 8

Data Structures and Algorithms Week 8 Data Structures and Algorithms Week 8 Dynamic programming Fibonacci numbers Optimization problems Matrix multiplication optimization Principles of dynamic programming Longest Common Subsequence Algorithm

More information

So far... Finished looking at lower bounds and linear sorts.

So far... Finished looking at lower bounds and linear sorts. So far... Finished looking at lower bounds and linear sorts. Next: Memoization -- Optimization problems - Dynamic programming A scheduling problem Matrix multiplication optimization Longest Common Subsequence

More information

CS473-Algorithms I. Lecture 11. Greedy Algorithms. Cevdet Aykanat - Bilkent University Computer Engineering Department

CS473-Algorithms I. Lecture 11. Greedy Algorithms. Cevdet Aykanat - Bilkent University Computer Engineering Department CS473-Algorithms I Lecture 11 Greedy Algorithms 1 Activity Selection Problem Input: a set S {1, 2,, n} of n activities s i =Start time of activity i, f i = Finish time of activity i Activity i takes place

More information

Dynamic Programming II

Dynamic Programming II June 9, 214 DP: Longest common subsequence biologists often need to find out how similar are 2 DNA sequences DNA sequences are strings of bases: A, C, T and G how to define similarity? DP: Longest common

More information

/463 Algorithms - Fall 2013 Solution to Assignment 3

/463 Algorithms - Fall 2013 Solution to Assignment 3 600.363/463 Algorithms - Fall 2013 Solution to Assignment 3 (120 points) I (30 points) (Hint: This problem is similar to parenthesization in matrix-chain multiplication, except the special treatment on

More information

Divide and Conquer Algorithms

Divide and Conquer Algorithms CSE341T 09/13/2017 Lecture 5 Divide and Conquer Algorithms We have already seen a couple of divide and conquer algorithms in this lecture. The reduce algorithm and the algorithm to copy elements of the

More information

Greedy Algorithms. T. M. Murali. January 28, Interval Scheduling Interval Partitioning Minimising Lateness

Greedy Algorithms. T. M. Murali. January 28, Interval Scheduling Interval Partitioning Minimising Lateness Greedy Algorithms T. M. Murali January 28, 2008 Algorithm Design Start discussion of dierent ways of designing algorithms. Greedy algorithms, divide and conquer, dynamic programming. Discuss principles

More information

CSC 373: Algorithm Design and Analysis Lecture 3

CSC 373: Algorithm Design and Analysis Lecture 3 CSC 373: Algorithm Design and Analysis Lecture 3 Allan Borodin January 11, 2013 1 / 13 Lecture 3: Outline Write bigger and get better markers A little more on charging arguments Continue examples of greedy

More information

15.4 Longest common subsequence

15.4 Longest common subsequence 15.4 Longest common subsequence Biological applications often need to compare the DNA of two (or more) different organisms A strand of DNA consists of a string of molecules called bases, where the possible

More information

Greedy algorithms is another useful way for solving optimization problems.

Greedy algorithms is another useful way for solving optimization problems. Greedy Algorithms Greedy algorithms is another useful way for solving optimization problems. Optimization Problems For the given input, we are seeking solutions that must satisfy certain conditions. These

More information

Subset sum problem and dynamic programming

Subset sum problem and dynamic programming Lecture Notes: Dynamic programming We will discuss the subset sum problem (introduced last time), and introduce the main idea of dynamic programming. We illustrate it further using a variant of the so-called

More information

CS173 Longest Increasing Substrings. Tandy Warnow

CS173 Longest Increasing Substrings. Tandy Warnow CS173 Longest Increasing Substrings Tandy Warnow CS 173 Longest Increasing Substrings Tandy Warnow Today s material The Longest Increasing Subsequence problem DP algorithm for finding a longest increasing

More information

Main approach: always make the choice that looks best at the moment.

Main approach: always make the choice that looks best at the moment. Greedy algorithms Main approach: always make the choice that looks best at the moment. - More efficient than dynamic programming - Always make the choice that looks best at the moment (just one choice;

More information

Dist(Vertex u, Vertex v, Graph preprocessed) return u.dist v.dist

Dist(Vertex u, Vertex v, Graph preprocessed) return u.dist v.dist Design and Analysis of Algorithms 5th September, 2016 Practice Sheet 3 Solutions Sushant Agarwal Solutions 1. Given an edge-weighted undirected connected chain-graph G = (V, E), all vertices having degree

More information

DD2451 Parallel and Distributed Computing --- FDD3008 Distributed Algorithms

DD2451 Parallel and Distributed Computing --- FDD3008 Distributed Algorithms DD2451 Parallel and Distributed Computing --- FDD3008 Distributed Algorithms Lecture 8 Leader Election Mads Dam Autumn/Winter 2011 Previously... Consensus for message passing concurrency Crash failures,

More information