Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 6, 2016 洪國寶

Size: px
Start display at page:

Download "Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 6, 2016 洪國寶"

Transcription

1 Design and Analysis of Algorithms 演算法設計與分析 Lecture 7 April 6, 2016 洪國寶 1

2 Course information (5/5) Grading (Tentative) Homework 25% (You may collaborate when solving the homework, however when writing up the solutions you must do so on your own. No typed or printed assignments.) Midterm exam 25% (Open book and notes) April 20, 2016 Final exam 25% (Open book and notes) Class participation 25% 2

3 Outline of the course (1/1) Introduction (1-4) Midterm exam Data structures (10-14) Apr. 20, 2016 Dynamic programming (15) Greedy methods (16) Amortized analysis (17) Advanced data structures (6, 19-21) Graph algorithms (22-25) NP-completeness (34-35) Other topics (5, 31) 3

4 Outline Review Dynamic programming (Cont.) Greedy method Activity-selection algorithm Basic elements of greedy approach Examples where greedy approach does not work Optimal merge pattern 4

5 Review: Development of A Dynamic-Programming Algorithm 1. Characterize the structure of an optimal solution 2. Recursively define the value of an optimal solution 3. Compute the value of an optimal solution in a bottom-up fashion 4. Construct an optimal solution from computed information 5

6 Matrix chain multiplication Given a sequence (chain) <A 1, A 2,, A n > of n matrices to be multiplied, compute the product A 1 A 2 A n in a way that minimizes the number of scalar multiplications. Every way of multiplying the matrices corresponds to a parenthesization. Impractical to check all possible parenthesizations P 1 n) ( n 1 k 1 P( k) P( n k) if if n 1 n 2 6

7 Review: Elements of Dynamic Programming Optimal substructure: an optimal solution to the problem contains within it optimal solutions to subproblems (for DP to be applicable) Overlapping subproblems: the space of subproblems must be small (for algorithm to be efficient) 7

8 Common Pattern in Discovering Optimal Substructure Show a solution to the problem consists of making a choice. Making the choice leaves one or more subproblems to be solved. Suppose that for a given problem, the choice that leads to an optimal solution is available. Given this optimal choice, determine which subproblems ensue and how to best characterize the resulting space of subproblems Show that the solutions to the subproblems used within the optimal solution to the problem must themselves be optimal by using a cut-and-paste technique and prove by contradiction 8

9 Review: Memoization A variation of dynamic programming that often offers the efficiency of the usual dynamic-programming approach while maintaining a top-down strategy Memoize the natural, but inefficient, recursive algorithm Maintain a table with subproblem solutions, but the control structure for filling in the table is more like the recursive algorithm 9

10 Review: LCS Algorithm if X = m, Y = n, then there are 2 m subsequences of X; we must compare each with Y (n comparisons) So the running time of the brute-force algorithm is O(n 2 m ) Notice that the LCS problem has optimal substructure: solutions of subproblems are parts of the final solution. Subproblems: find LCS of pairs of prefixes of X and Y 10

11 Review: LCS Algorithm Running Time LCS algorithm calculates the values of each entry of the array c[m,n] The running time is O(mn) since each c[i,j] is calculated in constant time, and there are mn elements in the array 11

12 Review: Optimal Polygon Triangulation Input: a convex polygon P=(v 0, v 1,, v n-1 ) Output: an optimal triangulation 12

13 Optimal Polygon Triangulation vs. Matrix-chain multiplication Optimal Polygon Triangulation 0 if i = j t[i,j] = min { t[ i, k] t[ k 1, j] w( vi 1v jvk )} i k j Matrix-chain multiplication if i < j m[ i, j] 0 min 1 k j { m[ i, k] m[ k 1, j] p i 1 p k p j if if i i j j 13

14 Dynamic Programming (TSP) Traveling salesperson problem (revisit) - Optimal substructure? (subproblems) g(i,s) = min{c ij + g(j,s-{j})} j S = the length of a shortest path starting at vertex i, going through all vertices in S, and terminating at vertex 1 - Overlapping subproblems? N = g(i,s) = (n-1)2 n-2 14

15 Outline Review Dynamic programming (Cont.) Greedy method Activity-selection algorithm Basic elements of greedy approach Examples where greedy approach does not work Optimal merge pattern 15

16 Algorithm design techniques So far, we ve looked at the following design techniques: - induction (incremental approach) - divide and conquer - augmenting data structures - dynamic programming Coming up: greedy method 16

17 Dynamic Programming VS. Greedy Algorithms Dynamic programming uses optimal substructure in a bottom-up fashion First find optimal solutions to subproblems and, having solved the subproblems, we find an optimal solution to the problem Greedy algorithms use optimal substructure in a top-down fashion First make a choice the choice that looks best at the time and then solving a resulting subproblem 17

18 Greedy Algorithms A greedy algorithm always makes the choice that looks best at the moment The hope: a locally optimal choice will lead to a globally optimal solution For some problems, it works Dynamic programming can be overkill; greedy algorithms tend to be easier to code 18

19 Example: Activity-Selection Formally: Given a set S of n activities s i = start time of activity i f i = finish time of activity i Find max-size subset A of compatible activities: for all i,j A, [s i, f i ) and [s j, f j ) do not overlap

20 Activity Selection: Optimal Substructure Let k be the minimum activity in A (i.e., the one with the earliest finish time). Then A - {k} is an optimal solution to S = {i S: s i f k } In words: once activity k is selected, the problem reduces to finding an optimal solution for activityselection over activities in S compatible with k Proof: if we could find optimal solution B to S with B > A - {k}, Then B U {k} is compatible And B U {k} > A 20

21 Activity Selection: Greedy Choice Property Dynamic programming? Memoize? Activity selection problem also exhibits the greedy choice property: Locally optimal choice globally optimal sol n Thm 16.1: if S is an activity selection problem sorted by finish time, then optimal solution A S such that {1} A Sketch of proof: if optimal solution B that does not contain {1}, can always replace first activity in B with {1} (Why?). Same number of activities, thus optimal. 21

22 Activity Selection: A Greedy Algorithm Intuition is simple: Always pick the activity with the earliest finish time available at the time GAS(S) 1 if S=NIL then return NIL 2 else return {k} U GAS(S ) where k is the activity in S with smallest f, and S = {i S: s i f k } 22

23 GAS(S) Activity Selection: A Greedy Algorithm 1 if S=NIL then return NIL 2 else return {k} U GAS(S ) where k is the activity in S with smallest f, and S = {i S: s i f k } Proof of correctness: use blackboard Use blackboard 23

24 Activity Selection: A LISP program GAS(S) 1 if S=NIL then return NIL 2 else return {k} U GAS(S ) where k is the activity in S with smallest f, and S = {i S: s i f k } (defun gas (l) (print l) (cond ((null l) nil) (T (cons (car l) (gas (filter (nth 1 (car l)) (cdr l))))))) (defun filter (s l) (cond ((null l) nil) ((> s (nth 0 (car l))) (filter s (cdr l))) (T (cons (car l) (filter s (cdr l)))))) 24

25 Activity Selection: A LISP program (cont.) (defvar *activity* '((1 4) (3 5) (0 6) (5 7) (3 8) (5 9) (6 10) (8 11) (8 12) (2 13) (12 14))) #<BUFFERED FILE-STREAM CHARACTER [3]> *activity* ((1 4) (3 5) (0 6) (5 7) (3 8) (5 9) (6 10) (8 11) (8 12) (2 13) (12 14)) [4]> (gas *activity*) ((1 4) (3 5) (0 6) (5 7) (3 8) (5 9) (6 10) (8 11) (8 12) (2 13) (12 14)) ((5 7) (5 9) (6 10) (8 11) (8 12) (12 14)) ((8 11) (8 12) (12 14)) ((12 14)) NIL ((1 4) (5 7) (8 11) (12 14)) The S for each iteration (print l) [5]> (dribble) 25

26 Activity Selection: A Greedy Algorithm If we sort the activities by finish time then the algorithm is simple (iterative instead of recursive): Sort the activities by finish time Schedule the first activity Then schedule the next activity in sorted list which starts after previous activity finishes Repeat until no more activities Note: we don t have to construct S explicitly. Why? 26

27 Greedy-Activity-Selector Assume (wlog) that f 1 f 2 f n Time complexity: O(n) Exercise: Proof of correctness by loop invariant. 27

28 28

29 Activity Selection 29

30 Other greedy choices Greedy-Activity-Selector always pick the activity with the earliest finish time available at the time (smallest f) Some other greedy choices: Largest f Largest/Smallest s Largest/Smallest (f-s) Fewest overlapping Exercise: Which of these criteria result in optimal solutions? 30

31 A Variation of the Problem Instead of maximizing the number of activities we want to schedule, we want to maximize the total time the resource is in use. None of the obvious greedy choices would work: Choose the activity that starts earliest/latest Choose the activity that finishes earliest/latest Choose the longest activity Exercise: Design an efficient algorithm for this variation of activity selection problem. 31

32 Outline Review Dynamic programming (Cont.) Greedy method Activity-selection algorithm Basic elements of greedy approach Examples where greedy approach does not work Optimal merge pattern 32

33 Elements of Greedy Strategy Greedy choice property: An optimal solution can be obtained by making choices that seem best at the time, without considering their implications for solutions to subproblems. Optimal substructure: An optimal solution can be obtained by augmenting the partial solution constructed so far with an optimal solution of the remaining subproblem. 33

34 Steps in designing greedy algorithms 1. Cast the optimization problem as one in which we make a choice and are left with one subproblem to solve. 2. Prove the greedy choice property. 3. Demonstrate optimal substructure. 34

35 Outline Review Dynamic programming (Cont.) Longest Common Subsequence (LCS) Greedy method Activity-selection algorithm Basic elements of greedy approach Examples where greedy approach does not work Optimal merge pattern 35

36 Examples where greedy approach does not work Traveling salesman problem: nearest neighbor, closest pair Matrix-chain multiplication: multiply the two matrices with lowest cost first Knapsack problem: largest values, largest value per unit weight 36

37 Traveling salesman problem Correctness is not obvious (2/7) (nearest neighbor) 37

38 Traveling salesman problem Correctness is not obvious (3/7) 38

39 Matrix-chain multiplication Greedy choice: multiply the two matrices with lowest cost first Example 1: <A 1, A 2, A 3 > (10*100, 100*5, 5*50) Example 2: <A 1, A 2, A 3, A 4, A 5, A 6 > (30*35, 35*15, 15*5, 5*10, 10*20, 20*25) 39

40 Matrix-chain multiplication Example 1: <A 1, A 2, A 3 > (10*100, 100*5, 5*50) ((A 1 A 2 )A 3 ) 10*100*5 + 10*5*50 = = 7500 (A 1 (A 2 A 3 )) 100*5* *100*50 = =

41 Matrix-chain multiplication Example 2: <A 1, A 2, A 3, A 4, A 5, A 6 > (30*35, 35*15, 15*5, 5*10, 10*20, 20*25) A 3 A 4 15*5*10 = 750 Figure 15.3 ((A 1 (A 2 A 3 ))((A 4 A 5 ) A 6 )) 41

42 42

43 Knapsack problem Given some items, pack the knapsack to get the maximum total value. Each item has some weight and some value. Total weight that we can carry is no more than some fixed number W. So we must consider weights of items as well as their value. Item # Weight Value

44 Knapsack Problem A thief breaks into a jewelry store carrying a knapsack. Given n items S = {item 1,,item n }, each having a weight w i and value v i, which items should the thief put into the knapsack with capacity W to obtain the maximum value? 44

45 Knapsack problem There are two versions of the problem: (1) 0-1 knapsack problem and (2) Fractional knapsack problem (1) Items are indivisible; you either take an item or not. Solved with dynamic programming (2) Items are divisible: you can take any fraction of an item. Solved with a greedy algorithm. 45

46 0-1 Knapsack Problem This problem requires a subset A of S to be determined such that i item A i v is maximized subject to The naïve approach is to generate all possible subset of S and find the maximum value, requiring 2 n time. item A i w i W 46

47 Does Greedy Approach Work? Strategy 1: Steal the items with the largest values. E.g. w=[25, 10, 10], v=[$10, $9, $9], W=30. Value is 10, although the optimal value is 18. w 1 =25 w 2 =10 w 3 =10 v 1 = $10 v 2 = $9 v 3 = $9 47

48 Does Greedy Approach Work? Strategy 2: Steal the items with the largest value per unit weight. E.g. w=[5, 20,10], v=[$50, $140, $60], W=30. Value is 190, although optimal would be 200. w 1 =5 w 2 =20 w 3 =10 v 1 =$50 v 2 =$140 v 3 =$60 48

49 NP-hard Problems (Knapsack, Traveling Salesman,...) The two approaches above cannot yield the optimal result for 0-1 knapsack. This problem is NP-hard even when all items are the same kind, that is, item i described by w i only (v i =w i ). Observation: Greedy approach to the fractional knapsack problem yields the optimal solution. E.g. (see 2nd example) (5/10)*60 = 220 where 5 is the remaining capacity of the knapsack 49

50 Dynamic Programming Approach for the Knapsack Problem We can find optimal solution for 0-1 knapsack problem by DP, but not in polynomial-time. Let A be an optimal subset with items from {item 1,,item i } only. There are two cases: 1) A contains item i Then the total value for A is equal to v i plus the optimal value obtained from the first i-1 items, where the total weight cannot exceed W w i 2) A does not contain item i Then the total value for A is equal to that of the optimal subset chosen from the first i-1 items (with total weight cannot exceed W). Q: What are the subproblems? 50

51 Dynamic Programming Approach for the Knapsack Problem A[i][w]= max(a[i-1][w], v i +A[i-1][w-w i ]) if w>w i A[i-1][w] if w<w i The maximum value is A[n][W] Running time is O(nW), not polynomial in n Q: WHY? n W 51

52 Fractional knapsack problem Greedy approach: taking items in order of greatest value per pound Optimal for the fractional version (why?), but not for the 0-1 version 52

53 Outline Review Dynamic programming (Cont.) Longest Common Subsequence (LCS) Greedy method Activity-selection algorithm Basic elements of greedy approach Examples where greedy approach does not work Optimal merge pattern 53

54 Optimal merge pattern 54

55 Optimal merge pattern 55

56 Huffman codes Compact Text Encoding Code that can be decoded Huffman encoding 56

57 Compact Text Encoding The goal is to develop a code that represents a given text as compactly as possible. A standard encoding is ASCII, which represents every character using 7 bits: An English sentence (A) (n) ( ) (E) (n) (g) (l) (i) (s) (h) ( ) (s) (e) (n) (t) (e) (n) (c) (e) This requires 133 bits 17 bytes 57

58 Compact Text Encoding (Cont.) Of course, this is wasteful because we can encode 12 characters in 4 bits: space = 0000 A = 0001 E = 0010 c = 0011 e = 0100 g = 0101 h = 0110 i = 0111 l = 1000 n = 1001 s = 1010 t = 1011 Then we encode the phrase as 0001 (A) 1001 (n) 0000 ( ) 0010 (E) 1001 (n) 0101 (g) 1000 (l) 0111 (i) 1010 (s) 0110 (h) 0000 ( ) 1010 (s) 0100 (e) 1001 (n) 1011 (t) 0100 (e) 1001 (n) 0011 (c) 0100 (e) This requires 76 bits 10 bytes 58

59 Compact Text Encoding (Cont.) An even better code is given by the following encoding: space = 000 A = 0010 E = 0011 s = 010 c = 0110 g = 0111 h = 1000 i = 1001 l = 1010 t = 1011 e = 110 n = 111 Then we encode the phrase as 0010 (A) 111 (n) 000 ( ) 0011 (E) 111 (n) 0111 (g) 1010 (l) 1001 (i) 010 (s) 1000 (h) 000 ( ) 010 (s) 110 (e) 111 (n) 1011 (t) 110 (e) 111 (n) 0110 (c) 110 (e) This requires 65 bits 9 bytes 59

60 Code that can be decoded Fixed-length codes: Every character is encoded using the same number of bits. To determine the boundaries between characters, we form groups of w bits, where w is the length of a character. Examples: ASCII Our first improved code Prefix codes: No character is the prefix of another character. Examples: Fixed-length codes Huffman codes 60

61 Why Prefix Codes? Consider a code that is not a prefix code: a= 01 m = 10 n = 111 o = 0 r = 11 s = 1 t = 0011 Now you send a fan-letter to you favorite movie star. One of the sentences is You are a star. You encode star as Your idol receives the letter and decodes the text using your coding table: = = moron Prefix codes are unambiguous. (See next slide) 61

62 prefix codes are unambiguous It suffices to show that the first character can be decoded unambiguously. We then remove this character and are left with the problem of decoding the first character of the remaining text, and so on until the whole text has been decoded. Why Are Prefix Codes Unambiguous? c Assume that there are two characters c and c' that could potentially be the first characters in the text. Assume that the encodings are x 0 x 1 x k and y 0 y 2 y l. Assume further that k l. c c' Since both c and c' can occur at the beginning of the text, we have x i = y i, for 0 i k; that is, x 0 x 1 x k is a prefix of y 0 y 2 y l, a contradiction. 62

63 Representing a Prefix-Code Dictionary In the example: space = 000 A = 0010 E = 0011 s = 010 c = 0110 g = 0111 h = 1000 i = 1001 l = 1010 t = 1011 e = 110 n = spc 0 A s e n E c g h i l t 63

64 Figure 16.3/ Figure

65 65

66 Huffman code 66

67 The Cost of Huffman Code Let C be the set of characters in the text to be encoded, and let f(c) be the frequency of character c. Let d T (c) be the depth of node (character) c in the tree representing the code. Then is the number of bits required to encode the text using the code represented by tree T. We call B(T) the cost of tree T. Observation: In a tree T representing an optimal prefix code, every internal node has two children. 67

68 Greedy Choice Lemma: There exists an optimal prefix code such that the two characters with smallest frequency are siblings and have maximal depth in T. 68

69 Proof: Let x and y be two such characters, and let T be a tree representing an optimal prefix code. Greedy Choice T Let a and b be two sibling leaves of maximal depth in T. x y Assume w.l.o.g. that f(x) f(y) and f(a) f(b). T' a b This implies that f(x) f(a) and f(y) f(b). Let T' be the tree obtained by exchanging a and x and b and y. (Continued in the next slide.) a x y b 69

70 Greedy Choice The cost difference between trees T and T' is T T' x y a b a b x y 70

71 Optimal Substructure After joining two nodes x and y by making them children of a new node z, the algorithm treats z as a leaf with frequency f(z) = f(x) + f(y). Let C' be the character set in which x and y are replaced by the single character z with frequency f(z) = f(x) + f(y), and let T' be an optimal tree for C'. Let T be the tree obtained from T' by making x and y children of z. We observe the following relationship between B(T) and B(T'): B(T) = B(T') + f(x) + f(y) (Continued in the next slide.) 71

72 72

73 Lemma: If T' is optimal for C', then T is optimal for C. Assume the contrary. Then there exists a better tree T'' for C. Also, there exists a tree T''' at least as good as T'' for C where x and y are sibling leaves of maximal depth. The removal of x and y from T''' turns their parent into a leaf; we can associate this leaf with z. The cost of the resulting tree is B(T''') f(x) f(y) < B(T) f(x) f(y) = B(T'). This contradicts the optimality of B(T'). Hence, T must be optimal for C. 73

74 74

75 Huffman code (Remarks) Assume that the string is generated by a memoryless source, regardless the past, the next character in the string is c with probability f(c). Then Huffman is optimal. Can we do better? 75

76 Huffman code (Remarks) Huffman encodes fixed length blocks. What if we vary them? Huffman uses one encoding throughout a life. What if characteristics change? What if data has structure? For examples: raster images, video Huffman is lossless. Necessary? LZW, MPEG, etc. 76

77 Huffman code (Implementation) Time complexity and data structure: Let S be the set of n weights (nodes). Constructing a Huffamn code based on greedy strategy can be described as following Repeat until S =1 Find two min nodes x and y from S and remove them from S Construct a new node z with weight w(z)=w(x)+w(y) and insert z into S 77

78 Huffman code (Implementation) Why data structures are important? An algorithm for constructing Huffman code (Tree): Repeat until S =1 Find two min nodes x and y from S and remove them from S Construct a new node z with weight w(z)=w(x)+w(y) and insert z into S The time complexity of the algorithm depends on how S is implemented. Data structure for S each iteration total linked list O(n) O(1) O(n^2) Sorted array O(1) O(n) O(n^2)? O(log n) O(log n) O(n log n) 78

79 We will cover heap in Lecture 9 79

80 Greedy method: Recap Greedy algorithms are efficient algorithms for optimization problems that exhibit two properties: Greedy choice property: An optimal solution can be obtained by making locally optimal choices. Optimal substructure: An optimal solution contains within it optimal solutions to smaller subproblems. If only optimal substructure is present, dynamic programming may be a viable approach; that is, the greedy choice property is what allows us to obtain faster algorithms than what can be obtained using dynamic programming. 80

81 Questions? 81

Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 15, 2015 洪國寶

Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 15, 2015 洪國寶 Design and Analysis of Algorithms 演算法設計與分析 Lecture 7 April 15, 2015 洪國寶 1 Course information (5/5) Grading (Tentative) Homework 25% (You may collaborate when solving the homework, however when writing

More information

Greedy Algorithms. CLRS Chapters Introduction to greedy algorithms. Design of data-compression (Huffman) codes

Greedy Algorithms. CLRS Chapters Introduction to greedy algorithms. Design of data-compression (Huffman) codes Greedy Algorithms CLRS Chapters 16.1 16.3 Introduction to greedy algorithms Activity-selection problem Design of data-compression (Huffman) codes (Minimum spanning tree problem) (Shortest-path problem)

More information

We ve done. Now. Next

We ve done. Now. Next We ve done Fast Fourier Transform Polynomial Multiplication Now Introduction to the greedy method Activity selection problem How to prove that a greedy algorithm works Huffman coding Matroid theory Next

More information

Elements of Dynamic Programming. COSC 3101A - Design and Analysis of Algorithms 8. Discovering Optimal Substructure. Optimal Substructure - Examples

Elements of Dynamic Programming. COSC 3101A - Design and Analysis of Algorithms 8. Discovering Optimal Substructure. Optimal Substructure - Examples Elements of Dynamic Programming COSC 3A - Design and Analysis of Algorithms 8 Elements of DP Memoization Longest Common Subsequence Greedy Algorithms Many of these slides are taken from Monica Nicolescu,

More information

Analysis of Algorithms - Greedy algorithms -

Analysis of Algorithms - Greedy algorithms - Analysis of Algorithms - Greedy algorithms - Andreas Ermedahl MRTC (Mälardalens Real-Time Reseach Center) andreas.ermedahl@mdh.se Autumn 2003 Greedy Algorithms Another paradigm for designing algorithms

More information

Algorithms Dr. Haim Levkowitz

Algorithms Dr. Haim Levkowitz 91.503 Algorithms Dr. Haim Levkowitz Fall 2007 Lecture 4 Tuesday, 25 Sep 2007 Design Patterns for Optimization Problems Greedy Algorithms 1 Greedy Algorithms 2 What is Greedy Algorithm? Similar to dynamic

More information

Chapter 16: Greedy Algorithm

Chapter 16: Greedy Algorithm Chapter 16: Greedy Algorithm 1 About this lecture Introduce Greedy Algorithm Look at some problems solvable by Greedy Algorithm 2 Coin Changing Suppose that in a certain country, the coin dominations consist

More information

16.Greedy algorithms

16.Greedy algorithms 16.Greedy algorithms 16.1 An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a n } of n proposed activities that with to use a resource. Each activity a i has a start time s i and a

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms Design and Analysis of Algorithms CSE 5311 Lecture 16 Greedy algorithms Junzhou Huang, Ph.D. Department of Computer Science and Engineering CSE5311 Design and Analysis of Algorithms 1 Overview A greedy

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms CSE 101, Winter 018 D/Q Greed SP s DP LP, Flow B&B, Backtrack Metaheuristics P, NP Design and Analysis of Algorithms Lecture 8: Greed Class URL: http://vlsicad.ucsd.edu/courses/cse101-w18/ Optimization

More information

CS473-Algorithms I. Lecture 11. Greedy Algorithms. Cevdet Aykanat - Bilkent University Computer Engineering Department

CS473-Algorithms I. Lecture 11. Greedy Algorithms. Cevdet Aykanat - Bilkent University Computer Engineering Department CS473-Algorithms I Lecture 11 Greedy Algorithms 1 Activity Selection Problem Input: a set S {1, 2,, n} of n activities s i =Start time of activity i, f i = Finish time of activity i Activity i takes place

More information

1 Non greedy algorithms (which we should have covered

1 Non greedy algorithms (which we should have covered 1 Non greedy algorithms (which we should have covered earlier) 1.1 Floyd Warshall algorithm This algorithm solves the all-pairs shortest paths problem, which is a problem where we want to find the shortest

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms Design and Analysis of Algorithms Instructor: SharmaThankachan Lecture 10: Greedy Algorithm Slides modified from Dr. Hon, with permission 1 About this lecture Introduce Greedy Algorithm Look at some problems

More information

G205 Fundamentals of Computer Engineering. CLASS 21, Mon. Nov Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm

G205 Fundamentals of Computer Engineering. CLASS 21, Mon. Nov Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm G205 Fundamentals of Computer Engineering CLASS 21, Mon. Nov. 22 2004 Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm Greedy Algorithms, 1 Algorithms for Optimization Problems Sequence of steps Choices at

More information

TU/e Algorithms (2IL15) Lecture 2. Algorithms (2IL15) Lecture 2 THE GREEDY METHOD

TU/e Algorithms (2IL15) Lecture 2. Algorithms (2IL15) Lecture 2 THE GREEDY METHOD Algorithms (2IL15) Lecture 2 THE GREEDY METHOD x y v w 1 Optimization problems for each instance there are (possibly) multiple valid solutions goal is to find an optimal solution minimization problem:

More information

Greedy Algorithms CHAPTER 16

Greedy Algorithms CHAPTER 16 CHAPTER 16 Greedy Algorithms In dynamic programming, the optimal solution is described in a recursive manner, and then is computed ``bottom up''. Dynamic programming is a powerful technique, but it often

More information

IN101: Algorithmic techniques Vladimir-Alexandru Paun ENSTA ParisTech

IN101: Algorithmic techniques Vladimir-Alexandru Paun ENSTA ParisTech IN101: Algorithmic techniques Vladimir-Alexandru Paun ENSTA ParisTech License CC BY-NC-SA 2.0 http://creativecommons.org/licenses/by-nc-sa/2.0/fr/ Outline Previously on IN101 Python s anatomy Functions,

More information

Scribe: Virginia Williams, Sam Kim (2016), Mary Wootters (2017) Date: May 22, 2017

Scribe: Virginia Williams, Sam Kim (2016), Mary Wootters (2017) Date: May 22, 2017 CS6 Lecture 4 Greedy Algorithms Scribe: Virginia Williams, Sam Kim (26), Mary Wootters (27) Date: May 22, 27 Greedy Algorithms Suppose we want to solve a problem, and we re able to come up with some recursive

More information

Dynamic Programming Assembly-Line Scheduling. Greedy Algorithms

Dynamic Programming Assembly-Line Scheduling. Greedy Algorithms Chapter 13 Greedy Algorithms Activity Selection Problem 0-1 Knapsack Problem Huffman Code Construction Dynamic Programming Assembly-Line Scheduling C-C Tsai P.1 Greedy Algorithms A greedy algorithm always

More information

Department of Computer Applications. MCA 312: Design and Analysis of Algorithms. [Part I : Medium Answer Type Questions] UNIT I

Department of Computer Applications. MCA 312: Design and Analysis of Algorithms. [Part I : Medium Answer Type Questions] UNIT I MCA 312: Design and Analysis of Algorithms [Part I : Medium Answer Type Questions] UNIT I 1) What is an Algorithm? What is the need to study Algorithms? 2) Define: a) Time Efficiency b) Space Efficiency

More information

1 Format. 2 Topics Covered. 2.1 Minimal Spanning Trees. 2.2 Union Find. 2.3 Greedy. CS 124 Quiz 2 Review 3/25/18

1 Format. 2 Topics Covered. 2.1 Minimal Spanning Trees. 2.2 Union Find. 2.3 Greedy. CS 124 Quiz 2 Review 3/25/18 CS 124 Quiz 2 Review 3/25/18 1 Format You will have 83 minutes to complete the exam. The exam may have true/false questions, multiple choice, example/counterexample problems, run-this-algorithm problems,

More information

Algorithms for Data Science

Algorithms for Data Science Algorithms for Data Science CSOR W4246 Eleni Drinea Computer Science Department Columbia University Thursday, October 1, 2015 Outline 1 Recap 2 Shortest paths in graphs with non-negative edge weights (Dijkstra

More information

Greedy algorithms part 2, and Huffman code

Greedy algorithms part 2, and Huffman code Greedy algorithms part 2, and Huffman code Two main properties: 1. Greedy choice property: At each decision point, make the choice that is best at the moment. We typically show that if we make a greedy

More information

ECE608 - Chapter 16 answers

ECE608 - Chapter 16 answers ¼ À ÈÌ Ê ½ ÈÊÇ Ä ÅË ½µ ½ º½¹ ¾µ ½ º½¹ µ ½ º¾¹½ µ ½ º¾¹¾ µ ½ º¾¹ µ ½ º ¹ µ ½ º ¹ µ ½ ¹½ ½ ECE68 - Chapter 6 answers () CLR 6.-4 Let S be the set of n activities. The obvious solution of using Greedy-Activity-

More information

Subsequence Definition. CS 461, Lecture 8. Today s Outline. Example. Assume given sequence X = x 1, x 2,..., x m. Jared Saia University of New Mexico

Subsequence Definition. CS 461, Lecture 8. Today s Outline. Example. Assume given sequence X = x 1, x 2,..., x m. Jared Saia University of New Mexico Subsequence Definition CS 461, Lecture 8 Jared Saia University of New Mexico Assume given sequence X = x 1, x 2,..., x m Let Z = z 1, z 2,..., z l Then Z is a subsequence of X if there exists a strictly

More information

16 Greedy Algorithms

16 Greedy Algorithms 16 Greedy Algorithms Optimization algorithms typically go through a sequence of steps, with a set of choices at each For many optimization problems, using dynamic programming to determine the best choices

More information

CSE 421 Greedy: Huffman Codes

CSE 421 Greedy: Huffman Codes CSE 421 Greedy: Huffman Codes Yin Tat Lee 1 Compression Example 100k file, 6 letter alphabet: File Size: ASCII, 8 bits/char: 800kbits 2 3 > 6; 3 bits/char: 300kbits a 45% b 13% c 12% d 16% e 9% f 5% Why?

More information

Greedy Algorithms. Alexandra Stefan

Greedy Algorithms. Alexandra Stefan Greedy Algorithms Alexandra Stefan 1 Greedy Method for Optimization Problems Greedy: take the action that is best now (out of the current options) it may cause you to miss the optimal solution You build

More information

Introduction to Algorithms

Introduction to Algorithms Introduction to Algorithms Dynamic Programming Well known algorithm design techniques: Brute-Force (iterative) ti algorithms Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic

More information

Lecture 8. Dynamic Programming

Lecture 8. Dynamic Programming Lecture 8. Dynamic Programming T. H. Cormen, C. E. Leiserson and R. L. Rivest Introduction to Algorithms, 3rd Edition, MIT Press, 2009 Sungkyunkwan University Hyunseung Choo choo@skku.edu Copyright 2000-2018

More information

CS F-10 Greedy Algorithms 1

CS F-10 Greedy Algorithms 1 CS673-2016F-10 Greedy Algorithms 1 10-0: Dynamic Programming Hallmarks of Dynamic Programming Optimal Program Substructure Overlapping Subproblems If a problem has optimal program structure, there may

More information

15.4 Longest common subsequence

15.4 Longest common subsequence 15.4 Longest common subsequence Biological applications often need to compare the DNA of two (or more) different organisms A strand of DNA consists of a string of molecules called bases, where the possible

More information

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS Department of Computer Science University of Babylon LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS By Faculty of Science for Women( SCIW), University of Babylon, Iraq Samaher@uobabylon.edu.iq

More information

Huffman Coding. Version of October 13, Version of October 13, 2014 Huffman Coding 1 / 27

Huffman Coding. Version of October 13, Version of October 13, 2014 Huffman Coding 1 / 27 Huffman Coding Version of October 13, 2014 Version of October 13, 2014 Huffman Coding 1 / 27 Outline Outline Coding and Decoding The optimal source coding problem Huffman coding: A greedy algorithm Correctness

More information

CS 758/858: Algorithms

CS 758/858: Algorithms CS 758/858: Algorithms http://www.cs.unh.edu/~ruml/cs758 Wheeler Ruml (UNH) Class 12, CS 758 1 / 22 Scheduling Rules Algorithm Proof Opt. Substructure Break Algorithms Wheeler Ruml (UNH) Class 12, CS 758

More information

Dynamic Programming II

Dynamic Programming II June 9, 214 DP: Longest common subsequence biologists often need to find out how similar are 2 DNA sequences DNA sequences are strings of bases: A, C, T and G how to define similarity? DP: Longest common

More information

CS141: Intermediate Data Structures and Algorithms Greedy Algorithms

CS141: Intermediate Data Structures and Algorithms Greedy Algorithms CS141: Intermediate Data Structures and Algorithms Greedy Algorithms Amr Magdy Activity Selection Problem Given a set of activities S = {a 1, a 2,, a n } where each activity i has a start time s i and

More information

Greedy Algorithms. Informal Definition A greedy algorithm makes its next step based only on the current state and simple calculations on the input.

Greedy Algorithms. Informal Definition A greedy algorithm makes its next step based only on the current state and simple calculations on the input. Greedy Algorithms Informal Definition A greedy algorithm makes its next step based only on the current state and simple calculations on the input. easy to design not always correct challenge is to identify

More information

So far... Finished looking at lower bounds and linear sorts.

So far... Finished looking at lower bounds and linear sorts. So far... Finished looking at lower bounds and linear sorts. Next: Memoization -- Optimization problems - Dynamic programming A scheduling problem Matrix multiplication optimization Longest Common Subsequence

More information

Tutorial 6-7. Dynamic Programming and Greedy

Tutorial 6-7. Dynamic Programming and Greedy Tutorial 6-7 Dynamic Programming and Greedy Dynamic Programming Why DP? Natural Recursion may be expensive. For example, the Fibonacci: F(n)=F(n-1)+F(n-2) Recursive implementation memoryless : time= 1

More information

/463 Algorithms - Fall 2013 Solution to Assignment 3

/463 Algorithms - Fall 2013 Solution to Assignment 3 600.363/463 Algorithms - Fall 2013 Solution to Assignment 3 (120 points) I (30 points) (Hint: This problem is similar to parenthesization in matrix-chain multiplication, except the special treatment on

More information

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms Lecturer: Shi Li Department of Computer Science and Engineering University at Buffalo Main Goal of Algorithm Design Design fast

More information

Dynamic Programming Group Exercises

Dynamic Programming Group Exercises Name: Name: Name: Dynamic Programming Group Exercises Adapted from material by Cole Frederick Please work the following problems in groups of 2 or 3. Use additional paper as needed, and staple the sheets

More information

CS473-Algorithms I. Lecture 10. Dynamic Programming. Cevdet Aykanat - Bilkent University Computer Engineering Department

CS473-Algorithms I. Lecture 10. Dynamic Programming. Cevdet Aykanat - Bilkent University Computer Engineering Department CS473-Algorithms I Lecture 1 Dynamic Programming 1 Introduction An algorithm design paradigm like divide-and-conquer Programming : A tabular method (not writing computer code) Divide-and-Conquer (DAC):

More information

Huffman Codes (data compression)

Huffman Codes (data compression) Huffman Codes (data compression) Data compression is an important technique for saving storage Given a file, We can consider it as a string of characters We want to find a compressed file The compressed

More information

Dynamic Programming. Design and Analysis of Algorithms. Entwurf und Analyse von Algorithmen. Irene Parada. Design and Analysis of Algorithms

Dynamic Programming. Design and Analysis of Algorithms. Entwurf und Analyse von Algorithmen. Irene Parada. Design and Analysis of Algorithms Entwurf und Analyse von Algorithmen Dynamic Programming Overview Introduction Example 1 When and how to apply this method Example 2 Final remarks Introduction: when recursion is inefficient Example: Calculation

More information

Greedy Algorithms. Textbook reading. Chapter 4 Chapter 5. CSci 3110 Greedy Algorithms 1/63

Greedy Algorithms. Textbook reading. Chapter 4 Chapter 5. CSci 3110 Greedy Algorithms 1/63 CSci 3110 Greedy Algorithms 1/63 Greedy Algorithms Textbook reading Chapter 4 Chapter 5 CSci 3110 Greedy Algorithms 2/63 Overview Design principle: Make progress towards a solution based on local criteria

More information

CSE 417 Dynamic Programming (pt 4) Sub-problems on Trees

CSE 417 Dynamic Programming (pt 4) Sub-problems on Trees CSE 417 Dynamic Programming (pt 4) Sub-problems on Trees Reminders > HW4 is due today > HW5 will be posted shortly Dynamic Programming Review > Apply the steps... 1. Describe solution in terms of solution

More information

Dynamic Programming. Lecture Overview Introduction

Dynamic Programming. Lecture Overview Introduction Lecture 12 Dynamic Programming 12.1 Overview Dynamic Programming is a powerful technique that allows one to solve many different types of problems in time O(n 2 ) or O(n 3 ) for which a naive approach

More information

Greedy Algorithms CLRS Laura Toma, csci2200, Bowdoin College

Greedy Algorithms CLRS Laura Toma, csci2200, Bowdoin College Greedy Algorithms CLRS 16.1-16.2 Laura Toma, csci2200, Bowdoin College Overview. Sometimes we can solve optimization problems with a technique called greedy. A greedy algorithm picks the option that looks

More information

Analysis of Algorithms Prof. Karen Daniels

Analysis of Algorithms Prof. Karen Daniels UMass Lowell Computer Science 91.503 Analysis of Algorithms Prof. Karen Daniels Spring, 2010 Lecture 2 Tuesday, 2/2/10 Design Patterns for Optimization Problems Greedy Algorithms Algorithmic Paradigm Context

More information

Chapter 6. Dynamic Programming

Chapter 6. Dynamic Programming Chapter 6 Dynamic Programming CS 573: Algorithms, Fall 203 September 2, 203 6. Maximum Weighted Independent Set in Trees 6..0. Maximum Weight Independent Set Problem Input Graph G = (V, E) and weights

More information

CSC 373 Lecture # 3 Instructor: Milad Eftekhar

CSC 373 Lecture # 3 Instructor: Milad Eftekhar Huffman encoding: Assume a context is available (a document, a signal, etc.). These contexts are formed by some symbols (words in a document, discrete samples from a signal, etc). Each symbols s i is occurred

More information

Problem Strategies. 320 Greedy Strategies 6

Problem Strategies. 320 Greedy Strategies 6 Problem Strategies Weighted interval scheduling: 2 subproblems (include the interval or don t) Have to check out all the possibilities in either case, so lots of subproblem overlap dynamic programming:

More information

Efficient Sequential Algorithms, Comp309. Motivation. Longest Common Subsequence. Part 3. String Algorithms

Efficient Sequential Algorithms, Comp309. Motivation. Longest Common Subsequence. Part 3. String Algorithms Efficient Sequential Algorithms, Comp39 Part 3. String Algorithms University of Liverpool References: T. H. Cormen, C. E. Leiserson, R. L. Rivest Introduction to Algorithms, Second Edition. MIT Press (21).

More information

CS 231: Algorithmic Problem Solving

CS 231: Algorithmic Problem Solving CS 231: Algorithmic Problem Solving Naomi Nishimura Module 5 Date of this version: June 14, 2018 WARNING: Drafts of slides are made available prior to lecture for your convenience. After lecture, slides

More information

CSE541 Class 2. Jeremy Buhler. September 1, 2016

CSE541 Class 2. Jeremy Buhler. September 1, 2016 CSE541 Class 2 Jeremy Buhler September 1, 2016 1 A Classic Problem and a Greedy Approach A classic problem for which one might want to apply a greedy algo is knapsack. Given: a knapsack of capacity M,

More information

February 24, :52 World Scientific Book - 9in x 6in soltys alg. Chapter 3. Greedy Algorithms

February 24, :52 World Scientific Book - 9in x 6in soltys alg. Chapter 3. Greedy Algorithms Chapter 3 Greedy Algorithms Greedy algorithms are algorithms prone to instant gratification. Without looking too far ahead, at each step they make a locally optimum choice, with the hope that it will lead

More information

We augment RBTs to support operations on dynamic sets of intervals A closed interval is an ordered pair of real

We augment RBTs to support operations on dynamic sets of intervals A closed interval is an ordered pair of real 14.3 Interval trees We augment RBTs to support operations on dynamic sets of intervals A closed interval is an ordered pair of real numbers ], with Interval ]represents the set Open and half-open intervals

More information

COMP251: Greedy algorithms

COMP251: Greedy algorithms COMP251: Greedy algorithms Jérôme Waldispühl School of Computer Science McGill University Based on (Cormen et al., 2002) Based on slides from D. Plaisted (UNC) & (goodrich & Tamassia, 2009) Disjoint sets

More information

4 Dynamic Programming

4 Dynamic Programming 4 Dynamic Programming Dynamic Programming is a form of recursion. In Computer Science, you have probably heard the tradeoff between Time and Space. There is a trade off between the space complexity and

More information

Lecture: Analysis of Algorithms (CS )

Lecture: Analysis of Algorithms (CS ) Lecture: Analysis of Algorithms (CS483-001) Amarda Shehu Spring 2017 1 The Fractional Knapsack Problem Huffman Coding 2 Sample Problems to Illustrate The Fractional Knapsack Problem Variable-length (Huffman)

More information

Lecture 3, Review of Algorithms. What is Algorithm?

Lecture 3, Review of Algorithms. What is Algorithm? BINF 336, Introduction to Computational Biology Lecture 3, Review of Algorithms Young-Rae Cho Associate Professor Department of Computer Science Baylor University What is Algorithm? Definition A process

More information

Main approach: always make the choice that looks best at the moment.

Main approach: always make the choice that looks best at the moment. Greedy algorithms Main approach: always make the choice that looks best at the moment. - More efficient than dynamic programming - Always make the choice that looks best at the moment (just one choice;

More information

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING QUESTION BANK UNIT-III. SUB NAME: DESIGN AND ANALYSIS OF ALGORITHMS SEM/YEAR: III/ II PART A (2 Marks)

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING QUESTION BANK UNIT-III. SUB NAME: DESIGN AND ANALYSIS OF ALGORITHMS SEM/YEAR: III/ II PART A (2 Marks) DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING QUESTION BANK UNIT-III SUB CODE: CS2251 DEPT: CSE SUB NAME: DESIGN AND ANALYSIS OF ALGORITHMS SEM/YEAR: III/ II PART A (2 Marks) 1. Write any four examples

More information

Algorithm Design Techniques part I

Algorithm Design Techniques part I Algorithm Design Techniques part I Divide-and-Conquer. Dynamic Programming DSA - lecture 8 - T.U.Cluj-Napoca - M. Joldos 1 Some Algorithm Design Techniques Top-Down Algorithms: Divide-and-Conquer Bottom-Up

More information

Computer Sciences Department 1

Computer Sciences Department 1 1 Advanced Design and Analysis Techniques (15.1, 15.2, 15.3, 15.4 and 15.5) 3 Objectives Problem Formulation Examples The Basic Problem Principle of optimality Important techniques: dynamic programming

More information

Homework3: Dynamic Programming - Answers

Homework3: Dynamic Programming - Answers Most Exercises are from your textbook: Homework3: Dynamic Programming - Answers 1. For the Rod Cutting problem (covered in lecture) modify the given top-down memoized algorithm (includes two procedures)

More information

CMPS 2200 Fall Dynamic Programming. Carola Wenk. Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk

CMPS 2200 Fall Dynamic Programming. Carola Wenk. Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk CMPS 00 Fall 04 Dynamic Programming Carola Wenk Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk 9/30/4 CMPS 00 Intro. to Algorithms Dynamic programming Algorithm design technique

More information

Greedy algorithms is another useful way for solving optimization problems.

Greedy algorithms is another useful way for solving optimization problems. Greedy Algorithms Greedy algorithms is another useful way for solving optimization problems. Optimization Problems For the given input, we are seeking solutions that must satisfy certain conditions. These

More information

Text Compression through Huffman Coding. Terminology

Text Compression through Huffman Coding. Terminology Text Compression through Huffman Coding Huffman codes represent a very effective technique for compressing data; they usually produce savings between 20% 90% Preliminary example We are given a 100,000-character

More information

Main approach: always make the choice that looks best at the moment. - Doesn t always result in globally optimal solution, but for many problems does

Main approach: always make the choice that looks best at the moment. - Doesn t always result in globally optimal solution, but for many problems does Greedy algorithms Main approach: always make the choice that looks best at the moment. - More efficient than dynamic programming - Doesn t always result in globally optimal solution, but for many problems

More information

Algorithms for Euclidean TSP

Algorithms for Euclidean TSP This week, paper [2] by Arora. See the slides for figures. See also http://www.cs.princeton.edu/~arora/pubs/arorageo.ps Algorithms for Introduction This lecture is about the polynomial time approximation

More information

1 i n (p i + r n i ) (Note that by allowing i to be n, we handle the case where the rod is not cut at all.)

1 i n (p i + r n i ) (Note that by allowing i to be n, we handle the case where the rod is not cut at all.) Dynamic programming is a problem solving method that is applicable to many different types of problems. I think it is best learned by example, so we will mostly do examples today. 1 Rod cutting Suppose

More information

10/24/ Rotations. 2. // s left subtree s right subtree 3. if // link s parent to elseif == else 11. // put x on s left

10/24/ Rotations. 2. // s left subtree s right subtree 3. if // link s parent to elseif == else 11. // put x on s left 13.2 Rotations MAT-72006 AA+DS, Fall 2013 24-Oct-13 368 LEFT-ROTATE(, ) 1. // set 2. // s left subtree s right subtree 3. if 4. 5. // link s parent to 6. if == 7. 8. elseif == 9. 10. else 11. // put x

More information

Unit-5 Dynamic Programming 2016

Unit-5 Dynamic Programming 2016 5 Dynamic programming Overview, Applications - shortest path in graph, matrix multiplication, travelling salesman problem, Fibonacci Series. 20% 12 Origin: Richard Bellman, 1957 Programming referred to

More information

Algorithms IV. Dynamic Programming. Guoqiang Li. School of Software, Shanghai Jiao Tong University

Algorithms IV. Dynamic Programming. Guoqiang Li. School of Software, Shanghai Jiao Tong University Algorithms IV Dynamic Programming Guoqiang Li School of Software, Shanghai Jiao Tong University Dynamic Programming Shortest Paths in Dags, Revisited Shortest Paths in Dags, Revisited The special distinguishing

More information

15-451/651: Design & Analysis of Algorithms January 26, 2015 Dynamic Programming I last changed: January 28, 2015

15-451/651: Design & Analysis of Algorithms January 26, 2015 Dynamic Programming I last changed: January 28, 2015 15-451/651: Design & Analysis of Algorithms January 26, 2015 Dynamic Programming I last changed: January 28, 2015 Dynamic Programming is a powerful technique that allows one to solve many different types

More information

Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology. Assignment

Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology. Assignment Class: V - CE Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology Sub: Design and Analysis of Algorithms Analysis of Algorithm: Assignment

More information

Longest Common Subsequence, Knapsack, Independent Set Scribe: Wilbur Yang (2016), Mary Wootters (2017) Date: November 6, 2017

Longest Common Subsequence, Knapsack, Independent Set Scribe: Wilbur Yang (2016), Mary Wootters (2017) Date: November 6, 2017 CS161 Lecture 13 Longest Common Subsequence, Knapsack, Independent Set Scribe: Wilbur Yang (2016), Mary Wootters (2017) Date: November 6, 2017 1 Overview Last lecture, we talked about dynamic programming

More information

CSE 417 Branch & Bound (pt 4) Branch & Bound

CSE 417 Branch & Bound (pt 4) Branch & Bound CSE 417 Branch & Bound (pt 4) Branch & Bound Reminders > HW8 due today > HW9 will be posted tomorrow start early program will be slow, so debugging will be slow... Review of previous lectures > Complexity

More information

CSE 101, Winter Design and Analysis of Algorithms. Lecture 11: Dynamic Programming, Part 2

CSE 101, Winter Design and Analysis of Algorithms. Lecture 11: Dynamic Programming, Part 2 CSE 101, Winter 2018 Design and Analysis of Algorithms Lecture 11: Dynamic Programming, Part 2 Class URL: http://vlsicad.ucsd.edu/courses/cse101-w18/ Goal: continue with DP (Knapsack, All-Pairs SPs, )

More information

Solving NP-hard Problems on Special Instances

Solving NP-hard Problems on Special Instances Solving NP-hard Problems on Special Instances Solve it in poly- time I can t You can assume the input is xxxxx No Problem, here is a poly-time algorithm 1 Solving NP-hard Problems on Special Instances

More information

Design and Analysis of Algorithms 演算法設計與分析. Lecture 13 December 18, 2013 洪國寶

Design and Analysis of Algorithms 演算法設計與分析. Lecture 13 December 18, 2013 洪國寶 Design and Analysis of Algorithms 演算法設計與分析 Lecture 13 December 18, 2013 洪國寶 1 Homework #10 1. 24.1-1 (p. 591 / p. 654) 2. 24.1-6 (p. 592 / p. 655) 3. 24.3-2 (p. 600 / p. 663) 4. 24.3-8 (p. 601) / 24.3-10

More information

Data Structures and Algorithms Week 8

Data Structures and Algorithms Week 8 Data Structures and Algorithms Week 8 Dynamic programming Fibonacci numbers Optimization problems Matrix multiplication optimization Principles of dynamic programming Longest Common Subsequence Algorithm

More information

Dijkstra s algorithm for shortest paths when no edges have negative weight.

Dijkstra s algorithm for shortest paths when no edges have negative weight. Lecture 14 Graph Algorithms II 14.1 Overview In this lecture we begin with one more algorithm for the shortest path problem, Dijkstra s algorithm. We then will see how the basic approach of this algorithm

More information

Lecture 5: Dynamic Programming II

Lecture 5: Dynamic Programming II Lecture 5: Dynamic Programming II Scribe: Weiyao Wang September 12, 2017 1 Lecture Overview Today s lecture continued to discuss dynamic programming techniques, and contained three parts. First, we will

More information

Chapter 9. Greedy Technique. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

Chapter 9. Greedy Technique. Copyright 2007 Pearson Addison-Wesley. All rights reserved. Chapter 9 Greedy Technique Copyright 2007 Pearson Addison-Wesley. All rights reserved. Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that

More information

MCS-375: Algorithms: Analysis and Design Handout #G2 San Skulrattanakulchai Gustavus Adolphus College Oct 21, Huffman Codes

MCS-375: Algorithms: Analysis and Design Handout #G2 San Skulrattanakulchai Gustavus Adolphus College Oct 21, Huffman Codes MCS-375: Algorithms: Analysis and Design Handout #G2 San Skulrattanakulchai Gustavus Adolphus College Oct 21, 2016 Huffman Codes CLRS: Ch 16.3 Ziv-Lempel is the most popular compression algorithm today.

More information

Lecture 13: Chain Matrix Multiplication

Lecture 13: Chain Matrix Multiplication Lecture 3: Chain Matrix Multiplication CLRS Section 5.2 Revised April 7, 2003 Outline of this Lecture Recalling matrix multiplication. The chain matrix multiplication problem. A dynamic programming algorithm

More information

Greedy Algorithms and Huffman Coding

Greedy Algorithms and Huffman Coding Greedy Algorithms and Huffman Coding Henry Z. Lo June 10, 2014 1 Greedy Algorithms 1.1 Change making problem Problem 1. You have quarters, dimes, nickels, and pennies. amount, n, provide the least number

More information

PSD1A. DESIGN AND ANALYSIS OF ALGORITHMS Unit : I-V

PSD1A. DESIGN AND ANALYSIS OF ALGORITHMS Unit : I-V PSD1A DESIGN AND ANALYSIS OF ALGORITHMS Unit : I-V UNIT I -- Introduction -- Definition of Algorithm -- Pseudocode conventions -- Recursive algorithms -- Time and space complexity -- Big- o notation --

More information

Virtual University of Pakistan

Virtual University of Pakistan Virtual University of Pakistan Department of Computer Science Course Outline Course Instructor Dr. Sohail Aslam E mail Course Code Course Title Credit Hours 3 Prerequisites Objectives Learning Outcomes

More information

Greedy Algorithms 1. For large values of d, brute force search is not feasible because there are 2 d

Greedy Algorithms 1. For large values of d, brute force search is not feasible because there are 2 d Greedy Algorithms 1 Simple Knapsack Problem Greedy Algorithms form an important class of algorithmic techniques. We illustrate the idea by applying it to a simplified version of the Knapsack Problem. Informally,

More information

Chapter 16. Greedy Algorithms

Chapter 16. Greedy Algorithms Chapter 16. Greedy Algorithms Algorithms for optimization problems (minimization or maximization problems) typically go through a sequence of steps, with a set of choices at each step. A greedy algorithm

More information

CS Algorithms and Complexity

CS Algorithms and Complexity CS 350 - Algorithms and Complexity Dynamic Programming Sean Anderson 2/20/18 Portland State University Table of contents 1. Homework 3 Solutions 2. Dynamic Programming 3. Problem of the Day 4. Application

More information

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Shortest Paths Date: 10/13/15

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Shortest Paths Date: 10/13/15 600.363 Introduction to Algorithms / 600.463 Algorithms I Lecturer: Michael Dinitz Topic: Shortest Paths Date: 10/13/15 14.1 Introduction Today we re going to talk about algorithms for computing shortest

More information

CSC 421: Algorithm Design & Analysis. Spring 2015

CSC 421: Algorithm Design & Analysis. Spring 2015 CSC 421: Algorithm Design & Analysis Spring 2015 Greedy algorithms greedy algorithms examples: optimal change, job scheduling Prim's algorithm (minimal spanning tree) Dijkstra's algorithm (shortest path)

More information

Dynamic Programming part 2

Dynamic Programming part 2 Dynamic Programming part 2 Week 7 Objectives More dynamic programming examples - Matrix Multiplication Parenthesis - Longest Common Subsequence Subproblem Optimal structure Defining the dynamic recurrence

More information

Welcome Back to Fundamentals of Multimedia (MR412) Fall, 2012 Lecture 10 (Chapter 7) ZHU Yongxin, Winson

Welcome Back to Fundamentals of Multimedia (MR412) Fall, 2012 Lecture 10 (Chapter 7) ZHU Yongxin, Winson Welcome Back to Fundamentals of Multimedia (MR412) Fall, 2012 Lecture 10 (Chapter 7) ZHU Yongxin, Winson zhuyongxin@sjtu.edu.cn 2 Lossless Compression Algorithms 7.1 Introduction 7.2 Basics of Information

More information