Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 15, 2015 洪國寶

Size: px
Start display at page:

Download "Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 15, 2015 洪國寶"

Transcription

1 Design and Analysis of Algorithms 演算法設計與分析 Lecture 7 April 15, 2015 洪國寶 1

2 Course information (5/5) Grading (Tentative) Homework 25% (You may collaborate when solving the homework, however when writing up the solutions you must do so on your own. No typed or printed assignments.) Midterm exam 25% (Open book and notes) April 22, 2015 Final exam 25% (Open book and notes) Class participation 25% 2

3 Outline of the course (1/1) Introduction (1-4) Midterm exam Data structures (10-14) Apr. 22, 2015 Dynamic programming (15) Greedy methods (16) Amortized analysis (17) Advanced data structures (6, 19-21) Graph algorithms (22-25) NP-completeness (34-35) Other topics (5, 31) 3

4 Outline Review Dynamic programming (Cont.) Longest Common Subsequence (LCS) Greedy method Activity-selection algorithm Basic elements of greedy approach Examples where greedy approach does not work Discuss homework 4

5 Review: Elements of Dynamic Programming Optimal substructure: an optimal solution to the problem contains within it optimal solutions to subproblems (for DP to be applicable) Overlapping subproblems: the space of subproblems must be small (for algorithm to be efficient) 5

6 Optimal Polygon Triangulation Def: A polygon P is convex if, when you connect any two points p, q in the polygon (interior + boundary), the entire line segment pq lies in the polygon. is convex Is not convex 6

7 Optimal Polygon Triangulation A triangulation of a polygon P={v 0, v 1,, v n-1 } is a set of non-intersecting diagonals that partitions the polygon into triangles. an edge vivj is called a diagonal (or chord) if v i and vj are not adjacent vertices Weight of a triangulation is the total weight of all triangles w( v i v j v k ) defined as: w( v i v j v k )= v i v j + v j v k + v i v k Weight = boundary+2(total length of all diagonals). Can we define w( v i v j v k ) to be the area of v i v j v k? 7

8 Optimal Polygon Triangulation Input: a convex polygon P=(v 0, v 1,, v n-1 ) Output: an optimal triangulation 8

9 Optimal Polygon Triangulation Triangulation of a convex polygon: Pick any diagonal, then recurse in the reduced convex polygons. One of the Sub-problems becomes: 9

10 Optimal Polygon Triangulation Let t[i,j] be the weight of an optimal triangulation of polygon (v i-1, v i,, v j ) 0 if i=j t[i.j]= if i<j min { t[ i, k] t[ k 1, j] w( vi 1v jvk )} i k j i j k Becomes two sub-problems of the same form and one triangle 10

11 Optimal Polygon Triangulation vs. Matrix-chain multiplication Optimal Polygon Triangulation 0 if i = j t[i,j] = min { t[ i, k] t[ k 1, j] w( vi 1v jvk )} i k j Matrix-chain multiplication if i < j m[ i, j] 0 min 1 k j { m[ i, k] m[ k 1, j] p i 1 p k p j if if i i j j 11

12 Optimal Polygon Triangulation Algorithm Similar to matrix-chain multiplication Running time is O(n 3 ) Lecture 01 Why abstract problems? One algorithm many applications matrix chain multiplication, optimal polygon triangulation Optimal merge pattern, Huffman code 12

13 Dynamic Programming (Remarks) Matrix-chain multiplication: O(n log n) Longest common subsequence: - O(mn/log n), O((m+n)/log (m+n)) Other problems Rod cutting / Assembly-line scheduling Optimal binary search trees Some real world applications - Speech recognition (Viterbi algorithm) - Image processing (Morphing) 13

14 Outline Greedy method Activity-selection algorithm Basic elements of greedy approach Examples where greedy approach does not work 14

15 Algorithm design techniques So far, we ve looked at the following design techniques: - induction (incremental approach) - divide and conquer - augmenting data structures - dynamic programming Coming up: greedy method 15

16 Dynamic Programming VS. Greedy Algorithms Dynamic programming uses optimal substructure in a bottom-up fashion First find optimal solutions to subproblems and, having solved the subproblems, we find an optimal solution to the problem Greedy algorithms use optimal substructure in a top-down fashion First make a choice the choice that looks best at the time and then solving a resulting subproblem 16

17 Greedy Algorithms A greedy algorithm always makes the choice that looks best at the moment The hope: a locally optimal choice will lead to a globally optimal solution For some problems, it works Dynamic programming can be overkill; greedy algorithms tend to be easier to code 17

18 Example: Activity-Selection Formally: Given a set S of n activities s i = start time of activity i f i = finish time of activity i Find max-size subset A of compatible activities: for all i,j A, [s i, f i ) and [s j, f j ) do not overlap

19 Activity Selection: Optimal Substructure Let k be the minimum activity in A (i.e., the one with the earliest finish time). Then A - {k} is an optimal solution to S = {i S: s i f k } In words: once activity k is selected, the problem reduces to finding an optimal solution for activityselection over activities in S compatible with k Proof: if we could find optimal solution B to S with B > A - {k}, Then B U {k} is compatible And B U {k} > A 19

20 Activity Selection: Greedy Choice Property Dynamic programming? Memoize? Activity selection problem also exhibits the greedy choice property: Locally optimal choice globally optimal sol n Thm 16.1: if S is an activity selection problem sorted by finish time, then optimal solution A S such that {1} A Sketch of proof: if optimal solution B that does not contain {1}, can always replace first activity in B with {1} (Why?). Same number of activities, thus optimal. 20

21 Activity Selection: A Greedy Algorithm Intuition is simple: Always pick the activity with the earliest finish time available at the time GAS(S) 1 if S=NIL then return NIL 2 else return {k} U GAS(S ) where k is the activity in S with smallest f, and S = {i S: s i f k } 21

22 GAS(S) Activity Selection: A Greedy Algorithm 1 if S=NIL then return NIL 2 else return {k} U GAS(S ) where k is the activity in S with smallest f, and S = {i S: s i f k } Proof of correctness: use blackboard Use blackboard 22

23 Activity Selection: A LISP program GAS(S) 1 if S=NIL then return NIL 2 else return {k} U GAS(S ) where k is the activity in S with smallest f, and S = {i S: s i f k } (defun gas (l) (print l) (cond ((null l) nil) (T (cons (car l) (gas (filter (nth 1 (car l)) (cdr l))))))) (defun filter (s l) (cond ((null l) nil) ((> s (nth 0 (car l))) (filter s (cdr l))) (T (cons (car l) (filter s (cdr l)))))) 23

24 Activity Selection: A LISP program (cont.) (defvar *activity* '((1 4) (3 5) (0 6) (5 7) (3 8) (5 9) (6 10) (8 11) (8 12) (2 13) (12 14))) #<BUFFERED FILE-STREAM CHARACTER [3]> *activity* ((1 4) (3 5) (0 6) (5 7) (3 8) (5 9) (6 10) (8 11) (8 12) (2 13) (12 14)) [4]> (gas *activity*) ((1 4) (3 5) (0 6) (5 7) (3 8) (5 9) (6 10) (8 11) (8 12) (2 13) (12 14)) ((5 7) (5 9) (6 10) (8 11) (8 12) (12 14)) ((8 11) (8 12) (12 14)) ((12 14)) NIL ((1 4) (5 7) (8 11) (12 14)) The S for each iteration (print l) [5]> (dribble) 24

25 Activity Selection: A Greedy Algorithm If we sort the activities by finish time then the algorithm is simple (iterative instead of recursive): Sort the activities by finish time Schedule the first activity Then schedule the next activity in sorted list which starts after previous activity finishes Repeat until no more activities Note: we don t have to construct S explicitly. Why? 25

26 Greedy-Activity-Selector Assume (wlog) that f 1 f 2 f n Time complexity: O(n) Exercise 1: Proof of correctness by loop invariant. 26

27 27

28 Activity Selection 28

29 Other greedy choices Greedy-Activity-Selector always pick the activity with the earliest finish time available at the time (smallest f) Some other greedy choices: Largest f Largest/Smallest s Largest/Smallest (f-s) Fewest overlapping Exercise 2: Which of these criteria result in optimal solutions? 29

30 A Variation of the Problem Instead of maximizing the number of activities we want to schedule, we want to maximize the total time the resource is in use. None of the obvious greedy choices would work: Choose the activity that starts earliest/latest Choose the activity that finishes earliest/latest Choose the longest activity Exercise 3: Design an efficient algorithm for this variation of activity selection problem. 30

31 Elements of Greedy Strategy Greedy choice property: An optimal solution can be obtained by making choices that seem best at the time, without considering their implications for solutions to subproblems. Optimal substructure: An optimal solution can be obtained by augmenting the partial solution constructed so far with an optimal solution of the remaining subproblem. 31

32 Steps in designing greedy algorithms 1. Cast the optimization problem as one in which we make a choice and are left with one subproblem to solve. 2. Prove the greedy choice property. 3. Demonstrate optimal substructure. 32

33 Examples where greedy approach does not work Traveling salesman problem: nearest neighbor, closest pair Matrix-chain multiplication: multiply the two matrices with lowest cost first Knapsack problem: largest values, largest value per unit weight 33

34 Traveling salesman problem Correctness is not obvious (2/7) (nearest neighbor) 34

35 Traveling salesman problem Correctness is not obvious (3/7) 35

36 Matrix-chain multiplication Greedy choice: multiply the two matrices with lowest cost first Example 1: <A 1, A 2, A 3 > (10*100, 100*5, 5*50) Example 2: <A 1, A 2, A 3, A 4, A 5, A 6 > (30*35, 35*15, 15*5, 5*10, 10*20, 20*25) 36

37 Matrix-chain multiplication Example 1: <A 1, A 2, A 3 > (10*100, 100*5, 5*50) ((A 1 A 2 )A 3 ) 10*100*5 + 10*5*50 = = 7500 (A 1 (A 2 A 3 )) 100*5* *100*50 = =

38 Matrix-chain multiplication Example 2: <A 1, A 2, A 3, A 4, A 5, A 6 > (30*35, 35*15, 15*5, 5*10, 10*20, 20*25) A 3 A 4 15*5*10 = 750 Figure 15.3 ((A 1 (A 2 A 3 ))((A 4 A 5 ) A 6 )) 38

39 39

40 Knapsack problem Given some items, pack the knapsack to get the maximum total value. Each item has some weight and some value. Total weight that we can carry is no more than some fixed number W. So we must consider weights of items as well as their value. Item # Weight Value

41 Knapsack Problem A thief breaks into a jewelry store carrying a knapsack. Given n items S = {item 1,,item n }, each having a weight w i and value v i, which items should the thief put into the knapsack with capacity W to obtain the maximum value? 41

42 Knapsack problem There are two versions of the problem: (1) 0-1 knapsack problem and (2) Fractional knapsack problem (1) Items are indivisible; you either take an item or not. Solved with dynamic programming (2) Items are divisible: you can take any fraction of an item. Solved with a greedy algorithm. 42

43 0-1 Knapsack Problem This problem requires a subset A of S to be determined such that i item A i v is maximized subject to The naïve approach is to generate all possible subset of S and find the maximum value, requiring 2 n time. item A i w i W 43

44 Does Greedy Approach Work? Strategy 1: Steal the items with the largest values. E.g. w=[25, 10, 10], v=[$10, $9, $9], W=30. Value is 10, although the optimal value is 18. w 1 =25 w 2 =10 w 3 =10 v 1 = $10 v 2 = $9 v 3 = $9 44

45 Does Greedy Approach Work? Strategy 2: Steal the items with the largest value per unit weight. E.g. w=[5, 20,10], v=[$50, $140, $60], W=30. Value is 190, although optimal would be 200. w 1 =5 w 2 =20 w 3 =10 v 1 =$50 v 2 =$140 v 3 =$60 45

46 NP-hard Problems (Knapsack, Traveling Salesman,...) The two approaches above cannot yield the optimal result for 0-1 knapsack. This problem is NP-hard even when all items are the same kind, that is, item i described by w i only (v i =w i ). Observation: Greedy approach to the fractional knapsack problem yields the optimal solution. E.g. (see 2nd example) (5/10)*60 = 220 where 5 is the remaining capacity of the knapsack 46

47 Dynamic Programming Approach for the Knapsack Problem We can find optimal solution for 0-1 knapsack problem by DP, but not in polynomial-time. Let A be an optimal subset with items from {item 1,,item i } only. There are two cases: 1) A contains item i Then the total value for A is equal to v i plus the optimal value obtained from the first i-1 items, where the total weight cannot exceed W w i 2) A does not contain item i Then the total value for A is equal to that of the optimal subset chosen from the first i-1 items (with total weight cannot exceed W). Q: What are the subproblems? 47

48 Dynamic Programming Approach for the Knapsack Problem A[i][w]= max(a[i-1][w], v i +A[i-1][w-w i ]) if w>w i A[i-1][w] if w<w i The maximum value is A[n][W] Running time is O(nW), not polynomial in n Q: WHY? n W 48

49 Fractional knapsack problem Greedy approach: taking items in order of greatest value per pound Optimal for the fractional version (why?), but not for the 0-1 version 49

50 Questions? 50

Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 6, 2016 洪國寶

Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 6, 2016 洪國寶 Design and Analysis of Algorithms 演算法設計與分析 Lecture 7 April 6, 2016 洪國寶 1 Course information (5/5) Grading (Tentative) Homework 25% (You may collaborate when solving the homework, however when writing up

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms Design and Analysis of Algorithms CSE 5311 Lecture 16 Greedy algorithms Junzhou Huang, Ph.D. Department of Computer Science and Engineering CSE5311 Design and Analysis of Algorithms 1 Overview A greedy

More information

1 Non greedy algorithms (which we should have covered

1 Non greedy algorithms (which we should have covered 1 Non greedy algorithms (which we should have covered earlier) 1.1 Floyd Warshall algorithm This algorithm solves the all-pairs shortest paths problem, which is a problem where we want to find the shortest

More information

Elements of Dynamic Programming. COSC 3101A - Design and Analysis of Algorithms 8. Discovering Optimal Substructure. Optimal Substructure - Examples

Elements of Dynamic Programming. COSC 3101A - Design and Analysis of Algorithms 8. Discovering Optimal Substructure. Optimal Substructure - Examples Elements of Dynamic Programming COSC 3A - Design and Analysis of Algorithms 8 Elements of DP Memoization Longest Common Subsequence Greedy Algorithms Many of these slides are taken from Monica Nicolescu,

More information

Algorithms Dr. Haim Levkowitz

Algorithms Dr. Haim Levkowitz 91.503 Algorithms Dr. Haim Levkowitz Fall 2007 Lecture 4 Tuesday, 25 Sep 2007 Design Patterns for Optimization Problems Greedy Algorithms 1 Greedy Algorithms 2 What is Greedy Algorithm? Similar to dynamic

More information

IN101: Algorithmic techniques Vladimir-Alexandru Paun ENSTA ParisTech

IN101: Algorithmic techniques Vladimir-Alexandru Paun ENSTA ParisTech IN101: Algorithmic techniques Vladimir-Alexandru Paun ENSTA ParisTech License CC BY-NC-SA 2.0 http://creativecommons.org/licenses/by-nc-sa/2.0/fr/ Outline Previously on IN101 Python s anatomy Functions,

More information

Greedy Algorithms. CLRS Chapters Introduction to greedy algorithms. Design of data-compression (Huffman) codes

Greedy Algorithms. CLRS Chapters Introduction to greedy algorithms. Design of data-compression (Huffman) codes Greedy Algorithms CLRS Chapters 16.1 16.3 Introduction to greedy algorithms Activity-selection problem Design of data-compression (Huffman) codes (Minimum spanning tree problem) (Shortest-path problem)

More information

Introduction to Algorithms

Introduction to Algorithms Introduction to Algorithms Dynamic Programming Well known algorithm design techniques: Brute-Force (iterative) ti algorithms Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic

More information

Analysis of Algorithms - Greedy algorithms -

Analysis of Algorithms - Greedy algorithms - Analysis of Algorithms - Greedy algorithms - Andreas Ermedahl MRTC (Mälardalens Real-Time Reseach Center) andreas.ermedahl@mdh.se Autumn 2003 Greedy Algorithms Another paradigm for designing algorithms

More information

Greedy Algorithms. Informal Definition A greedy algorithm makes its next step based only on the current state and simple calculations on the input.

Greedy Algorithms. Informal Definition A greedy algorithm makes its next step based only on the current state and simple calculations on the input. Greedy Algorithms Informal Definition A greedy algorithm makes its next step based only on the current state and simple calculations on the input. easy to design not always correct challenge is to identify

More information

CS473-Algorithms I. Lecture 11. Greedy Algorithms. Cevdet Aykanat - Bilkent University Computer Engineering Department

CS473-Algorithms I. Lecture 11. Greedy Algorithms. Cevdet Aykanat - Bilkent University Computer Engineering Department CS473-Algorithms I Lecture 11 Greedy Algorithms 1 Activity Selection Problem Input: a set S {1, 2,, n} of n activities s i =Start time of activity i, f i = Finish time of activity i Activity i takes place

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms CSE 101, Winter 018 D/Q Greed SP s DP LP, Flow B&B, Backtrack Metaheuristics P, NP Design and Analysis of Algorithms Lecture 8: Greed Class URL: http://vlsicad.ucsd.edu/courses/cse101-w18/ Optimization

More information

Dynamic Programming Assembly-Line Scheduling. Greedy Algorithms

Dynamic Programming Assembly-Line Scheduling. Greedy Algorithms Chapter 13 Greedy Algorithms Activity Selection Problem 0-1 Knapsack Problem Huffman Code Construction Dynamic Programming Assembly-Line Scheduling C-C Tsai P.1 Greedy Algorithms A greedy algorithm always

More information

15.4 Longest common subsequence

15.4 Longest common subsequence 15.4 Longest common subsequence Biological applications often need to compare the DNA of two (or more) different organisms A strand of DNA consists of a string of molecules called bases, where the possible

More information

16.Greedy algorithms

16.Greedy algorithms 16.Greedy algorithms 16.1 An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a n } of n proposed activities that with to use a resource. Each activity a i has a start time s i and a

More information

Problem Strategies. 320 Greedy Strategies 6

Problem Strategies. 320 Greedy Strategies 6 Problem Strategies Weighted interval scheduling: 2 subproblems (include the interval or don t) Have to check out all the possibilities in either case, so lots of subproblem overlap dynamic programming:

More information

Chapter 1 Divide and Conquer Algorithm Theory WS 2014/15 Fabian Kuhn

Chapter 1 Divide and Conquer Algorithm Theory WS 2014/15 Fabian Kuhn Chapter 1 Divide and Conquer Algorithm Theory WS 2014/15 Fabian Kuhn Divide And Conquer Principle Important algorithm design method Examples from Informatik 2: Sorting: Mergesort, Quicksort Binary search

More information

CSE 101, Winter Design and Analysis of Algorithms. Lecture 11: Dynamic Programming, Part 2

CSE 101, Winter Design and Analysis of Algorithms. Lecture 11: Dynamic Programming, Part 2 CSE 101, Winter 2018 Design and Analysis of Algorithms Lecture 11: Dynamic Programming, Part 2 Class URL: http://vlsicad.ucsd.edu/courses/cse101-w18/ Goal: continue with DP (Knapsack, All-Pairs SPs, )

More information

Greedy Algorithms CHAPTER 16

Greedy Algorithms CHAPTER 16 CHAPTER 16 Greedy Algorithms In dynamic programming, the optimal solution is described in a recursive manner, and then is computed ``bottom up''. Dynamic programming is a powerful technique, but it often

More information

G205 Fundamentals of Computer Engineering. CLASS 21, Mon. Nov Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm

G205 Fundamentals of Computer Engineering. CLASS 21, Mon. Nov Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm G205 Fundamentals of Computer Engineering CLASS 21, Mon. Nov. 22 2004 Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm Greedy Algorithms, 1 Algorithms for Optimization Problems Sequence of steps Choices at

More information

Dynamic Programmming: Activity Selection

Dynamic Programmming: Activity Selection Dynamic Programmming: Activity Selection Select the maximum number of non-overlapping activities from a set of n activities A = {a 1,, a n } (sorted by finish times). Identify easier subproblems to solve.

More information

CSE 417 Dynamic Programming (pt 4) Sub-problems on Trees

CSE 417 Dynamic Programming (pt 4) Sub-problems on Trees CSE 417 Dynamic Programming (pt 4) Sub-problems on Trees Reminders > HW4 is due today > HW5 will be posted shortly Dynamic Programming Review > Apply the steps... 1. Describe solution in terms of solution

More information

Subsequence Definition. CS 461, Lecture 8. Today s Outline. Example. Assume given sequence X = x 1, x 2,..., x m. Jared Saia University of New Mexico

Subsequence Definition. CS 461, Lecture 8. Today s Outline. Example. Assume given sequence X = x 1, x 2,..., x m. Jared Saia University of New Mexico Subsequence Definition CS 461, Lecture 8 Jared Saia University of New Mexico Assume given sequence X = x 1, x 2,..., x m Let Z = z 1, z 2,..., z l Then Z is a subsequence of X if there exists a strictly

More information

Greedy Algorithms CLRS Laura Toma, csci2200, Bowdoin College

Greedy Algorithms CLRS Laura Toma, csci2200, Bowdoin College Greedy Algorithms CLRS 16.1-16.2 Laura Toma, csci2200, Bowdoin College Overview. Sometimes we can solve optimization problems with a technique called greedy. A greedy algorithm picks the option that looks

More information

ECE608 - Chapter 16 answers

ECE608 - Chapter 16 answers ¼ À ÈÌ Ê ½ ÈÊÇ Ä ÅË ½µ ½ º½¹ ¾µ ½ º½¹ µ ½ º¾¹½ µ ½ º¾¹¾ µ ½ º¾¹ µ ½ º ¹ µ ½ º ¹ µ ½ ¹½ ½ ECE68 - Chapter 6 answers () CLR 6.-4 Let S be the set of n activities. The obvious solution of using Greedy-Activity-

More information

1 Format. 2 Topics Covered. 2.1 Minimal Spanning Trees. 2.2 Union Find. 2.3 Greedy. CS 124 Quiz 2 Review 3/25/18

1 Format. 2 Topics Covered. 2.1 Minimal Spanning Trees. 2.2 Union Find. 2.3 Greedy. CS 124 Quiz 2 Review 3/25/18 CS 124 Quiz 2 Review 3/25/18 1 Format You will have 83 minutes to complete the exam. The exam may have true/false questions, multiple choice, example/counterexample problems, run-this-algorithm problems,

More information

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Dynamic Programming

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Dynamic Programming Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 25 Dynamic Programming Terrible Fibonacci Computation Fibonacci sequence: f = f(n) 2

More information

Greedy Algorithms. Alexandra Stefan

Greedy Algorithms. Alexandra Stefan Greedy Algorithms Alexandra Stefan 1 Greedy Method for Optimization Problems Greedy: take the action that is best now (out of the current options) it may cause you to miss the optimal solution You build

More information

Department of Computer Applications. MCA 312: Design and Analysis of Algorithms. [Part I : Medium Answer Type Questions] UNIT I

Department of Computer Applications. MCA 312: Design and Analysis of Algorithms. [Part I : Medium Answer Type Questions] UNIT I MCA 312: Design and Analysis of Algorithms [Part I : Medium Answer Type Questions] UNIT I 1) What is an Algorithm? What is the need to study Algorithms? 2) Define: a) Time Efficiency b) Space Efficiency

More information

Lecture 8. Dynamic Programming

Lecture 8. Dynamic Programming Lecture 8. Dynamic Programming T. H. Cormen, C. E. Leiserson and R. L. Rivest Introduction to Algorithms, 3rd Edition, MIT Press, 2009 Sungkyunkwan University Hyunseung Choo choo@skku.edu Copyright 2000-2018

More information

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS Department of Computer Science University of Babylon LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS By Faculty of Science for Women( SCIW), University of Babylon, Iraq Samaher@uobabylon.edu.iq

More information

Algorithms for Data Science

Algorithms for Data Science Algorithms for Data Science CSOR W4246 Eleni Drinea Computer Science Department Columbia University Thursday, October 1, 2015 Outline 1 Recap 2 Shortest paths in graphs with non-negative edge weights (Dijkstra

More information

Solving NP-hard Problems on Special Instances

Solving NP-hard Problems on Special Instances Solving NP-hard Problems on Special Instances Solve it in poly- time I can t You can assume the input is xxxxx No Problem, here is a poly-time algorithm 1 Solving NP-hard Problems on Special Instances

More information

Chapter 1 Divide and Conquer Algorithm Theory WS 2015/16 Fabian Kuhn

Chapter 1 Divide and Conquer Algorithm Theory WS 2015/16 Fabian Kuhn Chapter 1 Divide and Conquer Algorithm Theory WS 2015/16 Fabian Kuhn Divide And Conquer Principle Important algorithm design method Examples from Informatik 2: Sorting: Mergesort, Quicksort Binary search

More information

We augment RBTs to support operations on dynamic sets of intervals A closed interval is an ordered pair of real

We augment RBTs to support operations on dynamic sets of intervals A closed interval is an ordered pair of real 14.3 Interval trees We augment RBTs to support operations on dynamic sets of intervals A closed interval is an ordered pair of real numbers ], with Interval ]represents the set Open and half-open intervals

More information

Chapter 16: Greedy Algorithm

Chapter 16: Greedy Algorithm Chapter 16: Greedy Algorithm 1 About this lecture Introduce Greedy Algorithm Look at some problems solvable by Greedy Algorithm 2 Coin Changing Suppose that in a certain country, the coin dominations consist

More information

Dynamic Programming. Lecture Overview Introduction

Dynamic Programming. Lecture Overview Introduction Lecture 12 Dynamic Programming 12.1 Overview Dynamic Programming is a powerful technique that allows one to solve many different types of problems in time O(n 2 ) or O(n 3 ) for which a naive approach

More information

Computer Sciences Department 1

Computer Sciences Department 1 1 Advanced Design and Analysis Techniques (15.1, 15.2, 15.3, 15.4 and 15.5) 3 Objectives Problem Formulation Examples The Basic Problem Principle of optimality Important techniques: dynamic programming

More information

16 Greedy Algorithms

16 Greedy Algorithms 16 Greedy Algorithms Optimization algorithms typically go through a sequence of steps, with a set of choices at each For many optimization problems, using dynamic programming to determine the best choices

More information

10/24/ Rotations. 2. // s left subtree s right subtree 3. if // link s parent to elseif == else 11. // put x on s left

10/24/ Rotations. 2. // s left subtree s right subtree 3. if // link s parent to elseif == else 11. // put x on s left 13.2 Rotations MAT-72006 AA+DS, Fall 2013 24-Oct-13 368 LEFT-ROTATE(, ) 1. // set 2. // s left subtree s right subtree 3. if 4. 5. // link s parent to 6. if == 7. 8. elseif == 9. 10. else 11. // put x

More information

Dynamic Programming Group Exercises

Dynamic Programming Group Exercises Name: Name: Name: Dynamic Programming Group Exercises Adapted from material by Cole Frederick Please work the following problems in groups of 2 or 3. Use additional paper as needed, and staple the sheets

More information

CS141: Intermediate Data Structures and Algorithms Dynamic Programming

CS141: Intermediate Data Structures and Algorithms Dynamic Programming CS141: Intermediate Data Structures and Algorithms Dynamic Programming Amr Magdy Programming? In this context, programming is a tabular method Other examples: Linear programing Integer programming 2 Rod

More information

So far... Finished looking at lower bounds and linear sorts.

So far... Finished looking at lower bounds and linear sorts. So far... Finished looking at lower bounds and linear sorts. Next: Memoization -- Optimization problems - Dynamic programming A scheduling problem Matrix multiplication optimization Longest Common Subsequence

More information

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Dynamic Programming I Date: 10/6/16

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Dynamic Programming I Date: 10/6/16 600.463 Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Dynamic Programming I Date: 10/6/16 11.1 Introduction Dynamic programming can be very confusing until you ve used it a

More information

Main approach: always make the choice that looks best at the moment.

Main approach: always make the choice that looks best at the moment. Greedy algorithms Main approach: always make the choice that looks best at the moment. - More efficient than dynamic programming - Always make the choice that looks best at the moment (just one choice;

More information

Lecture 4: Dynamic programming I

Lecture 4: Dynamic programming I Lecture : Dynamic programming I Dynamic programming is a powerful, tabular method that solves problems by combining solutions to subproblems. It was introduced by Bellman in the 950 s (when programming

More information

Greedy algorithms part 2, and Huffman code

Greedy algorithms part 2, and Huffman code Greedy algorithms part 2, and Huffman code Two main properties: 1. Greedy choice property: At each decision point, make the choice that is best at the moment. We typically show that if we make a greedy

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms Design and Analysis of Algorithms Instructor: SharmaThankachan Lecture 10: Greedy Algorithm Slides modified from Dr. Hon, with permission 1 About this lecture Introduce Greedy Algorithm Look at some problems

More information

Polygon decomposition. Motivation: Art gallery problem

Polygon decomposition. Motivation: Art gallery problem CG Lecture 3 Polygon decomposition 1. Polygon triangulation Triangulation theory Monotone polygon triangulation 2. Polygon decomposition into monotone pieces 3. Trapezoidal decomposition 4. Convex decomposition

More information

UML CS Algorithms Qualifying Exam Spring, 2004 ALGORITHMS QUALIFYING EXAM

UML CS Algorithms Qualifying Exam Spring, 2004 ALGORITHMS QUALIFYING EXAM NAME: This exam is open: - books - notes and closed: - neighbors - calculators ALGORITHMS QUALIFYING EXAM The upper bound on exam time is 3 hours. Please put all your work on the exam paper. (Partial credit

More information

Homework3: Dynamic Programming - Answers

Homework3: Dynamic Programming - Answers Most Exercises are from your textbook: Homework3: Dynamic Programming - Answers 1. For the Rod Cutting problem (covered in lecture) modify the given top-down memoized algorithm (includes two procedures)

More information

Main approach: always make the choice that looks best at the moment. - Doesn t always result in globally optimal solution, but for many problems does

Main approach: always make the choice that looks best at the moment. - Doesn t always result in globally optimal solution, but for many problems does Greedy algorithms Main approach: always make the choice that looks best at the moment. - More efficient than dynamic programming - Doesn t always result in globally optimal solution, but for many problems

More information

Polygon Triangulation. (slides partially by Daniel Vlasic )

Polygon Triangulation. (slides partially by Daniel Vlasic ) Polygon Triangulation (slides partially by Daniel Vlasic ) Triangulation: Definition Triangulation of a simple polygon P: decomposition of P into triangles by a maximal set of non-intersecting diagonals

More information

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING QUESTION BANK UNIT-III. SUB NAME: DESIGN AND ANALYSIS OF ALGORITHMS SEM/YEAR: III/ II PART A (2 Marks)

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING QUESTION BANK UNIT-III. SUB NAME: DESIGN AND ANALYSIS OF ALGORITHMS SEM/YEAR: III/ II PART A (2 Marks) DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING QUESTION BANK UNIT-III SUB CODE: CS2251 DEPT: CSE SUB NAME: DESIGN AND ANALYSIS OF ALGORITHMS SEM/YEAR: III/ II PART A (2 Marks) 1. Write any four examples

More information

Lecture 13: Chain Matrix Multiplication

Lecture 13: Chain Matrix Multiplication Lecture 3: Chain Matrix Multiplication CLRS Section 5.2 Revised April 7, 2003 Outline of this Lecture Recalling matrix multiplication. The chain matrix multiplication problem. A dynamic programming algorithm

More information

TU/e Algorithms (2IL15) Lecture 2. Algorithms (2IL15) Lecture 2 THE GREEDY METHOD

TU/e Algorithms (2IL15) Lecture 2. Algorithms (2IL15) Lecture 2 THE GREEDY METHOD Algorithms (2IL15) Lecture 2 THE GREEDY METHOD x y v w 1 Optimization problems for each instance there are (possibly) multiple valid solutions goal is to find an optimal solution minimization problem:

More information

Exercises Optimal binary search trees root

Exercises Optimal binary search trees root 5.5 Optimal binary search trees 403 e w 5 5 j 4.75 i j 4.00 i 3.75.00 3 3 0.70 0.80 3.5.0. 4 0.55 0.50 0.60 4 0.90 0.70 0.60 0.90 5 0.45 0.35 0. 0.50 5 0 0.45 0.40 0.5 0. 0.50 6 0 0. 0.5 0.5 0.0 0.35 6

More information

Lecture 22: Dynamic Programming

Lecture 22: Dynamic Programming Lecture 22: Dynamic Programming COSC242: Algorithms and Data Structures Brendan McCane Department of Computer Science, University of Otago Dynamic programming The iterative and memoised algorithms for

More information

Outline. CS38 Introduction to Algorithms. Fast Fourier Transform (FFT) Fast Fourier Transform (FFT) Fast Fourier Transform (FFT)

Outline. CS38 Introduction to Algorithms. Fast Fourier Transform (FFT) Fast Fourier Transform (FFT) Fast Fourier Transform (FFT) Outline CS8 Introduction to Algorithms Lecture 9 April 9, 0 Divide and Conquer design paradigm matrix multiplication Dynamic programming design paradigm Fibonacci numbers weighted interval scheduling knapsack

More information

UML CS Algorithms Qualifying Exam Fall, 2003 ALGORITHMS QUALIFYING EXAM

UML CS Algorithms Qualifying Exam Fall, 2003 ALGORITHMS QUALIFYING EXAM NAME: This exam is open: - books - notes and closed: - neighbors - calculators ALGORITHMS QUALIFYING EXAM The upper bound on exam time is 3 hours. Please put all your work on the exam paper. (Partial credit

More information

CSC Design and Analysis of Algorithms

CSC Design and Analysis of Algorithms CSC 8301- Design and Analysis of Algorithms Lecture 6 Divide and Conquer Algorithm Design Technique Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide a problem instance into two

More information

Dynamic Programming. Design and Analysis of Algorithms. Entwurf und Analyse von Algorithmen. Irene Parada. Design and Analysis of Algorithms

Dynamic Programming. Design and Analysis of Algorithms. Entwurf und Analyse von Algorithmen. Irene Parada. Design and Analysis of Algorithms Entwurf und Analyse von Algorithmen Dynamic Programming Overview Introduction Example 1 When and how to apply this method Example 2 Final remarks Introduction: when recursion is inefficient Example: Calculation

More information

Divide and Conquer. Algorithm Fall Semester

Divide and Conquer. Algorithm Fall Semester Divide and Conquer Algorithm 2014 Fall Semester Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances 2. Solve smaller instances

More information

CSE 421 Applications of DFS(?) Topological sort

CSE 421 Applications of DFS(?) Topological sort CSE 421 Applications of DFS(?) Topological sort Yin Tat Lee 1 Precedence Constraints In a directed graph, an edge (i, j) means task i must occur before task j. Applications Course prerequisite: course

More information

Chapter 1 Divide and Conquer Algorithm Theory WS 2013/14 Fabian Kuhn

Chapter 1 Divide and Conquer Algorithm Theory WS 2013/14 Fabian Kuhn Chapter 1 Divide and Conquer Algorithm Theory WS 2013/14 Fabian Kuhn Divide And Conquer Principle Important algorithm design method Examples from Informatik 2: Sorting: Mergesort, Quicksort Binary search

More information

Greedy Algorithms. Algorithms

Greedy Algorithms. Algorithms Greedy Algorithms Algorithms Greedy Algorithms Many algorithms run from stage to stage At each stage, they make a decision based on the information available A Greedy algorithm makes decisions At each

More information

15-451/651: Design & Analysis of Algorithms January 26, 2015 Dynamic Programming I last changed: January 28, 2015

15-451/651: Design & Analysis of Algorithms January 26, 2015 Dynamic Programming I last changed: January 28, 2015 15-451/651: Design & Analysis of Algorithms January 26, 2015 Dynamic Programming I last changed: January 28, 2015 Dynamic Programming is a powerful technique that allows one to solve many different types

More information

Greedy algorithms is another useful way for solving optimization problems.

Greedy algorithms is another useful way for solving optimization problems. Greedy Algorithms Greedy algorithms is another useful way for solving optimization problems. Optimization Problems For the given input, we are seeking solutions that must satisfy certain conditions. These

More information

CS141: Intermediate Data Structures and Algorithms Greedy Algorithms

CS141: Intermediate Data Structures and Algorithms Greedy Algorithms CS141: Intermediate Data Structures and Algorithms Greedy Algorithms Amr Magdy Activity Selection Problem Given a set of activities S = {a 1, a 2,, a n } where each activity i has a start time s i and

More information

Greedy Algorithms Huffman Coding

Greedy Algorithms Huffman Coding Greedy Algorithms Huffman Coding Huffman Coding Problem Example: Release 29.1 of 15-Feb-2005 of TrEMBL Protein Database contains 1,614,107 sequence entries, comprising 505,947,503 amino acids. There are

More information

QB LECTURE #1: Algorithms and Dynamic Programming

QB LECTURE #1: Algorithms and Dynamic Programming QB LECTURE #1: Algorithms and Dynamic Programming Adam Siepel Nov. 16, 2015 2 Plan for Today Introduction to algorithms Simple algorithms and running time Dynamic programming Soon: sequence alignment 3

More information

Efficient Sequential Algorithms, Comp309. Problems. Part 1: Algorithmic Paradigms

Efficient Sequential Algorithms, Comp309. Problems. Part 1: Algorithmic Paradigms Efficient Sequential Algorithms, Comp309 Part 1: Algorithmic Paradigms University of Liverpool References: T. H. Cormen, C. E. Leiserson, R. L. Rivest Introduction to Algorithms, Second Edition. MIT Press

More information

Lecture 1 (Part 1) Introduction/Overview

Lecture 1 (Part 1) Introduction/Overview UMass Lowell Computer Science 91.503 Analysis of Algorithms Prof. Karen Daniels Fall, 2013 Lecture 1 (Part 1) Introduction/Overview Monday, 9/9/13 Web Page Web Page http://www.cs.uml.edu/~kdaniels/courses/alg_503_f13.html

More information

CSE 431/531: Analysis of Algorithms. Greedy Algorithms. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

CSE 431/531: Analysis of Algorithms. Greedy Algorithms. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo CSE 431/531: Analysis of Algorithms Greedy Algorithms Lecturer: Shi Li Department of Computer Science and Engineering University at Buffalo Main Goal of Algorithm Design Design fast algorithms to solve

More information

Lecture 3: Art Gallery Problems and Polygon Triangulation

Lecture 3: Art Gallery Problems and Polygon Triangulation EECS 396/496: Computational Geometry Fall 2017 Lecture 3: Art Gallery Problems and Polygon Triangulation Lecturer: Huck Bennett In this lecture, we study the problem of guarding an art gallery (specified

More information

Chapter 16. Greedy Algorithms

Chapter 16. Greedy Algorithms Chapter 16. Greedy Algorithms Algorithms for optimization problems (minimization or maximization problems) typically go through a sequence of steps, with a set of choices at each step. A greedy algorithm

More information

Analysis of Algorithms Prof. Karen Daniels

Analysis of Algorithms Prof. Karen Daniels UMass Lowell Computer Science 91.503 Analysis of Algorithms Prof. Karen Daniels Spring, 2010 Lecture 2 Tuesday, 2/2/10 Design Patterns for Optimization Problems Greedy Algorithms Algorithmic Paradigm Context

More information

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms Lecturer: Shi Li Department of Computer Science and Engineering University at Buffalo Main Goal of Algorithm Design Design fast

More information

Dynamic Programming. Outline and Reading. Computing Fibonacci

Dynamic Programming. Outline and Reading. Computing Fibonacci Dynamic Programming Dynamic Programming version 1.2 1 Outline and Reading Matrix Chain-Product ( 5.3.1) The General Technique ( 5.3.2) -1 Knapsac Problem ( 5.3.3) Dynamic Programming version 1.2 2 Computing

More information

CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer

CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer CSC 8301- Design and Analysis of Algorithms Lecture 6 Divide and Conquer Algorithm Design Technique Divide-and-Conquer The most-well known algorithm design strategy: 1. Divide a problem instance into two

More information

Unit-5 Dynamic Programming 2016

Unit-5 Dynamic Programming 2016 5 Dynamic programming Overview, Applications - shortest path in graph, matrix multiplication, travelling salesman problem, Fibonacci Series. 20% 12 Origin: Richard Bellman, 1957 Programming referred to

More information

Algorithms for Euclidean TSP

Algorithms for Euclidean TSP This week, paper [2] by Arora. See the slides for figures. See also http://www.cs.princeton.edu/~arora/pubs/arorageo.ps Algorithms for Introduction This lecture is about the polynomial time approximation

More information

Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 1

Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 1 Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 1 Acknowledgement The set of slides have use materials from the following resources Slides for textbook by Dr. Y. Chen from

More information

Computer Algorithms CISC4080 CIS, Fordham Univ. Acknowledgement. Outline. Instructor: X. Zhang Lecture 1

Computer Algorithms CISC4080 CIS, Fordham Univ. Acknowledgement. Outline. Instructor: X. Zhang Lecture 1 Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 1 Acknowledgement The set of slides have use materials from the following resources Slides for textbook by Dr. Y. Chen from

More information

Dynamic Programming II

Dynamic Programming II June 9, 214 DP: Longest common subsequence biologists often need to find out how similar are 2 DNA sequences DNA sequences are strings of bases: A, C, T and G how to define similarity? DP: Longest common

More information

Lecture: Analysis of Algorithms (CS )

Lecture: Analysis of Algorithms (CS ) Lecture: Analysis of Algorithms (CS483-001) Amarda Shehu Spring 2017 1 The Fractional Knapsack Problem Huffman Coding 2 Sample Problems to Illustrate The Fractional Knapsack Problem Variable-length (Huffman)

More information

4 Dynamic Programming

4 Dynamic Programming 4 Dynamic Programming Dynamic Programming is a form of recursion. In Computer Science, you have probably heard the tradeoff between Time and Space. There is a trade off between the space complexity and

More information

CS6100: Topics in Design and Analysis of Algorithms

CS6100: Topics in Design and Analysis of Algorithms CS6100: Topics in Design and Analysis of Algorithms Guarding and Triangulating Polygons John Augustine CS6100 (Even 2012): Guarding and Triangulating Polygons The Art Gallery Problem A simple polygon is

More information

CS 491 CAP Intermediate Dynamic Programming

CS 491 CAP Intermediate Dynamic Programming CS 491 CAP Intermediate Dynamic Programming Victor Gao University of Illinois at Urbana-Champaign Oct 28, 2016 Linear DP Knapsack DP DP on a Grid Interval DP Division/Grouping DP Tree DP Set DP Outline

More information

Advanced Algorithm Homework 4 Results and Solutions

Advanced Algorithm Homework 4 Results and Solutions Advanced Algorithm Homework 4 Results and Solutions ID 1 2 3 4 5 Av Ex 2554 6288 9919 10 6 10 10 9.5 8.9 10-1 4208 10 10 9 8.5 10 9.5 9 0996 10 10 10 10 10 10 10 8239 10 10 10 10 10 10 10 7388 8 8.5 9

More information

We ve done. Now. Next

We ve done. Now. Next We ve done Matroid Theory Task scheduling problem (another matroid example) Dijkstra s algorithm (another greedy example) Dynamic Programming Now Matrix Chain Multiplication Longest Common Subsequence

More information

Lecturers: Sanjam Garg and Prasad Raghavendra March 20, Midterm 2 Solutions

Lecturers: Sanjam Garg and Prasad Raghavendra March 20, Midterm 2 Solutions U.C. Berkeley CS70 : Algorithms Midterm 2 Solutions Lecturers: Sanjam Garg and Prasad aghavra March 20, 207 Midterm 2 Solutions. (0 points) True/False Clearly put your answers in the answer box in front

More information

We ve done. Now. Next

We ve done. Now. Next We ve done Fast Fourier Transform Polynomial Multiplication Now Introduction to the greedy method Activity selection problem How to prove that a greedy algorithm works Huffman coding Matroid theory Next

More information

CS 231: Algorithmic Problem Solving

CS 231: Algorithmic Problem Solving CS 231: Algorithmic Problem Solving Naomi Nishimura Module 5 Date of this version: June 14, 2018 WARNING: Drafts of slides are made available prior to lecture for your convenience. After lecture, slides

More information

CSC 373: Algorithm Design and Analysis Lecture 8

CSC 373: Algorithm Design and Analysis Lecture 8 CSC 373: Algorithm Design and Analysis Lecture 8 Allan Borodin January 23, 2013 1 / 19 Lecture 8: Announcements and Outline Announcements No lecture (or tutorial) this Friday. Lecture and tutorials as

More information

Thomas H. Cormen Charles E. Leiserson Ronald L. Rivest. Introduction to Algorithms

Thomas H. Cormen Charles E. Leiserson Ronald L. Rivest. Introduction to Algorithms Thomas H. Cormen Charles E. Leiserson Ronald L. Rivest Introduction to Algorithms Preface xiii 1 Introduction 1 1.1 Algorithms 1 1.2 Analyzing algorithms 6 1.3 Designing algorithms 1 1 1.4 Summary 1 6

More information

11. APPROXIMATION ALGORITHMS

11. APPROXIMATION ALGORITHMS 11. APPROXIMATION ALGORITHMS load balancing center selection pricing method: vertex cover LP rounding: vertex cover generalized load balancing knapsack problem Lecture slides by Kevin Wayne Copyright 2005

More information

CS/COE 1501

CS/COE 1501 CS/COE 151 www.cs.pitt.edu/~nlf/cs151/ Greedy Algorithms and Dynamic Programming Consider the change making problem What is the minimum number of coins needed to make up a given value k? If you were working

More information

Dynamic Programming. Introduction, Weighted Interval Scheduling, Knapsack. Tyler Moore. Lecture 15/16

Dynamic Programming. Introduction, Weighted Interval Scheduling, Knapsack. Tyler Moore. Lecture 15/16 Dynamic Programming Introduction, Weighted Interval Scheduling, Knapsack Tyler Moore CSE, SMU, Dallas, TX Lecture /6 Greedy. Build up a solution incrementally, myopically optimizing some local criterion.

More information

Week 7 Convex Hulls in 3D

Week 7 Convex Hulls in 3D 1 Week 7 Convex Hulls in 3D 2 Polyhedra A polyhedron is the natural generalization of a 2D polygon to 3D 3 Closed Polyhedral Surface A closed polyhedral surface is a finite set of interior disjoint polygons

More information