Design and Analysis of Algorithms

Similar documents
1 Non greedy algorithms (which we should have covered

Elements of Dynamic Programming. COSC 3101A - Design and Analysis of Algorithms 8. Discovering Optimal Substructure. Optimal Substructure - Examples

Dynamic Programming Assembly-Line Scheduling. Greedy Algorithms

Greedy Algorithms. CLRS Chapters Introduction to greedy algorithms. Design of data-compression (Huffman) codes

Algorithms Dr. Haim Levkowitz

Greedy Algorithms. Informal Definition A greedy algorithm makes its next step based only on the current state and simple calculations on the input.

16.Greedy algorithms

Greedy Algorithms CLRS Laura Toma, csci2200, Bowdoin College

CS473-Algorithms I. Lecture 11. Greedy Algorithms. Cevdet Aykanat - Bilkent University Computer Engineering Department

Greedy Algorithms. Algorithms

Design and Analysis of Algorithms

Chapter 16: Greedy Algorithm

Design and Analysis of Algorithms

COMP251: Greedy algorithms

Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 15, 2015 洪國寶

Exercises Optimal binary search trees root

4.1 Interval Scheduling

We ve done. Now. Next

Analysis of Algorithms - Greedy algorithms -

Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 6, 2016 洪國寶

Greedy Algorithms CHAPTER 16

15.4 Longest common subsequence

G205 Fundamentals of Computer Engineering. CLASS 21, Mon. Nov Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm

ECE608 - Chapter 16 answers

Main approach: always make the choice that looks best at the moment.

Problem Strategies. 320 Greedy Strategies 6

Subsequence Definition. CS 461, Lecture 8. Today s Outline. Example. Assume given sequence X = x 1, x 2,..., x m. Jared Saia University of New Mexico

CSE 421 Applications of DFS(?) Topological sort

Main approach: always make the choice that looks best at the moment. - Doesn t always result in globally optimal solution, but for many problems does

Applied Algorithm Design Lecture 3

Greedy algorithms part 2, and Huffman code

IN101: Algorithmic techniques Vladimir-Alexandru Paun ENSTA ParisTech

Chapter 16. Greedy Algorithms

Optimization Problems. Optimization Problems

16 Greedy Algorithms

Greedy Algorithms. Alexandra Stefan

Greedy algorithms is another useful way for solving optimization problems.

Solving NP-hard Problems on Special Instances

Analysis of Algorithms Prof. Karen Daniels

Tutorial 6-7. Dynamic Programming and Greedy

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS

Greedy Algorithms. Previous Examples: Huffman coding, Minimum Spanning Tree Algorithms

Efficient Sequential Algorithms, Comp309. Problems. Part 1: Algorithmic Paradigms

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Dynamic Programming I Date: 10/6/16

CS473-Algorithms I. Lecture 10. Dynamic Programming. Cevdet Aykanat - Bilkent University Computer Engineering Department

Union Find and Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 8

CS141: Intermediate Data Structures and Algorithms Greedy Algorithms

Greedy Algorithms 1. For large values of d, brute force search is not feasible because there are 2 d

Lecture: Analysis of Algorithms (CS )

CSE541 Class 2. Jeremy Buhler. September 1, 2016

Greedy Algorithms and Matroids. Andreas Klappenecker

Framework for Design of Dynamic Programming Algorithms

Dynamic Programming. Introduction, Weighted Interval Scheduling, Knapsack. Tyler Moore. Lecture 15/16

Greedy Algorithms and Matroids. Andreas Klappenecker

These are not polished as solutions, but ought to give a correct idea of solutions that work. Note that most problems have multiple good solutions.

Lectures 12 and 13 Dynamic programming: weighted interval scheduling

CMSC 451: Dynamic Programming

Greedy Algorithms. T. M. Murali. January 28, Interval Scheduling Interval Partitioning Minimising Lateness

Greedy Algorithms and Matroids. Andreas Klappenecker

Dynamic Programmming: Activity Selection

1 i n (p i + r n i ) (Note that by allowing i to be n, we handle the case where the rod is not cut at all.)

So far... Finished looking at lower bounds and linear sorts.

Introduction to Algorithms

Department of Computer Applications. MCA 312: Design and Analysis of Algorithms. [Part I : Medium Answer Type Questions] UNIT I

Greedy Algorithms 1 {K(S) K(S) C} For large values of d, brute force search is not feasible because there are 2 d {1,..., d}.

Homework3: Dynamic Programming - Answers

Greedy Algorithms. This is such a simple approach that it is what one usually tries first.

CLASS-ROOM NOTES: OPTIMIZATION PROBLEM SOLVING - I

Scribe: Virginia Williams, Sam Kim (2016), Mary Wootters (2017) Date: May 22, 2017

Dynamic Programming II

CSE 417 Network Flows (pt 4) Min Cost Flows

CS F-10 Greedy Algorithms 1

Objec&ves. Review. Directed Graphs: Strong Connec&vity Greedy Algorithms

CIS 121 Data Structures and Algorithms Minimum Spanning Trees

CMSC 451: Lecture 10 Dynamic Programming: Weighted Interval Scheduling Tuesday, Oct 3, 2017

CMPS 2200 Fall Dynamic Programming. Carola Wenk. Slides courtesy of Charles Leiserson with changes and additions by Carola Wenk

/463 Algorithms - Fall 2013 Solution to Assignment 3

We ve done. Introduction to the greedy method Activity selection problem How to prove that a greedy algorithm works Fractional Knapsack Huffman coding

9.5 Equivalence Relations

memoization or iteration over subproblems the direct iterative algorithm a basic outline of dynamic programming

Algorithms for Data Science

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Dynamic Programming

Lecture 13: Chain Matrix Multiplication

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

Dynamic Programming Group Exercises

CSE 521: Design and Analysis of Algorithms I

CMSC Introduction to Algorithms Spring 2012 Lecture 16

Computer Science 385 Design and Analysis of Algorithms Siena College Spring Topic Notes: Dynamic Programming

Theorem: The greedy algorithm for RKP will always output the optimal solution under Selection Criterion 3. Proof sketch: Assume that the objects have

Shortest path problems

Announcements. CSEP 521 Applied Algorithms. Announcements. Polynomial time efficiency. Definitions of efficiency 1/14/2013

Algorithm Design Methods. Some Methods Not Covered

ESO207A: Data Structures and Algorithms End-semester exam

Dynamic Programming. Nothing to do with dynamic and nothing to do with programming.

Greedy Homework Problems

Greedy algorithms 2 4/5/12. Knapsack problems: Greedy or not? Compression algorithms. Data compression. David Kauchak cs302 Spring 2012

Introduction to Algorithms

Algorithms: COMP3121/3821/9101/9801

CSE 431/531: Analysis of Algorithms. Greedy Algorithms. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Algorithm classification

Transcription:

Design and Analysis of Algorithms CSE 5311 Lecture 16 Greedy algorithms Junzhou Huang, Ph.D. Department of Computer Science and Engineering CSE5311 Design and Analysis of Algorithms 1

Overview A greedy algorithm always makes the choice that looks best at the moment Make a locally optimal choice in hope of getting a globally optimal solution Example: Play cards, Invest on stocks, etc. Do not always yield optimal solutions They do for some problems with optimal substructure (like Dynamic Programming) Easier to code than DP CSE5311 Design and Analysis of Algorithms 2

An Activity-Selection Problem Input: Set S of n activities, a 1, a 2,, a n. Activity a i starts at time s i and finishes at time f i Two activities are compatible, if their intervals don t overlap. Output: A subset of maximum number of mutually compatible activities. Assume: activities are sorted by finishing times f 1 f 2 f n Example: i 1 2 3 4 5 6 7 8 9 10 11 s i 1 3 0 5 3 5 6 8 8 2 12 f i 4 5 6 8 8 9 10 11 12 13 14 Possible sets of mutually compatible activities {a 3, a 9, a 11 } {a 1, a 4, a 8, a 11 } {a 2, a 4, a 9, a 11 } Largest subsets CSE5311 Design and Analysis of Algorithms 3

An Activity-Selection Problem CSE5311 Design and Analysis of Algorithms 4

Optimal Substructure Suppose an optimal solution includes activity a k. Two subproblems: a 1,, a k-1, a k, 1. Select compatible activities that finish before a k starts a k+1,, a n 2. Select compatible activities that finish after a k starts The solution to the two subproblems must be optimal (prove using cut-and-paste argument) CSE5311 Design and Analysis of Algorithms 5

Cut-and-Paste The "cut and paste" technique is a way to prove that a problem has the property. In particular, you want to show that when you come up with an optimal solution to a problem, you have necessarily used optimal solutions to the constituent subproblems. The proof is by contradiction. Suppose you came up with an optimal solution to a problem by using suboptimal solutions to subproblems. Then, if you were to replace ("cut") those suboptimal subproblem solutions with optimal subproblem solutions (by "pasting" them in), you would improve your optimal solution. But, since your solution was optimal by assumption, you have a contradiction. CSE5311 Design and Analysis of Algorithms 6

Recursive Solution S ij : subset of activities that start after a i finish and finish before a j starts Subproblems: find c[i, j], maximum number of mutually compatible activities from Recurrence: CSE5311 Design and Analysis of Algorithms 7

Early Finish Greedy 1. while( activities) 2. Select the activity with the earliest finish 3. Remove the activities that are not compatible 4. end while Greedy in the sense that: It leaves as much opportunity as possible for the remaining activities to be scheduled. Maximizes the amount of unscheduled time remaining CSE5311 Design and Analysis of Algorithms 8

Example CSE5311 Design and Analysis of Algorithms 9

Example CSE5311 Design and Analysis of Algorithms 10

Example CSE5311 Design and Analysis of Algorithms 11

Example CSE5311 Design and Analysis of Algorithms 12

Example CSE5311 Design and Analysis of Algorithms 13

Greedy Choice Property Locally optimal choice, we then get globally optimal solution Them 16.1: if S is an non-empty activity-selection subproblem, then there exists optimal solution A in S such that a s1 in A, where a s1 is the earliest finish activity in A Sketch of proof: if there exists optimal solution B that does not contain a s1, can always replace the first activity in B with a s1. Same number of activities, thus optimal. CSE5311 Design and Analysis of Algorithms 14

Recursive Algorithm Initial call: Complexity: Straightforward to convert to an iterative one CSE5311 Design and Analysis of Algorithms 15

Iterative Algorithm Initial call: Complexity: CSE5311 Design and Analysis of Algorithms 16

Elements of the greedy strategy Determine the optimal substructure Develop the recursive solution Prove that if we make the greedy choice, only one subproblemremains Prove that it is safe to make the greedy choice Develop a recursive algorithm that implements the greedy strategy Convert the recursive algorithm to an iterative one. CSE5311 Design and Analysis of Algorithms 17

Not All Greedy are Equal Earliest finish time: Select the activity with the earliest finish time Optimal Earliest start time: Select activity with the earliest start time Shortest duration: Select activity with the shortest duration d i = f i s i Fewest conflicts: Select activity that conflicts with the least number of other activities first Last start time: Select activity with the last start CSE5311 Design and Analysis of Algorithms 18

Not All Greedy are Equal Earliest start time: Select activity with the earliest start time Shortest duration: Select activity with the shortest duration d i = f i s i CSE5311 Design and Analysis of Algorithms 19

Not All Greedy are Equal Fewest conflicts: Select activity that conflicts with the least number of other activities first CSE5311 Design and Analysis of Algorithms 20

Dynamic Programming vs. Greedy Algorithms Optimization problems Dynamic programming, but overkill sometime. Greedy algorithm: Being greedy for local optimization with the hope it will lead to a global optimal solution, not always, but in many situations, it works. CSE5311 Design and Analysis of Algorithms 21

Example: An Activity-Selection Problem Suppose A set of activities S={a 1, a 2,, a n } They use resources, such as lecture hall, one lecture at a time Each a i, has a start time s i, and finish time f i, with 0 s i < f i <. a i and a j are compatible if [s i, f i ) and [s j, f j ) do not overlap Goal: select maximum-size subset of mutually compatible activities. Start from dynamic programming, then greedy algorithm, see the relation between the two. CSE5311 Design and Analysis of Algorithms 22

DP solution step 1 Optimal substructure of activity-selection problem. Furthermore, assume that f 1 f n. Define S ij ={a k : f i s k <f k s j }, i.e., all activities starting after a i finished and ending before a j begins. Define two fictitious activities a 0 with f 0 =0 and a n+1 with s n+1 = So f 0 f 1 f n+1. Then an optimal solution including a k to S ij contains within it the optimal solution to S ik and S kj. CSE5311 Design and Analysis of Algorithms 23

DP solution step 2 A recursive solution Assume c[n+1,n+1] with c[i,j] is the number of activities in a maximum-size subset of mutually compatible activities in S ij. So the solution is c[0,n+1]=s 0,n+1. C[i,j]= 0 if S ij = max{c[i,k]+c[k,j]+1} if S ij i<k<j and a k S ij CSE5311 Design and Analysis of Algorithms 24

Converting DP Solution to Greedy Solution Theorem 16.1: consider any nonempty subproblem S ij, and let a m be the activity in S ij with earliest finish time: f m =min{f k : a k S ij }, then 1. Activity a m is used in some maximum-size subset of mutually compatible activities of S ij. 2. The subproblem S im is empty, so that choosing a m leaves S mj as the only one that may be nonempty. Proof of the theorem: CSE5311 Design and Analysis of Algorithms 25

Top-Down Rather Than Bottom-Up To solve S ij, choose a m in S ij with the earliest finish time, then solve S mj, (S im is empty) It is certain that optimal solution to S mj is in optimal solution to S ij. No need to solve S mj ahead of S ij. Subproblem pattern: S i,n+1. CSE5311 Design and Analysis of Algorithms 26

Optimal Solution Properties In DP, optimal solution depends: How many subproblems to divide. (2 subproblems) How many choices to determine which subproblem to use. (j-i-1 choices) However, the above theorem (16.1) reduces both significantly One subproblem (the other is sure to be empty). One choice, i.e., the one with earliest finish time in S ij. Moreover, top-down solving, rather than bottom-up in DP. Pattern to the subproblems that we solve, S m,n+1 from S ij. Pattern to the activities that we choose. The activity with earliest finish time. With this local optimal, it is in fact the global optimal. CSE5311 Design and Analysis of Algorithms 27

Elements of Greedy Strategy Determine the optimal substructure Develop the recursive solution Prove one of the optimal choices is the greedy choice yet safe Show that all but one of subproblems are empty after greedy choice Develop a recursive algorithm that implements the greedy strategy Convert the recursive algorithm to an iterative one. CSE5311 Design and Analysis of Algorithms 28

Greedy vs. DP Knapsack problem: a thief robbing a store and find n items I 1 (v 1,w 1 ), I 2 (v 2,w 2 ),,I n (v n,w n ). The i-th item is worth v i dollars and weight w i pound Given a weight W at most he can carry, Find the items which maximize the values Which items should the thief take to obtain the maximum amount of money? Fractional knapsack, Fractional of items can be taken Greed algorithm, O(nlogn) 0/1 knapsack. Each item is taken or not taken DP, O(nW). (Questions: 0/1 knapsack is an NP-complete problem, why O(nW) algorithm?) CSE5311 Design and Analysis of Algorithms 29

Knapsack Problem Both exhibit the optimal-substructure property 0-1: If item j is removed from an optimal packing, the remaining packing is an optimal packing with weight at most W-w j Fractional: If w pounds of item j is removed from an optimal packing, the remaining packing is an optimal packing with weight at most W-w that can be taken from other n-1 items plus w j w of item j CSE5311 Design and Analysis of Algorithms 30

Fractional Knapsack Problem Can be solvable by the greedy strategy Compute the value per pound v j /w j for each item Obeying a greedy strategy, take as much as possible of the item with the greatest value per pound. If the supply of that item is exhausted and there is still more room, take as much as possible of the item with the next value per pound, and so forth until there is no more room O(n lgn) (we need to sort the items by value per pound) CSE5311 Design and Analysis of Algorithms 31

0-1 Knapsack Problem Much harder Cannot be solved by the greedy strategy. Counter example? We must compare the solution to the sub-problem in which the item is included with the solution to the sub-problem in which the item is excluded before we can make the choice Dynamic Programming (See previous lectures) CSE5311 Design and Analysis of Algorithms 32

Counter Example CSE5311 Design and Analysis of Algorithms 33