Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 15, 2015 洪國寶

Similar documents
Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 6, 2016 洪國寶

Design and Analysis of Algorithms

1 Non greedy algorithms (which we should have covered

Elements of Dynamic Programming. COSC 3101A - Design and Analysis of Algorithms 8. Discovering Optimal Substructure. Optimal Substructure - Examples

Algorithms Dr. Haim Levkowitz

IN101: Algorithmic techniques Vladimir-Alexandru Paun ENSTA ParisTech

Greedy Algorithms. CLRS Chapters Introduction to greedy algorithms. Design of data-compression (Huffman) codes

Introduction to Algorithms

Analysis of Algorithms - Greedy algorithms -

Greedy Algorithms. Informal Definition A greedy algorithm makes its next step based only on the current state and simple calculations on the input.

CS473-Algorithms I. Lecture 11. Greedy Algorithms. Cevdet Aykanat - Bilkent University Computer Engineering Department

Design and Analysis of Algorithms

Dynamic Programming Assembly-Line Scheduling. Greedy Algorithms

15.4 Longest common subsequence

16.Greedy algorithms

Problem Strategies. 320 Greedy Strategies 6

Chapter 1 Divide and Conquer Algorithm Theory WS 2014/15 Fabian Kuhn

CSE 101, Winter Design and Analysis of Algorithms. Lecture 11: Dynamic Programming, Part 2

Greedy Algorithms CHAPTER 16

G205 Fundamentals of Computer Engineering. CLASS 21, Mon. Nov Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm

Dynamic Programmming: Activity Selection

CSE 417 Dynamic Programming (pt 4) Sub-problems on Trees

Subsequence Definition. CS 461, Lecture 8. Today s Outline. Example. Assume given sequence X = x 1, x 2,..., x m. Jared Saia University of New Mexico

Greedy Algorithms CLRS Laura Toma, csci2200, Bowdoin College

ECE608 - Chapter 16 answers

1 Format. 2 Topics Covered. 2.1 Minimal Spanning Trees. 2.2 Union Find. 2.3 Greedy. CS 124 Quiz 2 Review 3/25/18

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Dynamic Programming

Greedy Algorithms. Alexandra Stefan

Department of Computer Applications. MCA 312: Design and Analysis of Algorithms. [Part I : Medium Answer Type Questions] UNIT I

Lecture 8. Dynamic Programming

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS

Algorithms for Data Science

Solving NP-hard Problems on Special Instances

Chapter 1 Divide and Conquer Algorithm Theory WS 2015/16 Fabian Kuhn

We augment RBTs to support operations on dynamic sets of intervals A closed interval is an ordered pair of real

Chapter 16: Greedy Algorithm

Dynamic Programming. Lecture Overview Introduction

Computer Sciences Department 1

16 Greedy Algorithms

10/24/ Rotations. 2. // s left subtree s right subtree 3. if // link s parent to elseif == else 11. // put x on s left

Dynamic Programming Group Exercises

CS141: Intermediate Data Structures and Algorithms Dynamic Programming

So far... Finished looking at lower bounds and linear sorts.

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Dynamic Programming I Date: 10/6/16

Main approach: always make the choice that looks best at the moment.

Lecture 4: Dynamic programming I

Greedy algorithms part 2, and Huffman code

Design and Analysis of Algorithms

Polygon decomposition. Motivation: Art gallery problem

UML CS Algorithms Qualifying Exam Spring, 2004 ALGORITHMS QUALIFYING EXAM

Homework3: Dynamic Programming - Answers

Main approach: always make the choice that looks best at the moment. - Doesn t always result in globally optimal solution, but for many problems does

Polygon Triangulation. (slides partially by Daniel Vlasic )

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING QUESTION BANK UNIT-III. SUB NAME: DESIGN AND ANALYSIS OF ALGORITHMS SEM/YEAR: III/ II PART A (2 Marks)

Lecture 13: Chain Matrix Multiplication

TU/e Algorithms (2IL15) Lecture 2. Algorithms (2IL15) Lecture 2 THE GREEDY METHOD

Exercises Optimal binary search trees root

Lecture 22: Dynamic Programming

Outline. CS38 Introduction to Algorithms. Fast Fourier Transform (FFT) Fast Fourier Transform (FFT) Fast Fourier Transform (FFT)

UML CS Algorithms Qualifying Exam Fall, 2003 ALGORITHMS QUALIFYING EXAM

CSC Design and Analysis of Algorithms

Dynamic Programming. Design and Analysis of Algorithms. Entwurf und Analyse von Algorithmen. Irene Parada. Design and Analysis of Algorithms

Divide and Conquer. Algorithm Fall Semester

CSE 421 Applications of DFS(?) Topological sort

Chapter 1 Divide and Conquer Algorithm Theory WS 2013/14 Fabian Kuhn

Greedy Algorithms. Algorithms

15-451/651: Design & Analysis of Algorithms January 26, 2015 Dynamic Programming I last changed: January 28, 2015

Greedy algorithms is another useful way for solving optimization problems.

CS141: Intermediate Data Structures and Algorithms Greedy Algorithms

Greedy Algorithms Huffman Coding

QB LECTURE #1: Algorithms and Dynamic Programming

Efficient Sequential Algorithms, Comp309. Problems. Part 1: Algorithmic Paradigms

Lecture 1 (Part 1) Introduction/Overview

CSE 431/531: Analysis of Algorithms. Greedy Algorithms. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Lecture 3: Art Gallery Problems and Polygon Triangulation

Chapter 16. Greedy Algorithms

Analysis of Algorithms Prof. Karen Daniels

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

Dynamic Programming. Outline and Reading. Computing Fibonacci

CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer

Unit-5 Dynamic Programming 2016

Algorithms for Euclidean TSP

Computer Algorithms CISC4080 CIS, Fordham Univ. Instructor: X. Zhang Lecture 1

Computer Algorithms CISC4080 CIS, Fordham Univ. Acknowledgement. Outline. Instructor: X. Zhang Lecture 1

Dynamic Programming II

Lecture: Analysis of Algorithms (CS )

4 Dynamic Programming

CS6100: Topics in Design and Analysis of Algorithms

CS 491 CAP Intermediate Dynamic Programming

Advanced Algorithm Homework 4 Results and Solutions

We ve done. Now. Next

Lecturers: Sanjam Garg and Prasad Raghavendra March 20, Midterm 2 Solutions

We ve done. Now. Next

CS 231: Algorithmic Problem Solving

CSC 373: Algorithm Design and Analysis Lecture 8

Thomas H. Cormen Charles E. Leiserson Ronald L. Rivest. Introduction to Algorithms

11. APPROXIMATION ALGORITHMS

CS/COE 1501

Dynamic Programming. Introduction, Weighted Interval Scheduling, Knapsack. Tyler Moore. Lecture 15/16

Week 7 Convex Hulls in 3D

Transcription:

Design and Analysis of Algorithms 演算法設計與分析 Lecture 7 April 15, 2015 洪國寶 1

Course information (5/5) Grading (Tentative) Homework 25% (You may collaborate when solving the homework, however when writing up the solutions you must do so on your own. No typed or printed assignments.) Midterm exam 25% (Open book and notes) April 22, 2015 Final exam 25% (Open book and notes) Class participation 25% 2

Outline of the course (1/1) Introduction (1-4) Midterm exam Data structures (10-14) Apr. 22, 2015 Dynamic programming (15) Greedy methods (16) Amortized analysis (17) Advanced data structures (6, 19-21) Graph algorithms (22-25) NP-completeness (34-35) Other topics (5, 31) 3

Outline Review Dynamic programming (Cont.) Longest Common Subsequence (LCS) Greedy method Activity-selection algorithm Basic elements of greedy approach Examples where greedy approach does not work Discuss homework 4

Review: Elements of Dynamic Programming Optimal substructure: an optimal solution to the problem contains within it optimal solutions to subproblems (for DP to be applicable) Overlapping subproblems: the space of subproblems must be small (for algorithm to be efficient) 5

Optimal Polygon Triangulation Def: A polygon P is convex if, when you connect any two points p, q in the polygon (interior + boundary), the entire line segment pq lies in the polygon. is convex Is not convex 6

Optimal Polygon Triangulation A triangulation of a polygon P={v 0, v 1,, v n-1 } is a set of non-intersecting diagonals that partitions the polygon into triangles. an edge vivj is called a diagonal (or chord) if v i and vj are not adjacent vertices Weight of a triangulation is the total weight of all triangles w( v i v j v k ) defined as: w( v i v j v k )= v i v j + v j v k + v i v k Weight = boundary+2(total length of all diagonals). Can we define w( v i v j v k ) to be the area of v i v j v k? 7

Optimal Polygon Triangulation Input: a convex polygon P=(v 0, v 1,, v n-1 ) Output: an optimal triangulation 8

Optimal Polygon Triangulation Triangulation of a convex polygon: Pick any diagonal, then recurse in the reduced convex polygons. One of the Sub-problems becomes: 9

Optimal Polygon Triangulation Let t[i,j] be the weight of an optimal triangulation of polygon (v i-1, v i,, v j ) 0 if i=j t[i.j]= if i<j min { t[ i, k] t[ k 1, j] w( vi 1v jvk )} i k j i j k Becomes two sub-problems of the same form and one triangle 10

Optimal Polygon Triangulation vs. Matrix-chain multiplication Optimal Polygon Triangulation 0 if i = j t[i,j] = min { t[ i, k] t[ k 1, j] w( vi 1v jvk )} i k j Matrix-chain multiplication if i < j m[ i, j] 0 min 1 k j { m[ i, k] m[ k 1, j] p i 1 p k p j if if i i j j 11

Optimal Polygon Triangulation Algorithm Similar to matrix-chain multiplication Running time is O(n 3 ) Lecture 01 Why abstract problems? One algorithm many applications matrix chain multiplication, optimal polygon triangulation Optimal merge pattern, Huffman code 12

Dynamic Programming (Remarks) Matrix-chain multiplication: O(n log n) Longest common subsequence: - O(mn/log n), O((m+n)/log (m+n)) Other problems - 15.1 Rod cutting / Assembly-line scheduling - 15.5 Optimal binary search trees Some real world applications - Speech recognition (Viterbi algorithm) - Image processing (Morphing) 13

Outline Greedy method Activity-selection algorithm Basic elements of greedy approach Examples where greedy approach does not work 14

Algorithm design techniques So far, we ve looked at the following design techniques: - induction (incremental approach) - divide and conquer - augmenting data structures - dynamic programming Coming up: greedy method 15

Dynamic Programming VS. Greedy Algorithms Dynamic programming uses optimal substructure in a bottom-up fashion First find optimal solutions to subproblems and, having solved the subproblems, we find an optimal solution to the problem Greedy algorithms use optimal substructure in a top-down fashion First make a choice the choice that looks best at the time and then solving a resulting subproblem 16

Greedy Algorithms A greedy algorithm always makes the choice that looks best at the moment The hope: a locally optimal choice will lead to a globally optimal solution For some problems, it works Dynamic programming can be overkill; greedy algorithms tend to be easier to code 17

Example: Activity-Selection Formally: Given a set S of n activities s i = start time of activity i f i = finish time of activity i Find max-size subset A of compatible activities: for all i,j A, [s i, f i ) and [s j, f j ) do not overlap 1 3 2 4 5 6 18

Activity Selection: Optimal Substructure Let k be the minimum activity in A (i.e., the one with the earliest finish time). Then A - {k} is an optimal solution to S = {i S: s i f k } In words: once activity k is selected, the problem reduces to finding an optimal solution for activityselection over activities in S compatible with k Proof: if we could find optimal solution B to S with B > A - {k}, Then B U {k} is compatible And B U {k} > A 19

Activity Selection: Greedy Choice Property Dynamic programming? Memoize? Activity selection problem also exhibits the greedy choice property: Locally optimal choice globally optimal sol n Thm 16.1: if S is an activity selection problem sorted by finish time, then optimal solution A S such that {1} A Sketch of proof: if optimal solution B that does not contain {1}, can always replace first activity in B with {1} (Why?). Same number of activities, thus optimal. 20

Activity Selection: A Greedy Algorithm Intuition is simple: Always pick the activity with the earliest finish time available at the time GAS(S) 1 if S=NIL then return NIL 2 else return {k} U GAS(S ) where k is the activity in S with smallest f, and S = {i S: s i f k } 21

GAS(S) Activity Selection: A Greedy Algorithm 1 if S=NIL then return NIL 2 else return {k} U GAS(S ) where k is the activity in S with smallest f, and S = {i S: s i f k } Proof of correctness: use blackboard Use blackboard 22

Activity Selection: A LISP program GAS(S) 1 if S=NIL then return NIL 2 else return {k} U GAS(S ) where k is the activity in S with smallest f, and S = {i S: s i f k } (defun gas (l) (print l) (cond ((null l) nil) (T (cons (car l) (gas (filter (nth 1 (car l)) (cdr l))))))) (defun filter (s l) (cond ((null l) nil) ((> s (nth 0 (car l))) (filter s (cdr l))) (T (cons (car l) (filter s (cdr l)))))) 23

Activity Selection: A LISP program (cont.) (defvar *activity* '((1 4) (3 5) (0 6) (5 7) (3 8) (5 9) (6 10) (8 11) (8 12) (2 13) (12 14))) #<BUFFERED FILE-STREAM CHARACTER #P"gas.txt" @1> [3]> *activity* ((1 4) (3 5) (0 6) (5 7) (3 8) (5 9) (6 10) (8 11) (8 12) (2 13) (12 14)) [4]> (gas *activity*) ((1 4) (3 5) (0 6) (5 7) (3 8) (5 9) (6 10) (8 11) (8 12) (2 13) (12 14)) ((5 7) (5 9) (6 10) (8 11) (8 12) (12 14)) ((8 11) (8 12) (12 14)) ((12 14)) NIL ((1 4) (5 7) (8 11) (12 14)) The S for each iteration (print l) [5]> (dribble) 24

Activity Selection: A Greedy Algorithm If we sort the activities by finish time then the algorithm is simple (iterative instead of recursive): Sort the activities by finish time Schedule the first activity Then schedule the next activity in sorted list which starts after previous activity finishes Repeat until no more activities Note: we don t have to construct S explicitly. Why? 25

Greedy-Activity-Selector Assume (wlog) that f 1 f 2 f n Time complexity: O(n) Exercise 1: Proof of correctness by loop invariant. 26

27

Activity Selection 28

Other greedy choices Greedy-Activity-Selector always pick the activity with the earliest finish time available at the time (smallest f) Some other greedy choices: Largest f Largest/Smallest s Largest/Smallest (f-s) Fewest overlapping Exercise 2: Which of these criteria result in optimal solutions? 29

A Variation of the Problem Instead of maximizing the number of activities we want to schedule, we want to maximize the total time the resource is in use. None of the obvious greedy choices would work: Choose the activity that starts earliest/latest Choose the activity that finishes earliest/latest Choose the longest activity Exercise 3: Design an efficient algorithm for this variation of activity selection problem. 30

Elements of Greedy Strategy Greedy choice property: An optimal solution can be obtained by making choices that seem best at the time, without considering their implications for solutions to subproblems. Optimal substructure: An optimal solution can be obtained by augmenting the partial solution constructed so far with an optimal solution of the remaining subproblem. 31

Steps in designing greedy algorithms 1. Cast the optimization problem as one in which we make a choice and are left with one subproblem to solve. 2. Prove the greedy choice property. 3. Demonstrate optimal substructure. 32

Examples where greedy approach does not work Traveling salesman problem: nearest neighbor, closest pair Matrix-chain multiplication: multiply the two matrices with lowest cost first Knapsack problem: largest values, largest value per unit weight 33

Traveling salesman problem Correctness is not obvious (2/7) (nearest neighbor) 34

Traveling salesman problem Correctness is not obvious (3/7) 35

Matrix-chain multiplication Greedy choice: multiply the two matrices with lowest cost first Example 1: <A 1, A 2, A 3 > (10*100, 100*5, 5*50) Example 2: <A 1, A 2, A 3, A 4, A 5, A 6 > (30*35, 35*15, 15*5, 5*10, 10*20, 20*25) 36

Matrix-chain multiplication Example 1: <A 1, A 2, A 3 > (10*100, 100*5, 5*50) ((A 1 A 2 )A 3 ) 10*100*5 + 10*5*50 = 5000 + 2500 = 7500 (A 1 (A 2 A 3 )) 100*5*50 + 10*100*50 = 25000 + 50000 = 75000 37

Matrix-chain multiplication Example 2: <A 1, A 2, A 3, A 4, A 5, A 6 > (30*35, 35*15, 15*5, 5*10, 10*20, 20*25) A 3 A 4 15*5*10 = 750 Figure 15.3 ((A 1 (A 2 A 3 ))((A 4 A 5 ) A 6 )) 38

39

Knapsack problem Given some items, pack the knapsack to get the maximum total value. Each item has some weight and some value. Total weight that we can carry is no more than some fixed number W. So we must consider weights of items as well as their value. Item # Weight Value 1 1 8 2 3 6 3 5 5 40

Knapsack Problem A thief breaks into a jewelry store carrying a knapsack. Given n items S = {item 1,,item n }, each having a weight w i and value v i, which items should the thief put into the knapsack with capacity W to obtain the maximum value? 41

Knapsack problem There are two versions of the problem: (1) 0-1 knapsack problem and (2) Fractional knapsack problem (1) Items are indivisible; you either take an item or not. Solved with dynamic programming (2) Items are divisible: you can take any fraction of an item. Solved with a greedy algorithm. 42

0-1 Knapsack Problem This problem requires a subset A of S to be determined such that i item A i v is maximized subject to The naïve approach is to generate all possible subset of S and find the maximum value, requiring 2 n time. item A i w i W 43

Does Greedy Approach Work? Strategy 1: Steal the items with the largest values. E.g. w=[25, 10, 10], v=[$10, $9, $9], W=30. Value is 10, although the optimal value is 18. w 1 =25 w 2 =10 w 3 =10 v 1 = $10 v 2 = $9 v 3 = $9 44

Does Greedy Approach Work? Strategy 2: Steal the items with the largest value per unit weight. E.g. w=[5, 20,10], v=[$50, $140, $60], W=30. Value is 190, although optimal would be 200. w 1 =5 w 2 =20 w 3 =10 v 1 =$50 v 2 =$140 v 3 =$60 45

NP-hard Problems (Knapsack, Traveling Salesman,...) The two approaches above cannot yield the optimal result for 0-1 knapsack. This problem is NP-hard even when all items are the same kind, that is, item i described by w i only (v i =w i ). Observation: Greedy approach to the fractional knapsack problem yields the optimal solution. E.g. (see 2nd example) 50 + 140 + (5/10)*60 = 220 where 5 is the remaining capacity of the knapsack 46

Dynamic Programming Approach for the Knapsack Problem We can find optimal solution for 0-1 knapsack problem by DP, but not in polynomial-time. Let A be an optimal subset with items from {item 1,,item i } only. There are two cases: 1) A contains item i Then the total value for A is equal to v i plus the optimal value obtained from the first i-1 items, where the total weight cannot exceed W w i 2) A does not contain item i Then the total value for A is equal to that of the optimal subset chosen from the first i-1 items (with total weight cannot exceed W). Q: What are the subproblems? 47

Dynamic Programming Approach for the Knapsack Problem A[i][w]= max(a[i-1][w], v i +A[i-1][w-w i ]) if w>w i A[i-1][w] if w<w i The maximum value is A[n][W] Running time is O(nW), not polynomial in n Q: WHY? 1 2... n 1 2..... W 48

Fractional knapsack problem Greedy approach: taking items in order of greatest value per pound Optimal for the fractional version (why?), but not for the 0-1 version 49

Questions? 50