Design and Analysis of Algorithms

Similar documents
4.1 Interval Scheduling

Design and Analysis of Algorithms

Algorithms Dr. Haim Levkowitz

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CSE 421 Applications of DFS(?) Topological sort

Chapter 9. Greedy Technique. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

Greedy Algorithms. Alexandra Stefan

Greedy Algorithms. CLRS Chapters Introduction to greedy algorithms. Design of data-compression (Huffman) codes

Design and Analysis of Algorithms

Chapter 16: Greedy Algorithm

Greedy Algorithms. Previous Examples: Huffman coding, Minimum Spanning Tree Algorithms

Analysis of Algorithms - Greedy algorithms -

CSE 521: Design and Analysis of Algorithms I

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

looking ahead to see the optimum

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

G205 Fundamentals of Computer Engineering. CLASS 21, Mon. Nov Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm

CS473-Algorithms I. Lecture 11. Greedy Algorithms. Cevdet Aykanat - Bilkent University Computer Engineering Department

Design and Analysis of Algorithms

managing an evolving set of connected components implementing a Union-Find data structure implementing Kruskal s algorithm

CSE 417 Branch & Bound (pt 4) Branch & Bound

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

16.Greedy algorithms

Applied Algorithm Design Lecture 3

TU/e Algorithms (2IL15) Lecture 2. Algorithms (2IL15) Lecture 2 THE GREEDY METHOD

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Greedy Algorithms CHAPTER 16

CSE 431/531: Analysis of Algorithms. Greedy Algorithms. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 6, 2016 洪國寶

CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018

Tutorial 6-7. Dynamic Programming and Greedy

UNIT 3. Greedy Method. Design and Analysis of Algorithms GENERAL METHOD

Greedy Algorithms. Textbook reading. Chapter 4 Chapter 5. CSci 3110 Greedy Algorithms 1/63

Greedy algorithms part 2, and Huffman code

Lecture: Analysis of Algorithms (CS )

11. APPROXIMATION ALGORITHMS

CSC 373: Algorithm Design and Analysis Lecture 4

Greedy algorithms is another useful way for solving optimization problems.

COMP 355 Advanced Algorithms

Minimum Spanning Trees Shortest Paths

Minimum-Spanning-Tree problem. Minimum Spanning Trees (Forests) Minimum-Spanning-Tree problem

16 Greedy Algorithms

CSE331 Introduction to Algorithms Lecture 15 Minimum Spanning Trees

We ve done. Now. Next

February 24, :52 World Scientific Book - 9in x 6in soltys alg. Chapter 3. Greedy Algorithms

Department of Computer Applications. MCA 312: Design and Analysis of Algorithms. [Part I : Medium Answer Type Questions] UNIT I

Theorem 2.9: nearest addition algorithm

CSC 421: Algorithm Design & Analysis. Spring 2015

1 Format. 2 Topics Covered. 2.1 Minimal Spanning Trees. 2.2 Union Find. 2.3 Greedy. CS 124 Quiz 2 Review 3/25/18

Greedy Algorithms. At each step in the algorithm, one of several choices can be made.

Text Compression through Huffman Coding. Terminology

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS

Graphs and Network Flows ISE 411. Lecture 7. Dr. Ted Ralphs

CS141: Intermediate Data Structures and Algorithms Greedy Algorithms

Chapter 16. Greedy Algorithms

Minimum Spanning Trees

Minimum Spanning Trees My T. UF

Decreasing a key FIB-HEAP-DECREASE-KEY(,, ) 3.. NIL. 2. error new key is greater than current key 6. CASCADING-CUT(, )

CS521 \ Notes for the Final Exam

Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology. Assignment

Union Find and Greedy Algorithms. CSE 101: Design and Analysis of Algorithms Lecture 8

6. Algorithm Design Techniques

Week 11: Minimum Spanning trees

CSE 2320 Notes 6: Greedy Algorithms

Greedy Algorithms CLRS Laura Toma, csci2200, Bowdoin College

Greedy Algorithms. Subhash Suri. October 9, 2017

Dijkstra s algorithm for shortest paths when no edges have negative weight.

Algorithm classification

Greedy Algorithms. Mohan Kumar CSE5311 Fall The Greedy Principle

Huffman Coding. Version of October 13, Version of October 13, 2014 Huffman Coding 1 / 27

COP 4531 Complexity & Analysis of Data Structures & Algorithms

Exercise 1. D-ary Tree

22 Elementary Graph Algorithms. There are two standard ways to represent a

Algorithm Design and Analysis

CIS 121 Data Structures and Algorithms Minimum Spanning Trees

Analysis of Algorithms Prof. Karen Daniels

FINAL EXAM SOLUTIONS

10/31/18. About A6, Prelim 2. Spanning Trees, greedy algorithms. Facts about trees. Undirected trees

Spanning Trees, greedy algorithms. Lecture 20 CS2110 Fall 2018

11. APPROXIMATION ALGORITHMS

CSC 1700 Analysis of Algorithms: Minimum Spanning Tree

Greedy Algorithms and Huffman Coding

Approximation Algorithms

Recitation 13. Minimum Spanning Trees Announcements. SegmentLab has been released, and is due Friday, November 17. It s worth 135 points.

Greedy Algorithms Part Three

15.4 Longest common subsequence

Name: Lirong TAN 1. (15 pts) (a) Define what is a shortest s-t path in a weighted, connected graph G.

Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-4004) Subject Notes

Outline. CS38 Introduction to Algorithms. Approximation Algorithms. Optimization Problems. Set Cover. Set cover 5/29/2014. coping with intractibility

Algorithms for Euclidean TSP

Announcements. CSEP 521 Applied Algorithms. Announcements. Polynomial time efficiency. Definitions of efficiency 1/14/2013

Scribe: Virginia Williams, Sam Kim (2016), Mary Wootters (2017) Date: May 22, 2017

Spanning Tree. Lecture19: Graph III. Minimum Spanning Tree (MSP)

In this lecture, we ll look at applications of duality to three problems:

22 Elementary Graph Algorithms. There are two standard ways to represent a

Algorithms and Theory of Computation. Lecture 5: Minimum Spanning Tree

Spanning Trees 4/19/17. Prelim 2, assignments. Undirected trees

Spanning Trees. Lecture 22 CS2110 Spring 2017

Module 5 Graph Algorithms

Algorithms for Minimum Spanning Trees

Transcription:

CSE 101, Winter 018 D/Q Greed SP s DP LP, Flow B&B, Backtrack Metaheuristics P, NP Design and Analysis of Algorithms Lecture 8: Greed Class URL: http://vlsicad.ucsd.edu/courses/cse101-w18/

Optimization and Decision (Lecture 1, Slide 7) Decision question TSP(G,k): Is there a traveling salesperson tour of length k in the graph G? Optimization question TSP(G): What is the minimum length of any traveling salesperson tour in the graph G?

Greedy Algorithms A greedy algorithm always makes the choice that looks best at the moment Put another way: Greed makes a locally optimal choice in the hope that this choice will lead to a globally optimal solution Greedy algorithms do not always yield optimal solutions, but for some problems they do See cse101-greedy-handout.pdf in Piazza Resources, announced Piazza @535

Greedy Algorithm When do we use greedy algorithms? When we need a heuristic for a hard problem When the problem itself is greedy Examples of Greedy problems: Minimum Spanning Tree Prim s and Kruskal s algorithms follow from cut property and cycle property, which we ll see next time Minimum Weight Prefix Codes (Huffman coding) Making change with U.S. coin denominations

Properties of Greedy Problems Greedy-choice property: A globally optimal solution can be arrived at by making a locally optimal (greedy) choice Key task: prove that greedy method is optimal There are greedy stays ahead, exchange argument, etc. strategies for such proofs Optimal substructure property: An optimal solution to the problem contains within it optimal solutions to subproblems Key ingredient of both DP and Greed (we saw it with SP s ) DP: can cache and reuse the solutions to subproblems Greed: can iteratively make correct decisions as to how to grow a solution

Examples of Greedy Approaches Traveling Salesperson Problem tour all cities with minimum cost What is a greedy approach? Knapsack Problem Given item types with weights, values: achieve maximum value within weight limit What is a greedy approach? Coin Changing Problem make change with fewest #coins What is a greedy approach? Make a series of decisions (choices) to build up a solution Graph Coloring, Vertex Cover, K-Center Min #colors needed so that no edge has same-color endpoints Min #vertices to have at least one endpoint of each edge Pick k centers out of n points to minimize max point-to-center distance This is covering edges by vertices, i.e., minimum vertex cover (of edges). HW3 Exercise 1: covering vertices by edges

Examples of Greedy Approaches Traveling Salesperson Problem What is a greedy approach? Start somewhere, always go to the nearest unvisited city Knapsack Problem What is a greedy approach? Use as much as possible of highest value/weight ratio item Coin Changing Problem What is a greedy approach? Use as much as possible of largest denominations first Graph Coloring, Vertex Cover, K-Center Use lowest unused color Add highest-degree vertex Add new center that is farthest from all existing centers Nearest-Neighbor ~5% suboptimal for random Euclidean planar pointsets Optimal for Fractional Knapsack Problem Optimal for U.S. currency

Greedy Algorithm When do we use greedy algorithms? When we need a heuristic for a hard problem When the problem itself is greedy Examples of Greedy problems: Minimum Spanning Tree Prim s and Kruskal s algorithms follow from cut property and cycle property, which we ll see next time Minimum Weight Prefix Codes (Huffman coding) Making change with U.S. coin denominations Interval scheduling, Interval partitioning, HW3

Problem: Interval Scheduling Treatment adapted from book by Kleinberg and Tardos Interval scheduling. Job j starts at s j and finishes at f j. Two jobs compatible if they don't overlap. Goal: find maximum subset of mutually compatible jobs. b a c d e f g 0 1 3 5 6 7 8 9 10 11 h Time

Interval Scheduling: Greedy Approaches Greedy template: Consider jobs in some order. Take each job provided that it is compatible with the ones already taken. [Earliest start time] Consider jobs in ascending order of start time s j [Earliest finish time] Consider jobs in ascending order of finish time f j [Shortest interval] Consider jobs in ascending order of length f j s j [Fewest conflicts] For each job, count the number of conflicting jobs c j. Schedule in ascending order of conflicts c j

Interval Scheduling: Greedy Algorithms Greedy template: Consider jobs in some order. Take each job provided that it is compatible with the ones already taken. Counterexample for earliest start time Counterexample for shortest interval Counterexample for fewest conflicts

Interval Scheduling: Greedy Algorithm Consider jobs in increasing order of finish time Take each job provided that it is compatible with the ones already taken (i.e., does not overlap or conflict with any previously scheduled jobs) Sort jobs by finish times so that f 1 f... f n. Set of selected jobs A for j = 1 to n { if (job j compatible with A) A A {j} } return A

Interval Scheduling 0 B C A E D F G H 1 3 5 6 7 8 9 10 11 Time 0 1 3 5 6 7 8 9 10 11

Interval Scheduling 0 B C A E D F G H 1 3 5 6 7 8 9 10 11 Time B 0 1 3 5 6 7 8 9 10 11

Interval Scheduling 0 B C A E D F G H 1 3 5 6 7 8 9 10 11 Time B C 0 1 3 5 6 7 8 9 10 11

Interval Scheduling 0 B C A E D F G H 1 3 5 6 7 8 9 10 11 Time B A 0 1 3 5 6 7 8 9 10 11

Interval Scheduling 0 B C A E D F G H 1 3 5 6 7 8 9 10 11 Time B E 0 1 3 5 6 7 8 9 10 11

Interval Scheduling 0 B C A E D F G H 1 3 5 6 7 8 9 10 11 Time B ED 0 1 3 5 6 7 8 9 10 11

Interval Scheduling 0 B C A E D F G H 1 3 5 6 7 8 9 10 11 Time B E F 0 1 3 5 6 7 8 9 10 11

Interval Scheduling 0 B C A E D F G H 1 3 5 6 7 8 9 10 11 Time B E G 0 1 3 5 6 7 8 9 10 11

Interval Scheduling 0 B C A E D F G H 1 3 5 6 7 8 9 10 11 Time B E H 0 1 3 5 6 7 8 9 10 11

Interval Scheduling: Analysis Theorem. Greedy algorithm is optimal. Proof. (by contradiction) Assume greedy is not optimal, and let s see what happens. Let i 1, i,... i k denote jobs selected by greedy Let j 1, j,... j m denote jobs selected in optimal solution with i 1 = j 1, i = j,..., i r = j r for the largest possible value of r job i r+1 finishes before j r+1 Greedy: i 1 i i r i r+1 OPT: j 1 j j r j r+1... why not replace job j r+1 with job i r+1?

Interval Scheduling: Analysis Theorem. Greedy algorithm is optimal. Proof. (by contradiction) Assume greedy is not optimal, and let s see what happens. Let i 1, i,... i k denote jobs selected by greedy Let j 1, j,... j m denote jobs selected in optimal solution with i 1 = j 1, i = j,..., i r = j r for the largest possible value of r. job i r+1 finishes before j r+1 Example of Modify the Solution ( greedy stays ahead ) Greedy: i 1 i 1 i r i r+1 OPT: j 1 j j r i r+1... solution still feasible and optimal, but contradicts maximality of r

Interval Partitioning Interval partitioning. Lecture j starts at s j and finishes at f j. Goal: find minimum number of classrooms to schedule all lectures so that no two occur at the same time in the same room. This schedule uses classrooms for 10 lectures e j c d g b h a f i 9 9:30 10 10:30 11 11:30 1 1:30 1 1:30 :30 3 3:30 :30 Time

Interval partitioning. Interval Partitioning Lecture j starts at s j and finishes at f j Goal: find minimum number of classrooms to schedule all lectures so that no two occur at the same time in the same room This schedule uses only 3 classrooms for 10 lectures c d f j b g i a e h 9 9:30 10 10:30 11 11:30 1 1:30 1 1:30 :30 3 3:30 :30 Time

Interval Partitioning: Lower Bound on Optimal Solution Definition: The depth of a set of open intervals is the maximum number that contain any given time Key observation. Number of classrooms needed depth. e.g.., a, b, c all contain 9:30 Example below: Depth of intervals = 3 schedule is optimal Q. Does there always exist a schedule equal to depth of intervals? c d f j b g i a e h 9 9:30 10 10:30 11 11:30 1 1:30 1 1:30 :30 3 3:30 :30 Time

Interval Partitioning: Greedy Algorithm Greedy algorithm. Consider lectures in increasing order of start time: assign lecture to any compatible classroom. Sort intervals by starting time so that s 1 s... s n. d 0 number of allocated classrooms for j = 1 to n { if (lecture j is compatible with some classroom k) schedule lecture j in classroom k else allocate a new classroom d + 1 schedule lecture j in classroom d + 1 d d + 1 } Implementation: O(n log n). For each classroom k, maintain the finish time of the last job added. Keep the classrooms in a priority queue.

Interval Partitioning: Greedy Analysis Observation. Greedy algorithm never schedules two incompatible lectures in the same classroom. Theorem. Greedy algorithm is optimal. Proof. Example of Achieves the Bound Let d = number of classrooms that the greedy algorithm allocates. Classroom d is opened because we needed to schedule a job, say j, that is incompatible with all d-1 other classrooms. Since we sorted by start time, all these incompatibilities are caused by lectures that start no later than s j. Thus, we have d lectures overlapping at time s j +. Key observation all schedules use d classrooms.

Trees (Page 11 of PDF; Page 19 of hardcopy: Sidebar) A tree is connected and acyclic A tree on n nodes has n - 1 edges Any connected, undirected graph with V - 1 edges is a tree An undirected graph is a tree if and only if there is a unique path between any pair of nodes Fact about trees: Any two of the following properties imply the third: Connected Acyclic V - 1 edges

Minimum Spanning Tree Given a connected graph G = (V, E) with real-valued edge weights, an MST is a subset of the edges T E such that T is a spanning tree whose sum of edge weights is minimized. 6 3 18 9 6 9 16 8 10 5 1 11 7 8 5 11 7 1 G = (V, E) T, e T c e = 50 Cayley's Theorem: There are n n- spanning trees of K n can't solve by brute force Source: slides by Kevin Wayne for Kleinberg-Tardos, Algorithm Design, Pearson Addison-Wesley, 005.

Greedy MST Algorithms Kruskal s algorithm. Start with T =. Consider edges in ascending order of cost. Insert edge e in T unless doing so would create a cycle. Reverse-Delete algorithm. Start with T = E. Consider edges in descending order of cost. Delete edge e from T unless doing so would disconnect T. Prim s algorithm. Start with some root node s and greedily grow a tree T from s outward. At each step, add the cheapest edge e to T that has exactly one endpoint in T. Source: slides by Kevin Wayne for Kleinberg-Tardos, Algorithm Design, Pearson Addison-Wesley, 005.

Huffman Codes How to transmit English text using binary code? 6 letters + space = alphabet has 7 characters 5 bits per character suffices Observation #1: Not all characters occur with same frequency Sherlock Holmes, The Adventure of the Dancing Men : ETAOIN SHRDLU Suggests variable-length encoding CSE 8, 1, 100, Minimize W i L i W i = frequency of char i L i = length of codeword for char i Observation #: Variable-length code should have prefix property One code word per input symbol No code word is a prefix of any other code word Simplifies decoding process

Greedy Algorithm: Huffman Coding Huffman coding is based on probability (frequency) with which symbols appear in a message Goal is to minimize the expected code message length How it works Create a tree root node for each nonzero symbol frequency, with the frequency as the value of the node REPEAT Find two root nodes with smallest value Create a new root node with these two nodes as children, and value equal to the sum of the values of the two children (Until there is only one root node remaining)

Why Huffman is Optimal Suppose we have an optimal tree T, where a, b are siblings at the deepest level, while x, y are the two roots merged by Huffman s algorithm Swap x with a, y with b Neither swap can increase cost x and y have minimum weight locations of a and b are at the deepest level T x y a b

Huffman Coding Example Example: Symbol: Frequency: A E G I M N O R S T U V Y Blank 1 3 1 1 1 1 1 3 (Generic Implementation) Place the elements into a min heap (by frequency) Remove the first two elements from the heap Combine these two elements into one Insert the new element back into the heap

Huffman Coding Example Step 1: Step : A M T U V Y Step 3: Step : T U N V Y A M V Y A M T U

Huffman Coding Example Step 5: N O R V Y A M T U Step 6: N O R S G V Y A M T U

Huffman Coding Example Step 7 Step 8 T U N V Y A M 5 I E S G O R T U N 5 I E S G O R V Y A M 7

Huffman Coding Example Step 9 15 9 7 8 5 S G I E N O R V Y A M T U

Huffman Coding Example Final tree: 0 9 1 0 1 0 15 1 5 0 1 0 1 0 S G I E 0 7 1 1 0 1 0 V Y A 1 M 0 T 8 0 1 0 0 N O 1 U 1 R E.g., code word for Y is 10101 Sum of internal node values = total weighted pathlength of tree = W i L i = +5+9++++7++++8+15+ = 90 (vs. W i L i = 10 in naïve 5 bit per symbol code)