G205 Fundamentals of Computer Engineering. CLASS 21, Mon. Nov Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm

Similar documents
16.Greedy algorithms

Dynamic Programming Assembly-Line Scheduling. Greedy Algorithms

CS473-Algorithms I. Lecture 11. Greedy Algorithms. Cevdet Aykanat - Bilkent University Computer Engineering Department

Minimum Spanning Trees

Chapter 16: Greedy Algorithm

Lecture Notes for Chapter 23: Minimum Spanning Trees

Analysis of Algorithms - Greedy algorithms -

Greedy algorithms part 2, and Huffman code

Minimum Spanning Trees

Greedy Algorithms. At each step in the algorithm, one of several choices can be made.

2pt 0em. Computer Science & Engineering 423/823 Design and Analysis of Algorithms. Lecture 04 Minimum-Weight Spanning Trees (Chapter 23)

Greedy Algorithms CSE 6331

Minimum-Spanning-Tree problem. Minimum Spanning Trees (Forests) Minimum-Spanning-Tree problem

22 Elementary Graph Algorithms. There are two standard ways to represent a

Algorithms Dr. Haim Levkowitz

Decreasing a key FIB-HEAP-DECREASE-KEY(,, ) 3.. NIL. 2. error new key is greater than current key 6. CASCADING-CUT(, )

Minimum Spanning Trees

Design and Analysis of Algorithms

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

1 Start with each vertex being its own component. 2 Merge two components into one by choosing the light edge

22 Elementary Graph Algorithms. There are two standard ways to represent a

Lecture: Analysis of Algorithms (CS )

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

Minimum Spanning Tree

Week 11: Minimum Spanning trees

Greedy Algorithms. CLRS Chapters Introduction to greedy algorithms. Design of data-compression (Huffman) codes

Analysis of Algorithms Prof. Karen Daniels

Minimum Spanning Trees My T. UF

Minimum Spanning Trees

Greedy Algorithms CSE 780

Introduction to Algorithms

Chapter 9. Greedy Technique. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

Part VI Graph algorithms. Chapter 22 Elementary Graph Algorithms Chapter 23 Minimum Spanning Trees Chapter 24 Single-source Shortest Paths

Week 12: Minimum Spanning trees and Shortest Paths

Algorithm Design and Analysis

23.2 Minimum Spanning Trees

Announcements Problem Set 5 is out (today)!

Introduction to Algorithms

We ve done. Now. Next

Huffman Coding. Version of October 13, Version of October 13, 2014 Huffman Coding 1 / 27

Tutorial 6-7. Dynamic Programming and Greedy

CSE 431/531: Analysis of Algorithms. Greedy Algorithms. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Design and Analysis of Algorithms

Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 6, 2016 洪國寶

CSE331 Introduction to Algorithms Lecture 15 Minimum Spanning Trees

Minimum Spanning Trees

Greedy Algorithms. Textbook reading. Chapter 4 Chapter 5. CSci 3110 Greedy Algorithms 1/63

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS

COP 4531 Complexity & Analysis of Data Structures & Algorithms

Greedy Algorithms CHAPTER 16

CHAPTER 23. Minimum Spanning Trees

looking ahead to see the optimum

Chapter 23. Minimum Spanning Trees

CS60020: Foundations of Algorithm Design and Machine Learning. Sourangshu Bhattacharya

February 24, :52 World Scientific Book - 9in x 6in soltys alg. Chapter 3. Greedy Algorithms

We ve done. Introduction to the greedy method Activity selection problem How to prove that a greedy algorithm works Fractional Knapsack Huffman coding

Design and Analysis of Algorithms

Introduction to Algorithms

Lecture 10. Elementary Graph Algorithm Minimum Spanning Trees

5.4 Shortest Paths. Jacobs University Visualization and Computer Graphics Lab. CH : Algorithms and Data Structures 456

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-4004) Subject Notes

Minimum Spanning Trees and Prim s Algorithm

CIS 121 Data Structures and Algorithms Minimum Spanning Trees

2 A Template for Minimum Spanning Tree Algorithms

Recitation 13. Minimum Spanning Trees Announcements. SegmentLab has been released, and is due Friday, November 17. It s worth 135 points.

16 Greedy Algorithms

Text Compression through Huffman Coding. Terminology

CS 561, Lecture 9. Jared Saia University of New Mexico

Shortest Path Problem

Greedy Algorithms. Previous Examples: Huffman coding, Minimum Spanning Tree Algorithms

Solutions to relevant spring 2000 exam problems

1 Minimum Spanning Trees (MST) b 2 3 a. 10 e h. j m

Algorithms and Theory of Computation. Lecture 5: Minimum Spanning Tree

Algorithms and Theory of Computation. Lecture 5: Minimum Spanning Tree

Minimum spanning trees

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

TU/e Algorithms (2IL15) Lecture 2. Algorithms (2IL15) Lecture 2 THE GREEDY METHOD

Minimum Spanning Trees Outline: MST

2.1 Greedy Algorithms. 2.2 Minimum Spanning Trees. CS125 Lecture 2 Fall 2016

Depth-first Search (DFS)

Taking Stock. Last Time Flows. This Time Review! 1 Characterize the structure of an optimal solution

Unit 2: Algorithmic Graph Theory

Shortest Path Algorithms

Minimum Spanning Trees Ch 23 Traversing graphs

COMP251: Greedy algorithms

CSE 421 Greedy Alg: Union Find/Dijkstra s Alg

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

DESIGN AND ANALYSIS OF ALGORITHMS GREEDY METHOD

UC Berkeley CS 170: Efficient Algorithms and Intractable Problems Handout 8 Lecturer: David Wagner February 20, Notes 8 for CS 170

( ) + n. ( ) = n "1) + n. ( ) = T n 2. ( ) = 2T n 2. ( ) = T( n 2 ) +1

Greedy algorithms 2 4/5/12. Knapsack problems: Greedy or not? Compression algorithms. Data compression. David Kauchak cs302 Spring 2012

Lecture 13. Reading: Weiss, Ch. 9, Ch 8 CSE 100, UCSD: LEC 13. Page 1 of 29

Chapter 24. Shortest path problems. Chapter 24. Shortest path problems. 24. Various shortest path problems. Chapter 24. Shortest path problems

UNIT 3. Greedy Method. Design and Analysis of Algorithms GENERAL METHOD

CSC 373 Lecture # 3 Instructor: Milad Eftekhar

CSE373: Data Structures & Algorithms Lecture 17: Minimum Spanning Trees. Dan Grossman Fall 2013

CSE 521: Design and Analysis of Algorithms I

Introduction to Algorithms. Lecture 11

COMP251: Single source shortest paths

Transcription:

G205 Fundamentals of Computer Engineering CLASS 21, Mon. Nov. 22 2004 Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm

Greedy Algorithms, 1 Algorithms for Optimization Problems Sequence of steps Choices at each step With Dynamic Programming finding the best choice can be expensive Simpler, more efficient algorithms will do 11/22/2004 2

Greedy Algorithms, 2 A Greedy Algorithm always makes the choice that looks best at the moment It makes a locally optimal choice in the hope that it will lead to a globally optimal solution Optimal solutions are greedily achieved by Gas for many problems (not for all) 11/22/2004 3

Greedy Properties Properties an OP should exhibit to admit a greedy solution 1. Greedy-choice property Global solution via local greedy choices 2. Optimal substructure Optimal solution is obtained from optimal solutions to sub-problems 11/22/2004 4

Greedy-choice Property A globally optimal solutions can be arrived at by making a locally optimal greedy choice The choice that looks best is made independently of results from subproblems Main difference from Dynamic Programming Choices depends on results from sub-probs 11/22/2004 5

GA vs. DP DP: Solutions proceed bottom-up Progressing from smaller sub-problems to larger GA: The best choice is made Proceed to solve the corresponding subproblem A greedy strategy proceeds top-down, one greedy choice after another, reducing a problem instance to a smaller one 11/22/2004 6

Optimal Substructure A problem exhibits optimal substructure if an optimal solution to the problem contains optimal solutions to subproblems Common with DP Cleverness is required to show that a greedy choice at each step yields a globally optimal solution 11/22/2004 7

GA vs. DP: An example Optimal substructure is common to GA and DP Could lead to use DP when GA suffices or to use GA when DP is needed Subtleties in the difference can be illustrated by two variants of a classical optimization problem: The Knapsack Problem 11/22/2004 8

0-1 Knapsack Thief robbing n items from a store Item i is worth v i $ and weights w i pounds (v i and w i are integers) Thief can carry only up to W pounds in his/her knapsack Which items should the thief take to maximize the load? (0-1: Items either can be taken or left) 11/22/2004 9

Fractional Knapsack Thief robbing n items from a store Thief can carry only up to W pounds in his/her knapsack Thief can take fractions of items instead of making binary choices (like in 0-1) 11/22/2004 10

Knapsack Optimal Substructure Variations have optimal substructure 0-1: Consider the most valuable load weighting W and remove item j, the remaining load must be the most valuable load weighting W - w j from n-1 items Fractional: If we remove a weight w of an item j, the remaining load must be most valuable load weighting W - w from the n-1 items plus w j w from item j 11/22/2004 11

Solvability Issues 0-1 Knapsack is not solvable by a GA Fractional is Compute value per pound: v i / w i Greedy strategy: Take the items with the greatest value per pound first, till knapsack is full So, by sorting the item by v i / w i the whole process requires O(n log n) 11/22/2004 12

Steps of the Greedy Design Greedy algorithms are designed according to a series of simple steps 1. Describe the OP so that a choice leads to one sub-problem to be solved 2. Prove that there is always an optimal solutions that makes the greedy choice The greedy choice is safe 3. Demonstrate that greedy choice + optimal solution to sub-problem = optimal solution to the problem 11/22/2004 13

Examples: Dijkstra Algorithm for Shortest Paths INPUT: A directed graph G=(V,E) Source s A weight function w:e R + w(u,v) 0, (u,v) E Maintain a set S V whose final shortest-path weights from s have been determined 11/22/2004 14

Dijkstra Algorithm Dijkstra(G,w,s) Initialize-Single-Source(G,s) S = 0 Q = V while Q 0 do u = Extract-Min(Q) // GREEDY CHOICE HERE S = S {u} for each vertex v Adj[u] do Relax(u,v,w) 11/22/2004 15

Building MSTs We will build a set A of edges Initially, A has no edges As we add edges to A, maintain a loop invariant: Loop invariant: A is a subset of some MST Add only edges that maintain the invariant If A is a subset of some MST, an edge (u,v) is safe for A if and only if A {(u, v)} is also a subset of some MST (add only safe edges) 11/22/2004 16

Generic MST algorithm GENERIC-MST(G,w) A = 0 while A is not a spanning tree do find an edge (u, v) that is safe for A A = A {(u, v)} return A 11/22/2004 17

Correctness We use the loop invariant Initialization: The empty set trivially satisfies the loop invariant Maintenance: Since we add only safe edges, A remains a subset of some MST Termination: All edges added to A are in an MST, so when we stop, A is a spanning tree that is also an MST 11/22/2004 18

Finding a Safe Edge A cut (S,V S) of an undirected graph G is a partition of V An edge (u,v) crosses the cut (S,V S) if one of its endpoints is in S and the other in V S A cut respects a set of edges A if no edge in A crosses the cut An edge is a light edge crossing the cut if its weight is the minimum among all those that cross the cut 11/22/2004 19

Recognizing Safe Edges Theorem: Let G=(V,E) be a connected, undirected graph, and w:e R. Let A E included in some MST of G. Let (S, V S) any cut of G that respects A and let (u,v) be a light edge crossing (S, V S). Then edge (u,v) is safe for A 11/22/2004 20

Analysis of GENERIC-MST A is a forest containing connected components. Initially, each component is a single vertex Any safe edge merges two of these components into one. Each component is a tree. Since an MST has exactly V -1 edges, the for loop iterates V -1 times. Equivalently, after adding V -1 safe edges, we are down to just one component 11/22/2004 21

Prim s Algorithm for MST Builds one tree, so A is always a tree Starts from an arbitrary root r At each step, find a light edge crossing cut (V A, V V A ), where V A = vertices that A is incident on Add this edge to A 11/22/2004 22

Selecting Edges Efficiently Use a priority queue Q: Each object is a vertex in V V A Key of v is minimum weight of any edge (u,v), where u V A The vertex returned by EXTRACT-MIN is v such that there exists u V A and (u,v) is a light edge crossing (V A,V V A ) Key of v is if v is not adjacent to any vertices in V A 11/22/2004 23

Prim s MST The edges of A will form a rooted tree with root r: r is given as an input to the algorithm, but it can be any vertex Each vertex knows its parent in the tree by the attribute π[v] = parent of v. π[v] = NIL if v = r or v has no parent As algorithm progresses, A = {(v, π[v]) : v V {r} Q} At termination, V A = V Q = 0, so MST is A = {(v, π[v]) : v V {r}} 11/22/2004 24

Prim, the Algorithm PRIM(G,w,r) for each u V do key[u]= ; π[u]=nil key[r]=0; Q=V while Q 0 do u=extract-min(q) // GREEDY CHOICE! for each v Adj[u] do if v Q and w(u,v) < key[v] then π[v]=u key[v]=w(u, v) 11/22/2004 25

Huffman Codes Effective technique for compressing data (20-90% savings) Data = sequence of characters Uses a table of frequencies to build an optimal way of representing the data as a binary string Design a binary character code where each character is represented by a unique binary string Fixed-length codes vs. variable length codes 11/22/2004 26

Prefix Codes Prefix codes are codes in which no codeword is a prefix of some other codeword No loss of generality Prefix codes simplify decoding The codeword that begins an encoded file is unambiguous 11/22/2004 27

Constructing a Huffman Code The Huffman Code is an optimal prefix code Assumptions C is a set of n characters Every c C has a frequency f[c] The tree corresponding to the optimal prefix code is built bottom-up Begins with C leaves and performs C -1 merging operations to create the tree 11/22/2004 28

Huffman, the Algorithm Huffman(C) Q = C for i = 1 to n-1 do allocate a new node z left[z] = x = Extract-Min(Q) right[z] = y = Extract-Min(Q) f[z] = f[x] + f[y] insert(q,z) return Extract-Min(Q) 11/22/2004 29

Analysis Q is implemented as a binary min-heap Initialization of Q is O(n) The loop contributes O(n log n) Executed O(n) times Each time heap operations require O(log n) Total running time for n characters O(n log n) 11/22/2004 30

Correctness Lemma 1: Let C be an alphabet where each c C has frequency f[c]. Let x and y C be the characters with the lowest frequencies. Then there exists an optimal prefix code for C in which the codewords for x and y have the same length and differ only in one bit 11/22/2004 31

Greediness Lemma 1 Merging the two characters with the lowest frequency is greedy and leads to an optimal tree It is greedy: Of all possible mergers at each step, Huffman chooses the one with minimal cost There is a Lemma 2 for showing optimal substructure 11/22/2004 32

Assignments Textbook, Chapter 16, pages 370 392 Updated information on the class web page: www.ece.neu.edu/courses/eceg205/2004fa 11/22/2004 33

Happy Thanksgiving! 11/22/2004 34