Analysis of Algorithms - Greedy algorithms -

Size: px
Start display at page:

Download "Analysis of Algorithms - Greedy algorithms -"

Transcription

1 Analysis of Algorithms - Greedy algorithms - Andreas Ermedahl MRTC (Mälardalens Real-Time Reseach Center) andreas.ermedahl@mdh.se Autumn 2003 Greedy Algorithms Another paradigm for designing algorithms Often used in optimization problems Useful for many type of problems Basic idea: always select the solution to next subproblem that seems the best for the moment Greedy algorithms are often straightforward to design and easy to understand Example: Greedy Capitalist Algorithm, shortsightedly run all your factories to the max, without maintaining them, to get most profit out of them Greedy Algorithms Apparently greedy algorithms do not always give optimal solutions... For some problems they do. For some greedy algorithms can provide heuristics that work well in practice For some greedy algorithms don t work well at all We will investigate what type of problems that do Greedy algorithm: never go downwards! Global optimum can be better than local maximum We can get stuck in a local maximum 1

2 Example 1: The Activity Selection Problem A scheduling problem with a greedy solution Given: one resource (e.g. cpu or lecture hall) and set S of n activities (e.g. tasks or lectures) s i = start time of activity i f i = finish time of activity i Problem: Find max-size subset A of compatible activities (e.g. maximize the number of lectures) Activity i and j overlap and can therefore not run together s i i f i s j j f j Activity selection problem Assume f 1 f 2 f n. Activities sorted with respect to increasing finishing time The activity with first finishing time The activity with last finishing time Greedy approach: select the first activity to finish that is compatible with all previously selected activities Activity whose starting time is at least the finishing time of the current latest scheduled activity Greedy algorithm run gives: {1, 2, 5}. Activity selection algorithm All activities have been sorted on their finishing time: f 1 f 2 f n. i is the current activity f i is the maximum finishing time of any activity in A» Greedy algorithm: Always selects the next activity m with starting time f i Run of algorithm gives A = {1,4,8,11} 2

3 Optimal solution An optimal algorithm will produce the solution that contain a maximal number of compatible activities We might have several optimal solutions to the same problem Example from the book: Maximal subset of mutual compatible activities: {1,4,8,11} or {2,4,9,11} Question: how do we know that a greedy algorithm always finds the optimum for a given problem? Answer: No general way. However, problems that can be greedily solved typically have: 1. Greedy-choice property 2. Optimal substructure Greedy-choice property A problem has greedy-choice property if a global optimal solution can be arrived by making a locally optimal greedy choice at each step Always make the choice that look best at the moment We must somehow prove that a greedy-choice at each step yields a globally optimal solution Typical pattern. Show that: Any optimal solution can be turned into another optimal solution starting with a greedy choice, and The remaining problem is of the same type as original problem but smaller Optimal substructure A problem has optimal substructure property if an optimal solution can be computed from optimal solutions to subproblems An optimal solution contains within itself optimal solutions to subproblems Optimal substructure property appears in problems suitable for dynamic programming solutions Examples: fibcall, assembly-line scheduling, max-sum 3

4 Proof of optimal solution How prove that an algorithm produce an optimal solution? Use simple induction Show that it holds for the base case: Show there is an optimal solution containing activity 1 Show once 1 is selected, the rest of the problem is solved by making a greedy choice among the activities compatible with 1 Infer the algorithm is optimal by a simple induction Assume that we have arrived at a subproblem by having made greedy choice in the original problem Argue that an optimal solution to the subproblem, together with the greedy choices already made, yields an optimal solution Greedy-choice property Show that there is an optimal solution containing 1 Assume all activities S are sorted on finishing time: f 1 f 2 Let A Œ S be one optimal solution (containing maximum number of mutual compatible activities) Stmt: If the first activity in A is k 1 (not greedy), then there is another optimal solution B that begins with 1 Let B = A - {k}» {1} Because f 1 f k activity 1 is still compatible with A B = A, so B is also optimal Conclusion: There is an optimal solution containing activity 1 Eg. Assume that first in A is 2, then B can still be optimal if it contains 1 Optimal substructure Show that once 1 is selected, the rest of the problem is solved by making a greedy choice among the activities compatible with 1 Once we make the first greedy choice the problem left is S = {i œ S : s i f 1 } with optimal solution A = A - {1} For A to be an optimal selection in respect to S then A must be optimal selection in respect to S The same reasoning can be recursively applied Therefore, the greedy choice algorithm for the active-selection problem yields an optimal solution. If A = is A {1} then S = {4,6,7,8,9,11} If we pick an item 4 from A we can always instead pick 4 4

5 Greedy vs. dynamic programming Dynamic programming also relies on optimal substructure Dynamic programming may need solutions to many subproblems (e.g. max sum problem) Greedy algorithms just select a locally optimal solution to a subproblem and then go on with the rest Subtle differences in problem formulations can make a difference whether greedy algorithms always yield optimal solutions or not As an example, consider two similar versions of the knapsack problem 0-1 knapsack problem Fractional knapsack problem Example 2: The Knapsack problem Background: A thief robbing a store finds n different items Represented by set A Each item i œ A has two properties: weight: w i (e.g. 10 pounds), worth: v i (e.g. $60) The thief can carry items of at most weight W in his knapsack (e.g. 50 pounds) Problem: How to maximize the value of items possible to carry in the given knapsack? The two knapsack problems The 0-1 knapsack problem: Each w i and v i are integers Select a set of (whole) items A Œ A such that the value of the items are maximized and the sum of their weights is W An item is either selected or not For example: gold ingot The fractional knapsack problem: Somewhat different The thief can take fractions of items For example: gold dust 5

6 The knapsack problem Both knap-sack problems have optimal substructure property: Assume some optimal solution (with weight W) Remove some item j Remaining solution then optimal solution with weight W - w j Fractional knapsack can be solved by a greedy algorithm: Sort items by descending value density d i =v i /w i Pack whole items in that order until some item j does not fit, then pack fraction of j so knapsack is filled Yields an optimal solution 0-1 knapsack can not be solved by this greedy strategy! Counterexample, see next slide! 0-1 knapsack can however be solved by dynamic programming The knapsack problem Illustrating difference between 0-1 and fractional knapsack problem Greedy algorithm does not work! Greedy algorithm does work! Greatest value per pound Item 3 Item 1 10 Item $ $ $ $60 30 $ $ $ $ $60 $60 $100 $120 knapsack Smallest value per pound Thief must select subset of three items = $220 = $160 = $ knapsack problem Optimal solution = $160 Fractional knapsack problem Example 3: Huffman coding Efficient technique for compressing data Typically 20% to 90% savings are typical Optimal binary coding of characters We want to minimize the total number of bits needed Wasteful to use same number of bits to encode characters if they not appear with the same frequency E.g. in Swedish the character e appear much more frequent than the character z Idea: Have a variable-length code with a different number of bits for different characters More frequent characters should be given a shorter code 6

7 Huffman codes: Example Data file contains only characters: {a,b,c,d,e,f} The file contains in total characters The characters appear with the frequency shown in table Fixed size coding (three bits per character) gives: 3 * * * * * * 5 = 3 * 100 = bits Variable size coding gives: 1 * * * * * * 5 = = bits Saving is about % Huffman codes Huffman codes are variable length codes Each string can be decoded from left to right E.g. string abc is represented by = Huffman codes are prefix codes No codeword is also a prefix of another codeword Huffman codes are optimal prefix codes Given frequencies (no. of occurrences) for the different characters in a string, the Huffman encoding minimizes the total length of the string Prefix codes Prefix codes can be represented by binary trees: Each left branch labelled with 0, each right branch with 1 The characters are the leaves The code for a character is then given by the sequence of 0, 1 from the root to the leaf An optimal code must be represented by a full binary tree (why?) 7

8 Binary Trees for coding schemes Full binary tree b:13 c:12 d:16 e:9 f:5 Fixed-length code tree All characters are on the same level in the tree Characters with high frequency appears higher up in the tree b:13 c: d:16 Optimal prefix code tree Optimal prefix codes Given a set of characters C, a frequency f(c) for each character cœ C, and a tree T where d T (c) is the depth of character c Number of bits required to encode a file is thus: B(T) = cœ C f(c)d T (c) B(T) holds the cost for tree T Optimization problem: Finding the optimal prefix codes is equal to finding the tree T that minimizes B(T) Huffman coding Greedy algorithm that finds an optimal tree Works bottom up, starting with set of characters as set of subtrees (all leaves) Then successively merges subtrees to larger trees (merging = adding new node with subtrees as children) The two cheapest subtrees (lowest frequencies) are merged in each step Merges all leaves into single tree in C - 1 steps Basic idea: It should be good to merge subtrees for infrequent characters early since they then will end up deep in the tree. Thus, they will have long codes. Frequent characters end up high in the tree and will have short codes. 8

9 Huffman coding Algorithm for building optimal prefix-tree: f[i] is the frequency assigned to node i Q is a min-priority-queue keyed on frequency f Removes the two node with smallest frequency from Q Inserts new node sum of frequency node into Q Last node in Q is also the root of the created tree Huffman s algorithm run 1. All characters with their frequencies are inserted in f:5 e:9 c:12 b:13 d:16 2. Characters f and e are selected: b:13 14 c:12 d:16 3. Characters c and b are selected: 14 d:16 c:12 b:13 Huffman s algorithm run Character d and node 14 are selected: 30 c:12 b:13 14 d:16 9

10 Huffman s algorithm run Nodes and 30 are selected: c:12 b:13 14 d: Huffman s 3. algorithm run Character a and node 55 are selected: Only one node left in Q, becomes the new root of the tree b:13 c: d:16 Running time analysis Assume we use a binary heap as priority queue Q Both extract min and insert are O(lg n) for binary heap n - 1 steps in loop, thus O(n lg n) running time 10

11 Proving the correctness To prove the correctness of the algorithm we must prove that greedy-choice property and the optimal substructure property hold I.e. that we can always select the two characters with the lowest frequencies Proof of Greedy choice property: Lemma: Let C be a set of characters. If x, y are the characters with lowest frequencies, then there is an optimal prefix code for C where the codes of x and y have same length and differ only in last bit. Proof idea: Consider tree T for any optimal prefix code, and show it can be modified into another tree T whose prefix code is also optimal and where x and y are sibling leaves of maximum depth. Proof works by exchanging x and y with two deepest characters in T, and then showing that the resulting tree T has same cost as T See the book for the exact details of the proof Proving the correctness I.e. an optimal solution Proof of Optimal substructure property: can be computed from optimal solutions to Lemma: let T be tree for optimal prefix code subproblems for C. Let x, y be sibling leaves in T with parent z. Set f[z] = f[x] + f[y]. Then T = T \ {x, y} represents an optimal prefix code for (C \ {x, y})» {z}. Proof idea: Express B(T) in terms of B(T ). From the result (and minimality of B(T) for C), conclude that B(T ) is minimal for (C \ {x, y})» z. See the book for the exact details of the proof From these two lemma we obtain: Theorem: Huffman produces optimal prefix code. The End! 11

CS473-Algorithms I. Lecture 11. Greedy Algorithms. Cevdet Aykanat - Bilkent University Computer Engineering Department

CS473-Algorithms I. Lecture 11. Greedy Algorithms. Cevdet Aykanat - Bilkent University Computer Engineering Department CS473-Algorithms I Lecture 11 Greedy Algorithms 1 Activity Selection Problem Input: a set S {1, 2,, n} of n activities s i =Start time of activity i, f i = Finish time of activity i Activity i takes place

More information

Greedy Algorithms. CLRS Chapters Introduction to greedy algorithms. Design of data-compression (Huffman) codes

Greedy Algorithms. CLRS Chapters Introduction to greedy algorithms. Design of data-compression (Huffman) codes Greedy Algorithms CLRS Chapters 16.1 16.3 Introduction to greedy algorithms Activity-selection problem Design of data-compression (Huffman) codes (Minimum spanning tree problem) (Shortest-path problem)

More information

Algorithms Dr. Haim Levkowitz

Algorithms Dr. Haim Levkowitz 91.503 Algorithms Dr. Haim Levkowitz Fall 2007 Lecture 4 Tuesday, 25 Sep 2007 Design Patterns for Optimization Problems Greedy Algorithms 1 Greedy Algorithms 2 What is Greedy Algorithm? Similar to dynamic

More information

Greedy Algorithms CHAPTER 16

Greedy Algorithms CHAPTER 16 CHAPTER 16 Greedy Algorithms In dynamic programming, the optimal solution is described in a recursive manner, and then is computed ``bottom up''. Dynamic programming is a powerful technique, but it often

More information

Greedy algorithms part 2, and Huffman code

Greedy algorithms part 2, and Huffman code Greedy algorithms part 2, and Huffman code Two main properties: 1. Greedy choice property: At each decision point, make the choice that is best at the moment. We typically show that if we make a greedy

More information

16.Greedy algorithms

16.Greedy algorithms 16.Greedy algorithms 16.1 An activity-selection problem Suppose we have a set S = {a 1, a 2,..., a n } of n proposed activities that with to use a resource. Each activity a i has a start time s i and a

More information

We ve done. Now. Next

We ve done. Now. Next We ve done Fast Fourier Transform Polynomial Multiplication Now Introduction to the greedy method Activity selection problem How to prove that a greedy algorithm works Huffman coding Matroid theory Next

More information

TU/e Algorithms (2IL15) Lecture 2. Algorithms (2IL15) Lecture 2 THE GREEDY METHOD

TU/e Algorithms (2IL15) Lecture 2. Algorithms (2IL15) Lecture 2 THE GREEDY METHOD Algorithms (2IL15) Lecture 2 THE GREEDY METHOD x y v w 1 Optimization problems for each instance there are (possibly) multiple valid solutions goal is to find an optimal solution minimization problem:

More information

G205 Fundamentals of Computer Engineering. CLASS 21, Mon. Nov Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm

G205 Fundamentals of Computer Engineering. CLASS 21, Mon. Nov Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm G205 Fundamentals of Computer Engineering CLASS 21, Mon. Nov. 22 2004 Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm Greedy Algorithms, 1 Algorithms for Optimization Problems Sequence of steps Choices at

More information

Greedy Algorithms. Alexandra Stefan

Greedy Algorithms. Alexandra Stefan Greedy Algorithms Alexandra Stefan 1 Greedy Method for Optimization Problems Greedy: take the action that is best now (out of the current options) it may cause you to miss the optimal solution You build

More information

Huffman Coding. Version of October 13, Version of October 13, 2014 Huffman Coding 1 / 27

Huffman Coding. Version of October 13, Version of October 13, 2014 Huffman Coding 1 / 27 Huffman Coding Version of October 13, 2014 Version of October 13, 2014 Huffman Coding 1 / 27 Outline Outline Coding and Decoding The optimal source coding problem Huffman coding: A greedy algorithm Correctness

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms CSE 101, Winter 018 D/Q Greed SP s DP LP, Flow B&B, Backtrack Metaheuristics P, NP Design and Analysis of Algorithms Lecture 8: Greed Class URL: http://vlsicad.ucsd.edu/courses/cse101-w18/ Optimization

More information

Scribe: Virginia Williams, Sam Kim (2016), Mary Wootters (2017) Date: May 22, 2017

Scribe: Virginia Williams, Sam Kim (2016), Mary Wootters (2017) Date: May 22, 2017 CS6 Lecture 4 Greedy Algorithms Scribe: Virginia Williams, Sam Kim (26), Mary Wootters (27) Date: May 22, 27 Greedy Algorithms Suppose we want to solve a problem, and we re able to come up with some recursive

More information

Dynamic Programming Assembly-Line Scheduling. Greedy Algorithms

Dynamic Programming Assembly-Line Scheduling. Greedy Algorithms Chapter 13 Greedy Algorithms Activity Selection Problem 0-1 Knapsack Problem Huffman Code Construction Dynamic Programming Assembly-Line Scheduling C-C Tsai P.1 Greedy Algorithms A greedy algorithm always

More information

Chapter 16: Greedy Algorithm

Chapter 16: Greedy Algorithm Chapter 16: Greedy Algorithm 1 About this lecture Introduce Greedy Algorithm Look at some problems solvable by Greedy Algorithm 2 Coin Changing Suppose that in a certain country, the coin dominations consist

More information

Lecture: Analysis of Algorithms (CS )

Lecture: Analysis of Algorithms (CS ) Lecture: Analysis of Algorithms (CS483-001) Amarda Shehu Spring 2017 1 The Fractional Knapsack Problem Huffman Coding 2 Sample Problems to Illustrate The Fractional Knapsack Problem Variable-length (Huffman)

More information

ECE608 - Chapter 16 answers

ECE608 - Chapter 16 answers ¼ À ÈÌ Ê ½ ÈÊÇ Ä ÅË ½µ ½ º½¹ ¾µ ½ º½¹ µ ½ º¾¹½ µ ½ º¾¹¾ µ ½ º¾¹ µ ½ º ¹ µ ½ º ¹ µ ½ ¹½ ½ ECE68 - Chapter 6 answers () CLR 6.-4 Let S be the set of n activities. The obvious solution of using Greedy-Activity-

More information

Greedy Algorithms and Huffman Coding

Greedy Algorithms and Huffman Coding Greedy Algorithms and Huffman Coding Henry Z. Lo June 10, 2014 1 Greedy Algorithms 1.1 Change making problem Problem 1. You have quarters, dimes, nickels, and pennies. amount, n, provide the least number

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms Design and Analysis of Algorithms Instructor: SharmaThankachan Lecture 10: Greedy Algorithm Slides modified from Dr. Hon, with permission 1 About this lecture Introduce Greedy Algorithm Look at some problems

More information

Text Compression through Huffman Coding. Terminology

Text Compression through Huffman Coding. Terminology Text Compression through Huffman Coding Huffman codes represent a very effective technique for compressing data; they usually produce savings between 20% 90% Preliminary example We are given a 100,000-character

More information

February 24, :52 World Scientific Book - 9in x 6in soltys alg. Chapter 3. Greedy Algorithms

February 24, :52 World Scientific Book - 9in x 6in soltys alg. Chapter 3. Greedy Algorithms Chapter 3 Greedy Algorithms Greedy algorithms are algorithms prone to instant gratification. Without looking too far ahead, at each step they make a locally optimum choice, with the hope that it will lead

More information

Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 6, 2016 洪國寶

Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 6, 2016 洪國寶 Design and Analysis of Algorithms 演算法設計與分析 Lecture 7 April 6, 2016 洪國寶 1 Course information (5/5) Grading (Tentative) Homework 25% (You may collaborate when solving the homework, however when writing up

More information

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS Department of Computer Science University of Babylon LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS By Faculty of Science for Women( SCIW), University of Babylon, Iraq Samaher@uobabylon.edu.iq

More information

Greedy algorithms 2 4/5/12. Knapsack problems: Greedy or not? Compression algorithms. Data compression. David Kauchak cs302 Spring 2012

Greedy algorithms 2 4/5/12. Knapsack problems: Greedy or not? Compression algorithms. Data compression. David Kauchak cs302 Spring 2012 Knapsack problems: Greedy or not? Greedy algorithms 2 avid Kauchak cs02 Spring 12 l 0-1 Knapsack thief robbing a store finds n items worth v 1, v 2,.., v n dollars and weight w 1, w 2,, w n pounds, where

More information

CS 758/858: Algorithms

CS 758/858: Algorithms CS 758/858: Algorithms http://www.cs.unh.edu/~ruml/cs758 Wheeler Ruml (UNH) Class 12, CS 758 1 / 22 Scheduling Rules Algorithm Proof Opt. Substructure Break Algorithms Wheeler Ruml (UNH) Class 12, CS 758

More information

16 Greedy Algorithms

16 Greedy Algorithms 16 Greedy Algorithms Optimization algorithms typically go through a sequence of steps, with a set of choices at each For many optimization problems, using dynamic programming to determine the best choices

More information

Design and Analysis of Algorithms

Design and Analysis of Algorithms Design and Analysis of Algorithms CSE 5311 Lecture 16 Greedy algorithms Junzhou Huang, Ph.D. Department of Computer Science and Engineering CSE5311 Design and Analysis of Algorithms 1 Overview A greedy

More information

Huffman Codes (data compression)

Huffman Codes (data compression) Huffman Codes (data compression) Data compression is an important technique for saving storage Given a file, We can consider it as a string of characters We want to find a compressed file The compressed

More information

CSC 373 Lecture # 3 Instructor: Milad Eftekhar

CSC 373 Lecture # 3 Instructor: Milad Eftekhar Huffman encoding: Assume a context is available (a document, a signal, etc.). These contexts are formed by some symbols (words in a document, discrete samples from a signal, etc). Each symbols s i is occurred

More information

CS F-10 Greedy Algorithms 1

CS F-10 Greedy Algorithms 1 CS673-2016F-10 Greedy Algorithms 1 10-0: Dynamic Programming Hallmarks of Dynamic Programming Optimal Program Substructure Overlapping Subproblems If a problem has optimal program structure, there may

More information

CSE 421 Greedy: Huffman Codes

CSE 421 Greedy: Huffman Codes CSE 421 Greedy: Huffman Codes Yin Tat Lee 1 Compression Example 100k file, 6 letter alphabet: File Size: ASCII, 8 bits/char: 800kbits 2 3 > 6; 3 bits/char: 300kbits a 45% b 13% c 12% d 16% e 9% f 5% Why?

More information

CSE541 Class 2. Jeremy Buhler. September 1, 2016

CSE541 Class 2. Jeremy Buhler. September 1, 2016 CSE541 Class 2 Jeremy Buhler September 1, 2016 1 A Classic Problem and a Greedy Approach A classic problem for which one might want to apply a greedy algo is knapsack. Given: a knapsack of capacity M,

More information

COMP251: Greedy algorithms

COMP251: Greedy algorithms COMP251: Greedy algorithms Jérôme Waldispühl School of Computer Science McGill University Based on (Cormen et al., 2002) Based on slides from D. Plaisted (UNC) & (goodrich & Tamassia, 2009) Disjoint sets

More information

Analysis of Algorithms Prof. Karen Daniels

Analysis of Algorithms Prof. Karen Daniels UMass Lowell Computer Science 91.503 Analysis of Algorithms Prof. Karen Daniels Spring, 2010 Lecture 2 Tuesday, 2/2/10 Design Patterns for Optimization Problems Greedy Algorithms Algorithmic Paradigm Context

More information

Greedy Algorithms CLRS Laura Toma, csci2200, Bowdoin College

Greedy Algorithms CLRS Laura Toma, csci2200, Bowdoin College Greedy Algorithms CLRS 16.1-16.2 Laura Toma, csci2200, Bowdoin College Overview. Sometimes we can solve optimization problems with a technique called greedy. A greedy algorithm picks the option that looks

More information

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms Lecturer: Shi Li Department of Computer Science and Engineering University at Buffalo Main Goal of Algorithm Design Design fast

More information

MCS-375: Algorithms: Analysis and Design Handout #G2 San Skulrattanakulchai Gustavus Adolphus College Oct 21, Huffman Codes

MCS-375: Algorithms: Analysis and Design Handout #G2 San Skulrattanakulchai Gustavus Adolphus College Oct 21, Huffman Codes MCS-375: Algorithms: Analysis and Design Handout #G2 San Skulrattanakulchai Gustavus Adolphus College Oct 21, 2016 Huffman Codes CLRS: Ch 16.3 Ziv-Lempel is the most popular compression algorithm today.

More information

Horn Formulae. CS124 Course Notes 8 Spring 2018

Horn Formulae. CS124 Course Notes 8 Spring 2018 CS124 Course Notes 8 Spring 2018 In today s lecture we will be looking a bit more closely at the Greedy approach to designing algorithms. As we will see, sometimes it works, and sometimes even when it

More information

EE 368. Weeks 5 (Notes)

EE 368. Weeks 5 (Notes) EE 368 Weeks 5 (Notes) 1 Chapter 5: Trees Skip pages 273-281, Section 5.6 - If A is the root of a tree and B is the root of a subtree of that tree, then A is B s parent (or father or mother) and B is A

More information

Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-4004) Subject Notes

Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-4004) Subject Notes Page no: Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-00) Subject Notes Unit- Greedy Technique. Introduction: Greedy is the most straight forward design technique.

More information

Greedy Algorithms. Textbook reading. Chapter 4 Chapter 5. CSci 3110 Greedy Algorithms 1/63

Greedy Algorithms. Textbook reading. Chapter 4 Chapter 5. CSci 3110 Greedy Algorithms 1/63 CSci 3110 Greedy Algorithms 1/63 Greedy Algorithms Textbook reading Chapter 4 Chapter 5 CSci 3110 Greedy Algorithms 2/63 Overview Design principle: Make progress towards a solution based on local criteria

More information

Exercises Optimal binary search trees root

Exercises Optimal binary search trees root 5.5 Optimal binary search trees 403 e w 5 5 j 4.75 i j 4.00 i 3.75.00 3 3 0.70 0.80 3.5.0. 4 0.55 0.50 0.60 4 0.90 0.70 0.60 0.90 5 0.45 0.35 0. 0.50 5 0 0.45 0.40 0.5 0. 0.50 6 0 0. 0.5 0.5 0.0 0.35 6

More information

Chapter 9. Greedy Technique. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

Chapter 9. Greedy Technique. Copyright 2007 Pearson Addison-Wesley. All rights reserved. Chapter 9 Greedy Technique Copyright 2007 Pearson Addison-Wesley. All rights reserved. Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that

More information

Efficient Sequential Algorithms, Comp309. Motivation. Longest Common Subsequence. Part 3. String Algorithms

Efficient Sequential Algorithms, Comp309. Motivation. Longest Common Subsequence. Part 3. String Algorithms Efficient Sequential Algorithms, Comp39 Part 3. String Algorithms University of Liverpool References: T. H. Cormen, C. E. Leiserson, R. L. Rivest Introduction to Algorithms, Second Edition. MIT Press (21).

More information

looking ahead to see the optimum

looking ahead to see the optimum ! Make choice based on immediate rewards rather than looking ahead to see the optimum! In many cases this is effective as the look ahead variation can require exponential time as the number of possible

More information

Elements of Dynamic Programming. COSC 3101A - Design and Analysis of Algorithms 8. Discovering Optimal Substructure. Optimal Substructure - Examples

Elements of Dynamic Programming. COSC 3101A - Design and Analysis of Algorithms 8. Discovering Optimal Substructure. Optimal Substructure - Examples Elements of Dynamic Programming COSC 3A - Design and Analysis of Algorithms 8 Elements of DP Memoization Longest Common Subsequence Greedy Algorithms Many of these slides are taken from Monica Nicolescu,

More information

Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 15, 2015 洪國寶

Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 15, 2015 洪國寶 Design and Analysis of Algorithms 演算法設計與分析 Lecture 7 April 15, 2015 洪國寶 1 Course information (5/5) Grading (Tentative) Homework 25% (You may collaborate when solving the homework, however when writing

More information

Greedy Algorithms. This is such a simple approach that it is what one usually tries first.

Greedy Algorithms. This is such a simple approach that it is what one usually tries first. Greedy Algorithms A greedy algorithm tries to solve an optimisation problem by making a sequence of choices. At each decision point, the alternative that seems best at that moment is chosen. This is such

More information

4.8 Huffman Codes. These lecture slides are supplied by Mathijs de Weerd

4.8 Huffman Codes. These lecture slides are supplied by Mathijs de Weerd 4.8 Huffman Codes These lecture slides are supplied by Mathijs de Weerd Data Compression Q. Given a text that uses 32 symbols (26 different letters, space, and some punctuation characters), how can we

More information

Problem Strategies. 320 Greedy Strategies 6

Problem Strategies. 320 Greedy Strategies 6 Problem Strategies Weighted interval scheduling: 2 subproblems (include the interval or don t) Have to check out all the possibilities in either case, so lots of subproblem overlap dynamic programming:

More information

Binary Heaps in Dynamic Arrays

Binary Heaps in Dynamic Arrays Yufei Tao ITEE University of Queensland We have already learned that the binary heap serves as an efficient implementation of a priority queue. Our previous discussion was based on pointers (for getting

More information

T(n) = expected time of algorithm over all inputs of size n. half the elements in A[1.. j 1] are less than A[ j ], and half the elements are greater.

T(n) = expected time of algorithm over all inputs of size n. half the elements in A[1.. j 1] are less than A[ j ], and half the elements are greater. Algorithms Design and Analysis Definitions: An algorithm: It is any well-defined computational procedure that takes some value, or set of values, as input and produces some value, or set of values, as

More information

6. Finding Efficient Compressions; Huffman and Hu-Tucker

6. Finding Efficient Compressions; Huffman and Hu-Tucker 6. Finding Efficient Compressions; Huffman and Hu-Tucker We now address the question: how do we find a code that uses the frequency information about k length patterns efficiently to shorten our message?

More information

Analysis of Algorithms - Quicksort -

Analysis of Algorithms - Quicksort - Analysis of Algorithms - Quicksort - Andreas Ermedahl MRTC (Mälardalens Real-Time Reseach Center) andreas.ermedahl@mdh.se Autumn 2003 Quicksort Proposed by C.A.R. Hoare in 962. Divide- and- conquer algorithm

More information

Priority Queues and Binary Heaps

Priority Queues and Binary Heaps Yufei Tao ITEE University of Queensland In this lecture, we will learn our first tree data structure called the binary heap which serves as an implementation of the priority queue. Priority Queue A priority

More information

Fundamentals of Multimedia. Lecture 5 Lossless Data Compression Variable Length Coding

Fundamentals of Multimedia. Lecture 5 Lossless Data Compression Variable Length Coding Fundamentals of Multimedia Lecture 5 Lossless Data Compression Variable Length Coding Mahmoud El-Gayyar elgayyar@ci.suez.edu.eg Mahmoud El-Gayyar / Fundamentals of Multimedia 1 Data Compression Compression

More information

CS141: Intermediate Data Structures and Algorithms Greedy Algorithms

CS141: Intermediate Data Structures and Algorithms Greedy Algorithms CS141: Intermediate Data Structures and Algorithms Greedy Algorithms Amr Magdy Activity Selection Problem Given a set of activities S = {a 1, a 2,, a n } where each activity i has a start time s i and

More information

CPE702 Algorithm Analysis and Design Week 7 Algorithm Design Patterns

CPE702 Algorithm Analysis and Design Week 7 Algorithm Design Patterns CPE702 Algorithm Analysis and Design Week 7 Algorithm Design Patterns Pruet Boonma pruet@eng.cmu.ac.th Department of Computer Engineering Faculty of Engineering, Chiang Mai University Based on Slides by

More information

CSC 373: Algorithm Design and Analysis Lecture 4

CSC 373: Algorithm Design and Analysis Lecture 4 CSC 373: Algorithm Design and Analysis Lecture 4 Allan Borodin January 14, 2013 1 / 16 Lecture 4: Outline (for this lecture and next lecture) Some concluding comments on optimality of EST Greedy Interval

More information

15.4 Longest common subsequence

15.4 Longest common subsequence 15.4 Longest common subsequence Biological applications often need to compare the DNA of two (or more) different organisms A strand of DNA consists of a string of molecules called bases, where the possible

More information

Greedy Algorithms and Data Compression. Curs Fall 2017

Greedy Algorithms and Data Compression. Curs Fall 2017 Greedy Algorithms and Data Compression. Curs Fall 2017 Greedy Algorithms An optimization problem: Given of (S, f ), where S is a set of feasible elements and f : S R is the objective function, find a u

More information

1 Non greedy algorithms (which we should have covered

1 Non greedy algorithms (which we should have covered 1 Non greedy algorithms (which we should have covered earlier) 1.1 Floyd Warshall algorithm This algorithm solves the all-pairs shortest paths problem, which is a problem where we want to find the shortest

More information

Greedy Algorithms CSE 780

Greedy Algorithms CSE 780 Greedy Algorithms CSE 780 Reading: Sections 16.1, 16.2, 16.3, Chapter 23. 1 Introduction Optimization Problem: Construct a sequence or a set of elements {x 1,..., x k } that satisfies given constraints

More information

An undirected graph is a tree if and only of there is a unique simple path between any 2 of its vertices.

An undirected graph is a tree if and only of there is a unique simple path between any 2 of its vertices. Trees Trees form the most widely used subclasses of graphs. In CS, we make extensive use of trees. Trees are useful in organizing and relating data in databases, file systems and other applications. Formal

More information

CSE 2320 Notes 6: Greedy Algorithms

CSE 2320 Notes 6: Greedy Algorithms SE Notes 6: Greedy Algorithms (Last updated 9/9/6 :6 PM) LRS 6.-6. 6.A. ONEPTS ommitments are based on local decisions: NO backtracking (will see in stack rat-in-a-maze - Notes ) NO exhaustive search (will

More information

Binary Trees

Binary Trees Binary Trees 4-7-2005 Opening Discussion What did we talk about last class? Do you have any code to show? Do you have any questions about the assignment? What is a Tree? You are all familiar with what

More information

Partha Sarathi Manal

Partha Sarathi Manal MA 515: Introduction to Algorithms & MA353 : Design and Analysis of Algorithms [3-0-0-6] Lecture 11 http://www.iitg.ernet.in/psm/indexing_ma353/y09/index.html Partha Sarathi Manal psm@iitg.ernet.in Dept.

More information

Tutorial 6-7. Dynamic Programming and Greedy

Tutorial 6-7. Dynamic Programming and Greedy Tutorial 6-7 Dynamic Programming and Greedy Dynamic Programming Why DP? Natural Recursion may be expensive. For example, the Fibonacci: F(n)=F(n-1)+F(n-2) Recursive implementation memoryless : time= 1

More information

Figure 4.1: The evolution of a rooted tree.

Figure 4.1: The evolution of a rooted tree. 106 CHAPTER 4. INDUCTION, RECURSION AND RECURRENCES 4.6 Rooted Trees 4.6.1 The idea of a rooted tree We talked about how a tree diagram helps us visualize merge sort or other divide and conquer algorithms.

More information

Greedy Algorithms. Mohan Kumar CSE5311 Fall The Greedy Principle

Greedy Algorithms. Mohan Kumar CSE5311 Fall The Greedy Principle Greedy lgorithms Mohan Kumar CSE@UT CSE53 Fall 5 8/3/5 CSE 53 Fall 5 The Greedy Principle The problem: We are required to find a feasible solution that either maximizes or minimizes a given objective solution.

More information

Greedy Algorithms. Informal Definition A greedy algorithm makes its next step based only on the current state and simple calculations on the input.

Greedy Algorithms. Informal Definition A greedy algorithm makes its next step based only on the current state and simple calculations on the input. Greedy Algorithms Informal Definition A greedy algorithm makes its next step based only on the current state and simple calculations on the input. easy to design not always correct challenge is to identify

More information

Greedy Algorithms. Previous Examples: Huffman coding, Minimum Spanning Tree Algorithms

Greedy Algorithms. Previous Examples: Huffman coding, Minimum Spanning Tree Algorithms Greedy Algorithms A greedy algorithm is one where you take the step that seems the best at the time while executing the algorithm. Previous Examples: Huffman coding, Minimum Spanning Tree Algorithms Coin

More information

6. Algorithm Design Techniques

6. Algorithm Design Techniques 6. Algorithm Design Techniques 6. Algorithm Design Techniques 6.1 Greedy algorithms 6.2 Divide and conquer 6.3 Dynamic Programming 6.4 Randomized Algorithms 6.5 Backtracking Algorithms Malek Mouhoub, CS340

More information

Greedy Algorithms CSE 6331

Greedy Algorithms CSE 6331 Greedy Algorithms CSE 6331 Reading: Sections 16.1, 16.2, 16.3, Chapter 23. 1 Introduction Optimization Problem: Construct a sequence or a set of elements {x 1,..., x k } that satisfies given constraints

More information

Greedy algorithms is another useful way for solving optimization problems.

Greedy algorithms is another useful way for solving optimization problems. Greedy Algorithms Greedy algorithms is another useful way for solving optimization problems. Optimization Problems For the given input, we are seeking solutions that must satisfy certain conditions. These

More information

CMPS 102 Solutions to Homework 7

CMPS 102 Solutions to Homework 7 CMPS 102 Solutions to Homework 7 Kuzmin, Cormen, Brown, lbrown@soe.ucsc.edu November 17, 2005 Problem 1. 15.4-1 p.355 LCS Determine an LCS of x = (1, 0, 0, 1, 0, 1, 0, 1) and y = (0, 1, 0, 1, 1, 0, 1,

More information

Algorithm Design Methods. Some Methods Not Covered

Algorithm Design Methods. Some Methods Not Covered Algorithm Design Methods Greedy method. Divide and conquer. Dynamic Programming. Backtracking. Branch and bound. Some Methods Not Covered Linear Programming. Integer Programming. Simulated Annealing. Neural

More information

Greedy Algorithms. Subhash Suri. October 9, 2017

Greedy Algorithms. Subhash Suri. October 9, 2017 Greedy Algorithms Subhash Suri October 9, 2017 1 Introduction Greedy algorithms are a commonly used paradigm for combinatorial algorithms. Combinatorial problems intuitively are those for which feasible

More information

Department of Computer Applications. MCA 312: Design and Analysis of Algorithms. [Part I : Medium Answer Type Questions] UNIT I

Department of Computer Applications. MCA 312: Design and Analysis of Algorithms. [Part I : Medium Answer Type Questions] UNIT I MCA 312: Design and Analysis of Algorithms [Part I : Medium Answer Type Questions] UNIT I 1) What is an Algorithm? What is the need to study Algorithms? 2) Define: a) Time Efficiency b) Space Efficiency

More information

Lecture 3. Recurrences / Heapsort

Lecture 3. Recurrences / Heapsort Lecture 3. Recurrences / Heapsort T. H. Cormen, C. E. Leiserson and R. L. Rivest Introduction to Algorithms, 3rd Edition, MIT Press, 2009 Sungkyunkwan University Hyunseung Choo choo@skku.edu Copyright

More information

Trees (Part 1, Theoretical) CSE 2320 Algorithms and Data Structures University of Texas at Arlington

Trees (Part 1, Theoretical) CSE 2320 Algorithms and Data Structures University of Texas at Arlington Trees (Part 1, Theoretical) CSE 2320 Algorithms and Data Structures University of Texas at Arlington 1 Trees Trees are a natural data structure for representing specific data. Family trees. Organizational

More information

Intro. To Multimedia Engineering Lossless Compression

Intro. To Multimedia Engineering Lossless Compression Intro. To Multimedia Engineering Lossless Compression Kyoungro Yoon yoonk@konkuk.ac.kr 1/43 Contents Introduction Basics of Information Theory Run-Length Coding Variable-Length Coding (VLC) Dictionary-based

More information

We will go over the basic scenarios, in which it is appropriate to apply this technique, and a few concrete applications.

We will go over the basic scenarios, in which it is appropriate to apply this technique, and a few concrete applications. Chapter 16 The Greedy Method We have looked at the divide n conquer and dynamic programming techniques, and will now discuss another general technique, the greedy method, on designing algorithms. We will

More information

Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology. Assignment

Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology. Assignment Class: V - CE Sankalchand Patel College of Engineering - Visnagar Department of Computer Engineering and Information Technology Sub: Design and Analysis of Algorithms Analysis of Algorithm: Assignment

More information

Cpt S 223 Fall Cpt S 223. School of EECS, WSU

Cpt S 223 Fall Cpt S 223. School of EECS, WSU Course Review Cpt S 223 Fall 2012 1 Final Exam When: Monday (December 10) 8 10 AM Where: in class (Sloan 150) Closed book, closed notes Comprehensive Material for preparation: Lecture slides & class notes

More information

CSCI 136 Data Structures & Advanced Programming. Lecture 22 Fall 2018 Instructor: Bills

CSCI 136 Data Structures & Advanced Programming. Lecture 22 Fall 2018 Instructor: Bills CSCI 136 Data Structures & Advanced Programming Lecture 22 Fall 2018 Instructor: Bills Last Time Lab 7: Two Towers Array Representations of (Binary) Trees Application: Huffman Encoding 2 Today Improving

More information

15 July, Huffman Trees. Heaps

15 July, Huffman Trees. Heaps 1 Huffman Trees The Huffman Code: Huffman algorithm uses a binary tree to compress data. It is called the Huffman code, after David Huffman who discovered d it in 1952. Data compression is important in

More information

1 Format. 2 Topics Covered. 2.1 Minimal Spanning Trees. 2.2 Union Find. 2.3 Greedy. CS 124 Quiz 2 Review 3/25/18

1 Format. 2 Topics Covered. 2.1 Minimal Spanning Trees. 2.2 Union Find. 2.3 Greedy. CS 124 Quiz 2 Review 3/25/18 CS 124 Quiz 2 Review 3/25/18 1 Format You will have 83 minutes to complete the exam. The exam may have true/false questions, multiple choice, example/counterexample problems, run-this-algorithm problems,

More information

Greedy Algorithms. Algorithms

Greedy Algorithms. Algorithms Greedy Algorithms Algorithms Greedy Algorithms Many algorithms run from stage to stage At each stage, they make a decision based on the information available A Greedy algorithm makes decisions At each

More information

Heaps. 2/13/2006 Heaps 1

Heaps. 2/13/2006 Heaps 1 Heaps /13/00 Heaps 1 Outline and Reading What is a heap ( 8.3.1) Height of a heap ( 8.3.) Insertion ( 8.3.3) Removal ( 8.3.3) Heap-sort ( 8.3.) Arraylist-based implementation ( 8.3.) Bottom-up construction

More information

6. Finding Efficient Compressions; Huffman and Hu-Tucker Algorithms

6. Finding Efficient Compressions; Huffman and Hu-Tucker Algorithms 6. Finding Efficient Compressions; Huffman and Hu-Tucker Algorithms We now address the question: How do we find a code that uses the frequency information about k length patterns efficiently, to shorten

More information

Sorting and Searching

Sorting and Searching Sorting and Searching Lecture 2: Priority Queues, Heaps, and Heapsort Lecture 2: Priority Queues, Heaps, and Heapsort Sorting and Searching 1 / 24 Priority Queue: Motivating Example 3 jobs have been submitted

More information

Trees! Ellen Walker! CPSC 201 Data Structures! Hiram College!

Trees! Ellen Walker! CPSC 201 Data Structures! Hiram College! Trees! Ellen Walker! CPSC 201 Data Structures! Hiram College! ADTʼs Weʼve Studied! Position-oriented ADT! List! Stack! Queue! Value-oriented ADT! Sorted list! All of these are linear! One previous item;

More information

Name: Lirong TAN 1. (15 pts) (a) Define what is a shortest s-t path in a weighted, connected graph G.

Name: Lirong TAN 1. (15 pts) (a) Define what is a shortest s-t path in a weighted, connected graph G. 1. (15 pts) (a) Define what is a shortest s-t path in a weighted, connected graph G. A shortest s-t path is a path from vertex to vertex, whose sum of edge weights is minimized. (b) Give the pseudocode

More information

Greedy algorithms. Given a problem, how do we design an algorithm that solves the problem? There are several strategies:

Greedy algorithms. Given a problem, how do we design an algorithm that solves the problem? There are several strategies: Greedy algorithms Input Algorithm Goal? Given a problem, how do we design an algorithm that solves the problem? There are several strategies: 1. Try to modify an existing algorithm. 2. Construct an algorithm

More information

CSE 421 Applications of DFS(?) Topological sort

CSE 421 Applications of DFS(?) Topological sort CSE 421 Applications of DFS(?) Topological sort Yin Tat Lee 1 Precedence Constraints In a directed graph, an edge (i, j) means task i must occur before task j. Applications Course prerequisite: course

More information

Analysis of Algorithms

Analysis of Algorithms Algorithm An algorithm is a procedure or formula for solving a problem, based on conducting a sequence of specified actions. A computer program can be viewed as an elaborate algorithm. In mathematics and

More information

Main approach: always make the choice that looks best at the moment. - Doesn t always result in globally optimal solution, but for many problems does

Main approach: always make the choice that looks best at the moment. - Doesn t always result in globally optimal solution, but for many problems does Greedy algorithms Main approach: always make the choice that looks best at the moment. - More efficient than dynamic programming - Doesn t always result in globally optimal solution, but for many problems

More information

Priority Queues. 1 Introduction. 2 Naïve Implementations. CSci 335 Software Design and Analysis III Chapter 6 Priority Queues. Prof.

Priority Queues. 1 Introduction. 2 Naïve Implementations. CSci 335 Software Design and Analysis III Chapter 6 Priority Queues. Prof. Priority Queues 1 Introduction Many applications require a special type of queuing in which items are pushed onto the queue by order of arrival, but removed from the queue based on some other priority

More information

Greedy Algorithms. At each step in the algorithm, one of several choices can be made.

Greedy Algorithms. At each step in the algorithm, one of several choices can be made. Greedy Algorithms At each step in the algorithm, one of several choices can be made. Greedy Strategy: make the choice that is the best at the moment. After making a choice, we are left with one subproblem

More information