Lecture: Analysis of Algorithms (CS )

Similar documents
Huffman Coding. Version of October 13, Version of October 13, 2014 Huffman Coding 1 / 27

CS473-Algorithms I. Lecture 11. Greedy Algorithms. Cevdet Aykanat - Bilkent University Computer Engineering Department

Greedy Algorithms. Alexandra Stefan

Greedy algorithms part 2, and Huffman code

16.Greedy algorithms

Analysis of Algorithms - Greedy algorithms -

G205 Fundamentals of Computer Engineering. CLASS 21, Mon. Nov Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm

Chapter 16: Greedy Algorithm

Greedy algorithms 2 4/5/12. Knapsack problems: Greedy or not? Compression algorithms. Data compression. David Kauchak cs302 Spring 2012

Greedy Algorithms and Huffman Coding

Greedy Algorithms. CLRS Chapters Introduction to greedy algorithms. Design of data-compression (Huffman) codes

Design and Analysis of Algorithms

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS

Algorithms Dr. Haim Levkowitz

We will go over the basic scenarios, in which it is appropriate to apply this technique, and a few concrete applications.

Tutorial 6-7. Dynamic Programming and Greedy

Efficient Sequential Algorithms, Comp309. Motivation. Longest Common Subsequence. Part 3. String Algorithms

EE 368. Weeks 5 (Notes)

Greedy Algorithms CHAPTER 16

TU/e Algorithms (2IL15) Lecture 2. Algorithms (2IL15) Lecture 2 THE GREEDY METHOD

Dynamic Programming Assembly-Line Scheduling. Greedy Algorithms

Announcements. Programming assignment 1 posted - need to submit a.sh file

Design and Analysis of Algorithms

Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-4004) Subject Notes

CSE541 Class 2. Jeremy Buhler. September 1, 2016

Text Compression through Huffman Coding. Terminology

Scribe: Virginia Williams, Sam Kim (2016), Mary Wootters (2017) Date: May 22, 2017

Greedy Algorithms. Previous Examples: Huffman coding, Minimum Spanning Tree Algorithms

We ve done. Now. Next

14.4 Description of Huffman Coding

CS 758/858: Algorithms

Horn Formulae. CS124 Course Notes 8 Spring 2018

CS 206 Introduction to Computer Science II

Greedy Algorithms CSE 780

16 Greedy Algorithms

CS141: Intermediate Data Structures and Algorithms Greedy Algorithms

MCS-375: Algorithms: Analysis and Design Handout #G2 San Skulrattanakulchai Gustavus Adolphus College Oct 21, Huffman Codes

February 24, :52 World Scientific Book - 9in x 6in soltys alg. Chapter 3. Greedy Algorithms

15 July, Huffman Trees. Heaps

Chapter 9. Greedy Technique. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

Greedy Algorithms CSE 6331

Design and Analysis of Algorithms

ECE608 - Chapter 16 answers

An undirected graph is a tree if and only of there is a unique simple path between any 2 of its vertices.

16.3 The Huffman code problem

ENSC Multimedia Communications Engineering Huffman Coding (1)

CPE702 Algorithm Analysis and Design Week 7 Algorithm Design Patterns

Algorithms and Data Structures CS-CO-412

Lecture 6: Analysis of Algorithms (CS )

16.3 The Huffman code problem

4.8 Huffman Codes. These lecture slides are supplied by Mathijs de Weerd

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

Practical Session 10 - Huffman code, Sort properties, QuickSort algorithm, Selection

Greedy Algorithms. Algorithms

Fundamentals of Multimedia. Lecture 5 Lossless Data Compression Variable Length Coding

Lecture: Analysis of Algorithms (CS )

Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 6, 2016 洪國寶

Subsequence Definition. CS 461, Lecture 8. Today s Outline. Example. Assume given sequence X = x 1, x 2,..., x m. Jared Saia University of New Mexico

1 Format. 2 Topics Covered. 2.1 Minimal Spanning Trees. 2.2 Union Find. 2.3 Greedy. CS 124 Quiz 2 Review 3/25/18

More Bits and Bytes Huffman Coding

Elements of Dynamic Programming. COSC 3101A - Design and Analysis of Algorithms 8. Discovering Optimal Substructure. Optimal Substructure - Examples

CSCI 136 Data Structures & Advanced Programming. Lecture 22 Fall 2018 Instructor: Bills

Greedy Algorithms. Textbook reading. Chapter 4 Chapter 5. CSci 3110 Greedy Algorithms 1/63

CS 161 Fall 2015 Final Exam

Department of Computer Applications. MCA 312: Design and Analysis of Algorithms. [Part I : Medium Answer Type Questions] UNIT I

Theorem: The greedy algorithm for RKP will always output the optimal solution under Selection Criterion 3. Proof sketch: Assume that the objects have

CSE 417 Dynamic Programming (pt 4) Sub-problems on Trees

Lecture: Analysis of Algorithms (CS )

Multiple Choice. Write your answer to the LEFT of each problem. 3 points each

Problem Strategies. 320 Greedy Strategies 6

logn D. Θ C. Θ n 2 ( ) ( ) f n B. nlogn Ο n2 n 2 D. Ο & % ( C. Θ # ( D. Θ n ( ) Ω f ( n)

CSE 421 Greedy: Huffman Codes

( ) + n. ( ) = n "1) + n. ( ) = T n 2. ( ) = 2T n 2. ( ) = T( n 2 ) +1

CSE 2320 Notes 6: Greedy Algorithms

CS F-10 Greedy Algorithms 1

looking ahead to see the optimum

UNIT 3. Greedy Method. Design and Analysis of Algorithms GENERAL METHOD

Exercises Optimal binary search trees root

CS 337 Project 1: Minimum-Weight Binary Search Trees

Out: April 19, 2017 Due: April 26, 2017 (Wednesday, Reading/Study Day, no late work accepted after Friday)

Analysis of Algorithms Prof. Karen Daniels

2. (a) Explain when the Quick sort is preferred to merge sort and vice-versa.

6. Algorithm Design Techniques

) $ f ( n) " %( g( n)

CMPSCI 240 Reasoning Under Uncertainty Homework 4

IN101: Algorithmic techniques Vladimir-Alexandru Paun ENSTA ParisTech

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay

9. The expected time for insertion sort for n keys is in which set? (All n! input permutations are equally likely.)

Non-context-Free Languages. CS215, Lecture 5 c

Greedy Algorithms and Data Compression. Curs Fall 2017

Greedy Algorithms 1 {K(S) K(S) C} For large values of d, brute force search is not feasible because there are 2 d {1,..., d}.

Binary Heaps in Dynamic Arrays

Lecture 14. Greedy algorithms!

Lecture 5: Dynamic Programming II

Greedy Approach: Intro

Approximation Algorithms

Design and Analysis of Algorithms 演算法設計與分析. Lecture 7 April 15, 2015 洪國寶

Lecture 34. Wednesday, April 6 CS 215 Fundamentals of Programming II - Lecture 34 1

CMPS 102 Solutions to Homework 7

EE67I Multimedia Communication Systems Lecture 4

Transcription:

Lecture: Analysis of Algorithms (CS483-001) Amarda Shehu Spring 2017

1 The Fractional Knapsack Problem Huffman Coding 2

Sample Problems to Illustrate The Fractional Knapsack Problem Variable-length (Huffman) Coding

The Fractional Knapsack Problem Given n objects Each object has an integer profit p i Each object has a fractional weight w i You can take fractions of an object You have a knapsack with weight capacity M, where M is not necessarily an integer Problem: Fit objects (taking even fractions of them) that give the maximum total profit

An Optimal Greedy Solution to the Fractional Knapsack Problem Sort the items by descending p i /w i ratios (focusing on maximizing profit while minimizing weight) Examine each object i {1,..., n} in this order If object fits in knapsack, take it What is the time complexity? Why does this greedy approach find the optimal solution to the Fractional Knapsack Problem?

Proof of Correctness Outline of Today s Class Let X {1, 2,..., k} be the optimal items taken Consider item j with associated (p j, w j ) that has the the highest p j /w j ratio If j is not used in X, then X is not optimal: We can remove portions of items with a total weight of w j from X and add j instead. Repeating this process, you see that the greedy approach changes X considering all items without decreasing the total value of X.

Another Problem Addressed with a Greedy Algorithm Huffman (Variable) Coding

The Coding Problem Outline of Today s Class Consider a message consisting of k characters (with known frequencies). We want to encode this message using a binary cipher That is, we want to assign d bits to each letter: Letter a b c d e f Frequency ( 10 3 ) 45 13 12 16 9 5 Fixed-length encoding 000 001 010 011 100 101 A message consisting of 100, 000 a-f characters would require 300, 000 bits of storage!!!

How about Variable-length Encoding? We could assign a variable-length encoding instead: Letter a b c d e f Frequency ( 10 3 ) 45 13 12 16 9 5 Fixed-length encoding 000 001 010 011 100 101 Variable-length encoding 0 101 100 111 1101 1100 A message like 001011101 parses uniquely That is to say that one can decode this cipher uniquely This result is based on the fact that no code is a prefix of another for the encoded characters Only 9 bits are used instead.

Optimum Source Coding Problem Problem: Given an alphabet A = {a 1,..., a n } with frequency distribution f (a i ), find a binary prefix code C for A that minimizes the number of bits B(C) = n f (a i ) L(c(a i )) i=1 needed to encode a message of n i=1 f (a i) characters, where c(a i ) is the codeword/code for encoding a i, and L(c(a i )) is the length of this code. Solution: Huffman developed a greedy algorithm for producing a minimum-cost prefix code. The code that is produced is called a Huffman Code.

Basic Idea Behind Huffman Coding A binary tree constructs codes 1-1 correspondence between the leaves and the characters The label of each leaf is the frequency of each character Left edges are labeled 0, right edges are labeled 1 Path from root to leaf is the code associated with the character at that leaf

Basic Idea Behind Huffman Coding Step 1. Pick two letters x, y from alphabet A with the smallest frequencies and create a subtree that has these two characters as leaves. This is the greedy idea. Label the root of this subtree as z. Step 2. Set frequency f (z) = f (x) + f (y). Remove x and y and add z, creating a new alphabet A = A z {x, y}. Note that A = A 1 Repeat this procedure, called merge, creating new alphabet A until only one symbol is left. The resulting tree is the Huffman Code.

Huffman Code Algorithm HuffmanCoding(C) 1: n A 2: Q A 3: for all i = 1 to n 1 do 4: allocate a new node z 5: left[z] x EXTRACT-MIN(Q) 6: right[z] y EXTRACT-MIN(Q) 7: f [z] f [x] + f [y] 8: INSERT(Q, z) 9: return EXTRACT-MIN(Q) Can you see why the time complexity of this algorithm is O(n lgn)?

(for Optimization Problem) A greedy algorithm builds a solution one step at a time Unlike DP, at each step, a greedy algorithm makes the currently best choice from a small number of choices The currently best choice is also referred to as the locally optimal choice Greedy algorithms are similar to DP algorithms in: the solution is efficient if the problem exhibits substructure BUT The greedy solution may not be optimal even if the problem exhibits optimal substructure (DP needed in such cases)

for Near-Optimal Solutions When to Design Greedy-choice property: a locally optimal choice leads to a globally optimal solution Applying the greedy approach to other problems that do not have this property can yield suboptimal solutions BUT: suboptimal solutions may be good enough approximations of the optimal solution on some applications Instance: when globally optimal solution is too expensive to compute