CSC 373: Algorithm Design and Analysis Lecture 4

Similar documents
CSC 373 Lecture # 3 Instructor: Milad Eftekhar

Kruskal s MST Algorithm

CSC 373: Algorithm Design and Analysis Lecture 3

looking ahead to see the optimum

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CSC2420 Fall 2012: Algorithm Design, Analysis and Theory

Chapter 9. Greedy Technique. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

CIS 121 Data Structures and Algorithms Minimum Spanning Trees

Design and Analysis of Algorithms

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Graphs and Discrete Structures

2 A Template for Minimum Spanning Tree Algorithms

CS3383 Unit 2: Greedy

2.1 Greedy Algorithms. 2.2 Minimum Spanning Trees. CS125 Lecture 2 Fall 2016

Chapter 23. Minimum Spanning Trees

Greedy Algorithms Part Three

Greedy Algorithms and Matroids. Andreas Klappenecker

managing an evolving set of connected components implementing a Union-Find data structure implementing Kruskal s algorithm

Algorithms Dr. Haim Levkowitz

Minimum Spanning Trees

Theorem 2.9: nearest addition algorithm

Introduction to Optimization

Vertex coloring, chromatic number

We ve done. Introduction to the greedy method Activity selection problem How to prove that a greedy algorithm works Fractional Knapsack Huffman coding

UC Berkeley CS 170: Efficient Algorithms and Intractable Problems Handout 8 Lecturer: David Wagner February 20, Notes 8 for CS 170

CSC2420 Fall 2012: Algorithm Design, Analysis and Theory

In this lecture, we ll look at applications of duality to three problems:

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014

CS 6783 (Applied Algorithms) Lecture 5

22 Elementary Graph Algorithms. There are two standard ways to represent a

Greedy algorithms is another useful way for solving optimization problems.

Algorithms and Theory of Computation. Lecture 5: Minimum Spanning Tree

CSC 1700 Analysis of Algorithms: Minimum Spanning Tree

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

Small Survey on Perfect Graphs

Greedy Algorithms. Textbook reading. Chapter 4 Chapter 5. CSci 3110 Greedy Algorithms 1/63

CS 341: Algorithms. Douglas R. Stinson. David R. Cheriton School of Computer Science University of Waterloo. February 26, 2019

Greedy Algorithms CHAPTER 16

Scribe: Virginia Williams, Sam Kim (2016), Mary Wootters (2017) Date: May 22, 2017

Algorithms and Theory of Computation. Lecture 5: Minimum Spanning Tree

Lecture 8: The Traveling Salesman Problem

CSE373: Data Structures & Algorithms Lecture 17: Minimum Spanning Trees. Dan Grossman Fall 2013

Huffman Coding. Version of October 13, Version of October 13, 2014 Huffman Coding 1 / 27

COP 4531 Complexity & Analysis of Data Structures & Algorithms

22 Elementary Graph Algorithms. There are two standard ways to represent a

Algorithms and Data Structures: Minimum Spanning Trees (Kruskal) ADS: lecture 16 slide 1

Notes 4 : Approximating Maximum Parsimony

1 Minimum Spanning Trees (MST) b 2 3 a. 10 e h. j m

Definition: A graph G = (V, E) is called a tree if G is connected and acyclic. The following theorem captures many important facts about trees.

5 MST and Greedy Algorithms

Greedy Algorithms CSE 780

CSE331 Introduction to Algorithms Lecture 15 Minimum Spanning Trees

February 24, :52 World Scientific Book - 9in x 6in soltys alg. Chapter 3. Greedy Algorithms

Vertex coloring, chromatic number

6.1 Minimum Spanning Trees

5 MST and Greedy Algorithms

Greedy Algorithms. CLRS Chapters Introduction to greedy algorithms. Design of data-compression (Huffman) codes

Representations of Weighted Graphs (as Matrices) Algorithms and Data Structures: Minimum Spanning Trees. Weighted Graphs

Graphs and Network Flows ISE 411. Lecture 7. Dr. Ted Ralphs

Introduction to Optimization

tree follows. Game Trees

Greedy Algorithms and Matroids. Andreas Klappenecker

Chordal Graphs: Theory and Algorithms

Greedy Algorithms and Matroids. Andreas Klappenecker

Design and Analysis of Algorithms

Horn Formulae. CS124 Course Notes 8 Spring 2018

4.8 Huffman Codes. These lecture slides are supplied by Mathijs de Weerd

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CSE101: Design and Analysis of Algorithms. Ragesh Jaiswal, CSE, UCSD

Minimum Spanning Trees

Solutions to Midterm 2 - Monday, July 11th, 2009

Minimum Spanning Trees My T. UF

16 Greedy Algorithms

Department of Computer Applications. MCA 312: Design and Analysis of Algorithms. [Part I : Medium Answer Type Questions] UNIT I

Greedy Algorithms CSE 6331

1 Format. 2 Topics Covered. 2.1 Minimal Spanning Trees. 2.2 Union Find. 2.3 Greedy. CS 124 Quiz 2 Review 3/25/18

Week 11: Minimum Spanning trees

Outline. Computer Science 331. Analysis of Prim's Algorithm

Name: Lirong TAN 1. (15 pts) (a) Define what is a shortest s-t path in a weighted, connected graph G.

CS521 \ Notes for the Final Exam

2pt 0em. Computer Science & Engineering 423/823 Design and Analysis of Algorithms. Lecture 04 Minimum-Weight Spanning Trees (Chapter 23)

Analysis of Algorithms - Greedy algorithms -

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

CSE 100 Minimum Spanning Trees Prim s and Kruskal

TU/e Algorithms (2IL15) Lecture 2. Algorithms (2IL15) Lecture 2 THE GREEDY METHOD

CSC2420 Spring 2015: Lecture 2

8 Matroid Intersection

Lecture 6 Basic Graph Algorithms

Decreasing a key FIB-HEAP-DECREASE-KEY(,, ) 3.. NIL. 2. error new key is greater than current key 6. CASCADING-CUT(, )

Greedy Algorithms. Alexandra Stefan

These notes present some properties of chordal graphs, a set of undirected graphs that are important for undirected graphical models.

CSE 21 Mathematics for Algorithm and System Analysis

Assignment 5: Solutions

CS200: Graphs. Rosen Ch , 9.6, Walls and Mirrors Ch. 14

Chapter 3 Trees. Theorem A graph T is a tree if, and only if, every two distinct vertices of T are joined by a unique path.

Greedy Algorithms 1. For large values of d, brute force search is not feasible because there are 2 d

On 2-Subcolourings of Chordal Graphs

CSE 331: Introduction to Algorithm Analysis and Design Graphs

Minimum Spanning Trees

Transcription:

CSC 373: Algorithm Design and Analysis Lecture 4 Allan Borodin January 14, 2013 1 / 16

Lecture 4: Outline (for this lecture and next lecture) Some concluding comments on optimality of EST Greedy Interval Colouring Algorithm Interval Graphs Graph MIS and graph colouring Kruskal s MST Huffman coding Greedy algorithms for makespan problem 2 / 16

Comments on the proof technique in graph colouring The proof technique used in proving the optimality of the EST greedy algorithm for interval colouring is also often used for proving approximations. The idea is to find some bound (or bounds) that any solution must satisfy and then relate that to the algorithm s solution. In this case, consider the maximum number of intervals in the input set that intersect at any given point. Observation The number of colours must be at least this large. For the interval colouring proof, it then just remained to show that the greedy algorithm will never use more than this number of colours. 3 / 16

Why doesn t the Greedy Colouring Algorithm exceed this intrinsic bound? Recall that we have sorted the intervals by nondecreasing starting time (i.e. earliest start time first). Let k = maximum number of intervals in the input set that intersect at any given point. Suppose for a contradiction that the algorithm used more than k colours. Consider the first time (say on some interval l) that the greedy algorithm would have used k + 1 of colours. Then it must be that there are k intervals intersecting l. Let s l be the starting time of interval I l. These intersecting intervals must all include sl. Why? Hence, there are k + 1 intervals intersecting at sl! 4 / 16

Interval graphs As we remarked last class, there is a natural way to view the interval scheduling and colouring problems as graph problems. Let I be a set of intervals. We can construct the intersection graph G(I) = (V, E) where V = I (u, v) is an edge in E iff the intervals corresponding to u and v intersect. Any graph that is the intersection graph of a set of intervals is called an interval graph. 5 / 16

Graph MIS and Colouring Let G = (V, E) be a graph. The following two problems are known to be NP hard to approximate (to within a factor of n 1 ɛ for any ɛ > 0) for arbitrary graphs: Graph MIS A subset U of V is an independent set (aka stable set) in G if for all u, v U, (u, v) is not an edge in E. The maximum independent set (MIS) problem is to find a maximum size independent set U. Graph colouring A function c mapping vertices to {1, 2,..., k} is a valid colouring of G if c(u) is not equal to c(v) for all (u, v) E. The graph colouring problem is to find a valid colouring so as to minimize the number of colours k. 6 / 16

Efficient algorithms for interval graphs Given a set I of intervals, it is easy to construct its intersection graph G(I). Note Given any graph G, there is a linear-time algorithm to decide if G is an interval graph and if so to construct an interval representation. The interval scheduling (resp. interval colouring) problem becomes the graph MIS (resp. colouring) problem for the intersection graph and hence these problems are efficiently solved for interval graphs. Question: Is there a graph theoretic explanation? YES: interval graphs are chordal graphs. The minimum colouring number (chromatic number) of a graph is always at least the size of a maximum clique (clique number). The greedy interval colouring proof shows that for interval graphs (and chordal graphs) the chromatic number = clique number.; i.e. perfect graphs. However, Mycielskis Theorem shows that there exist triangle-free graphs with arbitrarily high chromatic number. 7 / 16

Greedy algorithms for the MST problem We will start with Kruskal s algorithm. The presentation in DPV also considers appropriate data structures for implementing the algorithm. In terms of the basic structure of the algorithm it is very similar to the EFT algorithm for interval scheduling. Kruskal s algorithm Order edges so that w e1 w e2... w em. Let T := % T is the current forest to be extended to an MST For i : 1,..., m If e i connects two components of T : T := T {e i } End For Claim Same style inductive proof (using cut property to show T i is promising) could be used to show Kruskal a algorithm is optimal. 8 / 16

Code for Kruskal s algorithm as in DPV text DPV Figure 5.4: Kruskals minimum spanning tree algorithm Procedure Kruskal(G, w) Input: A connected undirected graph G = (V, E) with edge weights w e Output: A minimum spanning tree defined by the edges X For all u V : makeset(u) X = Sort the edges E by (non-decreasing) weight For all edges {u, v} E, in increasing order of weight If find(u) find(v): add edge {u, v} to X union(u, v) Comment The inductive proof for optimality of Kruskal or Prim s MST algorithm shows that when all edges are distinct, the MST is unique. 9 / 16

Prim s MST and Dikstra s Least Cost Paths Prim s algorithm (and the proof of its optimality) for the MST problem is very similar but now the next edge is adaptively chosen to be the smallest edge leaving the current component. The style of Prim s MST algorithm is very similar to Dikstra s algorithm for computing least cost paths from a single source node s to all the other nodes in a directed graph with non-negative edge costs. We can view the single source least cost problem as computing a least cost tree with root s. At each iteration i, having computed the tree for nodes in some set Si, the next edge (or node) to be chosen is the one that minimizes the cost to a node not in S i by adding an edge leaving S i. It can be shown (in some precise model for greedy algorithms) that an adaptive order for choosing edges is necessary; that is, a fixed order will not work. 10 / 16

Huffman (prefix-free) binary encoding Consider a set of symbols Γ = {s 1, s 2,..., s n }. These symbols appear in some context (e.g. words in a document, discrete samples from a signal, etc.). We want to encode each symbol s i as a binary string, call it σ i. We assume that these symbols occur with different frequencies with symbol s i having frequency f i. Clearly if a symbol say s occurs very often (resp. infrequently), we want to use a relatively short (resp. long) string to represent it. In order to simplify decoding, a nice property is that the encodings {σ i } satisfy the prefix-free property that no codeword σ i is the prefix of another code word σ j. 11 / 16

Prefix binary codes as binary trees Such an encoding is equivalent to a full ordered binary tree T ; that is, a rooted binary tree where Every non leaf has exactly two children With the left edge say labeled 0 and the right edge labeled 1 With every leaf labeled by a symbol in Γ Then the labels along the path to a leaf define the string encoding the symbol at that leaf. The goal is to create such a tree T so as to minimize cost(t ) = f i (depth of s i in T ) i Equivalently we are minimizing the expected symbol length, namely E s Γ [ σ(s) ] = i p i (depth of s i in T ) where p i = f i i f i is the probability of s i. The intuitive idea is to greedily combine the two lowest frequency symbols s 1 and s 2 to create a new symbol with frequency f 1 + f 2. 12 / 16

An example of Huffman coding in DPV 154 Algo Figure 5.10 A prefix-free encoding. Frequencies are shown in square brackets. 0 1 Symbol Codeword A 0 B 100 C 101 D 11 A [70] [23] [60] D [37] B [3] C [20] for our toy example, where (under the codes of Figure 5.10) the total size of the binary drops to 213 megabits, a 17% improvement. 13 / 16

The Huffman algorithm as in DPV text Code for Huffman coding Procedure Huffman(f) Input: An array f[1... n] of frequencies with f 1 f 2... f n Output: An encoding tree with n leaves Let H be a priority queue of integers, ordered by f For i : 1,..., n insert(h, i) For k : n + 1,..., 2n 1 i = deletemin(h), j = deletemin(h) create a node numbered k with children i, j f [k] = f [i] + f [j] insert(h, k) 14 / 16

What are chordal graphs? The following two slides were not discussed in class but I leave them here for anyone who is interested. PEO There are many equivalent ways to define chordal graphs. For our purposes, let s define chordal graphs G = (V, E) as those having a perfect elimination ordering (PEO) of the vertices. An ordering v(1), v(2),..., v(n) such that for all i, Neighbourhood(v(i)) intersect {v(i + 1),..., v(n)} is a clique (i.e. the MIS of the induced graph of the inductive neighbourhood is 1). Note Ordering intervals by earliest finishing times will provide a PEO for the intersection graph of intervals Hence interval graphs are chordal. 15 / 16

More on chordal graphs We can abstract the arguments used for interval selection to show the optimality of greedy algorithms for any chordal graph using a PEO ordering. We can abstract the arguments used for interval colouring to show the optimality of greedy algorithms for any chordal graph using a reverse PEO ordering. An equivalent (and initial) definition of chordal graphs are graphs which do not have any k-cycles (for k > 3) as induced subgraphs What are and are not chordal graphs? For example a 4-cycle cannot be an interval graph. Trees are chordal graphs. Can generalize chordal graphs by generalizing PEO orderings so that the induced neighbourhoods have MIS = k for some small k. 16 / 16