Greedy Algorithms and Matroids. Andreas Klappenecker

Similar documents
Greedy Algorithms and Matroids. Andreas Klappenecker

Greedy Algorithms and Matroids. Andreas Klappenecker

We ve done. Introduction to the greedy method Activity selection problem How to prove that a greedy algorithm works Fractional Knapsack Huffman coding

CLASS-ROOM NOTES: OPTIMIZATION PROBLEM SOLVING - I

Chapter 23. Minimum Spanning Trees

CSC 373: Algorithm Design and Analysis Lecture 4

Greedy algorithms is another useful way for solving optimization problems.

In this lecture, we ll look at applications of duality to three problems:

MST. GENERIC-MST.G; w/ 1 A D; 2 while A does not form a spanning tree 3 findanedge.u; / that is safe for A 4 A D A [ f.

Greedy Algorithms. Algorithms

Greedy Algorithms. Previous Examples: Huffman coding, Minimum Spanning Tree Algorithms

5 MST and Greedy Algorithms

8 Matroid Intersection

5 MST and Greedy Algorithms

Week 11: Minimum Spanning trees

Definition: A graph G = (V, E) is called a tree if G is connected and acyclic. The following theorem captures many important facts about trees.

Kruskal s MST Algorithm

Graph Definitions. In a directed graph the edges have directions (ordered pairs). A weighted graph includes a weight function.

CS F-10 Greedy Algorithms 1

Algorithms Dr. Haim Levkowitz

Lecture Notes on Spanning Trees

Steiner Trees and Forests

Solving NP-hard Problems on Special Instances

Graph Algorithms. Andreas Klappenecker. [based on slides by Prof. Welch]

Graphs and Network Flows IE411. Lecture 21. Dr. Ted Ralphs

Chapter 9. Greedy Technique. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

Greedy Algorithms 1 {K(S) K(S) C} For large values of d, brute force search is not feasible because there are 2 d {1,..., d}.

Introduction to Optimization

Introduction to Optimization

Tutorial 6-7. Dynamic Programming and Greedy

Design and Analysis of Algorithms

Elements of Dynamic Programming. COSC 3101A - Design and Analysis of Algorithms 8. Discovering Optimal Substructure. Optimal Substructure - Examples

Analysis of Algorithms Prof. Karen Daniels

NP Completeness. Andreas Klappenecker [partially based on slides by Jennifer Welch]

CSE373: Data Structures & Algorithms Lecture 17: Minimum Spanning Trees. Dan Grossman Fall 2013

Problem Set 2 Solutions

Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret

NP Completeness. Andreas Klappenecker [partially based on slides by Jennifer Welch]

Lecture 25 Notes Spanning Trees

6. Lecture notes on matroid intersection

Lecture 19. Broadcast routing

SUGGESTION : read all the questions and their values before you start.

looking ahead to see the optimum

Depth-first Search (DFS)

1 Non greedy algorithms (which we should have covered

Approximation slides 1. An optimal polynomial algorithm for the Vertex Cover and matching in Bipartite graphs

Greedy Algorithms 1. For large values of d, brute force search is not feasible because there are 2 d

Framework for Design of Dynamic Programming Algorithms

CSE331 Introduction to Algorithms Lecture 15 Minimum Spanning Trees

Minimal Spanning Tree

What is a minimal spanning tree (MST) and how to find one

COMP251: Greedy algorithms

Greedy Algorithms. Informal Definition A greedy algorithm makes its next step based only on the current state and simple calculations on the input.

ECE608 - Chapter 16 answers

Graph Algorithms Using Depth First Search

Design and Analysis of Algorithms

We will give examples for each of the following commonly used algorithm design techniques:

Main approach: always make the choice that looks best at the moment. - Doesn t always result in globally optimal solution, but for many problems does

CSE332: Data Abstractions Lecture 25: Minimum Spanning Trees. Ruth Anderson via Conrad Nied Winter 2015

22 Elementary Graph Algorithms. There are two standard ways to represent a

7.3 Spanning trees Spanning trees [ ] 61

Cardinality Lectures

Matching Theory. Figure 1: Is this graph bipartite?

Theory of Computing. Lecture 10 MAS 714 Hartmut Klauck

Greedy Algorithms. At each step in the algorithm, one of several choices can be made.

We ve done. Now. Next

5. Lecture notes on matroid intersection

Chapter 3 Trees. Theorem A graph T is a tree if, and only if, every two distinct vertices of T are joined by a unique path.

G205 Fundamentals of Computer Engineering. CLASS 21, Mon. Nov Stefano Basagni Fall 2004 M-W, 1:30pm-3:10pm

Minimum Spanning Trees

A taste of perfect graphs (continued)

UNIT Name the different ways of representing a graph? a.adjacencymatrix b. Adjacency list

Discharging and reducible configurations

CIS 121 Data Structures and Algorithms Minimum Spanning Trees

Minimum spanning trees

Greedy Algorithms CHAPTER 16

22 Elementary Graph Algorithms. There are two standard ways to represent a

CME 305: Discrete Mathematics and Algorithms Instructor: Reza Zadeh HW#3 Due at the beginning of class Thursday 02/26/15

Solutions for the Exam 6 January 2014

Computer Algorithms-2 Prof. Dr. Shashank K. Mehta Department of Computer Science and Engineering Indian Institute of Technology, Kanpur

Introduction to Algorithms

Lecture Notes 2: The Simplex Algorithm

Dist(Vertex u, Vertex v, Graph preprocessed) return u.dist v.dist

Optimization I : Brute force and Greedy strategy

Welcome to the course Algorithm Design

Chordal Graphs: Theory and Algorithms

16.Greedy algorithms

CS 6783 (Applied Algorithms) Lecture 5

Lecture 6: Faces, Facets

Propositional Logic. Andreas Klappenecker

Theorem 2.9: nearest addition algorithm

AMS /672: Graph Theory Homework Problems - Week V. Problems to be handed in on Wednesday, March 2: 6, 8, 9, 11, 12.

CSE 21: Mathematics for Algorithms and Systems Analysis

K 4 C 5. Figure 4.5: Some well known family of graphs

Main approach: always make the choice that looks best at the moment.

Undirected Graphs. DSA - lecture 6 - T.U.Cluj-Napoca - M. Joldos 1

These are not polished as solutions, but ought to give a correct idea of solutions that work. Note that most problems have multiple good solutions.

Applications of the Linear Matroid Parity Algorithm to Approximating Steiner Trees

Introduction to Graph Theory

Introduction to Algorithms

Transcription:

Greedy Algorithms and Matroids Andreas Klappenecker

Giving Change

Coin Changing Suppose we have n types of coins with values v[1] > v[2] > > v[n] > 0 Given an amount C, a positive integer, the following algorithm tries to give change for C: m[1] := 0; m[2]:=0; ; m[n] := 0; // multiplicities of coins for i = 1 to n do od; while C >= v[i] do od; C := C v[i]; m[i]++;

Coin Changing Suppose that v[1] = 5; v[2] =2; v[1] = 1; Let C = 14. Then the algorithm calculates m[1] = 2 and m[2] = 2 This gives the correct amount of change: C = v[1]*m[1]+v[2]*m[2] = 5*2+2*2 = 14

Correctness Suppose we have n types of coins with values v[1] > v[2] > > v[n] > 0 Input: A positive integer C m[1] := 0; m[2]:=0; ; m[n] := 0; // multiplicities of coins for i = 1 to n do while C >= v[i] do C := C v[i]; m[i]++; od; od; // When is the algorithm correct for all inputs C?

Optimality Suppose we have n types of coins with values v[1] > v[2] > > v[n] = 1 Input: A positive integer C m[1] := 0; m[2]:=0; ; m[n] := 0; // multiplicities of coins for i = 1 to n do while C >= v[i] do C := C v[i]; m[i]++; od; od; // Does the algorithm gives the least number of coins?

Example v[1] = 5; v[2] = 2; v[3] = 1; If C<2, then the algorithm gives a single coin, which is optimal. If C<5, then the algorithm gives at most 2 coins: C = 4 = 2*2 // 2 coins C = 3 = 2+1 // 2 coins C = 2 = 2 // 1 coins In each case this is optimal. If C >= 5, then the algorithm uses the most coins of value 5 and then gives an optimal change for the remaining value < 5. One cannot improve upon this solution. Let us see why.

Example Any optimal solution to give change for C with C = v[1]*m[1] + v[2]*m[2] + v[3] * m[3] with v[1] = 5; v[2] = 2; v[3] = 1; must satisfy a) m[3] <= 1 // if m[3] >= 2, replace two 1 s with 2. b) 2*m[2] + m[1] < 5 // otherwise use a 5 c) If C >= 5*k, then m[1] >= k // otherwise solution violates b) Thus, any optimal solution greedily chooses maximal number of 5s. The remaining value in the optimal value has to choose maximal number of 2s, so any optimal solution is the greedy solution!

Example 2 Let v[1]=4; v[2]=3; v[3]=1; The algorithm yields 3 coins of change for C = 6 namely 6=4+1+1 Is this optimal?

Example 2 Any optimal solution to give change for C with C = v[1]*m[1] + v[2]*m[2] + v[3] * m[3] with v[1] = 4; v[2] = 3; v[3] = 1; must satisfy a) m[3] <= 2 // if m[3] > 2, replace three 1 s with a 3. b) We cannot have both m[3]>0 and m[2]>0, for otherwise we could use a 4 to reduce the coin count. c) m[2]<4 // otherwise use 4 to reduce number of coins Greedy will fail in general. One can still find an optimal solution efficiently, but the solution is not unique. Why? 9=3+3+3=4+4+1

How Likely is it Optimal? Let N be a positive integer. Let us choose integer a and b uniformly at random subject to N>b>a>1. Then the greedy coin-changing algorithm with v[1]=b; v[2]=a; v[3]=1 gives always optimal change with probability 8/3 N -1/2 + O(1/N) as Thane Plambeck has shown [AMM 96(4), April 1989, pg 357].

Greedy Algorithms

Greedy Algorithms The development of a greedy algorithm can be separated into the following steps: 1.Cast the optimization problem as one in which we make a choice and are left with one subproblem to solve. 2.Prove that there is always an optimal solution to the original problem that makes the greedy choice, so that the greedy choice is always safe. 3.Demonstrate that, having made the greedy choice, what remains is a subproblem with the property that if we combine an optimal solution to the subproblem with the greedy choice that we have made, we arrive at an optimal solution to the original problem. 13

Greedy-Choice Property The greedy choice property is that a globally optimal solution can be arrived at by making a locally optimal (=greedy) choice. 14

Optimal Substructure A problem exhibits optimal substructure if and only if an optimal solution to the problem contains within it optimal solutions to subproblems. 15

Greedy Algorithms Greedy algorithms are easily designed, but correctness of the algorithm is harder to show. We will look at some general principles that allow one to prove that the greedy algorithm is correct.

Matroids

Matroid Let S be a finite set, and F a nonempty family of subsets of S, that is, F P(S). We call (S,F) a matroid if and only if M1) If B F and A B, then A F. [The family F is called hereditary] M2) If A,B F and A < B, then there exists x in B\A such that A {x} in F [This is called the exchange property]

Example 1 (Matric Matroids)

Example 1 (Matric Matroids) Let M be a matrix. Let S be the set of rows of M and

Example 1 (Matric Matroids) Let M be a matrix. Let S be the set of rows of M and F = { A A S, A is linearly independent }

Example 1 (Matric Matroids) Let M be a matrix. Let S be the set of rows of M and F = { A A S, A is linearly independent } Claim: (S,F) is a matroid.

Example 1 (Matric Matroids) Let M be a matrix. Let S be the set of rows of M and F = { A A S, A is linearly independent } Claim: (S,F) is a matroid. Clearly, F is not empty (it contains every row of M).

Example 1 (Matric Matroids) Let M be a matrix. Let S be the set of rows of M and F = { A A S, A is linearly independent } Claim: (S,F) is a matroid. Clearly, F is not empty (it contains every row of M). M1) If B is a set of linearly independent rows of M, then any subset A of M is linearly independent. Thus, F is hereditary.

Example 1 (Matric Matroids) Let M be a matrix. Let S be the set of rows of M and F = { A A S, A is linearly independent } Claim: (S,F) is a matroid. Clearly, F is not empty (it contains every row of M). M1) If B is a set of linearly independent rows of M, then any subset A of M is linearly independent. Thus, F is hereditary. M2) If A, B are sets of linearly independent rows of M, and A < B, then dim span A < dim span B. Choose a row x in B that is not contained in span A. Then A {x} is a linearly independent subset of rows of M.Therefore, F satisfied the exchange property.

Undirected Graphs Let V be a finite set, E a subset of { e e V, e =2 } Then (V,E) is called an undirected graph. We call V the set of vertices and E the set of edges of the graph. a b V = {a,b,c,d} c d E = { {a,b}, {a,c}, {a,d}, {b,d}, {c,d} }

Induced Subgraphs Let G=(V,E) be a graph. We call a graph (V,E ) an induced subgraph of G if and only if its edge set E is a subset of E.

Spanning Trees Given a connected graph G, a spanning tree of G is an induced subgraph of G that happens to be a tree and connects all vertices of G. If the edges are weighted, then a spanning tree of G with minimum weight is called a minimum spanning tree. a 1 b 2 c 3 5 4 d

Example 2 (Graphic Matroids)

Example 2 (Graphic Matroids) Let G=(V,E) be an undirected graph.

Example 2 (Graphic Matroids) Let G=(V,E) be an undirected graph. Choose S = E and F = { A H = (V,A) is an induced subgraph of G such that H is a forest }.

Example 2 (Graphic Matroids) Let G=(V,E) be an undirected graph. Choose S = E and F = { A H = (V,A) is an induced subgraph of G such that H is a forest }. Claim: (S,F) is a matroid.

Example 2 (Graphic Matroids) Let G=(V,E) be an undirected graph. Choose S = E and F = { A H = (V,A) is an induced subgraph of G such that H is a forest }. Claim: (S,F) is a matroid. M1) F is a nonempty hereditary set system.

Example 2 (Graphic Matroids) Let G=(V,E) be an undirected graph. Choose S = E and F = { A H = (V,A) is an induced subgraph of G such that H is a forest }. Claim: (S,F) is a matroid. M1) F is a nonempty hereditary set system. M2) Let A and B in F with A < B. Then (V,B) has fewer trees than (V,A). Therefore, (V,B) must contain a tree T whose vertices are in different trees in the forest (V,A). One can add the edge x connecting the two different trees to A and obtain another forest (V,A {x}).

Weight Functions A matroid (S,F) is called weighted if it equipped with a weight function w: S->R +, i.e., all weights are positive real numbers. If A is a subset of S, then w(a) := Σ a in A w(a). Weight functions of this form are sometimes called linear weight functions.

Greedy Algorithm for Matroids Greedy(M=(S,F),w) // maximizing version A := ; Sort S into monotonically decreasing order by weight w. for each x in S taken in monotonically decreasing order do od; return A; if A {x} in F then A := A {x}; fi;

Correctness Theorem: Let M= (S,F) be a weighted matroid with weight function w. Then Greedy(M,w) returns a set in F of maximal weight. [Thus, even though Greedy algorithms in general do not produce optimal results, the greedy algorithm for matroids does! This algorithm is applicable for a wide class of problems. Yet, the correctness proof for Greedy is not more difficult than the correctness for special instance such as Huffman coding. This is economy of thought!]

Correctness Seeking a contradiction, suppose that Greedy returns C in F, but there exists B in F such that w(b)>w(c). Since w is nonnegative, we can assume that B,C are bases; say B={b 1,...,b n } and C={c 1,...,c n }. Let i be the smallest index such that w(b 1 )<=w(c 1 ),...,w(b i-1 )<=w(c i-1 ), and w(b i )>w(c i ). Greedy would have picked b i in the ith step. Indeed, {b 1,...,b i } and {c 1,...,c i-1 } are in F, hence {c 1,...,c i-1, b i } in F by the exchange axiom. Thus w(b)<=w(c), contradiction!!! 27

Complexity Let n = S = # elements in the set S. Sorting of S: O(n log n) The for-loop iterates n times. In the body of the loop one needs to check whether A {x} is in F. If each check takes f(n) time, then the loop takes O(n f(n)) time. Thus, Greedy takes O(n log n + n f(n)) time.

Minimizing or Maximizing? Let M=(S,F) be a matroid. The algorithm Greedy(M,w) returns a set A in F maximizing the weight w(a). If we would like to find a set A in F with minimal weight, then we can use Greedy with weight function w (a) = m-w(a) for a in A, where m is a real number such that m > max s in S w(s).

Matric Matroids

Matric Matroids Let M be a matrix. Let S be the set of rows of the matrix M and

Matric Matroids Let M be a matrix. Let S be the set of rows of the matrix M and F = { A A S, A is linearly independent }.

Matric Matroids Let M be a matrix. Let S be the set of rows of the matrix M and F = { A A S, A is linearly independent }.

Matric Matroids Let M be a matrix. Let S be the set of rows of the matrix M and F = { A A S, A is linearly independent }. Weight function w(a)= A.

Matric Matroids Let M be a matrix. Let S be the set of rows of the matrix M and F = { A A S, A is linearly independent }. Weight function w(a)= A.

Matric Matroids Let M be a matrix. Let S be the set of rows of the matrix M and F = { A A S, A is linearly independent }. Weight function w(a)= A. What does Greedy((S,F),w) compute?

Matric Matroids Let M be a matrix. Let S be the set of rows of the matrix M and F = { A A S, A is linearly independent }. Weight function w(a)= A. What does Greedy((S,F),w) compute? The algorithm yields a basis of the vector space spanned by the rows of the matrix M.

Graphic Matroids Let G=(V,E) be an undirected connected graph. Let S = E and F = { A H = (S,A) is an induced subgraph of G such that H is a forest }. Let w be a weight function on E. Define w (a)=m-w(a), where m>w(a), for all a in A. Greedy((S,F), w ) returns a minimum spanning tree of G. This algorithm in known as Kruskal s algorithm.

Kruskal's MST algorithm 6 5 12 3 10 13 16 7 15 18 4 14 8 17 9 11 2 Consider the edges in increasing order of weight, add an edge iff it does not cause a cycle [Animation taken from Prof. Welch s lecture notes]

Kruskal's MST algorithm 6 5 12 3 10 13 16 7 15 18 4 14 8 17 9 11 2 Consider the edges in increasing order of weight, add an edge iff it does not cause a cycle [Animation taken from Prof. Welch s lecture notes]

Kruskal's MST algorithm 6 5 12 3 10 13 16 7 15 18 4 14 8 17 9 11 2 Consider the edges in increasing order of weight, add an edge iff it does not cause a cycle [Animation taken from Prof. Welch s lecture notes]

Kruskal's MST algorithm 6 5 12 3 10 13 16 7 15 18 4 14 8 17 9 11 2 Consider the edges in increasing order of weight, add an edge iff it does not cause a cycle [Animation taken from Prof. Welch s lecture notes]

Kruskal's MST algorithm 6 5 12 3 10 13 16 7 15 18 4 14 8 17 9 11 2 Consider the edges in increasing order of weight, add an edge iff it does not cause a cycle [Animation taken from Prof. Welch s lecture notes]

Kruskal's MST algorithm 6 5 12 3 10 13 16 7 15 18 4 14 8 17 9 11 2 Consider the edges in increasing order of weight, add an edge iff it does not cause a cycle [Animation taken from Prof. Welch s lecture notes]

Kruskal's MST algorithm 6 5 12 3 10 13 16 7 15 18 4 14 8 17 9 11 2 Consider the edges in increasing order of weight, add an edge iff it does not cause a cycle [Animation taken from Prof. Welch s lecture notes]

Kruskal's MST algorithm 6 5 12 3 10 13 16 7 15 18 4 14 8 17 9 11 2 Consider the edges in increasing order of weight, add an edge iff it does not cause a cycle [Animation taken from Prof. Welch s lecture notes]

Kruskal's MST algorithm 6 5 12 3 10 13 16 7 15 18 4 14 8 17 9 11 2 Consider the edges in increasing order of weight, add an edge iff it does not cause a cycle [Animation taken from Prof. Welch s lecture notes]

Kruskal's MST algorithm 6 5 12 3 10 13 16 7 15 18 4 14 8 17 9 11 2 Consider the edges in increasing order of weight, add an edge iff it does not cause a cycle [Animation taken from Prof. Welch s lecture notes]

Conclusion Matroids characterize a group of problems for which the greedy algorithm yields an optimal solution. Kruskals minimum spanning tree algorithm fits nicely into this framework.