Lecture Notes on Karger s Min-Cut Algorithm. Eric Vigoda Georgia Institute of Technology Last updated for Advanced Algorithms, Fall 2013.

Similar documents
princeton univ. F 15 cos 521: Advanced Algorithm Design Lecture 2: Karger s Min Cut Algorithm

1 Minimum Cut Problem

CPSC 536N: Randomized Algorithms Term 2. Lecture 10

CSCI 5454 Ramdomized Min Cut

A Randomized Algorithm for Minimum Cuts

Algortithms for the Min-Cut problem

Randomized Graph Algorithms

Minimum Cuts in Graphs

Reducing Directed Max Flow to Undirected Max Flow and Bipartite Matching

CS369G: Algorithmic Techniques for Big Data Spring

Min-Cut and Randomization

Randomized algorithms have several advantages over deterministic ones. We discuss them here:

13 Randomized Minimum Cut

Lecture 1: A Monte Carlo Minimum Cut Algorithm

12.1 Formulation of General Perfect Matching

Fast and Simple Algorithms for Weighted Perfect Matching

Augmenting undirected connectivity in RNC and in randomized ~ O(n 3 ) time

Topic: Local Search: Max-Cut, Facility Location Date: 2/13/2007

FOUR EDGE-INDEPENDENT SPANNING TREES 1

PLANAR GRAPH BIPARTIZATION IN LINEAR TIME

Test 1 Review Questions with Solutions

Random Contractions and Sampling for Hypergraph and Hedge Connectivity

Algorithms for Minimum Spanning Trees

CSE 203A: Karger s Min-Cut Algorithm (Lecture Date: 5/11/06)

New Upper Bounds for MAX-2-SAT and MAX-2-CSP w.r.t. the Average Variable Degree

CS 598CSC: Approximation Algorithms Lecture date: March 2, 2011 Instructor: Chandra Chekuri

[13] D. Karger, \Using randomized sparsication to approximate minimum cuts" Proc. 5th Annual

Lecture 1 August 31, 2017

Design and Analysis of Algorithms (VII)

Outline. CS38 Introduction to Algorithms. Approximation Algorithms. Optimization Problems. Set Cover. Set cover 5/29/2014. coping with intractibility

Solution for Homework set 3

6.856 Randomized Algorithms

Theorem 2.9: nearest addition algorithm

Dijkstra s algorithm for shortest paths when no edges have negative weight.

CPS 102: Discrete Mathematics. Quiz 3 Date: Wednesday November 30, Instructor: Bruce Maggs NAME: Prob # Score. Total 60

A New Approach to the Minimum Cut Problem

Graph Contraction. Graph Contraction CSE341T/CSE549T 10/20/2014. Lecture 14

North Bank. West Island. East Island. South Bank

Lecture 2. 1 Introduction. 2 The Set Cover Problem. COMPSCI 632: Approximation Algorithms August 30, 2017

Approximating Fault-Tolerant Steiner Subgraphs in Heterogeneous Wireless Networks

Assignment 4 Solutions of graph problems

5.1 Min-Max Theorem for General Matching

Introduction to Randomized Algorithms

Graph Connectivity G G G

Encoding/Decoding, Counting graphs

Lecture Notes: Euclidean Traveling Salesman Problem

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014

Exact Algorithms Lecture 7: FPT Hardness and the ETH

Treewidth and graph minors

Definition For vertices u, v V (G), the distance from u to v, denoted d(u, v), in G is the length of a shortest u, v-path. 1

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I. Instructor: Shaddin Dughmi

9 Bounds for the Knapsack Problem (March 6)

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18

Important separators and parameterized algorithms

Chapter 1 Divide and Conquer Algorithm Theory WS 2013/14 Fabian Kuhn

arxiv:cs/ v1 [cs.ds] 20 Feb 2003

and 6.855J March 6, Maximum Flows 2

Connecting face hitting sets in planar graphs

Lecture 2 - Graph Theory Fundamentals - Reachability and Exploration 1

Network Flows. 1. Flows deal with network models where edges have capacity constraints. 2. Shortest paths deal with edge costs.

Counting the Number of Eulerian Orientations

1 Better Approximation of the Traveling Salesman

1 Counting triangles and cliques

Testing Isomorphism of Strongly Regular Graphs

HW Graph Theory SOLUTIONS (hbovik)

Faster parameterized algorithms for Minimum Fill-In

Lecture 3: Graphs and flows

Parameterized graph separation problems

Planar graphs. Chapter 8

CPSC 536N: Randomized Algorithms Term 2. Lecture 4

Notes 4 : Approximating Maximum Parsimony

All-Pairs Nearly 2-Approximate Shortest-Paths in O(n 2 polylog n) Time

CSE P521: Applied Algorithms

Treaps. 1 Binary Search Trees (BSTs) CSE341T/CSE549T 11/05/2014. Lecture 19

6.842 Randomness and Computation September 25-27, Lecture 6 & 7. Definition 1 Interactive Proof Systems (IPS) [Goldwasser, Micali, Rackoff]

All-to-All Communication

Discrete Wiskunde II. Lecture 6: Planar Graphs

A Reduction of Conway s Thrackle Conjecture

Figure : Example Precedence Graph

CME 305: Discrete Mathematics and Algorithms Instructor: Reza Zadeh HW#3 Due at the beginning of class Thursday 02/26/15

DO NOT RE-DISTRIBUTE THIS SOLUTION FILE

2 A Template for Minimum Spanning Tree Algorithms

Lecture and notes by: Nate Chenette, Brent Myers, Hari Prasad November 8, Property Testing

6 Distributed data management I Hashing

Graph Algorithms. Chromatic Polynomials. Graph Algorithms

Lecture 1. 1 Notation

Trapezoidal Maps. Notes taken from CG lecture notes of Mount (pages 60 69) Course page has a copy

A TIGHT BOUND ON THE LENGTH OF ODD CYCLES IN THE INCOMPATIBILITY GRAPH OF A NON-C1P MATRIX

Acyclic Edge Colorings of Graphs

These are not polished as solutions, but ought to give a correct idea of solutions that work. Note that most problems have multiple good solutions.

Augmenting Trees so that Every Three Vertices Lie on a Cycle

1 Bipartite maximum matching

Optimal Parallel Randomized Renaming

Flexible Coloring. Xiaozhou Li a, Atri Rudra b, Ram Swaminathan a. Abstract

Compact Sets. James K. Peterson. September 15, Department of Biological Sciences and Department of Mathematical Sciences Clemson University

Merge Sort. Yufei Tao. Department of Computer Science and Engineering Chinese University of Hong Kong

Parallel Breadth First Search

Trees. 3. (Minimally Connected) G is connected and deleting any of its edges gives rise to a disconnected graph.

U.C. Berkeley CS170 : Algorithms, Fall 2013 Midterm 1 Professor: Satish Rao October 10, Midterm 1 Solutions

Transcription:

Lecture Notes on Karger s Min-Cut Algorithm. Eric Vigoda Georgia Institute of Technology Last updated for 4540 - Advanced Algorithms, Fall 2013. Today s topic: Karger s min-cut algorithm [3]. Problem Definition Let G = V, E) be an undirected graph. Let n = V denote the number of vertices and m = E denote the number of edges. For S V, let δ G S) = {u, v) E : u S, v S} denote the set of edges crossing between S and S = V \S. This set δ G S) is called a cut since their removal from G disconnects G into more than one component. We often refer to δ G S) as δs) when it is clear which graph G we are referring to. Goal: Find the cut of minimum size. In other words, we want to find the fewest edges that need to be removed so that the graph is no longer connected we might as well assume G is initially connected, otherwise there s nothing to do). Previous Work Closely related is the minimum st-cut problem. In this problem, we also take two vertices s and t as input, and we only consider cuts that separate s and t into different components, thus, we restrict attention to cuts δs) where s S and t / S. Traditionally, the min-cut problem was solved by solving n 1 minimum st-cut problems. The size of the minimum st-cut is equal to the value of the maximum st-flow, and the fastest algorithm for solving maximum st-flow runs in Onm logn 2 /m)) time [1]. In fact, all n 1 maximum st-flow computations can be done simultaneously with the same time bounds [2]. Karger [3] devised a simple, clever algorithm to solve the min-cut problem without the st-condition) without using any max-flow computations. A refinement of Karger s algorithm, due to Karger and Stein [4] is also faster than the above approach using max-st-flow computations. We will present Karger s algorithm, followed by the refinement. Algorithm Definition The basic operation in the algorithm is to contract edges, which is defined formally below, but a rough description is for an edge e = v, w), to contract e we merge v and w into one vertex. The main idea is that if the min cut is δs), if both v and w are in S or both are in S and thus e is not in the cut δs)), then by contracting the edge e we have not changed δs) it looks the same in the new graph). The algorithm will start initially with a simple graph as input, and then it will need to consider multigraphs. In a multigraph there are possibly multiple edges between a pair of vertices. We do not have edges of the form v, v) in our multigraphs; we refer to these types of edges as self-loops. Definition 1. Let G = V, E) be a multigraph without self-loops. For e = {v, w} E, the contraction with respect to e, denoted G/e, is formed by the following steps: 1

1. Replace vertices v and w with a new vertex, z. 2. Replace every edge v, x) or w, x) with an edge z, x). 3. Remove self-loops to z. 4. The resulting multigraph is G/e. As pointed out above, the key observation is that if we contract an edge v, w) then we preserve those cuts where v and w are both in S or both in S. This is formalized in the following statement. Observation 2. Let e = v, w) E. There is a one-to-one correspondence between cuts in G which do not contain any edge v, w), and cuts in G/e. In fact, for S V such that v, w S, δ G S) = δ G/e S) with z substituted for v and w). Thus, starting from G, if we contract some edges to get a multigraph G, then all of the cuts in G correspond to cuts in G, but some cuts in G no longer are represented in G. Let k denote the size of the minimum cut in G. A key point that we ll use later is that the size of the minimum cut in G is k. Why? If it s smaller, then this small cut in G corresponds to a cut in G which is also smaller than k, contradicting our assumption that the minimum cut in G is of size k. The idea of the algorithm is to contract n 2 edges and then two vertices remain. These two remaining vertices correspond to sets of vertices in the original graph. Hence, these two vertices correspond to a partition S, S) of the original graph, and the edges remaining in this two vertex graph correspond to δs) in the original input graph. So we will output this cut δs) as our guess for the minimum cut of the original input graph. We will see momentarily why this is a good guess. But first we need to specify the algorithm more precisely, namely we need to decide: What edges do we choose to contract in each of the n 2 steps? Consider a minimum cut δs ), if there are multiple min-cut s just choose one arbitrarily. If we never contract edges from this cut δs ), then, by Observation 2, this is the cut the algorithm will end up with. Since δs ) is of minimum size, it has relatively few edges, so if we contract a random edge intuitively we are unlikely to choose an edge of this min-cut. It turns out that by contracting random edges we will have a reasonable probability of preserving this min-cut δs ). Here is the formal algorithm. Karger s min-cut algorithm: Starting from the input graph G = V, E), repeat the following process until only two vertices remain: 1. Choose an edge e = v, w) uniformly at random from E. 2. Set G = G/e. These final two vertices correspond to sets S, S of vertices in the original graph, and the edges remaining in the final graph correspond to the edges in δs), a cut of the original graph. Note, in the algorithm we are considering a multigraph. Suppose we have a multigraph G with m edges, and a pair of vertices v and w have 5 edges between them in G. Then these 5 edges are distinct, and the chance of choosing to contract v, w) is 5/m. Analysis of Algorithm s Error Probability We need to show that Karger s min-cut algorithm has a reasonable chance of ending at a minimum cut of the original graph: 2

Lemma 3. Let δs ) be a cut of minimum size of the graph G = V, E). Pr Karger s algorithm ends with the cut δs ) ) 1 n. Proof. Fix some minimum cut δs ) in the original graph. Let δs) = k denote the size of this cut. Denote the edges we contract in the algorithm as {e 0, e 1,..., e n 3 }. Let G j denote the multigraph we have after j contractions. Hence, G 0 is the original input graph G, and G n 2 is the final graph. For j > 0 we have that: G j = G j 1 /e j. The algorithms succeeds if none of the contracted edges are in δs ), in other words, Karger s algorithm outputs δs ) e 0, e 1,..., e n 3 / δs ). Let us first consider the probability that the algorithm succeeds in the first contraction, i.e., that e 0 / δs ). Note, Pr e 0 / δs ) ) = 1 δs ) = 1 k E m. We need to lower bound this probability, and hence we need a lower bound on m. To be useful we need to lower bound m in terms of k. Since k is the size of the minimum cut, notice that every vertex must have degree at least k. Why? If some vertex x has degree < k then the set of edges incident to x is a cut smaller than k, which contradicts our initial assumption that the size of the minimum cut is k. Since G has minimum degree k, it has at least nk/2 edges since the total degrees is nk and this counts each edge twice). Hence, Pr e 0 / δs ) ) = 1 k m 1 k nk/2 = 1 2 n. 1) This same approach works for analyzing later stages of the algorithm. Recall, by Observation 2, every cut in an intermediate multigraph corresponds to a cut of the original graph. Hence, for all of the multigraphs considered by the algorithm the min-cut size is k. Otherwise, we have a vertex z in this multigraph with degree < k, and the edges incident this vertex correspond to a cut in this multigraph and also a cut in the original graph by Observation of size < k. So now let s compute the probability that the first two contractions were successful, which is Pr e 1, e 2 / δs ) ). Recall, Bayes formula for conditional probabilities which says that: Pr e 1 / δs ) e 0 / δs ) ) = Pr e 1 / δs ) and e 0 / δs ) ) Pr e 0 / δs. ) ) Therefore, rearranging we have that: Pr e 0, e 1 / δs ) ) = Pr e 0 / δs ) ) Pr e 1 / δs ) e 0 / δs ) ) We bounded Pr e 0 / δs ) ) in 1). And we can now bound Pr e 1 / δs ) e 0 / δs ) ). The graph G 1 has n 1 vertices, and we just argued that it has minimum degree k. Therefore, it has n 1)k/2 edges, and hence, Pr e 1 / δs ) e 0 / δs k ) ) 1 n 1)k/2 = 1 2 n 1. 3) 3

Now plugging 1) and 3) back into we have that: Pr e 0, e 1 / δs ) ) 1 2 ) 1 2 ) n n 1 = n 2 n n 3 n 1. Now we can do the general computation. Recall, G j denotes the multigraph after j contractions. The multigraph G j contains n j vertices since we lose one vertex per contraction. Since we observed that G j has minimum degree k, we know that G j has at least n j)k/2 edges. We can now compute the probability that the algorithm successfully finds our specific minimum cut δs ). To do so, all of the contracted edges must not be in δs ): Pr final graph = δs ) ) = Pr e 0, e 1,..., e n 3 / δs ) ) = Pr e 0 / δs ) ) Pr e 1,..., e n 3 / δs ) e 0 / δs ) ) n 3 = Pr e 0 / δs ) ) Pr e j / δs ) e 0,..., e j 1 / δs ) ) = n 3 j=0 n 3 j=0 1 j=1 k n j)k/2 1 2 ) n j ) = n 2 n 3 n n 1 n 4 n 2 n 5 n 3 3 5 2 4 1 3 1) = n)n 1) 1 = n. Boosting the Success Probability We d like to boost the probability of success, so that the algorithm succeeds with high probability, let s say with probability at least 1 1/n 10. In fact, we ll see that we can achieve any inverse polynomial for the error probability by simply changing the constants in the following scheme. To boost the success probability, we simply run the above algorithm 10n 2 lnn) times, and we output the best of the cuts we see. Before analyzing this scheme, a simple fact recall, the Taylor s expansion for exp x) = 1 x + x 2 /2 ± where x 0. Hence, for 0 x 1 we have that: exp x) 1 x. Fix a minimum cut δs ). The probability that all 10n 2 lnn) runs of the algorithm fail to output δs ) is at most 1 1 n ) 10n 2 lnn) exp 10n2 lnn) n ) exp 10 lnn)) = n 10. Therefore, the probability we find a minimum cut is at least 1 n 10, as desired. 4

Running Time It s easy to implement Karger s algorithm so that one run takes On 2 ) time that will be your first homework problem. Therefore, we have an On 4 log n) time randomized algorithm with error probability 1/polyn). A faster version of this algorithm was devised by Karger and Stein [4]. The key idea comes from looking at the telescoping product. In the initial contractions it s very unlikely we contracted an edge in the minimum cut. Towards the end of the algorithm, our probability of contracting an edge in the minimum cut grows. From the earlier analysis we have the following. For a fixed minimum cut δs), the probability that this cut survives down to l vertices is at least l / n. Thus, for l = n/ 2 we have probability 1/2 of succeeding. Hence, in expectation two trails should suffice. Improved algorithm: From a multigraph G, if G has at least 2 vertices, repeat twice: 1. run the original algorithm down to n/ 2 vertices. 2. recurse on the resulting graph. Return the minimum of the cuts found in the two recursive calls. We can easily compute the running time via the following recurrence which is straightforward to solve, e.g., the standard Master theorem applies): T n) = 2T n/ + On 2 ) = On 2 log n). Since we succeed down to n/ 2 with probability 1/2, we have the following recurrence for the probability of success, denote by P n): P n) 1 1 1 2 ) 2 P n/. This solves to P n) = Ω 1 log n ). By a similar boosting argument as we used for the original algorithm, with Olog 2 n) runs of the algorithm, the probability of success is 1 1/polyn). Therefore, in On 2 log 3 n) total time, we can find the minimum cut with probability 1 1/polyn). Number of Minimum Cuts Before finishing, we observe an interesting corollary of Karger s original algorithm. Corollary 4. Any graph has at most On 2 ) minimum cuts. Proof. Consider the collection of all cuts of minimum size, and denote this collection by S 1,..., S j. For all 1 i j, the probability that one run of Karger s algorithm finds S i is 1/n 2. Since only cut is outputted by each run of Karger s algorithm we must have that j n 2. Note, we can also enumerate all of these cuts by the above algorithm. 5

References [1] A. V. Goldberg and R. E. Tarjan. A new approach to the maximum-flow problem. Journal of the ACM, 354):921-940, 1988. [2] J. Hao and J. B. Orlin. A faster algorithm for finding the minimum cut in a graph. In Proceedings of the 3rd Annual ACM-SIAM Symposium on Discrete Algorithms SODA), pages 165-174, 1992. [3] D. R. Karger. Global min-cuts in RNC, and other ramifications of a simple min-cut algorithm. In Proceedings of the 4th Annual ACM-SIAM Symposium on Discrete Algorithms SODA), pages 21-30, 1993. [4] D. R. Karger and C. Stein. A new approach to the minimum cut problem. Journal of the ACM, 434):601-640, 1996. 6