Bipartite Matching & the Hungarian Method

Similar documents
5 Matchings in Bipartite Graphs and Their Applications

Lecture 3: Graphs and flows

MTL 776: Graph Algorithms Lecture : Matching in Graphs

Chapter 5 Graph Algorithms Algorithm Theory WS 2012/13 Fabian Kuhn

Graph Algorithms Matching

Graphs and Network Flows IE411. Lecture 21. Dr. Ted Ralphs

Lecture 11: Maximum flow and minimum cut

Jessica Su (some parts copied from CLRS / last quarter s notes)

Notes for Lecture 20

The Dynamic Hungarian Algorithm for the Assignment Problem with Changing Costs

1. Lecture notes on bipartite matching February 4th,

CS261: Problem Set #2

Matching Algorithms. Proof. If a bipartite graph has a perfect matching, then it is easy to see that the right hand side is a necessary condition.

Graphs and Network Flows IE411. Lecture 13. Dr. Ted Ralphs

Incremental Assignment Problem

2. Lecture notes on non-bipartite matching

Problem set 2. Problem 1. Problem 2. Problem 3. CS261, Winter Instructor: Ashish Goel.

1 Matchings in Graphs

Theorem 3.1 (Berge) A matching M in G is maximum if and only if there is no M- augmenting path.

Graph Theory S 1 I 2 I 1 S 2 I 1 I 2

LECTURES 3 and 4: Flows and Matchings

3 No-Wait Job Shops with Variable Processing Times

Minimum Spanning Trees

1 Linear programming relaxation

1 Matching in Non-Bipartite Graphs

6. Lecture notes on matroid intersection

Algorithm and Complexity of Disjointed Connected Dominating Set Problem on Trees

Lecture 4: Primal Dual Matching Algorithm and Non-Bipartite Matching. 1 Primal/Dual Algorithm for weighted matchings in Bipartite Graphs

Paths, Flowers and Vertex Cover

GRAPH THEORY and APPLICATIONS. Matchings

Matching and Covering

Paths, Flowers and Vertex Cover

1. Lecture notes on bipartite matching

List of Theorems. Mat 416, Introduction to Graph Theory. Theorem 1 The numbers R(p, q) exist and for p, q 2,

Ma/CS 6b Class 2: Matchings

PERFECT MATCHING THE CENTRALIZED DEPLOYMENT MOBILE SENSORS THE PROBLEM SECOND PART: WIRELESS NETWORKS 2.B. SENSOR NETWORKS OF MOBILE SENSORS

Tutorial for Algorithm s Theory Problem Set 5. January 17, 2013

Homework 3 Solutions

Figure 2.1: A bipartite graph.

5. Lecture notes on matroid intersection

Network flows and Menger s theorem

Introduction to Graph Theory

Matching Theory. Figure 1: Is this graph bipartite?

Matchings. Examples. K n,m, K n, Petersen graph, Q k ; graphs without perfect matching. A maximal matching cannot be enlarged by adding another edge.

Graph Theory: Matchings and Factors

Ma/CS 6b Class 5: Graph Connectivity

AMS /672: Graph Theory Homework Problems - Week V. Problems to be handed in on Wednesday, March 2: 6, 8, 9, 11, 12.

Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch.

CPS 102: Discrete Mathematics. Quiz 3 Date: Wednesday November 30, Instructor: Bruce Maggs NAME: Prob # Score. Total 60

Mathematical and Algorithmic Foundations Linear Programming and Matchings

Exercise set 2 Solutions

CPSC 536N: Randomized Algorithms Term 2. Lecture 10

CSE 417 Network Flows (pt 3) Modeling with Min Cuts

Parallel Breadth First Search

and 6.855J March 6, Maximum Flows 2

Lecture and notes by: Sarah Fletcher and Michael Xu November 3rd, Multicommodity Flow

Theorem 2.9: nearest addition algorithm

Homework #5 Algorithms I Spring 2017

Minimum-Spanning-Tree problem. Minimum Spanning Trees (Forests) Minimum-Spanning-Tree problem

Assignment and Matching

1 Weighted matchings in bipartite graphs

CS261: Problem Set #1

Maximum flows & Maximum Matchings

Graphs. Pseudograph: multiple edges and loops allowed

Finding a -regular Supergraph of Minimum Order

In this lecture, we ll look at applications of duality to three problems:

implementing the breadth-first search algorithm implementing the depth-first search algorithm

Graphs: Introduction. Ali Shokoufandeh, Department of Computer Science, Drexel University

MATH 682 Notes Combinatorics and Graph Theory II

ON A WEAKER VERSION OF SUM LABELING OF GRAPHS

Math 778S Spectral Graph Theory Handout #2: Basic graph theory

Approximation Algorithms

Math 776 Graph Theory Lecture Note 1 Basic concepts

Number Theory and Graph Theory


CSE 417 Network Flows (pt 4) Min Cost Flows

x ji = s i, i N, (1.1)

Lecture 10,11: General Matching Polytope, Maximum Flow. 1 Perfect Matching and Matching Polytope on General Graphs

COMP260 Spring 2014 Notes: February 4th

Adjacent: Two distinct vertices u, v are adjacent if there is an edge with ends u, v. In this case we let uv denote such an edge.

6 Randomized rounding of semidefinite programs

Group Strategyproof Mechanisms via Primal-Dual Algorithms. Key Points to Discuss

Chain Packings and Odd Subtree Packings. Garth Isaak Department of Mathematics and Computer Science Dartmouth College, Hanover, NH

Network Design and Optimization course

Bipartite Coverings and the Chromatic Number

ACO Comprehensive Exam October 12 and 13, Computability, Complexity and Algorithms

Notes for Lecture 24

DO NOT RE-DISTRIBUTE THIS SOLUTION FILE

Lecture 11a: Maximum Matchings Lale Özkahya

Approximation Algorithms: The Primal-Dual Method. My T. Thai

CPCS Discrete Structures 1

The extendability of matchings in strongly regular graphs

Graph Algorithms Maximum Flow Applications

Flexible Coloring. Xiaozhou Li a, Atri Rudra b, Ram Swaminathan a. Abstract

Parameterized coloring problems on chordal graphs

Reductions and Satisfiability

Section 8.2 Graph Terminology. Undirected Graphs. Definition: Two vertices u, v in V are adjacent or neighbors if there is an edge e between u and v.

Basic Graph Theory with Applications to Economics

Treewidth and graph minors

CSE 203A: Karger s Min-Cut Algorithm (Lecture Date: 5/11/06)

Transcription:

Bipartite Matching & the Hungarian Method Last Revised: 15/9/7 These notes follow formulation originally developed by Subhash Suri in http://www.cs.ucsb.edu/ suri/cs/matching.pdf We previously saw how to use the Ford-Fulkerson Max- Flow algorithm to find Maximum-Size matchings in bipartite graphs. In this section we discuss how to find Maximum-Weight matchings in bipartite graphs, a situation in which Max-Flow is no longer applicable. The O( V ) algorithm presented is the Hungarian Algorithm due to Kuhn & Munkres. Review of Max-Bipartite Matching Earlier seen in Max-Flow section Augmenting Paths Feasible Labelings and Equality Graphs The Hungarian Algorithm for Max-Weighted Bipartite Matching 1

Application: Max Bipartite Matching A graph G = (V, E) is bipartite if there exists partition V = Y with Y = and E Y. A Matching is a subset M E such that v V at most one edge in M is incident upon v. The size of a matching is M, the number of edges in M. A Maximum Matching is matching M such that every other matching M satisfies M M. Problem: Given bipartite graph G, find a maximum matching.

A bipartite graph with matchings L R L R

We now consider Weighted bipartite graphs. These are graphs in which each edge (i, j) has a weight, or value, w(i, j). The weight of matching M is the sum of the weights of edges in M, w(m) = e M w(e). Problem: Given bipartite weighted graph G, find a maximum weight matching. Y Y Y Y 1 1 1 1 Note that, without loss of generality, by adding edges of weight, we may assume that G is a complete weighted graph.

Alternating Paths: Y 1 Y Y Y Y5 Y Y 1 Y Y Y Y5 Y 1 5 1 5 Let M be a matching of G. Vertex v is matched if it is endpoint of edge in M; otherwise v is free Y, Y, Y, Y,,, 5, are matched, other vertices are free. A path is alternating if its edges alternate between M and E M.,, Y,, Y, 5, Y, is alternating An alternating path is augmenting if both endpoints are free. Augmenting path has one less edge in M than in E M; replacing the M edges by the E M ones increments size of the matching. 5

Alternating Trees: Y Y Y 1 5 Y Y Y Y 1 Y Y Y Y5 Y 1 5 1 5 An alternating tree is a tree rooted at some free vertex v in which every path is an alternating path. Note: The diagram assumes a complete bipartite graph; matching M is the red edges. Root is Y 5.

The Assignment Problem: Let G be a (complete) weighted bipartite graph. The Assignment problem is to find a max-weight matching in G. A Perfect Matching is an M in which every vertex is adjacent to some edge in M. A max-weight matching is perfect. Max-Flow reduction dosn t work in presence of weights. The algorithm we will see is called the Hungarian Algorithm. 7

Feasible Labelings & Equality Graphs 1 1 1 1 1 1 1 A feasible labeling l 1 1 1 Equality Graph G l A vetex labeling is a function l : V R A feasible labeling is one such that l(x) + l(y) w(x, y), x, y Y the Equality Graph (with respect to l) is G = (V, E l ) where E l = {(x, y) : l(x) + l(y) = w(x, y)} 8

1 1 1 1 1 1 1 A feasible labeling l 1 1 1 Equality Graph G l Theorem: If l is feasible and M is a Perfect matching in E l then M is a max-weight matching. Proof: Denote edge e E by e = (e x, e y ). Let M be any PM in G (not necessarily in in E l ). Since every v V is covered exactly once by M we have w(m ) = e M w(e) e M (l(e x ) + l(e y )) = v V l(v) so v V l(v) is an upper-bound on the cost of any perfect matching. Now let M be a PM in E l. Then w(m) = e M w(e) = v V l(v). So w(m ) w(m) and M is optimal. 9

1 1 1 1 1 1 1 A feasible labeling l 1 1 1 Equality Graph G l Theorem[Kuhn-Munkres]: If l is feasible and M is a Perfect matching in E l then M is a max-weight matching. The KM theorem transforms the problem from an optimization problem of finding a max-weight matching into a combinatorial one of finding a perfect matching. It combinatorializes the weights. This is a classic technique in combinatorial optimization. Notice that the proof of the KM theorem says that for any matching M and any feasible labeling l we have w(m) l(v). v V This has very strong echos of the max-flow min-cut theorem. 1

Our algorithm will be to Start with any feasible labeling l and some matching M E l maintaining an alternating tree T E l. While M is not perfect repeat the following: 1. Find an augmenting path for M in E l ; this increases size of M Reset T to be one free vertex. If no augmenting path exists, improve l to l such that M, T E l. Add one edge in E l to T, keeping it an augmenting tree Go to 1. Note that in each step of the loop we will either be increasing the size of M or T so this process must terminate. 11

Furthermore, when the process terminates, M will be a perfect matching in E l for some feasible labeling l. So, by the Kuhn-Munkres theorem, M will be a maxweight matching.

Finding an Initial Feasible Labelling Y Y 1 8 1 1 8 Finding an initial feasible labeling is simple. Just use: y Y, l(y) =, With this labelling it is obvious that x, l(x) = max{w(x, y)} y Y x, y Y, w(x, y) l(x) + l(y) 1

Improving Labellings Let l be a feasible labeling. Define neighbor of u V and set S V to be N l (u) = {v : (u, v) E l, }, N l (S) = u S N l (u) Lemma: Let S and T = N l (S) Y. Set and α l = min {l(x) + l(y) w(x, y)} x S,y T l (v) = l(v) α l if v S l(v) + α l if v T l(v) otherwise Then l is a feasible labeling and, (i) If (x, y) E l for x S, y T then (x, y) E l ; (ii) If (x, y) E l for x S, y T then (x, y) E l ; (iii) For some x S, y T we have (x, y) E l but (x, y) E l 1

The Hungarian Method 1. Generate initial labelling l and matching M in E l.. If M perfect, stop. Otherwise pick free vertex u. Set S = {u}, T =. Note: S T will be vertices of alternating tree. If N l (S) = T, update labels (forcing N l (S) T ) α l = min s S, y T {l(x) + l(y) w(x, y)} l (v) = l(v) α l if v S l(v) + α l if v T l(v) otherwise. If N l (S) T, pick y N l (S) T. If y free, u y is augmenting path. Augment M and go to. If y matched, say to z, extend alternating tree: S = S {z}, T = T {y}. Go to. 1

Y Y Y Y Y Y 1 8 1 8 8 1 8 1 Original Graph Eq Graph+Matching Alternating Tree 8 1 8 Initial Graph, trivial labelling and associated Equality Graph Initial matching: (x, y 1 ), (x, y ) S = {x 1 }, T =. Since N l (S) T, do step. Choose y N l (S) T. y is matched so grow tree by adding (y, x ), i.e., S = {x 1, x }, T = {y }. At this point N l (S) = T, so goto. 15

Y Y Y Y Y Y 1 8 1 8 8 1 8 1 Original Graph Old E l and M new Eq Graph 8 1 S = {x 1, x }, T = {y } and N l (S) = T Calculate α l α l = = min x S,y T + 1, (x 1, y 1 ) +, (x 1, y ) 8 +, (x, y 1 ) 8 +, (x, y ) Reduce labels of S by ; Increase labels of T by. Now N l (S) = {y, y } = {y } = T. 1

Y Y Y Y Y Y 8 8 8 1 8 1 Orig E l and M New Alternating Tree New M 1 S = {x 1, x }, N l (S) = {y, y }, T = {y } Choose y N l (S) T and add it to T. y is not matched in M so we have just found an alternating path x 1, y, x, y with two free endpoints. We can therefore augment M to get a larger matching in the new equality graph. This matching is perfect, so it must be optimal. Note that matching (x 1, y ), (x, y ), (x, y 1 ) has cost + + = 1 which is exactly the sum of the labels in our final feasible labelling. 17

Correctness: We can always take the trivial l and empty matching M = to start algorithm. If N l (S) = T, we saw that we could always update labels to create a new feasible matching l. The lemma on page 1 guarantees that all edges in S T and S T that were in E l will be in E l. In particular, this guarantees (why?) that the current M remains in E l as does the alternating tree built so far. Note: The lemma requires that T Y but this is trivially correct since T = S 1 so T < Y. If N l (S) T, we can, by definition, always augment alternating tree by choosing some x S and y T such that (x, y) E l. Note that at some point y chosen must be free, in which case we augment M. 18

So the algorithm always terminates. M is a perfect matching in E l when the algorithm terminates it is optimal by Kuhn-Munkres theorem.

Complexity In each phase of algorithm, M increases by 1 so there are at most V phases. How much work needs to be done in each phase? In implementation, y T keep track of slack y = min x S {l(x) + l(y) w(x, y)} Initializing all slacks at beginning of phase takes O( V ) time. In step we must update all slacks when vertex moves from S to S. This takes O( V ) time; only V vertices can be moved from S to S, giving O( V ) time per phase. In step, α l = min y/ T slack y and can therefore be calculated in O( V ) time from the slacks. This is done at most V times per phase (why?) so only takes O( V ) time per phase. After calculating α l we must update all slacks. This can be done in O( V ) time by setting y T, slack y = slack y α l. Since this is only done O( V ) times, total time per phase is O( V ). 19

There are V phases and O( V ) work per phase so the total running time is O( V ).