Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret

Size: px
Start display at page:

Download "Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret"

Transcription

1 Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret Greedy Algorithms (continued) The best known application where the greedy algorithm is optimal is surely the construction of a minimum spanning tree (MST). Most of you are familiar with Prim s and Kruskal s algorithms, but they are just two of a rather large family of greedy algorithms for the MST problem. We look at two more algorithms in that family and then proceed to prove that any algorithm in that family indeed returns a minimum spanning tree. Greedy algorithms for the MST problem Greedy algorithms use only a fraction of the information available about the problem. Bottom-up greedy methods build solutions piece by piece (starting from the empty set) by selecting, among the remaining pieces, that which optimizes the value of the partial solution, while ensuring that the subset selected so far can be extended into a feasible solution. Top-down greedy methods (much less common) build solutions from the full set of pieces by removing one piece at a time, selecting a piece whose removal optimizes the value of the remaining collection while ensuring that this collection continues to contain feasible solutions. Thus in both cases, and indeed for any greedy algorithm, the idea is to produce the largest immediate gain while maintaining feasibility. In the case of the MST problem, we are given an undirected graph G=(V,E), and a length (distance/weight/etc.) for each edge, d : E R. Our aim is to find a spanning tree a tree connecting all vertices of minimum total weight. MSTs have two important properties, usually called the cycle property and the cut property. The cycle property says that, for any cycle X in the graph, X E, if the weight of an edge e X is strictly larger than the weight of every other edge of X, then this edge e cannot belong to any MST of the graph. (Phrased slightly differently, for any cycle X in the graph, if the weight of an edge e X is larger than or equal to the weight of every other edge of X, then there exists an MST that does not contain e.) Recall that a cut in a graph is a partition of the vertices of the graph into two non-empty subsets, Y ={S,V S}, S V, S /0; this partition induces a set of cut edges, the cut-set C Y ={u,v} u S, v V S}. In a connected graph (and we are always given a connected graph for the MST problem), there is a bijection between cuts and cut-sets, so that we can specify one or the other. The cut property says that, for any cut-set C in graph G, if the weight of an edge e C is strictly smaller than the weight of every other edge of C, then this edge belongs to all MSTs of this graph. (Phrased slightly differently, for any cut-set C in graph G, if the weight of an edge e C Y is smaller than or equal to the weight of every other edge of C Y, then there exists an MST that contains e.) 1

2 Now a bottom-up greedy method for the MST starts from an empty set and adds one piece at a time (an edge or a vertex, although the distinction is somewhat artificial), subject to not creating cycles, until a tree is built (or until there remains no candidate piece the two are equivalent). In contrast, a top-down greedy method for the MST starts from the entire graph and removes one edge at a time, subject to not disconnecting the graph, until no more edges can be removed. In each case, the choice is made on the basis of the contribution made by the chosen edge (or vertex) to the current collection, a purely local decision. A top-down approach is the Reverse-Delete algorithm, first mentioned by Kruskal (but not to be confused with Kruskal s algorithm, which is, of course, a bottom-up approach). The Reverse-Delete algorithm starts with the original graph and deletes edges from it. The algorithm works as follows. Start with graph G, which contains a list of edges E. Sort the edges in E in decreasing order by weight, then go through the edges one by one, from largest weight down to smallest weight. For each edge in turn, check whether deleting the edge would disconnect the graph; if not, remove that edge. The proof of optimality for this algorithm is quite straightforward, using the cycle-property. If the graph G has no cycles, then it is a tree and thus it is its own unique (and hence also optimal) spanning tree. As long as there remains a cycle in the graph, we do not have a tree and must remove at least one edge. Reverse-Delete removes the remaining edge of largest weight; this edge must appear in at least one cut-set, because it is part of cycle (since its removal does not disconnect the graph) and it is necessarily the edge of largest weight in any cutset in which it appears and thus the cycle property ensures that there exists at least one MST that does not include it. We can view the bottom-up methods as proceeding by coalescing equivalence classes. An equivalence class consists of a set of vertices with an associated set of edges that form a minimum spanning tree for the vertices in the class. Initially, each vertex is the sole element of its equivalence class and its associated set of edges is empty. When the algorithm terminates, only one equivalence class remains and the associated set of edges defines a minimum spanning tree. At each step of the algorithm, we select an edge with an endpoint in each of two equivalence classes and coalesce these two classes, thereby combining two trees into one larger tree. (Edges with both endpoints in the same equivalence class are permanently excluded, since their selection would lead to a cycle.) In order to minimize the increase in the value of the objective function, greediness dictates that the allowable edge of least cost be chosen next. This choice can be made with or without additional constraints. At one extreme we can apply no additional constraint and always choose the shortest edge that combines two spanning trees into a larger spanning tree; at the other extreme we can designate a special equivalence class which must be involved in any merge operation. The first approach can be viewed as selecting edges; it is known as Kruskal s algorithm, after J. B. Kruskal, who first presented it in 1956; the second is best viewed (at least for programming purposes) as selecting vertices (adding them one by one to a single partial spanning tree) and is known as Prim s algorithm, after R.C. Prim, who presented it in A third, more general, approach is in fact the oldest algorithm proposed for the MST and among the oldest algorithms formally defined in Computer Science; it is due to O. Boruvka, who first published it in 1926 as a method of constructing an efficient 2

3 electrical network to serve the city of Moravia. Boruvka s algorithm considers all current equivalence classes (spanning trees for subsets of vertices) at once, joins each to its closest neighbor (where the distance is defined as the length of the shortest edge with one endpoint in each of the two equivalence classes), subject to not creating a cycle. We can prove the correctness of all three coalescence-based algorithms (Kruskal s, Prim s, and Boruvka s) at once by proving the correctness of the more general algorithm. Theorem 1. If, at each step of the algorithm, an arbitrary equivalence class, T i, is selected, but the edge selected for inclusion is the smallest that has exactly one endpoint in T i, then the final tree that results is a minimum spanning tree. Proof. As G is connected, there is always at least one allowable edge at each iteration. As each iteration can proceed, irrespective of our choice of T i, and as an iteration decreases the number of equivalence classes by one, the algorithm terminates. The proof is by induction, though we present it somewhat informally. Let T A be a spanning tree produced by the above algorithm and let T M be a minimum spanning tree. We give a procedure for transforming T M into T A. We will form a sequence of trees, each slightly different from its predecessor, with the property that the sums of the lengths of the edges in successive trees are the same. Since T A will be at the far end of the sequence of transformations, it must also be a minimum spanning tree. Label the edges of T A by the iteration on which they entered the tree. Let e i be the edge of lowest index that is present in T A but not in T M. The addition of e i to T M forms a cycle. Note that the length of e i is greater than or equal to that of every other edge in the cycle; otherwise, T M would not be a minimum spanning tree, because breaking any edge in the cycle would produce a spanning tree and breaking a longer edge, if one such existed, would reduce the total cost. Now, when e i was added to the forest of trees that eventually became T A, it connected an arbitrarily chosen tree, T i, to some other tree. Traverse the cycle in T M starting from the endpoint of e i that was in T i and going in the direction that does not lead to an immediate traversal of e i. At some point in this traversal, we first encounter an edge with exactly one endpoint in T i. It might be the first edge we encounter, or it might be many edges into the traversal, but such an edge must exist since the other endpoint of e i is not in T i. Furthermore, this edge, call it ê, cannot be e i. Now the length of e i cannot exceed that of ê, because ê was an allowable edge, but was not selected; thus the two edges have equal length. We replace ê with e i in T M : the resulting tree has the same total length. Note that ê may or may not be an edge of T A, but if it is, then its index is greater than i, so that our new tree now first differs from T A at some index greater than i. Replacing T M with this new minimum spanning tree, we continue this process until there are no differences, i.e., until T M has been transformed into T A. Iterative improvement methods Our next class of methods are those that start with a complete solution structure and proceed to refine it through successive iterations; at each iteration, an improvement is made 3

4 on a local basis. This is a more powerful approach that the greedy approach: whereas a greedy approach never alters any choice it has already made, an iterative improvement approach has no problem doing so. Moreover, the number of iterations is not fixed indeed, bounding the number of iterations is the major problem in analyzing an iterative improvement algorithm, whereas the number of steps taken by a greedy algorithm is simply the number of elements included in the solution. Matching and flow Matching and network flow are the two most important problems for which an iterative improvement method delivers optimal solutions. We first consider the maximum matching problem in bipartite graphs. Maximum bipartite matching A matching in a graph is a subset of edges that do not share any endpoint; a maximum matching is just a matching of maximum cardinality. A graph is said to be bipartite if its vertices can be partitioned into two sets in such a way that all edges of the graph will have one endpoint in one set and the other endpoint in the other set the vertices of each set form an independent set. Matching in bipartite graphs is one of the fundamental optimization problems, as it is used for assignment problems, that is, problems where one wants to find the optimal way to assign, say works crews to jobs problems to be solved everyday on construction sites, in factories, in airlines and railways, as well as in job scheduling for computing systems. We describe an iterative improvement algorithm for this problem: an algorithm that refines the current solution through local changes, using an approach that can be repeated many times, each time improving the quality of the solution. In the problem of maximum bipartite matching, in order to improve an existing matching, we must start by identifying unmatched vertices of degree at least 1 we need at least one such on each side of the bipartite graph. Consider the trivial 4-vertex bipartite graph with vertex set {a, b, 1, 2} and edge set {{a,1},{a,2},{b,1}}, with current matching M ={{a,1}}. It is clear that there exists a larger matching, namely M = {{a,2},{b,1}}. Note that, in order to transform M into M, the set of matched vertices will simply gain two new members, but the set of matched edges, while larger by one, may have nothing in common with the previous set. In this trivial example, we could have started our search at vertex b, which has just one neighbor, vertex 1; but vertex 1 was already matched, so we had to unmatch it (undoing a previous decision), which makes its previous mate, vertex a, to become an unmatched vertex of degree at least 1, to follow the matched edge back to vertex a, where we found that a had an unmatched neighbor, vertex 2. We thus identified a path of three edges, the first and the last unmatched, the middle one matched; by flipping the status of each edge, from matched to unmatched and vice versa, we replaced a path with one matched edge by the same path, but with two matched edges. Let us formally define what this type of path is. 4

5 Let G = (V,E) be a graph and M be a matching. An alternating path with respect to M is a path such that such that every other edge on the path is in M, while the others are in E M. If, in addition, the path is of odd length and the first and last vertices on the path are unmatched, then the alternating path is called an augmenting path. The reason it is called an augmenting path is that we can use is to augment the size of the matching: whereas an alternating path may have the same number of edges in M and in E M, or one more in M, or one more in E M, an augmenting path must have one more edge in M than in E M. Moreover, because of the definition of matchings, it is safe to flip the status of every edge in an augmenting path from matched to unmatched, and vice versa: none of the vertices on the augmenting path can have been the endpoint of a matched edge other than those already on the path. Augmenting paths are thus the tool we needed to design an iterative improvement algorithm: in general terms, we start with an arbitrary matching (including possibly an empty one), then we search for an augmenting path in the graph; if one is found, we augment the matching by flipping the status of all edges along the augmenting path; if none is found, we stop. The obvious question, at this point, is whether the absence of any augmenting path indicates just a local maximum or a global one. The answer is positive: if G has no augmenting path with respect to M, then M is a maximum matching it is optimal. We phrase this result positively. Theorem 2. Let G be a graph, M an optimal matching for G, and M any matching for G such that we have M < M. Then G has an augmenting path with respect to M. This result is due to French mathematician Claude Berge and so known as Berge s theorem. The proof is deceptively simple, but note that it is nonconstructive. Proof. Let M M denote the symmetric difference of M and M, i.e., M M = (M M ) (M M ), and consider the subgraph G = (V,M M ). All vertices of G have degree two or less, because they have at most one incident edge from each of M and M ; moreover, every connected component of G is one of: (i) a single vertex; (ii) a cycle of even length, with edges drawn alternately from M and M ; or (iii) a path with edges drawn alternately from M and M. As the cardinality of M exceeds that of M, there exists at least one path composed of alternating edges from M and M, with more edges from M than from M. The path must begin and end with edges from M and the endpoints are unmatched in M, because the path is a connected component of G ; hence this path is an augmenting path. Berge s theorem shows that the use of augmenting paths not only enables us to improve on the quality of an initial solution, it enables us to obtain an optimal solution. Note that the definitions of alternating paths and augmenting paths hold just as well for nonbipartite graphs as for bipartite ones; and Berge s theorem does too. The next step is to develop an algorithm for finding augmenting path and this is where the difference between bipartite and nonbipartite graphs will show. 5

6 In a bipartite graph, any augmenting path begins on one side of the graph and ends on the other. Thus a search algorithm can simply start at any unmatched vertex on one side of the graph, say the left side, and traverse any edge to the other side. If the endpoint on the right side is also unmatched, then an augmenting path, consisting of a single unmatched edge, has been found. If the other endpoint is matched, then the algorithm traverses that matched edge to the left side and follows any unmatched edge, if one exists, to an unvisited vertex on the right side. The process is repeated until either an augmenting path is found or a deadend on the left side is reached. Unmatched edges are always traversed from the left side to the right side and matched edges in the opposite direction. If a deadend is reached, we must explore other paths until we find an augmenting path or run out of possibilities. In developing an augmenting path, choices arise in only two places: in selecting an initial unmatched vertex and in selecting an unmatched edge out of a vertex on the left side. In order to examine all possibilities for augmenting paths, we need to explore these choices in some systematic way; because all augmenting paths make exactly the same contribution of one additional matched edge, we should search for the shortest augmenting paths. Thus we use a breadth-first search of the graph, starting at each unmatched vertex on the left side. If any of the current active vertices (the frontier in the BFS, which will always be vertices on the left side) has an unmatched neighbor on the right, we are done. Otherwise, from each neighbor on the right, we follow the matched edge of which it is an endpoint back to a vertex on the left and repeat the process. Thus the BFS increases path lengths by 2 at each iteration because the move back to the left along matched edges is forced. The BFS takes O( E ) time, as it cannot look at an edge more than twice (once from each end); as the number of augmenting paths we may find is in O( V ), the running time of this BFS augmenting strategy is O( V E ). Since the input size is Θ( V + E ), the time taken is more than linear, but no more than quadratic, in the size of the input. However, we are wasting a lot of time: each new BFS starts from scratch and, most likely, will follow many paths already followed in the previous BFS. And each BFS produces a single new augmenting path. Yet, there typically will be a number of augmenting paths in a graph with respect to a matching, especially if that matching is small. Instead of stopping at the first unmatched neighbor on the right, we could finish that stage of BFS, collecting all unmatched neighbors on the right. Doing so would not increase the worst-case running time of the BFS, yet might yield multiple augmenting paths of the same length. However, we can use multiple augmenting paths only if they are vertex-disjoint, since otherwise we could cause conflicting assignments of vertices or even edges. The BFS might discover that k l of the current active vertices (on the left) have an unmatched neighbor on the right, but some of these neighbors might be shared; if there are k r unmatched neighbors on the right, the maximum number of disjoint augmenting paths is min{k l,k r }. The number may be smaller, however, because this sharing of vertices can occur at any stage along the alternating paths. Thus we must adjust our BFS to provide backpointers, so that we can retrace paths from right-side unmatched vertices reached in the search; and we must add a backtracing phase, which retraces at most one path for each unmatched vertex reached on the right-hand side. The backtracing is itself a graph search. Specifically, for each left-side vertex encountered during the breadth-first search, we record its distance from the clos- 6

7 est unmatched left-side vertex, passing as before through matched right-side vertices. We use this information to run a (backward) depth-first search from each unmatched right-side vertex discovered during the BFS: during a DFS we consider only edges that take us one level closer to unmatched left-side vertices. When we discover an augmenting path, we eliminate the vertices along this path from consideration by any remaining DFS, thereby ensuring that our augmenting paths will be vertex-disjoint. We can hope that the number of augmenting paths found during each search is more than a constant, so that the number of searches (iterations) to be run is significantly decreased, preferably to o( V ). We characterize the gain to be realized through a series of small theorems; these theorems apply equally to general graphs and bipartite graphs. We begin with a more precise proof of Berge s theorem that allows us to refine its conclusion. Theorem 3. Let M 1 and M 2 be two matchings in some graph, G = (V,E), with M 1 > M 2. Then the subgraph G =(V,M 1 M 2 ) contains at least M 1 M 2 vertex-disjoint augmenting paths with respect to M 2. Proof. Recall that every connected component of G is one of: (i) a single vertex; (ii) a cycle of even length, with edges alternately drawn from M 1 and M 2 ; or (iii) a path with edges alternately drawn from M 1 and M 2. Let C i =(V i,e i ) be the ith connected component and define δ(c i ) = E i M 1 E i M 2. From our previous observations, we know that δ(c i ) must be one of 1, 0, or 1 and that it equals 1 exactly when C i is an augmenting path with respect to M 2. Now we have δ(c i )= M 1 M 2 M 2 M 1 = M 1 M 2, i so that at least M 1 M 2 components C i are such that δ(c i ) equals 1, which proves the theorem. This tells us that many disjoint augmenting paths exist, but says nothing about their lengths, nor about finding them. Indeed, if we take M 2 to be the empty set and M 1 to be a maximum matching, the theorem tells us that the original graph contains enough disjoint augmenting paths to go from no matching at all to a maximum matching in a single step! But these paths will normally be of various lengths and finding such a set is actually a very hard problem. We will focus on finding a set of disjoint shortest augmenting paths (thus all of the same length) with respect to the current matching; such a set will normally not contain enough paths to obtain a maximum matching in one step. Our next result is intuitively obvious, but the theorem proves it and also makes it precise: successive shortest augmenting paths cannot become shorter. Theorem 4. Let G =(V, E) be a graph, with M a nonmaximal matching, P a shortest augmenting path with respect to M, and P any augmenting path with respect to the augmented matching M P. Then we have P P + P P 7

8 Proof. The matching M P P contains two more edges than M, so that, by our previous theorem, M (M P P ) = P P contains (at least) two vertex-disjoint augmenting paths with respect to M, call them P 1 and P 2. Thus we have P P P 1 + P 2. Since P is a shortest augmenting path with respect to M, we also have P P 1 and P P 2, so that we get P P 2 P. Since P P is (P P ) (P P ), we can write P P = P + P P P. Substituting in our preceding inequality yields our conclusion. An interesting corollary (especially given our BFS approach to finding shortest augmenting paths) is that two successive shortest augmenting paths have the same length only if they are disjoint. Our new algorithm uses all disjoint shortest augmenting paths it finds, as follows. Begin with an arbitrary (possibly empty) matching. Repeatedly find a maximal set of vertex-disjoint shortest augmenting paths, and use them all to augment the current matching, until no augmenting path can be found. Now we are ready to prove the crucial result on the worst-case number of searches required to obtain a maximum matching. The result itself is on the number of different lengths that can be found among the collection of shortest augmenting paths produced in successive searches. Theorem 5. Let s be the cardinality of a maximum matching and let P 1,P 2,..., P s be a sequence of shortest augmenting paths that build on the empty matching. Then the number of distinct integers in the sequence P 1, P 2,..., P s cannot exceed 2 s. The intuition here is that, as we start the first search with an empty matching (or a small one), there will be many disjoint shortest augmenting paths and so there will be many repeated values towards the beginning of the sequence of path lengths; toward the end, however, augmenting paths are more complex, longer, and rarer, so that most values toward the end of the sequence will be distinct. The proof formalizes this intuition by using a midpoint in the number of distinct values that is very far along the sequence of augmenting paths: not at s 2, but at s s. Proof. Let r= s s and consider M r, the rth matching in the augmentation sequence. Since M r =r and since the maximum matching has cardinality s>r, we conclude (using Berge s extended theorem) that there exist exactly s r vertex-disjoint augmenting paths with respect to M r. (These need not be the remaining augmenting paths in our sequence, P r+1, P r+2,..., P s.) Altogether these paths contain at most all of the edges from M r, so that the shortest contains at most r/(s r) such edges (if the edges of M r are evenly distributed among the s r vertex-disjoint paths) and thus at most 2 r/(s r) +1 edges in all. But the shortest augmenting path is precisely the next one picked, so that we get P r+1 2 s s /(s s sqrts ) + 1 2(s s)/ s+1 2 s 1 < 2 s +1. 8

9 Since P r+1 is an odd integer (all augmenting paths have odd length), we can conclude that P r+1 2 s 1. Hence each of P 1, P 2,..., P r must have length no greater than 2 s 1. Therefore, these r lengths must be distributed among at most s different values and this bound can be reached only if P r = P r+1. Since P r+1, P r+2,..., P s cannot contribute more than s r = s distinct values, the total number of distinct integers in the sequence does not exceed 2 s. Thus our improved algorithm iterates Θ( V ) times, as opposed to Θ( V ) times for the original version, a substantial improvement since the worst-case cost of an iteration remains unchanged. For bipartite graphs, we can construct a maximum matching in O( V E ) time. 9

Matching Theory. Figure 1: Is this graph bipartite?

Matching Theory. Figure 1: Is this graph bipartite? Matching Theory 1 Introduction A matching M of a graph is a subset of E such that no two edges in M share a vertex; edges which have this property are called independent edges. A matching M is said to

More information

2. Lecture notes on non-bipartite matching

2. Lecture notes on non-bipartite matching Massachusetts Institute of Technology 18.433: Combinatorial Optimization Michel X. Goemans February 15th, 013. Lecture notes on non-bipartite matching Given a graph G = (V, E), we are interested in finding

More information

Minimum-Spanning-Tree problem. Minimum Spanning Trees (Forests) Minimum-Spanning-Tree problem

Minimum-Spanning-Tree problem. Minimum Spanning Trees (Forests) Minimum-Spanning-Tree problem Minimum Spanning Trees (Forests) Given an undirected graph G=(V,E) with each edge e having a weight w(e) : Find a subgraph T of G of minimum total weight s.t. every pair of vertices connected in G are

More information

Theorem 2.9: nearest addition algorithm

Theorem 2.9: nearest addition algorithm There are severe limits on our ability to compute near-optimal tours It is NP-complete to decide whether a given undirected =(,)has a Hamiltonian cycle An approximation algorithm for the TSP can be used

More information

Matching Algorithms. Proof. If a bipartite graph has a perfect matching, then it is easy to see that the right hand side is a necessary condition.

Matching Algorithms. Proof. If a bipartite graph has a perfect matching, then it is easy to see that the right hand side is a necessary condition. 18.433 Combinatorial Optimization Matching Algorithms September 9,14,16 Lecturer: Santosh Vempala Given a graph G = (V, E), a matching M is a set of edges with the property that no two of the edges have

More information

22 Elementary Graph Algorithms. There are two standard ways to represent a

22 Elementary Graph Algorithms. There are two standard ways to represent a VI Graph Algorithms Elementary Graph Algorithms Minimum Spanning Trees Single-Source Shortest Paths All-Pairs Shortest Paths 22 Elementary Graph Algorithms There are two standard ways to represent a graph

More information

22 Elementary Graph Algorithms. There are two standard ways to represent a

22 Elementary Graph Algorithms. There are two standard ways to represent a VI Graph Algorithms Elementary Graph Algorithms Minimum Spanning Trees Single-Source Shortest Paths All-Pairs Shortest Paths 22 Elementary Graph Algorithms There are two standard ways to represent a graph

More information

Lecture 25 Spanning Trees

Lecture 25 Spanning Trees Lecture 25 Spanning Trees 15-122: Principles of Imperative Computation (Fall 2018) Frank Pfenning, Iliano Cervesato The following is a simple example of a connected, undirected graph with 5 vertices (A,

More information

1 Minimum Spanning Trees (MST) b 2 3 a. 10 e h. j m

1 Minimum Spanning Trees (MST) b 2 3 a. 10 e h. j m Minimum Spanning Trees (MST) 8 0 e 7 b 3 a 5 d 9 h i g c 8 7 6 3 f j 9 6 k l 5 m A graph H(U,F) is a subgraph of G(V,E) if U V and F E. A subgraph H(U,F) is called spanning if U = V. Let G be a graph with

More information

Undirected Graphs. DSA - lecture 6 - T.U.Cluj-Napoca - M. Joldos 1

Undirected Graphs. DSA - lecture 6 - T.U.Cluj-Napoca - M. Joldos 1 Undirected Graphs Terminology. Free Trees. Representations. Minimum Spanning Trees (algorithms: Prim, Kruskal). Graph Traversals (dfs, bfs). Articulation points & Biconnected Components. Graph Matching

More information

Lecture 10. Elementary Graph Algorithm Minimum Spanning Trees

Lecture 10. Elementary Graph Algorithm Minimum Spanning Trees Lecture 10. Elementary Graph Algorithm Minimum Spanning Trees T. H. Cormen, C. E. Leiserson and R. L. Rivest Introduction to Algorithms, 3rd Edition, MIT Press, 2009 Sungkyunkwan University Hyunseung Choo

More information

Graphs and Network Flows IE411. Lecture 21. Dr. Ted Ralphs

Graphs and Network Flows IE411. Lecture 21. Dr. Ted Ralphs Graphs and Network Flows IE411 Lecture 21 Dr. Ted Ralphs IE411 Lecture 21 1 Combinatorial Optimization and Network Flows In general, most combinatorial optimization and integer programming problems are

More information

Notes on Minimum Spanning Trees. Red Rule: Given a cycle containing no red edges, select a maximum uncolored edge on the cycle, and color it red.

Notes on Minimum Spanning Trees. Red Rule: Given a cycle containing no red edges, select a maximum uncolored edge on the cycle, and color it red. COS 521 Fall 2009 Notes on Minimum Spanning Trees 1. The Generic Greedy Algorithm The generic greedy algorithm finds a minimum spanning tree (MST) by an edge-coloring process. Initially all edges are uncolored.

More information

Adjacent: Two distinct vertices u, v are adjacent if there is an edge with ends u, v. In this case we let uv denote such an edge.

Adjacent: Two distinct vertices u, v are adjacent if there is an edge with ends u, v. In this case we let uv denote such an edge. 1 Graph Basics What is a graph? Graph: a graph G consists of a set of vertices, denoted V (G), a set of edges, denoted E(G), and a relation called incidence so that each edge is incident with either one

More information

Greedy Algorithms. At each step in the algorithm, one of several choices can be made.

Greedy Algorithms. At each step in the algorithm, one of several choices can be made. Greedy Algorithms At each step in the algorithm, one of several choices can be made. Greedy Strategy: make the choice that is the best at the moment. After making a choice, we are left with one subproblem

More information

Decreasing a key FIB-HEAP-DECREASE-KEY(,, ) 3.. NIL. 2. error new key is greater than current key 6. CASCADING-CUT(, )

Decreasing a key FIB-HEAP-DECREASE-KEY(,, ) 3.. NIL. 2. error new key is greater than current key 6. CASCADING-CUT(, ) Decreasing a key FIB-HEAP-DECREASE-KEY(,, ) 1. if >. 2. error new key is greater than current key 3.. 4.. 5. if NIL and.

More information

Definition: A graph G = (V, E) is called a tree if G is connected and acyclic. The following theorem captures many important facts about trees.

Definition: A graph G = (V, E) is called a tree if G is connected and acyclic. The following theorem captures many important facts about trees. Tree 1. Trees and their Properties. Spanning trees 3. Minimum Spanning Trees 4. Applications of Minimum Spanning Trees 5. Minimum Spanning Tree Algorithms 1.1 Properties of Trees: Definition: A graph G

More information

Theory of Computing. Lecture 10 MAS 714 Hartmut Klauck

Theory of Computing. Lecture 10 MAS 714 Hartmut Klauck Theory of Computing Lecture 10 MAS 714 Hartmut Klauck Seven Bridges of Königsberg Can one take a walk that crosses each bridge exactly once? Seven Bridges of Königsberg Model as a graph Is there a path

More information

Lecture 2 - Graph Theory Fundamentals - Reachability and Exploration 1

Lecture 2 - Graph Theory Fundamentals - Reachability and Exploration 1 CME 305: Discrete Mathematics and Algorithms Instructor: Professor Aaron Sidford (sidford@stanford.edu) January 11, 2018 Lecture 2 - Graph Theory Fundamentals - Reachability and Exploration 1 In this lecture

More information

Mathematical and Algorithmic Foundations Linear Programming and Matchings

Mathematical and Algorithmic Foundations Linear Programming and Matchings Adavnced Algorithms Lectures Mathematical and Algorithmic Foundations Linear Programming and Matchings Paul G. Spirakis Department of Computer Science University of Patras and Liverpool Paul G. Spirakis

More information

Undirected Graphs. Hwansoo Han

Undirected Graphs. Hwansoo Han Undirected Graphs Hwansoo Han Definitions Undirected graph (simply graph) G = (V, E) V : set of vertexes (vertices, nodes, points) E : set of edges (lines) An edge is an unordered pair Edge (v, w) = (w,

More information

LECTURES 3 and 4: Flows and Matchings

LECTURES 3 and 4: Flows and Matchings LECTURES 3 and 4: Flows and Matchings 1 Max Flow MAX FLOW (SP). Instance: Directed graph N = (V,A), two nodes s,t V, and capacities on the arcs c : A R +. A flow is a set of numbers on the arcs such that

More information

On the Relationships between Zero Forcing Numbers and Certain Graph Coverings

On the Relationships between Zero Forcing Numbers and Certain Graph Coverings On the Relationships between Zero Forcing Numbers and Certain Graph Coverings Fatemeh Alinaghipour Taklimi, Shaun Fallat 1,, Karen Meagher 2 Department of Mathematics and Statistics, University of Regina,

More information

CONNECTIVITY AND NETWORKS

CONNECTIVITY AND NETWORKS CONNECTIVITY AND NETWORKS We begin with the definition of a few symbols, two of which can cause great confusion, especially when hand-written. Consider a graph G. (G) the degree of the vertex with smallest

More information

Definition For vertices u, v V (G), the distance from u to v, denoted d(u, v), in G is the length of a shortest u, v-path. 1

Definition For vertices u, v V (G), the distance from u to v, denoted d(u, v), in G is the length of a shortest u, v-path. 1 Graph fundamentals Bipartite graph characterization Lemma. If a graph contains an odd closed walk, then it contains an odd cycle. Proof strategy: Consider a shortest closed odd walk W. If W is not a cycle,

More information

Greedy algorithms is another useful way for solving optimization problems.

Greedy algorithms is another useful way for solving optimization problems. Greedy Algorithms Greedy algorithms is another useful way for solving optimization problems. Optimization Problems For the given input, we are seeking solutions that must satisfy certain conditions. These

More information

CS521 \ Notes for the Final Exam

CS521 \ Notes for the Final Exam CS521 \ Notes for final exam 1 Ariel Stolerman Asymptotic Notations: CS521 \ Notes for the Final Exam Notation Definition Limit Big-O ( ) Small-o ( ) Big- ( ) Small- ( ) Big- ( ) Notes: ( ) ( ) ( ) ( )

More information

Matchings in Graphs. Definition 1 Let G = (V, E) be a graph. M E is called as a matching of G if v V we have {e M : v is incident on e E} 1.

Matchings in Graphs. Definition 1 Let G = (V, E) be a graph. M E is called as a matching of G if v V we have {e M : v is incident on e E} 1. Lecturer: Scribe: Meena Mahajan Rajesh Chitnis Matchings in Graphs Meeting: 1 6th Jan 010 Most of the material in this lecture is taken from the book Fast Parallel Algorithms for Graph Matching Problems

More information

Graph Connectivity G G G

Graph Connectivity G G G Graph Connectivity 1 Introduction We have seen that trees are minimally connected graphs, i.e., deleting any edge of the tree gives us a disconnected graph. What makes trees so susceptible to edge deletions?

More information

Discrete mathematics

Discrete mathematics Discrete mathematics Petr Kovář petr.kovar@vsb.cz VŠB Technical University of Ostrava DiM 470-2301/02, Winter term 2018/2019 About this file This file is meant to be a guideline for the lecturer. Many

More information

Figure 2.1: A bipartite graph.

Figure 2.1: A bipartite graph. Matching problems The dance-class problem. A group of boys and girls, with just as many boys as girls, want to dance together; hence, they have to be matched in couples. Each boy prefers to dance with

More information

Greedy Algorithms. Previous Examples: Huffman coding, Minimum Spanning Tree Algorithms

Greedy Algorithms. Previous Examples: Huffman coding, Minimum Spanning Tree Algorithms Greedy Algorithms A greedy algorithm is one where you take the step that seems the best at the time while executing the algorithm. Previous Examples: Huffman coding, Minimum Spanning Tree Algorithms Coin

More information

CIS 121 Data Structures and Algorithms Minimum Spanning Trees

CIS 121 Data Structures and Algorithms Minimum Spanning Trees CIS 121 Data Structures and Algorithms Minimum Spanning Trees March 19, 2019 Introduction and Background Consider a very natural problem: we are given a set of locations V = {v 1, v 2,..., v n }. We want

More information

Number Theory and Graph Theory

Number Theory and Graph Theory 1 Number Theory and Graph Theory Chapter 6 Basic concepts and definitions of graph theory By A. Satyanarayana Reddy Department of Mathematics Shiv Nadar University Uttar Pradesh, India E-mail: satya8118@gmail.com

More information

Advanced algorithms. topological ordering, minimum spanning tree, Union-Find problem. Jiří Vyskočil, Radek Mařík 2012

Advanced algorithms. topological ordering, minimum spanning tree, Union-Find problem. Jiří Vyskočil, Radek Mařík 2012 topological ordering, minimum spanning tree, Union-Find problem Jiří Vyskočil, Radek Mařík 2012 Subgraph subgraph A graph H is a subgraph of a graph G, if the following two inclusions are satisfied: 2

More information

Lecture 3: Graphs and flows

Lecture 3: Graphs and flows Chapter 3 Lecture 3: Graphs and flows Graphs: a useful combinatorial structure. Definitions: graph, directed and undirected graph, edge as ordered pair, path, cycle, connected graph, strongly connected

More information

Minimum Spanning Trees

Minimum Spanning Trees Minimum Spanning Trees Overview Problem A town has a set of houses and a set of roads. A road connects and only houses. A road connecting houses u and v has a repair cost w(u, v). Goal: Repair enough (and

More information

Rigidity, connectivity and graph decompositions

Rigidity, connectivity and graph decompositions First Prev Next Last Rigidity, connectivity and graph decompositions Brigitte Servatius Herman Servatius Worcester Polytechnic Institute Page 1 of 100 First Prev Next Last Page 2 of 100 We say that a framework

More information

Minimum spanning trees

Minimum spanning trees Minimum spanning trees [We re following the book very closely.] One of the most famous greedy algorithms (actually rather family of greedy algorithms). Given undirected graph G = (V, E), connected Weight

More information

1. Lecture notes on bipartite matching

1. Lecture notes on bipartite matching Massachusetts Institute of Technology 18.453: Combinatorial Optimization Michel X. Goemans February 5, 2017 1. Lecture notes on bipartite matching Matching problems are among the fundamental problems in

More information

Basic Graph Theory with Applications to Economics

Basic Graph Theory with Applications to Economics Basic Graph Theory with Applications to Economics Debasis Mishra February, 0 What is a Graph? Let N = {,..., n} be a finite set. Let E be a collection of ordered or unordered pairs of distinct elements

More information

PERFECT MATCHING THE CENTRALIZED DEPLOYMENT MOBILE SENSORS THE PROBLEM SECOND PART: WIRELESS NETWORKS 2.B. SENSOR NETWORKS OF MOBILE SENSORS

PERFECT MATCHING THE CENTRALIZED DEPLOYMENT MOBILE SENSORS THE PROBLEM SECOND PART: WIRELESS NETWORKS 2.B. SENSOR NETWORKS OF MOBILE SENSORS SECOND PART: WIRELESS NETWORKS 2.B. SENSOR NETWORKS THE CENTRALIZED DEPLOYMENT OF MOBILE SENSORS I.E. THE MINIMUM WEIGHT PERFECT MATCHING 1 2 ON BIPARTITE GRAPHS Prof. Tiziana Calamoneri Network Algorithms

More information

Approximation Algorithms

Approximation Algorithms Chapter 8 Approximation Algorithms Algorithm Theory WS 2016/17 Fabian Kuhn Approximation Algorithms Optimization appears everywhere in computer science We have seen many examples, e.g.: scheduling jobs

More information

Section 3.1: Nonseparable Graphs Cut vertex of a connected graph G: A vertex x G such that G x is not connected. Theorem 3.1, p. 57: Every connected

Section 3.1: Nonseparable Graphs Cut vertex of a connected graph G: A vertex x G such that G x is not connected. Theorem 3.1, p. 57: Every connected Section 3.1: Nonseparable Graphs Cut vertex of a connected graph G: A vertex x G such that G x is not connected. Theorem 3.1, p. 57: Every connected graph G with at least 2 vertices contains at least 2

More information

looking ahead to see the optimum

looking ahead to see the optimum ! Make choice based on immediate rewards rather than looking ahead to see the optimum! In many cases this is effective as the look ahead variation can require exponential time as the number of possible

More information

These are not polished as solutions, but ought to give a correct idea of solutions that work. Note that most problems have multiple good solutions.

These are not polished as solutions, but ought to give a correct idea of solutions that work. Note that most problems have multiple good solutions. CSE 591 HW Sketch Sample Solutions These are not polished as solutions, but ought to give a correct idea of solutions that work. Note that most problems have multiple good solutions. Problem 1 (a) Any

More information

Fundamental Properties of Graphs

Fundamental Properties of Graphs Chapter three In many real-life situations we need to know how robust a graph that represents a certain network is, how edges or vertices can be removed without completely destroying the overall connectivity,

More information

Matchings. Saad Mneimneh

Matchings. Saad Mneimneh Matchings Saad Mneimneh 1 Stable matching Consider n men and n women. A matching is a one to one correspondence between the men and the women. In finding a matching, however, we would like to respect the

More information

Dist(Vertex u, Vertex v, Graph preprocessed) return u.dist v.dist

Dist(Vertex u, Vertex v, Graph preprocessed) return u.dist v.dist Design and Analysis of Algorithms 5th September, 2016 Practice Sheet 3 Solutions Sushant Agarwal Solutions 1. Given an edge-weighted undirected connected chain-graph G = (V, E), all vertices having degree

More information

5.1 Min-Max Theorem for General Matching

5.1 Min-Max Theorem for General Matching CSC5160: Combinatorial Optimization and Approximation Algorithms Topic: General Matching Date: 4/01/008 Lecturer: Lap Chi Lau Scribe: Jennifer X.M. WU In this lecture, we discuss matchings in general graph.

More information

Distributed minimum spanning tree problem

Distributed minimum spanning tree problem Distributed minimum spanning tree problem Juho-Kustaa Kangas 24th November 2012 Abstract Given a connected weighted undirected graph, the minimum spanning tree problem asks for a spanning subtree with

More information

Module 7. Independent sets, coverings. and matchings. Contents

Module 7. Independent sets, coverings. and matchings. Contents Module 7 Independent sets, coverings Contents and matchings 7.1 Introduction.......................... 152 7.2 Independent sets and coverings: basic equations..... 152 7.3 Matchings in bipartite graphs................

More information

Minimum Spanning Trees My T. UF

Minimum Spanning Trees My T. UF Introduction to Algorithms Minimum Spanning Trees @ UF Problem Find a low cost network connecting a set of locations Any pair of locations are connected There is no cycle Some applications: Communication

More information

Graphs. Graph G = (V, E) Types of graphs E = O( V 2 ) V = set of vertices E = set of edges (V V)

Graphs. Graph G = (V, E) Types of graphs E = O( V 2 ) V = set of vertices E = set of edges (V V) Graph Algorithms Graphs Graph G = (V, E) V = set of vertices E = set of edges (V V) Types of graphs Undirected: edge (u, v) = (v, u); for all v, (v, v) E (No self loops.) Directed: (u, v) is edge from

More information

In this lecture, we ll look at applications of duality to three problems:

In this lecture, we ll look at applications of duality to three problems: Lecture 7 Duality Applications (Part II) In this lecture, we ll look at applications of duality to three problems: 1. Finding maximum spanning trees (MST). We know that Kruskal s algorithm finds this,

More information

Minimum Spanning Trees Ch 23 Traversing graphs

Minimum Spanning Trees Ch 23 Traversing graphs Next: Graph Algorithms Graphs Ch 22 Graph representations adjacency list adjacency matrix Minimum Spanning Trees Ch 23 Traversing graphs Breadth-First Search Depth-First Search 11/30/17 CSE 3101 1 Graphs

More information

Matching 4/21/2016. Bipartite Matching. 3330: Algorithms. First Try. Maximum Matching. Key Questions. Existence of Perfect Matching

Matching 4/21/2016. Bipartite Matching. 3330: Algorithms. First Try. Maximum Matching. Key Questions. Existence of Perfect Matching Bipartite Matching Matching 3330: Algorithms A graph is bipartite if its vertex set can be partitioned into two subsets A and B so that each edge has one endpoint in A and the other endpoint in B. A B

More information

Introduction to Optimization

Introduction to Optimization Introduction to Optimization Greedy Algorithms October 28, 2016 École Centrale Paris, Châtenay-Malabry, France Dimo Brockhoff Inria Saclay Ile-de-France 2 Course Overview Date Fri, 7.10.2016 Fri, 28.10.2016

More information

5 MST and Greedy Algorithms

5 MST and Greedy Algorithms 5 MST and Greedy Algorithms One of the traditional and practically motivated problems of discrete optimization asks for a minimal interconnection of a given set of terminals (meaning that every pair will

More information

CS 341: Algorithms. Douglas R. Stinson. David R. Cheriton School of Computer Science University of Waterloo. February 26, 2019

CS 341: Algorithms. Douglas R. Stinson. David R. Cheriton School of Computer Science University of Waterloo. February 26, 2019 CS 341: Algorithms Douglas R. Stinson David R. Cheriton School of Computer Science University of Waterloo February 26, 2019 D.R. Stinson (SCS) CS 341 February 26, 2019 1 / 296 1 Course Information 2 Introduction

More information

Notes for Lecture 24

Notes for Lecture 24 U.C. Berkeley CS170: Intro to CS Theory Handout N24 Professor Luca Trevisan December 4, 2001 Notes for Lecture 24 1 Some NP-complete Numerical Problems 1.1 Subset Sum The Subset Sum problem is defined

More information

5 MST and Greedy Algorithms

5 MST and Greedy Algorithms 5 MST and Greedy Algorithms One of the traditional and practically motivated problems of discrete optimization asks for a minimal interconnection of a given set of terminals (meaning that every pair will

More information

Trees Rooted Trees Spanning trees and Shortest Paths. 12. Graphs and Trees 2. Aaron Tan November 2017

Trees Rooted Trees Spanning trees and Shortest Paths. 12. Graphs and Trees 2. Aaron Tan November 2017 12. Graphs and Trees 2 Aaron Tan 6 10 November 2017 1 10.5 Trees 2 Definition Definition Definition: Tree A graph is said to be circuit-free if, and only if, it has no circuits. A graph is called a tree

More information

5 Matchings in Bipartite Graphs and Their Applications

5 Matchings in Bipartite Graphs and Their Applications 5 Matchings in Bipartite Graphs and Their Applications 5.1 Matchings Definition 5.1 A matching M in a graph G is a set of edges of G, none of which is a loop, such that no two edges in M have a common

More information

K 4 C 5. Figure 4.5: Some well known family of graphs

K 4 C 5. Figure 4.5: Some well known family of graphs 08 CHAPTER. TOPICS IN CLASSICAL GRAPH THEORY K, K K K, K K, K K, K C C C C 6 6 P P P P P. Graph Operations Figure.: Some well known family of graphs A graph Y = (V,E ) is said to be a subgraph of a graph

More information

Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch.

Copyright 2007 Pearson Addison-Wesley. All rights reserved. A. Levitin Introduction to the Design & Analysis of Algorithms, 2 nd ed., Ch. Iterative Improvement Algorithm design technique for solving optimization problems Start with a feasible solution Repeat the following step until no improvement can be found: change the current feasible

More information

Lecture 8: The Traveling Salesman Problem

Lecture 8: The Traveling Salesman Problem Lecture 8: The Traveling Salesman Problem Let G = (V, E) be an undirected graph. A Hamiltonian cycle of G is a cycle that visits every vertex v V exactly once. Instead of Hamiltonian cycle, we sometimes

More information

Treewidth and graph minors

Treewidth and graph minors Treewidth and graph minors Lectures 9 and 10, December 29, 2011, January 5, 2012 We shall touch upon the theory of Graph Minors by Robertson and Seymour. This theory gives a very general condition under

More information

Faster parameterized algorithms for Minimum Fill-In

Faster parameterized algorithms for Minimum Fill-In Faster parameterized algorithms for Minimum Fill-In Hans L. Bodlaender Pinar Heggernes Yngve Villanger Abstract We present two parameterized algorithms for the Minimum Fill-In problem, also known as Chordal

More information

MATH 363 Final Wednesday, April 28. Final exam. You may use lemmas and theorems that were proven in class and on assignments unless stated otherwise.

MATH 363 Final Wednesday, April 28. Final exam. You may use lemmas and theorems that were proven in class and on assignments unless stated otherwise. Final exam This is a closed book exam. No calculators are allowed. Unless stated otherwise, justify all your steps. You may use lemmas and theorems that were proven in class and on assignments unless stated

More information

CSE 521: Design and Analysis of Algorithms I

CSE 521: Design and Analysis of Algorithms I CSE 521: Design and Analysis of Algorithms I Greedy Algorithms Paul Beame 1 Greedy Algorithms Hard to define exactly but can give general properties Solution is built in small steps Decisions on how to

More information

Optimization I : Brute force and Greedy strategy

Optimization I : Brute force and Greedy strategy Chapter 3 Optimization I : Brute force and Greedy strategy A generic definition of an optimization problem involves a set of constraints that defines a subset in some underlying space (like the Euclidean

More information

1 Matchings in Graphs

1 Matchings in Graphs Matchings in Graphs J J 2 J 3 J 4 J 5 J J J 6 8 7 C C 2 C 3 C 4 C 5 C C 7 C 8 6 J J 2 J 3 J 4 J 5 J J J 6 8 7 C C 2 C 3 C 4 C 5 C C 7 C 8 6 Definition Two edges are called independent if they are not adjacent

More information

Theorem 3.1 (Berge) A matching M in G is maximum if and only if there is no M- augmenting path.

Theorem 3.1 (Berge) A matching M in G is maximum if and only if there is no M- augmenting path. 3 Matchings Hall s Theorem Matching: A matching in G is a subset M E(G) so that no edge in M is a loop, and no two edges in M are incident with a common vertex. A matching M is maximal if there is no matching

More information

Topics in Combinatorial Optimization February 5, Lecture 2

Topics in Combinatorial Optimization February 5, Lecture 2 8.997 Topics in Combinatorial Optimization February 5, 2004 Lecture 2 Lecturer: Michel X. Goemans Scribe: Robert Kleinberg In this lecture, we will: Present Edmonds algorithm for computing a maximum matching

More information

CS261: A Second Course in Algorithms Lecture #16: The Traveling Salesman Problem

CS261: A Second Course in Algorithms Lecture #16: The Traveling Salesman Problem CS61: A Second Course in Algorithms Lecture #16: The Traveling Salesman Problem Tim Roughgarden February 5, 016 1 The Traveling Salesman Problem (TSP) In this lecture we study a famous computational problem,

More information

Solution for Homework set 3

Solution for Homework set 3 TTIC 300 and CMSC 37000 Algorithms Winter 07 Solution for Homework set 3 Question (0 points) We are given a directed graph G = (V, E), with two special vertices s and t, and non-negative integral capacities

More information

Chapter 23. Minimum Spanning Trees

Chapter 23. Minimum Spanning Trees Chapter 23. Minimum Spanning Trees We are given a connected, weighted, undirected graph G = (V,E;w), where each edge (u,v) E has a non-negative weight (often called length) w(u,v). The Minimum Spanning

More information

8 Matroid Intersection

8 Matroid Intersection 8 Matroid Intersection 8.1 Definition and examples 8.2 Matroid Intersection Algorithm 8.1 Definitions Given two matroids M 1 = (X, I 1 ) and M 2 = (X, I 2 ) on the same set X, their intersection is M 1

More information

Problem Set 2 Solutions

Problem Set 2 Solutions Problem Set 2 Solutions Graph Theory 2016 EPFL Frank de Zeeuw & Claudiu Valculescu 1. Prove that the following statements about a graph G are equivalent. - G is a tree; - G is minimally connected (it is

More information

Dartmouth Computer Science Technical Report TR Chain Match: An Algorithm for Finding a Perfect Matching of a Regular Bipartite Multigraph

Dartmouth Computer Science Technical Report TR Chain Match: An Algorithm for Finding a Perfect Matching of a Regular Bipartite Multigraph Dartmouth Computer Science Technical Report TR2014-753 Chain Match: An Algorithm for Finding a Perfect Matching of a Regular Bipartite Multigraph Stefanie Ostrowski May 28, 2014 Abstract We consider the

More information

Advanced Combinatorial Optimization September 17, Lecture 3. Sketch some results regarding ear-decompositions and factor-critical graphs.

Advanced Combinatorial Optimization September 17, Lecture 3. Sketch some results regarding ear-decompositions and factor-critical graphs. 18.438 Advanced Combinatorial Optimization September 17, 2009 Lecturer: Michel X. Goemans Lecture 3 Scribe: Aleksander Madry ( Based on notes by Robert Kleinberg and Dan Stratila.) In this lecture, we

More information

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li

CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms. Lecturer: Shi Li CSE 431/531: Algorithm Analysis and Design (Spring 2018) Greedy Algorithms Lecturer: Shi Li Department of Computer Science and Engineering University at Buffalo Main Goal of Algorithm Design Design fast

More information

Minimum Spanning Tree (5A) Young Won Lim 5/11/18

Minimum Spanning Tree (5A) Young Won Lim 5/11/18 Minimum Spanning Tree (5A) Copyright (c) 2015 2018 Young W. Lim. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2

More information

Greedy Algorithms 1. For large values of d, brute force search is not feasible because there are 2 d

Greedy Algorithms 1. For large values of d, brute force search is not feasible because there are 2 d Greedy Algorithms 1 Simple Knapsack Problem Greedy Algorithms form an important class of algorithmic techniques. We illustrate the idea by applying it to a simplified version of the Knapsack Problem. Informally,

More information

Introduction to Graph Theory

Introduction to Graph Theory Introduction to Graph Theory Tandy Warnow January 20, 2017 Graphs Tandy Warnow Graphs A graph G = (V, E) is an object that contains a vertex set V and an edge set E. We also write V (G) to denote the vertex

More information

2 A Template for Minimum Spanning Tree Algorithms

2 A Template for Minimum Spanning Tree Algorithms CS, Lecture 5 Minimum Spanning Trees Scribe: Logan Short (05), William Chen (0), Mary Wootters (0) Date: May, 0 Introduction Today we will continue our discussion of greedy algorithms, specifically in

More information

DESIGN AND ANALYSIS OF ALGORITHMS GREEDY METHOD

DESIGN AND ANALYSIS OF ALGORITHMS GREEDY METHOD 1 DESIGN AND ANALYSIS OF ALGORITHMS UNIT II Objectives GREEDY METHOD Explain and detail about greedy method Explain the concept of knapsack problem and solve the problems in knapsack Discuss the applications

More information

V10 Metabolic networks - Graph connectivity

V10 Metabolic networks - Graph connectivity V10 Metabolic networks - Graph connectivity Graph connectivity is related to analyzing biological networks for - finding cliques - edge betweenness - modular decomposition that have been or will be covered

More information

CSE 100: GRAPH ALGORITHMS

CSE 100: GRAPH ALGORITHMS CSE 100: GRAPH ALGORITHMS Dijkstra s Algorithm: Questions Initialize the graph: Give all vertices a dist of INFINITY, set all done flags to false Start at s; give s dist = 0 and set prev field to -1 Enqueue

More information

Assignment 4 Solutions of graph problems

Assignment 4 Solutions of graph problems Assignment 4 Solutions of graph problems 1. Let us assume that G is not a cycle. Consider the maximal path in the graph. Let the end points of the path be denoted as v 1, v k respectively. If either of

More information

MC 302 GRAPH THEORY 10/1/13 Solutions to HW #2 50 points + 6 XC points

MC 302 GRAPH THEORY 10/1/13 Solutions to HW #2 50 points + 6 XC points MC 0 GRAPH THEORY 0// Solutions to HW # 0 points + XC points ) [CH] p.,..7. This problem introduces an important class of graphs called the hypercubes or k-cubes, Q, Q, Q, etc. I suggest that before you

More information

Lecture 11: Maximum flow and minimum cut

Lecture 11: Maximum flow and minimum cut Optimisation Part IB - Easter 2018 Lecture 11: Maximum flow and minimum cut Lecturer: Quentin Berthet 4.4. The maximum flow problem. We consider in this lecture a particular kind of flow problem, with

More information

Matching and Planarity

Matching and Planarity Matching and Planarity Po-Shen Loh June 010 1 Warm-up 1. (Bondy 1.5.9.) There are n points in the plane such that every pair of points has distance 1. Show that there are at most n (unordered) pairs of

More information

Introduction III. Graphs. Motivations I. Introduction IV

Introduction III. Graphs. Motivations I. Introduction IV Introduction I Graphs Computer Science & Engineering 235: Discrete Mathematics Christopher M. Bourke cbourke@cse.unl.edu Graph theory was introduced in the 18th century by Leonhard Euler via the Königsberg

More information

WUCT121. Discrete Mathematics. Graphs

WUCT121. Discrete Mathematics. Graphs WUCT121 Discrete Mathematics Graphs WUCT121 Graphs 1 Section 1. Graphs 1.1. Introduction Graphs are used in many fields that require analysis of routes between locations. These areas include communications,

More information

We ve done. Introduction to the greedy method Activity selection problem How to prove that a greedy algorithm works Fractional Knapsack Huffman coding

We ve done. Introduction to the greedy method Activity selection problem How to prove that a greedy algorithm works Fractional Knapsack Huffman coding We ve done Introduction to the greedy method Activity selection problem How to prove that a greedy algorithm works Fractional Knapsack Huffman coding Matroid Theory Now Matroids and weighted matroids Generic

More information

11.1 Facility Location

11.1 Facility Location CS787: Advanced Algorithms Scribe: Amanda Burton, Leah Kluegel Lecturer: Shuchi Chawla Topic: Facility Location ctd., Linear Programming Date: October 8, 2007 Today we conclude the discussion of local

More information

5 Graphs

5 Graphs 5 Graphs jacques@ucsd.edu Some of the putnam problems are to do with graphs. They do not assume more than a basic familiarity with the definitions and terminology of graph theory. 5.1 Basic definitions

More information

1 Matchings with Tutte s Theorem

1 Matchings with Tutte s Theorem 1 Matchings with Tutte s Theorem Last week we saw a fairly strong necessary criterion for a graph to have a perfect matching. Today we see that this condition is in fact sufficient. Theorem 1 (Tutte, 47).

More information