Submodular Optimization
|
|
- Jerome Shelton
- 5 years ago
- Views:
Transcription
1 Submodular Optimization Nathaniel Grammel N. Grammel Submodular Optimization 1 / 28
2 Submodularity Captures the notion of Diminishing Returns Definition Suppose U is a set. A set function f :2 U! R is submodular if for any S, T U: f (S \ T )+f (S [ T ) apple f (S)+f (T ) N. Grammel Submodular Optimization 2 / 28
3 Submodularity Captures the notion of Diminishing Returns Definition Suppose U is a set. A set function f :2 U! R is submodular if for any S, T U: f (S \ T )+f (S [ T ) apple f (S)+f (T ) How does this represent diminishing returns? N. Grammel Submodular Optimization 2 / 28
4 Submodularity Captures the notion of Diminishing Returns Definition Suppose U is a set. A set function f :2 U! R is submodular if for any S, T U: f (S \ T )+f (S [ T ) apple f (S)+f (T ) How does this represent diminishing returns? Equivalent Definition: Definition A function f :2 U! R is submodular if for any S, T U such that S T,andanyx 2 U \ T : f (T [{x}) f (T ) apple f (S [{x}) f (S) N. Grammel Submodular Optimization 2 / 28
5 Submodularity and Diminishing Returns Imagine choosing items from U one by one At any point, let S be the set of items chosen so far Choosing x as the next item gives an increase in utility of f (S [{x}) f (S) N. Grammel Submodular Optimization 3 / 28
6 Submodularity and Diminishing Returns Imagine choosing items from U one by one At any point, let S be the set of items chosen so far Choosing x as the next item gives an increase in utility of f (S [{x}) f (S) If we choose some other items first to get a set T (notice: S T ), and then choose x, theincreaseinutilityis f (T [{x}) f (T ) apple f (S [{x}) f (S) N. Grammel Submodular Optimization 3 / 28
7 Submodularity and Diminishing Returns Imagine choosing items from U one by one At any point, let S be the set of items chosen so far Choosing x as the next item gives an increase in utility of f (S [{x}) f (S) If we choose some other items first to get a set T (notice: S T ), and then choose x, theincreaseinutilityis f (T [{x}) f (T ) apple f (S [{x}) f (S) Adding x give less utility if we start with more N. Grammel Submodular Optimization 3 / 28
8 Submodularity and Diminishing Returns Imagine choosing items from U one by one At any point, let S be the set of items chosen so far Choosing x as the next item gives an increase in utility of f (S [{x}) f (S) If we choose some other items first to get a set T (notice: S T ), and then choose x, theincreaseinutilityis f (T [{x}) f (T ) apple f (S [{x}) f (S) Adding x give less utility if we start with more This is the concept of diminishing returns N. Grammel Submodular Optimization 3 / 28
9 Monotonicity: Another Useful Property Definition A set function f :2 U! R is monotone if for every S, T U with S T : f (S) apple f (T ) N. Grammel Submodular Optimization 4 / 28
10 Monotonicity: Another Useful Property Definition A set function f :2 U! R is monotone if for every S, T U with S T : f (S) apple f (T ) We are generally interested in monotone submodular functions. Often, these are utility functions. N. Grammel Submodular Optimization 4 / 28
11 AConcreteExample Recall this example utility function from early in the semester: f ({apple, orange}) =5 f ({apple}) =f ({orange}) =3 f ({}) =f (?) =0 Notice it is monotone submodular! N. Grammel Submodular Optimization 5 / 28
12 Why do we care? Diminishing Returns N. Grammel Submodular Optimization 6 / 28
13 Why do we care? Diminishing Returns Closely Related to Convexity and Concavity N. Grammel Submodular Optimization 6 / 28
14 Why do we care? Diminishing Returns Closely Related to Convexity and Concavity Optimization problems are very hard in general (often, NP-hard), especially with set functions. N. Grammel Submodular Optimization 6 / 28
15 Why do we care? Diminishing Returns Closely Related to Convexity and Concavity Optimization problems are very hard in general (often, NP-hard), especially with set functions. But with submodularity, we can make strong guarantees: e cient algorithms for close approximations. N. Grammel Submodular Optimization 6 / 28
16 Why do we care? Diminishing Returns Closely Related to Convexity and Concavity Optimization problems are very hard in general (often, NP-hard), especially with set functions. But with submodularity, we can make strong guarantees: e cient algorithms for close approximations. Many natural functions have these properties N. Grammel Submodular Optimization 6 / 28
17 Why do we care? Diminishing Returns Closely Related to Convexity and Concavity Optimization problems are very hard in general (often, NP-hard), especially with set functions. But with submodularity, we can make strong guarantees: e cient algorithms for close approximations. Many natural functions have these properties Inferring Influence in a Network (Stay tuned!) N. Grammel Submodular Optimization 6 / 28
18 Why do we care? Diminishing Returns Closely Related to Convexity and Concavity Optimization problems are very hard in general (often, NP-hard), especially with set functions. But with submodularity, we can make strong guarantees: e cient algorithms for close approximations. Many natural functions have these properties Inferring Influence in a Network (Stay tuned!) Determining representative sentences in a document N. Grammel Submodular Optimization 6 / 28
19 Why do we care? Diminishing Returns Closely Related to Convexity and Concavity Optimization problems are very hard in general (often, NP-hard), especially with set functions. But with submodularity, we can make strong guarantees: e cient algorithms for close approximations. Many natural functions have these properties Inferring Influence in a Network (Stay tuned!) Determining representative sentences in a document Many applications to image and signal processing. N. Grammel Submodular Optimization 6 / 28
20 Why do we care? Diminishing Returns Closely Related to Convexity and Concavity Optimization problems are very hard in general (often, NP-hard), especially with set functions. But with submodularity, we can make strong guarantees: e cient algorithms for close approximations. Many natural functions have these properties Inferring Influence in a Network (Stay tuned!) Determining representative sentences in a document Many applications to image and signal processing. Sensor Placement N. Grammel Submodular Optimization 6 / 28
21 Why do we care? Diminishing Returns Closely Related to Convexity and Concavity Optimization problems are very hard in general (often, NP-hard), especially with set functions. But with submodularity, we can make strong guarantees: e cient algorithms for close approximations. Many natural functions have these properties Inferring Influence in a Network (Stay tuned!) Determining representative sentences in a document Many applications to image and signal processing. Sensor Placement Graph Cuts N. Grammel Submodular Optimization 6 / 28
22 Why do we care? Diminishing Returns Closely Related to Convexity and Concavity Optimization problems are very hard in general (often, NP-hard), especially with set functions. But with submodularity, we can make strong guarantees: e cient algorithms for close approximations. Many natural functions have these properties Inferring Influence in a Network (Stay tuned!) Determining representative sentences in a document Many applications to image and signal processing. Sensor Placement Graph Cuts And many more! (Check out submodularity.org) N. Grammel Submodular Optimization 6 / 28
23 Submodular Optimization Two main types of submodular optimization: N. Grammel Submodular Optimization 7 / 28
24 Submodular Optimization Two main types of submodular optimization: Maximization N. Grammel Submodular Optimization 7 / 28
25 Submodular Optimization Two main types of submodular optimization: Maximization Want to find S to maximize f (S) subject to some constraints Most simply: Want to find S that maximizes f (S) subjectto S = k for some k (cardinality constraint) More generally: Other types of constraints (e.g. knapsack constraints, matroid constraints) N. Grammel Submodular Optimization 7 / 28
26 Submodular Optimization Two main types of submodular optimization: Maximization Want to find S to maximize f (S) subject to some constraints Most simply: Want to find S that maximizes f (S) subjectto S = k for some k (cardinality constraint) More generally: Other types of constraints (e.g. knapsack constraints, matroid constraints) Cover/Minimization N. Grammel Submodular Optimization 7 / 28
27 Submodular Optimization Two main types of submodular optimization: Maximization Want to find S to maximize f (S) subject to some constraints Most simply: Want to find S that maximizes f (S) subjectto S = k for some k (cardinality constraint) More generally: Other types of constraints (e.g. knapsack constraints, matroid constraints) Cover/Minimization Want to minimize the cost of covering f Most simply: Find S with minimum size that achieves f (S) =f (U) More generally: Find S such that f (S) =f (U) while minimizing cost(s) for some cost function. N. Grammel Submodular Optimization 7 / 28
28 Submodular Maximization: The Simplest Case Suppose we are given a utility function f :2 U! R and a cardinality constraing k Goal: Find S = argmax f (S) S U: S applek This is NP-hard in general! What if f is submodular? Monotone? Nonnegative? We can find good near-optimal solutions! N. Grammel Submodular Optimization 8 / 28
29 Submodular Maximization: A Simple Greedy Algorithm Suppose f :2 U! R is monotone submodular. Assume f is nonnegative. N. Grammel Submodular Optimization 9 / 28
30 Submodular Maximization: A Simple Greedy Algorithm Suppose f :2 U! R is monotone submodular. Assume f is nonnegative. Start with S = {} N. Grammel Submodular Optimization 9 / 28
31 Submodular Maximization: A Simple Greedy Algorithm Suppose f :2 U! R is monotone submodular. Assume f is nonnegative. Start with S = {} At each step, pick the item x 2 U \ S that maximizes f (S [{x}) f (S) (theitemthatmaximizesthegain in utility) and let S = S [{x}. N. Grammel Submodular Optimization 9 / 28
32 Submodular Maximization: A Simple Greedy Algorithm Suppose f :2 U! R is monotone submodular. Assume f is nonnegative. Start with S = {} At each step, pick the item x 2 U \ S that maximizes f (S [{x}) f (S) (theitemthatmaximizesthegain in utility) and let S = S [{x}. Continue until S = k N. Grammel Submodular Optimization 9 / 28
33 Submodular Maximization: A Simple Greedy Algorithm Suppose f :2 U! R is monotone submodular. Assume f is nonnegative. Start with S = {} At each step, pick the item x 2 U \ S that maximizes f (S [{x}) f (S) (theitemthatmaximizesthegain in utility) and let S = S [{x}. Continue until S = k Theorem (Nemhauser et al. 1978) Let S be the k-element set constructed as above, and let S be the set that maximizes f over all sets of size at most k. Then: f (S) (1 1/e)f (S ) N. Grammel Submodular Optimization 9 / 28
34 Submodular Maximization: A Simple Greedy Algorithm Suppose f :2 U! R is monotone submodular. Assume f is nonnegative. Start with S = {} At each step, pick the item x 2 U \ S that maximizes f (S [{x}) f (S) (theitemthatmaximizesthegain in utility) and let S = S [{x}. Continue until S = k Theorem (Nemhauser et al. 1978) Let S be the k-element set constructed as above, and let S be the set that maximizes f over all sets of size at most k. Then: f (S) (1 1/e)f (S ) Thus, the set S provides a (1 1/e)-approximation, and the construction gives a polynomial-time approximation algorithm. N. Grammel Submodular Optimization 9 / 28
35 Greedy Algorithm: Proof For convenience, let S(x) =f (S [{x}) f (S). Then submodularity states that for S T and x 2 U \ T,wehave T (x) apple S (x). Let S i U be the subset of i elements chosen greedily: S i = S i 1 [{arg max x2u S i 1 (x)} with S 0 = {}. Proof that f (S) (1 1/e)f (S ). Due to monotonicity, S = k. LetS = {e 1, e 2,...,e k }.Further, also due to monotonicity, for any i < k: f (S ) apple f (S [ S i ) (1) N. Grammel Submodular Optimization 10 / 28
36 Greedy Algorithm: Proof Proof that f (S) (1 1/e)f (S ). We also have the following equality f (S [ S i )=f (S i )+ kx j=1 S i [{e 1,...,e j 1 }(e j ) (1) since the terms S i [{e 1,...,e j 1 }(e j )=f (S i [{e 1,...,e j }) f (S i [{e 1,...,e j 1 }) are telescoping so the sum is equal to f (S i [{e 1,...,e j }) f (S i [ {}) =f (S i [ S ) f (S i ). N. Grammel Submodular Optimization 10 / 28
37 Greedy Algorithm: Proof Proof that f (S) (1 1/e)f (S ). Due to submodularity, we have f (S i )+ kx j=1 S i [{e 1,...,e j 1 }(e j ) apple f (S i )+ kx j=1 S i (e j ) (1) The greedy rule states that f (S i+1 ) f (S i ) S i (x) foranyx. Thus: f (S i )+ kx j=1 kx S i (e j ) apple f (S i )+ (f (S i+1 ) f (S i )) j=1 apple f (S i )+k(f (S i+1 ) f (S I )) (2) where the second inequality holds since S = k. N. Grammel Submodular Optimization 10 / 28
38 Greedy Algorithm: Proof Proof that f (S) (1 1/e)f (S ). Putting it all together: f (S ) f (S i ) apple k(f (S i+1 ) f (S i )) (1) Let i = f (S ) f (S i ). Then, we can rearrange to get 1 i apple k( i i+1 ), or i+1 apple i 1 k which yields k apple k k N. Grammel Submodular Optimization 10 / 28
39 Greedy Algorithm: Proof Proof that f (S) (1 1/e)f (S ). Since f is nonnegative, 0 = f (S ) f ({}) apple f (S ). A famous inequality states: 1 x apple e x for all x 2 R. Thisyields k apple 1 1 k k 0 apple 1 Since k = f (S ) f (S k ), we get 1 k f (S ) apple e k/k f (S ) k k = f (S ) f (S k ) apple e 1 f (S ) (1) f (S ) e 1 f (S ) apple f (S k ) (2) f (S )(1 1/e) apple f (S k ) (3) N. Grammel Submodular Optimization 10 / 28
40 (1 1/e) isthebestwecanget Theorem (Nemhauser and Wolsey 1978) Any algorithm that evaluate f on at most a polynomial number of inputs cannot do better than a (1 1/e)-approximation of the optimal solution. N. Grammel Submodular Optimization 11 / 28
41 Speedup with Lazy Evaluations (Minoux 1978) Evaluating S(x) foreveryx at each iteration may be costly Store for each item x avalue (x), representing an upper bound on S(x). Store the items sorted in order of. At each step, pick the element x at the front of the list (i.e. with maximum (x)) Lazy Evaluation: Evaluate S only for element x, andupdate (x) S(x). If after update, (x) (x 0 ) for all other x 0,thenx is still the best choice for the greedy algorithm! We ve avoided re-evaluating for all the other elements! N. Grammel Submodular Optimization 12 / 28
42 Submodular Cover Suppose instead we want to find S so that f (S )=f (U) while minimizing S. Again, suppose f is monotone and submodular. But: this time let f :2 U! N. Suppose we apply the same greedy rule until our constructed set S has f (S) =f (U). Do we have bounds on S? Yes! Theorem (Wolsey 1982) S apple(1 + ln ) S where =max x2u f ({x}) is the maximum possible increase in utility. N. Grammel Submodular Optimization 13 / 28
43 What about non-uniform costs? Up until now we have focused on cardinality: either constrained to S applek (maximization), or trying to minimize S (cover) N. Grammel Submodular Optimization 14 / 28
44 What about non-uniform costs? Up until now we have focused on cardinality: either constrained to S applek (maximization), or trying to minimize S (cover) What if we instead have a cost function c(x) forallx 2 U and want to maximize f (S) subjectto X c(x) apple B x2s for some budget B. This is called a Knapsack Constraint. N. Grammel Submodular Optimization 14 / 28
45 What about non-uniform costs? Up until now we have focused on cardinality: either constrained to S applek (maximization), or trying to minimize S (cover) What if we instead have a cost function c(x) forallx 2 U and want to maximize f (S) subjectto X c(x) apple B x2s for some budget B. This is called a Knapsack Constraint. Or minimize P x2s c(x) suchthats covers f (i.e. f (S) =f (U))? N. Grammel Submodular Optimization 14 / 28
46 Results for non-uniform costs Standard Greedy Algorithm could be arbitrarily bad: Doesn t consider costs at all! N. Grammel Submodular Optimization 15 / 28
47 Results for non-uniform costs Standard Greedy Algorithm could be arbitrarily bad: Doesn t consider costs at all! Modification: At each step, choose x that maximizes best bang for the buck S (x) c(x) N. Grammel Submodular Optimization 15 / 28
48 Results for non-uniform costs Standard Greedy Algorithm could be arbitrarily bad: Doesn t consider costs at all! Modification: At each step, choose x that maximizes best bang for the buck S (x) c(x) Can get similar results to uniform-cost case, with some slight modifications N. Grammel Submodular Optimization 15 / 28
49 Results for non-uniform costs Standard Greedy Algorithm could be arbitrarily bad: Doesn t consider costs at all! Modification: At each step, choose x that maximizes best bang for the buck S (x) c(x) Can get similar results to uniform-cost case, with some slight modifications Cover: Wolsey (1982) generalizes the result of uniform-cost case N. Grammel Submodular Optimization 15 / 28
50 Results for non-uniform costs Standard Greedy Algorithm could be arbitrarily bad: Doesn t consider costs at all! Modification: At each step, choose x that maximizes best bang for the buck S (x) c(x) Can get similar results to uniform-cost case, with some slight modifications Cover: Wolsey (1982) generalizes the result of uniform-cost case For maximization: A bit trickier. This greedy rule doesn t su ce! But some simple modifications can yield similar approximations to uniform-cost case. N. Grammel Submodular Optimization 15 / 28
Lecture 2. 1 Introduction. 2 The Set Cover Problem. COMPSCI 632: Approximation Algorithms August 30, 2017
COMPSCI 632: Approximation Algorithms August 30, 2017 Lecturer: Debmalya Panigrahi Lecture 2 Scribe: Nat Kell 1 Introduction In this lecture, we examine a variety of problems for which we give greedy approximation
More informationDistributed Submodular Maximization in Massive Datasets. Alina Ene. Joint work with Rafael Barbosa, Huy L. Nguyen, Justin Ward
Distributed Submodular Maximization in Massive Datasets Alina Ene Joint work with Rafael Barbosa, Huy L. Nguyen, Justin Ward Combinatorial Optimization Given A set of objects V A function f on subsets
More informationSubmodular Optimization in Computational Sustainability. Andreas Krause
Submodular Optimization in Computational Sustainability Andreas Krause Master Class at CompSust 2012 Combinatorial optimization in computational sustainability Many applications in computational sustainability
More informationCoverage Approximation Algorithms
DATA MINING LECTURE 12 Coverage Approximation Algorithms Example Promotion campaign on a social network We have a social network as a graph. People are more likely to buy a product if they have a friend
More informationCS224W: Analysis of Networks Jure Leskovec, Stanford University
HW2 Q1.1 parts (b) and (c) cancelled. HW3 released. It is long. Start early. CS224W: Analysis of Networks Jure Leskovec, Stanford University http://cs224w.stanford.edu 10/26/17 Jure Leskovec, Stanford
More informationOptimization of submodular functions
Optimization of submodular functions Law Of Diminishing Returns According to the Law of Diminishing Returns (also known as Diminishing Returns Phenomenon), the value or enjoyment we get from something
More information1 Linear programming relaxation
Cornell University, Fall 2010 CS 6820: Algorithms Lecture notes: Primal-dual min-cost bipartite matching August 27 30 1 Linear programming relaxation Recall that in the bipartite minimum-cost perfect matching
More information8 Matroid Intersection
8 Matroid Intersection 8.1 Definition and examples 8.2 Matroid Intersection Algorithm 8.1 Definitions Given two matroids M 1 = (X, I 1 ) and M 2 = (X, I 2 ) on the same set X, their intersection is M 1
More informationA survey of submodular functions maximization. Yao Zhang 03/19/2015
A survey of submodular functions maximization Yao Zhang 03/19/2015 Example Deploy sensors in the water distribution network to detect contamination F(S): the performance of the detection when a set S of
More informationNP Completeness. Andreas Klappenecker [partially based on slides by Jennifer Welch]
NP Completeness Andreas Klappenecker [partially based on slides by Jennifer Welch] Dealing with NP-Complete Problems Dealing with NP-Completeness Suppose the problem you need to solve is NP-complete. What
More informationCS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I. Instructor: Shaddin Dughmi
CS599: Convex and Combinatorial Optimization Fall 2013 Lecture 14: Combinatorial Problems as Linear Programs I Instructor: Shaddin Dughmi Announcements Posted solutions to HW1 Today: Combinatorial problems
More informationarxiv: v1 [cs.ds] 19 Nov 2018
Towards Nearly-linear Time Algorithms for Submodular Maximization with a Matroid Constraint Alina Ene Huy L. Nguy ên arxiv:1811.07464v1 [cs.ds] 19 Nov 2018 Abstract We consider fast algorithms for monotone
More information1 Non greedy algorithms (which we should have covered
1 Non greedy algorithms (which we should have covered earlier) 1.1 Floyd Warshall algorithm This algorithm solves the all-pairs shortest paths problem, which is a problem where we want to find the shortest
More informationExam problems for the course Combinatorial Optimization I (DM208)
Exam problems for the course Combinatorial Optimization I (DM208) Jørgen Bang-Jensen Department of Mathematics and Computer Science University of Southern Denmark The problems are available form the course
More informationConstructive and destructive algorithms
Constructive and destructive algorithms Heuristic algorithms Giovanni Righini University of Milan Department of Computer Science (Crema) Constructive algorithms In combinatorial optimization problems every
More informationGreedy Approximations
CS 787: Advanced Algorithms Instructor: Dieter van Melkebeek Greedy Approximations Approximation algorithms give a solution to a problem in polynomial time, at most a given factor away from the correct
More informationApproximation Algorithms
Chapter 8 Approximation Algorithms Algorithm Theory WS 2016/17 Fabian Kuhn Approximation Algorithms Optimization appears everywhere in computer science We have seen many examples, e.g.: scheduling jobs
More informationThe Maximum Facility Location Problem
The Maximum Facility Location Problem MITCHELL JONES Supervisor: Dr. Julián Mestre This thesis is submitted in partial fulfillment of the requirements for the degree of Bachelor of Computer Science and
More informationApproximation Algorithms
Approximation Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A 4 credit unit course Part of Theoretical Computer Science courses at the Laboratory of Mathematics There will be 4 hours
More informationTheorem 2.9: nearest addition algorithm
There are severe limits on our ability to compute near-optimal tours It is NP-complete to decide whether a given undirected =(,)has a Hamiltonian cycle An approximation algorithm for the TSP can be used
More informationOn Distributed Submodular Maximization with Limited Information
On Distributed Submodular Maximization with Limited Information Bahman Gharesifard Stephen L. Smith Abstract This paper considers a class of distributed submodular maximization problems in which each agent
More information6 Randomized rounding of semidefinite programs
6 Randomized rounding of semidefinite programs We now turn to a new tool which gives substantially improved performance guarantees for some problems We now show how nonlinear programming relaxations can
More information1 Overview. 2 Applications of submodular maximization. AM 221: Advanced Optimization Spring 2016
AM : Advanced Optimization Spring 06 Prof. Yaron Singer Lecture 0 April th Overview Last time we saw the problem of Combinatorial Auctions and framed it as a submodular maximization problem under a partition
More informationFeature Selection. Department Biosysteme Karsten Borgwardt Data Mining Course Basel Fall Semester / 262
Feature Selection Department Biosysteme Karsten Borgwardt Data Mining Course Basel Fall Semester 2016 239 / 262 What is Feature Selection? Department Biosysteme Karsten Borgwardt Data Mining Course Basel
More informationGreedy Algorithms 1 {K(S) K(S) C} For large values of d, brute force search is not feasible because there are 2 d {1,..., d}.
Greedy Algorithms 1 Simple Knapsack Problem Greedy Algorithms form an important class of algorithmic techniques. We illustrate the idea by applying it to a simplified version of the Knapsack Problem. Informally,
More informationOn Modularity Clustering. Group III (Ying Xuan, Swati Gambhir & Ravi Tiwari)
On Modularity Clustering Presented by: Presented by: Group III (Ying Xuan, Swati Gambhir & Ravi Tiwari) Modularity A quality index for clustering a graph G=(V,E) G=(VE) q( C): EC ( ) EC ( ) + ECC (, ')
More informationCSC2420 Fall 2012: Algorithm Design, Analysis and Theory
CSC2420 Fall 2012: Algorithm Design, Analysis and Theory Allan Borodin September 20, 2012 1 / 1 Lecture 2 We continue where we left off last lecture, namely we are considering a PTAS for the the knapsack
More informationIntroduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/18/14
600.363 Introduction to Algorithms / 600.463 Algorithms I Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/18/14 23.1 Introduction We spent last week proving that for certain problems,
More informationSubmodularity Reading Group. Matroid Polytopes, Polymatroid. M. Pawan Kumar
Submodularity Reading Group Matroid Polytopes, Polymatroid M. Pawan Kumar http://www.robots.ox.ac.uk/~oval/ Outline Linear Programming Matroid Polytopes Polymatroid Polyhedron Ax b A : m x n matrix b:
More informationCS261: A Second Course in Algorithms Lecture #15: Introduction to Approximation Algorithms
CS6: A Second Course in Algorithms Lecture #5: Introduction to Approximation Algorithms Tim Roughgarden February 3, 06 Coping with N P -Completeness All of CS6 and the first half of CS6 focus on problems
More informationGreedy Algorithms and Matroids. Andreas Klappenecker
Greedy Algorithms and Matroids Andreas Klappenecker Greedy Algorithms A greedy algorithm solves an optimization problem by working in several phases. In each phase, a decision is made that is locally optimal
More informationSolutions for the Exam 6 January 2014
Mastermath and LNMB Course: Discrete Optimization Solutions for the Exam 6 January 2014 Utrecht University, Educatorium, 13:30 16:30 The examination lasts 3 hours. Grading will be done before January 20,
More informationCS 598CSC: Approximation Algorithms Lecture date: March 2, 2011 Instructor: Chandra Chekuri
CS 598CSC: Approximation Algorithms Lecture date: March, 011 Instructor: Chandra Chekuri Scribe: CC Local search is a powerful and widely used heuristic method (with various extensions). In this lecture
More informationApproximation slides 1. An optimal polynomial algorithm for the Vertex Cover and matching in Bipartite graphs
Approximation slides 1 An optimal polynomial algorithm for the Vertex Cover and matching in Bipartite graphs Approximation slides 2 Linear independence A collection of row vectors {v T i } are independent
More informationGreedy Algorithms and Matroids. Andreas Klappenecker
Greedy Algorithms and Matroids Andreas Klappenecker Giving Change Coin Changing Suppose we have n types of coins with values v[1] > v[2] > > v[n] > 0 Given an amount C, a positive integer, the following
More information/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18
601.433/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18 22.1 Introduction We spent the last two lectures proving that for certain problems, we can
More informationOutline. CS38 Introduction to Algorithms. Approximation Algorithms. Optimization Problems. Set Cover. Set cover 5/29/2014. coping with intractibility
Outline CS38 Introduction to Algorithms Lecture 18 May 29, 2014 coping with intractibility approximation algorithms set cover TSP center selection randomness in algorithms May 29, 2014 CS38 Lecture 18
More informationSubmodular Functions, Optimization, and Applications to Machine Learning
Submodular Functions, Optimization, and Applications to Machine Learning Spring Quarter, Lecture 10 http://www.ee.washington.edu/people/faculty/bilmes/classes/ee596b_spring_2016/ Prof. Je Bilmes University
More informationWe ve done. Introduction to the greedy method Activity selection problem How to prove that a greedy algorithm works Fractional Knapsack Huffman coding
We ve done Introduction to the greedy method Activity selection problem How to prove that a greedy algorithm works Fractional Knapsack Huffman coding Matroid Theory Now Matroids and weighted matroids Generic
More information4 Greedy Approximation for Set Cover
COMPSCI 532: Design and Analysis of Algorithms October 14, 2015 Lecturer: Debmalya Panigrahi Lecture 12 Scribe: Allen Xiao 1 Overview In this lecture, we introduce approximation algorithms and their analysis
More informationAbsorbing Random walks Coverage
DATA MINING LECTURE 3 Absorbing Random walks Coverage Random Walks on Graphs Random walk: Start from a node chosen uniformly at random with probability. n Pick one of the outgoing edges uniformly at random
More informationA Class of Submodular Functions for Document Summarization
A Class of Submodular Functions for Document Summarization Hui Lin, Jeff Bilmes University of Washington, Seattle Dept. of Electrical Engineering June 20, 2011 Lin and Bilmes Submodular Summarization June
More informationCSE 548: Analysis of Algorithms. Lecture 13 ( Approximation Algorithms )
CSE 548: Analysis of Algorithms Lecture 13 ( Approximation Algorithms ) Rezaul A. Chowdhury Department of Computer Science SUNY Stony Brook Fall 2017 Approximation Ratio Consider an optimization problem
More informationGreedy Algorithms and Matroids. Andreas Klappenecker
Greedy Algorithms and Matroids Andreas Klappenecker Greedy Algorithms A greedy algorithm solves an optimization problem by working in several phases. In each phase, a decision is made that is locally optimal
More informationApproximation Algorithms
Approximation Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A new 4 credit unit course Part of Theoretical Computer Science courses at the Department of Mathematics There will be 4 hours
More informationval(y, I) α (9.0.2) α (9.0.3)
CS787: Advanced Algorithms Lecture 9: Approximation Algorithms In this lecture we will discuss some NP-complete optimization problems and give algorithms for solving them that produce a nearly optimal,
More informationCOMP 355 Advanced Algorithms Approximation Algorithms: VC and TSP Chapter 11 (KT) Section (CLRS)
COMP 355 Advanced Algorithms Approximation Algorithms: VC and TSP Chapter 11 (KT) Section 35.1-35.2(CLRS) 1 Coping with NP-Completeness Brute-force search: This is usually only a viable option for small
More informationPart I Part II Part III Part IV Part V. Influence Maximization
Part I Part II Part III Part IV Part V Influence Maximization 1 Word-of-mouth (WoM) effect in social networks xphone is good xphone is good xphone is good xphone is good xphone is good xphone is good xphone
More informationApproximation Algorithms
Approximation Algorithms Subhash Suri June 5, 2018 1 Figure of Merit: Performance Ratio Suppose we are working on an optimization problem in which each potential solution has a positive cost, and we want
More informationCS261: Problem Set #2
CS261: Problem Set #2 Due by 11:59 PM on Tuesday, February 9, 2016 Instructions: (1) Form a group of 1-3 students. You should turn in only one write-up for your entire group. (2) Submission instructions:
More informationNotes on Minimum Cuts and Modular Functions
Notes on Minimum Cuts and Modular Functions 1 Introduction The following are my notes on Cunningham s paper [1]. Given a submodular function f and a set S, submodular minimisation is the problem of finding
More information6. Lecture notes on matroid intersection
Massachusetts Institute of Technology 18.453: Combinatorial Optimization Michel X. Goemans May 2, 2017 6. Lecture notes on matroid intersection One nice feature about matroids is that a simple greedy algorithm
More informationViral Marketing and Outbreak Detection. Fang Jin Yao Zhang
Viral Marketing and Outbreak Detection Fang Jin Yao Zhang Paper 1: Maximizing the Spread of Influence through a Social Network Authors: David Kempe, Jon Kleinberg, Éva Tardos KDD 2003 Outline Problem Statement
More informationCSC 505, Fall 2000: Week 12
CSC 505, Fall 000: Week Proving the N P-completeness of a decision problem A:. Prove that A is in N P give a simple guess and check algorithm (the certificate being guessed should be something requiring
More informationFinal. Name: TA: Section Time: Course Login: Person on Left: Person on Right: U.C. Berkeley CS170 : Algorithms, Fall 2013
U.C. Berkeley CS170 : Algorithms, Fall 2013 Final Professor: Satish Rao December 16, 2013 Name: Final TA: Section Time: Course Login: Person on Left: Person on Right: Answer all questions. Read them carefully
More informationGreedy Algorithms 1. For large values of d, brute force search is not feasible because there are 2 d
Greedy Algorithms 1 Simple Knapsack Problem Greedy Algorithms form an important class of algorithmic techniques. We illustrate the idea by applying it to a simplified version of the Knapsack Problem. Informally,
More information5 MST and Greedy Algorithms
5 MST and Greedy Algorithms One of the traditional and practically motivated problems of discrete optimization asks for a minimal interconnection of a given set of terminals (meaning that every pair will
More informationOptimisation While Streaming
Optimisation While Streaming Amit Chakrabarti Dartmouth College Joint work with S. Kale, A. Wirth DIMACS Workshop on Big Data Through the Lens of Sublinear Algorithms, Aug 2015 Combinatorial Optimisation
More informationPartha Sarathi Mandal
MA 515: Introduction to Algorithms & MA353 : Design and Analysis of Algorithms [3-0-0-6] Lecture 39 http://www.iitg.ernet.in/psm/indexing_ma353/y09/index.html Partha Sarathi Mandal psm@iitg.ernet.in Dept.
More informationLecture 11: Maximum flow and minimum cut
Optimisation Part IB - Easter 2018 Lecture 11: Maximum flow and minimum cut Lecturer: Quentin Berthet 4.4. The maximum flow problem. We consider in this lecture a particular kind of flow problem, with
More informationLecture 9: Pipage Rounding Method
Recent Advances in Approximation Algorithms Spring 2015 Lecture 9: Pipage Rounding Method Lecturer: Shayan Oveis Gharan April 27th Disclaimer: These notes have not been subjected to the usual scrutiny
More informationGreedy Algorithms. Algorithms
Greedy Algorithms Algorithms Greedy Algorithms Many algorithms run from stage to stage At each stage, they make a decision based on the information available A Greedy algorithm makes decisions At each
More informationPolynomial-Time Approximation Algorithms
6.854 Advanced Algorithms Lecture 20: 10/27/2006 Lecturer: David Karger Scribes: Matt Doherty, John Nham, Sergiy Sidenko, David Schultz Polynomial-Time Approximation Algorithms NP-hard problems are a vast
More informationLast topic: Summary; Heuristics and Approximation Algorithms Topics we studied so far:
Last topic: Summary; Heuristics and Approximation Algorithms Topics we studied so far: I Strength of formulations; improving formulations by adding valid inequalities I Relaxations and dual problems; obtaining
More informationAlgorithm Design Methods. Some Methods Not Covered
Algorithm Design Methods Greedy method. Divide and conquer. Dynamic Programming. Backtracking. Branch and bound. Some Methods Not Covered Linear Programming. Integer Programming. Simulated Annealing. Neural
More informationChapter 16. Greedy Algorithms
Chapter 16. Greedy Algorithms Algorithms for optimization problems (minimization or maximization problems) typically go through a sequence of steps, with a set of choices at each step. A greedy algorithm
More informationInfluence Maximization in Location-Based Social Networks Ivan Suarez, Sudarshan Seshadri, Patrick Cho CS224W Final Project Report
Influence Maximization in Location-Based Social Networks Ivan Suarez, Sudarshan Seshadri, Patrick Cho CS224W Final Project Report Abstract The goal of influence maximization has led to research into different
More informationTowards more efficient infection and fire fighting
Towards more efficient infection and fire fighting Peter Floderus Andrzej Lingas Mia Persson The Centre for Mathematical Sciences, Lund University, 00 Lund, Sweden. Email: pflo@maths.lth.se Department
More informationAbsorbing Random walks Coverage
DATA MINING LECTURE 3 Absorbing Random walks Coverage Random Walks on Graphs Random walk: Start from a node chosen uniformly at random with probability. n Pick one of the outgoing edges uniformly at random
More information5 MST and Greedy Algorithms
5 MST and Greedy Algorithms One of the traditional and practically motivated problems of discrete optimization asks for a minimal interconnection of a given set of terminals (meaning that every pair will
More informationIntroduction to Mathematical Programming IE406. Lecture 20. Dr. Ted Ralphs
Introduction to Mathematical Programming IE406 Lecture 20 Dr. Ted Ralphs IE406 Lecture 20 1 Reading for This Lecture Bertsimas Sections 10.1, 11.4 IE406 Lecture 20 2 Integer Linear Programming An integer
More information35 Approximation Algorithms
35 Approximation Algorithms Many problems of practical significance are NP-complete, yet they are too important to abandon merely because we don t know how to find an optimal solution in polynomial time.
More information11.1 Facility Location
CS787: Advanced Algorithms Scribe: Amanda Burton, Leah Kluegel Lecturer: Shuchi Chawla Topic: Facility Location ctd., Linear Programming Date: October 8, 2007 Today we conclude the discussion of local
More informationCSE 417 Network Flows (pt 3) Modeling with Min Cuts
CSE 417 Network Flows (pt 3) Modeling with Min Cuts Reminders > HW6 is due on Friday start early bug fixed on line 33 of OptimalLineup.java: > change true to false Review of last two lectures > Defined
More informationGreat Theoretical Ideas in Computer Science
15-251 Great Theoretical Ideas in Computer Science Lecture 20: Randomized Algorithms November 5th, 2015 So far Formalization of computation/algorithm Computability / Uncomputability Computational complexity
More informationSubmodular Utility Maximization for Deadline Constrained Data Collection in Sensor Networks
Submodular Utility Maximization for Deadline Constrained Data Collection in Sensor Networks Zizhan Zheng, Member, IEEE and Ness B. Shroff, Fellow, IEEE Abstract We study the utility maximization problem
More informationChapter 8. NP-complete problems
Chapter 8. NP-complete problems Search problems E cient algorithms We have developed algorithms for I I I I I finding shortest paths in graphs, minimum spanning trees in graphs, matchings in bipartite
More informationWe augment RBTs to support operations on dynamic sets of intervals A closed interval is an ordered pair of real
14.3 Interval trees We augment RBTs to support operations on dynamic sets of intervals A closed interval is an ordered pair of real numbers ], with Interval ]represents the set Open and half-open intervals
More informationCS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 36
CS 473: Algorithms Ruta Mehta University of Illinois, Urbana-Champaign Spring 2018 Ruta (UIUC) CS473 1 Spring 2018 1 / 36 CS 473: Algorithms, Spring 2018 LP Duality Lecture 20 April 3, 2018 Some of the
More informationShannon Switching Game
EECS 495: Combinatorial Optimization Lecture 1 Shannon s Switching Game Shannon Switching Game In the Shannon switching game, two players, Join and Cut, alternate choosing edges on a graph G. Join s objective
More informationLecture 7. s.t. e = (u,v) E x u + x v 1 (2) v V x v 0 (3)
COMPSCI 632: Approximation Algorithms September 18, 2017 Lecturer: Debmalya Panigrahi Lecture 7 Scribe: Xiang Wang 1 Overview In this lecture, we will use Primal-Dual method to design approximation algorithms
More informationCoping with NP-Completeness
Coping with NP-Completeness Siddhartha Sen Questions: sssix@cs.princeton.edu Some figures obtained from Introduction to Algorithms, nd ed., by CLRS Coping with intractability Many NPC problems are important
More informationInteger and Combinatorial Optimization
Integer and Combinatorial Optimization GEORGE NEMHAUSER School of Industrial and Systems Engineering Georgia Institute of Technology Atlanta, Georgia LAURENCE WOLSEY Center for Operations Research and
More informationCOLUMN GENERATION IN LINEAR PROGRAMMING
COLUMN GENERATION IN LINEAR PROGRAMMING EXAMPLE: THE CUTTING STOCK PROBLEM A certain material (e.g. lumber) is stocked in lengths of 9, 4, and 6 feet, with respective costs of $5, $9, and $. An order for
More information16 Greedy Algorithms
16 Greedy Algorithms Optimization algorithms typically go through a sequence of steps, with a set of choices at each For many optimization problems, using dynamic programming to determine the best choices
More information15-451/651: Design & Analysis of Algorithms November 4, 2015 Lecture #18 last changed: November 22, 2015
15-451/651: Design & Analysis of Algorithms November 4, 2015 Lecture #18 last changed: November 22, 2015 While we have good algorithms for many optimization problems, the previous lecture showed that many
More informationRecitation 4: Elimination algorithm, reconstituted graph, triangulation
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 Recitation 4: Elimination algorithm, reconstituted graph, triangulation
More informationApproximation Algorithms
Approximation Algorithms Frédéric Giroire FG Simplex 1/11 Motivation Goal: Find good solutions for difficult problems (NP-hard). Be able to quantify the goodness of the given solution. Presentation of
More informationApproximability Results for the p-center Problem
Approximability Results for the p-center Problem Stefan Buettcher Course Project Algorithm Design and Analysis Prof. Timothy Chan University of Waterloo, Spring 2004 The p-center
More information4 Integer Linear Programming (ILP)
TDA6/DIT37 DISCRETE OPTIMIZATION 17 PERIOD 3 WEEK III 4 Integer Linear Programg (ILP) 14 An integer linear program, ILP for short, has the same form as a linear program (LP). The only difference is that
More informationA Lottery Model for Center-type Problems With Outliers
A Lottery Model for Center-type Problems With Outliers David G. Harris Thomas Pensyl Aravind Srinivasan Khoa Trinh Abstract In this paper, we give tight approximation algorithms for the k-center and matroid
More information12 Introduction to LP-Duality
12 Introduction to LP-Duality A large fraction of the theory of approximation algorithms, as we know it today, is built around linear programming (LP). In Section 12.1 we will review some key concepts
More informationVocalizing Large Time Series Efficiently
Vocalizing Large Time Series Efficiently Immanuel Trummer, Mark Bryan, Ramya Narasimha {it224,mab539,rn343}@cornell.edu Cornell University ABSTRACT We vocalize query results for time series data. We describe
More informationThe k-center problem Approximation Algorithms 2009 Petros Potikas
Approximation Algorithms 2009 Petros Potikas 1 Definition: Let G=(V,E) be a complete undirected graph with edge costs satisfying the triangle inequality and k be an integer, 0 < k V. For any S V and vertex
More informationSubmodularity Cuts and Applications
Submodularity Cuts and Applications Yoshinobu Kawahara The Inst. of Scientific and Industrial Res. (ISIR), Osaka Univ., Japan kawahara@ar.sanken.osaka-u.ac.jp Koji Tsuda Comp. Bio. Research Center, AIST,
More informationProblem set 2. Problem 1. Problem 2. Problem 3. CS261, Winter Instructor: Ashish Goel.
CS261, Winter 2017. Instructor: Ashish Goel. Problem set 2 Electronic submission to Gradescope due 11:59pm Thursday 2/16. Form a group of 2-3 students that is, submit one homework with all of your names.
More informationCSC 373: Algorithm Design and Analysis Lecture 3
CSC 373: Algorithm Design and Analysis Lecture 3 Allan Borodin January 11, 2013 1 / 13 Lecture 3: Outline Write bigger and get better markers A little more on charging arguments Continue examples of greedy
More information6.856 Randomized Algorithms
6.856 Randomized Algorithms David Karger Handout #4, September 21, 2002 Homework 1 Solutions Problem 1 MR 1.8. (a) The min-cut algorithm given in class works because at each step it is very unlikely (probability
More informationIn this lecture, we ll look at applications of duality to three problems:
Lecture 7 Duality Applications (Part II) In this lecture, we ll look at applications of duality to three problems: 1. Finding maximum spanning trees (MST). We know that Kruskal s algorithm finds this,
More informationGreedy Algorithms. Previous Examples: Huffman coding, Minimum Spanning Tree Algorithms
Greedy Algorithms A greedy algorithm is one where you take the step that seems the best at the time while executing the algorithm. Previous Examples: Huffman coding, Minimum Spanning Tree Algorithms Coin
More information