Conflict Graphs for Combinatorial Optimization Problems

Similar documents
The Knapsack Problem with Conflict Graphs

Algorithms for the Bin Packing Problem with Conflicts

Theorem 2.9: nearest addition algorithm

A Branch-and-Bound Algorithm for the Knapsack Problem with Conflict Graph

Approximation of Knapsack Problems with Conict and Forcing Graphs

Unit 8: Coping with NP-Completeness. Complexity classes Reducibility and NP-completeness proofs Coping with NP-complete problems. Y.-W.

Chordal deletion is fixed-parameter tractable

Decision Problems. Observation: Many polynomial algorithms. Questions: Can we solve all problems in polynomial time? Answer: No, absolutely not.

Optimization I : Brute force and Greedy strategy

CSC2420 Fall 2012: Algorithm Design, Analysis and Theory

NP Completeness. Andreas Klappenecker [partially based on slides by Jennifer Welch]

On 2-Subcolourings of Chordal Graphs

Minimal Dominating Sets in Graphs: Enumeration, Combinatorial Bounds and Graph Classes

Chordal Graphs: Theory and Algorithms

Solving NP-hard Problems on Special Instances

Approximation Algorithms

Interaction Between Input and Output-Sensitive

Algorithm Design Methods. Some Methods Not Covered

NP-Hardness. We start by defining types of problem, and then move on to defining the polynomial-time reductions.

Treewidth and graph minors

CSE 417 Branch & Bound (pt 4) Branch & Bound

Solutions for the Exam 6 January 2014

PTAS for Matroid Matching

Necessary edges in k-chordalizations of graphs

These notes present some properties of chordal graphs, a set of undirected graphs that are important for undirected graphical models.

Graphs and Network Flows IE411. Lecture 21. Dr. Ted Ralphs

Parameterized coloring problems on chordal graphs

FINAL EXAM SOLUTIONS

Faster parameterized algorithms for Minimum Fill-In

arxiv: v1 [cs.dm] 21 Dec 2015

Greedy algorithms is another useful way for solving optimization problems.

Graphs and Discrete Structures

In this lecture, we ll look at applications of duality to three problems:

Exponential time algorithms for the minimum dominating set problem on some graph classes

Chapter 9 Graph Algorithms

Chapter 10 Part 1: Reduction

A 2-APPROXIMATION ALGORITHM FOR THE MINIMUM KNAPSACK PROBLEM WITH A FORCING GRAPH. Yotaro Takazawa Shinji Mizuno Tokyo Institute of Technology

3 No-Wait Job Shops with Variable Processing Times

Chordal graphs MPRI

CS521 \ Notes for the Final Exam

Chapter 9 Graph Algorithms

CSC2420 Spring 2015: Lecture 2

Some Hardness Proofs

Approximation Algorithms for Geometric Intersection Graphs

9 About Intersection Graphs

! Greed. O(n log n) interval scheduling. ! Divide-and-conquer. O(n log n) FFT. ! Dynamic programming. O(n 2 ) edit distance.

APPROXIMATION ALGORITHMS FOR GEOMETRIC PROBLEMS

Fast algorithms for max independent set

Introduction to Optimization

Final. Name: TA: Section Time: Course Login: Person on Left: Person on Right: U.C. Berkeley CS170 : Algorithms, Fall 2013

3 INTEGER LINEAR PROGRAMMING

Coping with the Limitations of Algorithm Power Exact Solution Strategies Backtracking Backtracking : A Scenario

Faster parameterized algorithms for Minimum Fill-In

Graph Algorithms Using Depth First Search

NP and computational intractability. Kleinberg and Tardos, chapter 8

Towards a Memory-Efficient Knapsack DP Algorithm

FEDOR V. FOMIN. Lectures on treewidth. The Parameterized Complexity Summer School 1-3 September 2017 Vienna, Austria

REU Problems of the Czech Group

Algorithm and Complexity of Disjointed Connected Dominating Set Problem on Trees

Analysis of Algorithms - Greedy algorithms -

Approximation Algorithms

Chapter 8. NP and Computational Intractability. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Kernelization Upper Bounds for Parameterized Graph Coloring Problems

An exact algorithm for max-cut in sparse graphs

Index. C cmp;l, 126. C rq;l, 127. A A sp. B BANDWIDTH, 32, 218 complexity of, 32 k-bandwidth, 32, 218

Computing optimal total vertex covers for trees

Dominance Constraints and Dominance Graphs

8.1 Polynomial-Time Reductions

Welcome to the course Algorithm Design

val(y, I) α (9.0.2) α (9.0.3)

Reductions and Satisfiability

Chapter 9 Graph Algorithms

Ch9: Exact Inference: Variable Elimination. Shimi Salant, Barak Sternberg

Fixed-Parameter Algorithms

Minimum sum multicoloring on the edges of trees

Column Generation Based Primal Heuristics

Constructive and destructive algorithms

! Greed. O(n log n) interval scheduling. ! Divide-and-conquer. O(n log n) FFT. ! Dynamic programming. O(n 2 ) edit distance.

Fixed-Parameter Algorithms, IA166

The Square Root Phenomenon in Planar Graphs

How Much Of A Hypertree Can Be Captured By Windmills?

11/22/2016. Chapter 9 Graph Algorithms. Introduction. Definitions. Definitions. Definitions. Definitions

Exact Algorithms for NP-hard problems

Constraint Satisfaction Problems

Approximation Techniques for Utilitarian Mechanism Design

CMPSCI611: The SUBSET-SUM Problem Lecture 18

Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret

5 MST and Greedy Algorithms

Computability Theory

2. (a) Explain when the Quick sort is preferred to merge sort and vice-versa.

Exact Solution of the Robust Knapsack Problem

with Dana Richards December 1, 2017 George Mason University New Results On Routing Via Matchings Indranil Banerjee The Routing Model

Discrete mathematics II. - Graphs

4.1 Interval Scheduling

Common Induced Subgraph Isomorphism Structural Parameterizations and Exact Algorithms

Example of a Demonstration that a Problem is NP-Complete by reduction from CNF-SAT

ON WEIGHTED RECTANGLE PACKING WITH LARGE RESOURCES*

CSE 431/531: Analysis of Algorithms. Greedy Algorithms. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

5 MST and Greedy Algorithms

On the Max Coloring Problem

Transcription:

Conflict Graphs for Combinatorial Optimization Problems Ulrich Pferschy joint work with Andreas Darmann and Joachim Schauer University of Graz, Austria

Introduction Combinatorial Optimization Problem CO Consider any CO with decision variables x j {0, 1}, j V, and a feasible domain x S.

Introduction Combinatorial Optimization Problem CO Consider any CO with decision variables x j {0, 1}, j V, and a feasible domain x S. Conflict Structure Add disjunctive constraints for some pairs of variables: x i + x j 1 for (i, j) E V V = at most one of the two variables i, j can be set to 1. Representation by a conflict graph G = (V,E) Edges in G connect conflicting variables of CO.

Introduction Relation to Independent Set (IS) Feasible domain for CO problem with a conflict graph: Intersection of S with an independent / stable set problem in G

Introduction Relation to Independent Set (IS) Feasible domain for CO problem with a conflict graph: Intersection of S with an independent / stable set problem in G Complexity IS is already strongly N P-hard, no constant approximation ratio = adding an IS condition makes CO (much) more difficult.

Introduction Relation to Independent Set (IS) Feasible domain for CO problem with a conflict graph: Intersection of S with an independent / stable set problem in G Complexity IS is already strongly N P-hard, no constant approximation ratio = adding an IS condition makes CO (much) more difficult. One main direction of research: Identify special graph classes for the conflict graph G such that the considered CO problem is polynomially solvable permits a (fully) polynomial approximation scheme has a constant approximation ratio

Conflict Graphs: Bin Packing Conflicting items must be in different bins

Conflict Graphs: Bin Packing Conflicting items must be in different bins Jansen, Öhring 97: 5/2- resp. 2 + ε-approximation for special graph classes; improved by Epstein, Levin 06 (also on-line) Epstein et al. 08: extension to two dimensional packing of squares Jansen 99: A-FPTAS for special graph classes e.g.: perfect, bipartite, interval, d-inductive graphs.

Conflict Graphs: Bin Packing Conflicting items must be in different bins Jansen, Öhring 97: 5/2- resp. 2 + ε-approximation for special graph classes; improved by Epstein, Levin 06 (also on-line) Epstein et al. 08: extension to two dimensional packing of squares Jansen 99: A-FPTAS for special graph classes e.g.: perfect, bipartite, interval, d-inductive graphs. Gendreau et al. 04: heuristics and lower bounds Malaguti et al. 07: hybrid tabu search Malaguti et al. 08: exact algorithm (branch-and-price)

Conflict Graphs: Scheduling Mutual Exclusion Scheduling Schedule unit-length jobs on m machines, conflicting jobs not to be executed in the same time interval. Baker, Coffman 96; Bodlaender, Jansen 93; polynomially solvable special cases, special graph classes.

Conflict Graphs: Scheduling Mutual Exclusion Scheduling Schedule unit-length jobs on m machines, conflicting jobs not to be executed in the same time interval. Baker, Coffman 96; Bodlaender, Jansen 93; polynomially solvable special cases, special graph classes. Scheduling with Incompatible Jobs Conflicting jobs not to be executed on the same machine. Bodlaender, Jansen, Woeginger 94: approximation algorithms for special graph classes.

Knapsack Problem with Conflict Graph (KCG) Vertices (=items) adjacent in G cannot be packed together in the knapsack! ILP-Formulation (KCG) max n j=1 p jx j s.t. n j=1 w jx j c (i, j) E = x i + x j 1 x j {0, 1}

Knapsack Problem with Conflict Graph (KCG) Vertices (=items) adjacent in G cannot be packed together in the knapsack! ILP-Formulation (KCG) max n j=1 p jx j s.t. n j=1 w jx j c (i, j) E = x i + x j 1 x j {0, 1} Introduce upper bound P on optimal solution, e.g. P := n j=1 p j Note: Classical Greedy algorithm can perform as bad as possible!

Literature on KCG Exact Algorithms and Heuristics Yamada et al. 02: introduce the problem, present heuristic and exact algorithms (based on Lagrangean relaxation)

Literature on KCG Exact Algorithms and Heuristics Yamada et al. 02: introduce the problem, present heuristic and exact algorithms (based on Lagrangean relaxation) Hifi, Michrafy 06: reactive local search algorithm Hifi, Michrafy 07: exact algorithms (refined branch-and-bound strategy)

Literature on KCG Exact Algorithms and Heuristics Yamada et al. 02: introduce the problem, present heuristic and exact algorithms (based on Lagrangean relaxation) Hifi, Michrafy 06: reactive local search algorithm Hifi, Michrafy 07: exact algorithms (refined branch-and-bound strategy) no special graph classes considered!

Knapsack Problem with Conflict Graph (KCG) Our Goal Identify special graph classes, where KCG can be solved in pseudo-polynomial time and permits an FPTAS.

Knapsack Problem with Conflict Graph (KCG) Our Goal Identify special graph classes, where KCG can be solved in pseudo-polynomial time and permits an FPTAS. Our Results Pseudo-polynomial time algorithms and FPTAS for KCG on: Trees

Knapsack Problem with Conflict Graph (KCG) Our Goal Identify special graph classes, where KCG can be solved in pseudo-polynomial time and permits an FPTAS. Our Results Pseudo-polynomial time algorithms and FPTAS for KCG on: Trees Graphs with Bounded Treewidth

Knapsack Problem with Conflict Graph (KCG) Our Goal Identify special graph classes, where KCG can be solved in pseudo-polynomial time and permits an FPTAS. Our Results Pseudo-polynomial time algorithms and FPTAS for KCG on: Trees Graphs with Bounded Treewidth Chordal Graphs

Idea for KCG on Trees Basic Observation Apply Dynamic Programming by Profits = Scaling yields FPTAS We use Dynamic Programming by Reaching moving bottom up in the conflict tree.

Idea for KCG on Trees Basic Observation Apply Dynamic Programming by Profits = Scaling yields FPTAS We use Dynamic Programming by Reaching moving bottom up in the conflict tree. For every vertex i: Determine the solution of the subproblem defined by the subtree rooted in i.

Idea for KCG on Trees Basic Observation Apply Dynamic Programming by Profits = Scaling yields FPTAS We use Dynamic Programming by Reaching moving bottom up in the conflict tree. For every vertex i: Determine the solution of the subproblem defined by the subtree rooted in i. Notation z i (d) solution with profit d and minimal weight found in the subtree T(i) with item i necessarily included.

Idea for KCG on Trees Basic Observation Apply Dynamic Programming by Profits = Scaling yields FPTAS We use Dynamic Programming by Reaching moving bottom up in the conflict tree. For every vertex i: Determine the solution of the subproblem defined by the subtree rooted in i. Notation z i (d) solution with profit d and minimal weight found in the subtree T(i) with item i necessarily included. y i (d) solution with profit d and minimal weight found in the subtree T(i) with item i excluded.

Algorithmic Details Tree traversed in Depth-First-Search-Order (DFS) Vertex r excluded r excluded j

Algorithmic Details Tree traversed in Depth-First-Search-Order (DFS) Vertex r excluded r excluded j for every child j of r: y r (d) = min k {y r (d k) + min {z j (k), y j (k)}}

Algorithmic Details Tree traversed in Depth-First-Search-Order (DFS) Vertex r excluded r excluded j for every child j of r: y r (d) = min k {y r (d k) + min {z j (k), y j (k)}} Vertex r included r included j

Algorithmic Details Tree traversed in Depth-First-Search-Order (DFS) Vertex r excluded r excluded j for every child j of r: y r (d) = min k {y r (d k) + min {z j (k), y j (k)}} Vertex r included r included j for every child j of r: z r (d) = min k {z r (d k) + y j (k)}

Algorithmic Details Tree traversed in Depth-First-Search-Order (DFS) Vertex r excluded r excluded j for every child j of r: y r (d) = min k {y r (d k) + min {z j (k), y j (k)}} Vertex r included r included j for every child j of r: z r (d) = min k {z r (d k) + y j (k)} Running time: O(nP 2 ) Space: O(nP) (trivial version)

Space Reduction Method Worst case without reduction: O(nP)

Space Reduction Method Worst case without reduction: O(nP) processed

Space Reduction Method Worst case without reduction: O(nP) merged processed

Space Reduction Method Worst case without reduction: O(nP) storage space allocated merged processed

Space Reduction Method Worst case without reduction: O(nP) storage space allocated merged processed processed

Space Reduction Method Worst case without reduction: O(nP) storage space allocated merged storage space allocated processed merged processed Choosing the left child in every step allocates O(n) storage arrays. O(n)

Space Reduction Method Worst case without reduction: O(nP) storage space allocated merged storage space allocated processed merged processed Choosing the left child in every step allocates O(n) storage arrays. O(n) By choosing the right child vertex, only two arrays would be necessary.

Space Reduction Method Worst case without reduction: O(nP) storage space allocated merged storage space allocated processed merged processed Choosing the left child in every step allocates O(n) storage arrays. O(n) By choosing the right child vertex, only two arrays would be necessary. In general: Space can be reduced to O(log n) arrays. Worst-case: Complete binary tree

Space Reduction Method (general) Property 1. Tree T processed in DFS order with following rule: Take child vertex j, whose subtree contains the largest number of vertices.

Space Reduction Method (general) Property 1. Tree T processed in DFS order with following rule: Take child vertex j, whose subtree contains the largest number of vertices. Property 2. Each vertex of T requires O(k) space for processing.

Space Reduction Method (general) Property 1. Tree T processed in DFS order with following rule: Take child vertex j, whose subtree contains the largest number of vertices. Property 2. Each vertex of T requires O(k) space for processing. Property 3. Merging child j to parent i requires 2 O(k) space.

Space Reduction Method (general) Property 1. Tree T processed in DFS order with following rule: Take child vertex j, whose subtree contains the largest number of vertices. Property 2. Each vertex of T requires O(k) space for processing. Property 3. Merging child j to parent i requires 2 O(k) space. Lemma 1. An algorithm A fulfilling Properties 1, 2 and 3 uses at most (ld(n) + 1) O(k) space.

Space Reduction Method (general) Property 1. Tree T processed in DFS order with following rule: Take child vertex j, whose subtree contains the largest number of vertices. Property 2. Each vertex of T requires O(k) space for processing. Property 3. Merging child j to parent i requires 2 O(k) space. Lemma 1. An algorithm A fulfilling Properties 1, 2 and 3 uses at most (ld(n) + 1) O(k) space. Sketch of Proof. r has k childs i 1...i k so that T(i j ) n 2 and w.l.o.g T(i 1 ) T(i j ) for all j {1...k}: Then the processing of T(i 1 ) is done by using at most (ld( n 2 ) + 1) O(k) = ld(n) O(k) space. After merging this subtree to r this space can be deallocated, but O(k) space is used at vertex r, which has to be kept until A has finished.

Bounded Treewidth Tree-Decomposition of graph G = (V, E) Every graph can be represented by a tree whose vertices are subsets of V. Original graph can be reproduced from the tree-decomposition. E.g. adjacent vertices of G must be jointly contained in at least one subset. Treewidth: minimal cardinality over all tree-decompositions of the largest subset 1.

Bounded Treewidth Tree-Decomposition of graph G = (V, E) Every graph can be represented by a tree whose vertices are subsets of V. Original graph can be reproduced from the tree-decomposition. E.g. adjacent vertices of G must be jointly contained in at least one subset. Treewidth: minimal cardinality over all tree-decompositions of the largest subset 1. Example b d a c e g f h

Bounded Treewidth Tree-Decomposition of graph G = (V, E) Every graph can be represented by a tree whose vertices are subsets of V. Original graph can be reproduced from the tree-decomposition. E.g. adjacent vertices of G must be jointly contained in at least one subset. Treewidth: minimal cardinality over all tree-decompositions of the largest subset 1. Example b d a c e g f h a b c a f c c d e a g f g h

Bounded Treewidth Treewidth Treewidth indicates how far is the graph away from a tree. trees have treewidth 1 series parallel graphs have treewidth 2...

Bounded Treewidth Treewidth Treewidth indicates how far is the graph away from a tree. trees have treewidth 1 series parallel graphs have treewidth 2... Nice Tree-Decomposition For algorithmic purposes, the structure of the decomposition is restricted to four simple configurations. A nice tree-decomposition with the same treewidth can be computed from a tree-decomposition in O(n) time. [cf. Bodlaender, Koster 08] Many N P-hard problems can be solved efficiently for graphs with bounded treewidth. [Bodlaender 97]

Solving KCG with Bounded Treewidth Algorithmic Idea Take a nice tree-decomposition T of G Process T in a DFS way. For every vertex of T (i.e. a subset of V): Consider all independent sets (IS) of all vertices in T explicitly. Note: Number of IS is constant!

Solving KCG with Bounded Treewidth Algorithmic Idea Take a nice tree-decomposition T of G Process T in a DFS way. For every vertex of T (i.e. a subset of V): Consider all independent sets (IS) of all vertices in T explicitly. Note: Number of IS is constant! Perform dynamic programming and consider inclusion or exclusion for every IS.

Solving KCG with Bounded Treewidth Algorithmic Idea Take a nice tree-decomposition T of G Process T in a DFS way. For every vertex of T (i.e. a subset of V): Consider all independent sets (IS) of all vertices in T explicitly. Note: Number of IS is constant! Perform dynamic programming and consider inclusion or exclusion for every IS. Time and Space KCG for conflict graphs of bounded treewidth can be solved in O(nP 2 ) time and O(log n P + n) space given a tree-decomposition. The space reduction method of Lemma 1 can be applied again!

Chordal Graphs Definition A Chordal Graph (a.k.a. triangulated graph) does not contain induced cycles other than triangles. = every cycle of at least four vertices has a chord.

Chordal Graphs Definition A Chordal Graph (a.k.a. triangulated graph) does not contain induced cycles other than triangles. = every cycle of at least four vertices has a chord. Example

Chordal Graphs Definition A Chordal Graph (a.k.a. triangulated graph) does not contain induced cycles other than triangles. = every cycle of at least four vertices has a chord. Example

Chordal Graphs Definition A Chordal Graph (a.k.a. triangulated graph) does not contain induced cycles other than triangles. = every cycle of at least four vertices has a chord. Example

Chordal Graphs Bounded Treewidth versus Chordal Graph No subset relation! Tree Bounded Chordal series parallel K n

Solving KCG on Chordal Graphs Tree Representation For every chordal graph G there is a clique tree T = (K, E): maximal cliques K of G are vertices of T for each vertex v G: all cliques K containing v induce a subtree in T. cf. [Blair, Peyton 93]

Solving KCG on Chordal Graphs Tree Representation For every chordal graph G there is a clique tree T = (K, E): maximal cliques K of G are vertices of T for each vertex v G: all cliques K containing v induce a subtree in T. cf. [Blair, Peyton 93] Basic Idea of the Algorithm The vertices of the clique tree T are complete subgraphs. = at most one vertex of each clique can be in the knapsack. Process T in a DFS way.

Solving KCG on Chordal Graphs Dynamic Programming Definition f v d (I): solution with profit d and minimal weight containing item v I, while considering in the clique tree only the subtree rooted in I. definition extended to v =.

Solving KCG on Chordal Graphs Vertex R with one (or first) child J d b a e c R a b c b c d e J

Solving KCG on Chordal Graphs Vertex R with one (or first) child J d b a e c R a b c b c d e J for v R: if v R J: {b, c} fd v(r) = f d v(j)

Solving KCG on Chordal Graphs Vertex R with one (or first) child J d b a e c R a b c b c d e J for v R: if v R J: {b, c} fd v(r) = f d v(j) else: {a} fd v (R) = w(v)+ } min i {fd p(v) i (J) : i (J \ R)

Solving KCG on Chordal Graphs Vertex R with one (or first) child J d b a e c R a b c b c d e J for v R: if v R J: {b, c} fd v(r) = f d v(j) else: {a} fd v (R) = w(v)+ } min i {fd p(v) i { (J) : i (J \ R) f i d (J) : i (J \ R) } f d (R) = min i

Solving KCG on Chordal Graphs Vertex R with second child J h for v R: R a g a b c if v R J: d b f c e b c d e a f g h J {a} fd v(r) = } min k {fk v(r) + f d k+p(v) v (J) fd v(r) = f d v (R) w(v)

Solving KCG on Chordal Graphs Vertex R with second child J h for v R: R a g a b c if v R J: d b f c e b c d e a f g h J {a} fd v(r) = } min k {fk v(r) + f d k+p(v) v (J) fd v(r) = f d v (R) w(v) else: {b, c} { fd v(r) = min i,k f v k (R) + fd k i (J) : i (J \ R) }

Solving KCG on Chordal Graphs Vertex R with second child J h for v R: R a g a b c if v R J: d b f c e b c d e a f g h f d (R) = min i,k J {a} fd v(r) = } min k {fk v(r) + f d k+p(v) v (J) fd v(r) = f d v (R) w(v) else: {b, c} { fd v(r) = min i,k f v k (R) + fd k i (J) : i (J \ R) } { } fk (R) + f d k i (J) : i (J \ R)

Solving KCG on Chordal Graphs Running Time Straightforward: O(n 4 P 2 )

Solving KCG on Chordal Graphs Running Time Straightforward: O(n 4 P 2 ) Taking a closer look: O(n 2 P 2 )

Solving KCG on Chordal Graphs Running Time Straightforward: O(n 4 P 2 ) Taking a closer look: O(n 2 P 2 ) Space O(n log n P) with space reduction technique!

Solving KCG on Chordal Graphs Running Time Straightforward: O(n 4 P 2 ) Taking a closer look: O(n 2 P 2 ) Space O(n log n P) with space reduction technique! Storing Solution Sets (for all three algorithms) Note: Storing not only solution values but solution sets increases time and space by a factor of n (or log n for bit-encoding).

Solving KCG on Chordal Graphs Running Time Straightforward: O(n 4 P 2 ) Taking a closer look: O(n 2 P 2 ) Space O(n log n P) with space reduction technique! Storing Solution Sets (for all three algorithms) Note: Storing not only solution values but solution sets increases time and space by a factor of n (or log n for bit-encoding). This can be avoided by applying a general recursive divide and conquer technique (see Pferschy 99).

Deriving an FPTAS from Dynamic Programming Scaling Classical approach of an FPTAS for the knapsack problem: Scale profits: ( ) n p j = Running time: n P εp max p j ( scaling n P n n p max = n 2 Induced relative error can be bounded by ε. n εp max ) p max = n3 ε

Deriving an FPTAS from Dynamic Programming Scaling Classical approach of an FPTAS for the knapsack problem: Scale profits: ( ) n p j = Running time: n P εp max p j ( scaling n P n n p max = n 2 Induced relative error can be bounded by ε. n εp max ) p max = n3 ε FPTAS for KCG Same scaling approach can be applied to all three KCG algorithms. Technical details rather straightforward.

Minimum Spanning Tree with Conflict Graph (MSTCG) Problem Description given: a weighted graph H = (V,E, w), a conflict graph G = (E, Ē) whose m vertices correspond uniquely to edges in E. edge ē = (i, j) Ē = conflict between the two edges i, j E

Minimum Spanning Tree with Conflict Graph (MSTCG) Problem Description given: a weighted graph H = (V,E, w), a conflict graph G = (E, Ē) whose m vertices correspond uniquely to edges in E. edge ē = (i, j) Ē = conflict between the two edges i, j E Problem (MSTCG): find a minimum spanning tree T E of H without conflicts w.r.t. G

Minimum Spanning Tree with Conflict Graph (MSTCG) Question: For which type of conflict graph does MSTCG become N P-hard?

Minimum Spanning Tree with Conflict Graph (MSTCG) Question: For which type of conflict graph does MSTCG become N P-hard? Definition 2-ladder: graph whose components are paths of length one. 3-ladder: graph whose components are paths of length two.

Minimum Spanning Tree with Conflict Graph (MSTCG) Question: For which type of conflict graph does MSTCG become N P-hard? Definition 2-ladder: graph whose components are paths of length one. 3-ladder: graph whose components are paths of length two. 2-ladder 3-ladder (imagine a rope ladder)

Minimum Spanning Tree with Conflict Graph (MSTCG) Question: For which type of conflict graph does MSTCG become N P-hard? Definition 2-ladder: graph whose components are paths of length one. 3-ladder: graph whose components are paths of length two. Our Complexity Results MSTCG is polynomially solvable if the conflict graph G is a 2-ladder.

Minimum Spanning Tree with Conflict Graph (MSTCG) Question: For which type of conflict graph does MSTCG become N P-hard? Definition 2-ladder: graph whose components are paths of length one. 3-ladder: graph whose components are paths of length two. Our Complexity Results MSTCG is polynomially solvable if the conflict graph G is a 2-ladder. MSTCG is strongly N P-hard if the conflict graph G is a 3-ladder.

MSTCG with a 2-ladder Conflict-free Matroid Subsets of edges NOT containing any conflicting pair define the conflict-free matroid: I := { E E (e, f ) Ē : {e, f } E } Note that in a 2-ladder conflicting pairs are independent from each other.

MSTCG with a 2-ladder Conflict-free Matroid Subsets of edges NOT containing any conflicting pair define the conflict-free matroid: I := { E E (e, f ) Ē : {e, f } E } Note that in a 2-ladder conflicting pairs are independent from each other. = MSTCG is the intersection of the graphic matroid (MST) with the conflict-free matroid Matroid Intersection MSTCG with a 2-ladder can be solved by Edmonds weighted matroid intersection algorithm in polynomial time.

MSTCG with a 3-ladder Complexity Result MSTCG with a 3-ladder is strongly N P-hard.

MSTCG with a 3-ladder Complexity Result MSTCG with a 3-ladder is strongly N P-hard. Construction Reduction from the special case of 3-SAT, where each variable occurs in at most 5 clauses (still N P-complete). Construct an instance of MST and a 3-ladder conflict graph as follows:

MSTCG with a 3-ladder h 11 C 1 1 1 g 11 1 For every clause, e.g. C 1, a fork represents the three literals. f 11 e 11 11 a 1 ā 1 w 14 x 1 w 13 x 1 b 1 z 14 z 13 w 12 w11 z 12 z 11 b1 y 1 w 10 ȳ 1 1 2

MSTCG with a 3-ladder h 11 C 1 1 1 g 11 1 f 11 e 11 11 a 1 ā 1 w 14 x 1 w 13 x 1 b 1 z 14 z 13 w 12 w11 z 12 z 11 b1 For every clause, e.g. C 1, a fork represents the three literals. For every variable, two gadgets represent x 1 and x 1. Edges w 1j, z 1j for the 5 clauses variable 1 appears in. y 1 w 10 ȳ 1 1 2

MSTCG with a 3-ladder h 11 C 1 1 1 g 11 1 f 11 e 11 11 a 1 ā 1 w 14 x 1 w 13 x 1 b 1 z 14 z 13 w 12 w11 z 12 z 11 y 1 w 10 ȳ 1 1 2 b1 For every clause, e.g. C 1, a fork represents the three literals. For every variable, two gadgets represent x 1 and x 1. Edges w 1j, z 1j for the 5 clauses variable 1 appears in. Spanning tree: Connection from C 1 to 1 either via x 1 or via w 10,...,w 14. edge x 1 in the tree x 1 = TRUE

MSTCG with a 3-ladder h 11 C 1 1 1 g 11 1 f 11 e 11 11 Conflict Graph G: (x 1, x 1 ) ( 11, g 11 ) (z 11, w 11, f 11 ) (z 1j, w 1j, f 1j ), j = 2,3,4 (w 10, f 10 ) a 1 ā 1 w 14 x 1 w 13 x 1 b 1 z 14 z 13 w 12 w11 z 12 z 11 b1 y 1 w 10 ȳ 1 1 2

MSTCG with a 3-ladder h 11 C 1 1 1 g 11 1 f 11 e 11 11 a 1 ā 1 w 14 x 1 w 13 x 1 b 1 z 14 z 13 w 12 w11 z 12 z 11 y 1 w 10 ȳ 1 1 2 b1 Conflict Graph G: (x 1, x 1 ) ( 11, g 11 ) (z 11, w 11, f 11 ) (z 1j, w 1j, f 1j ), j = 2,3,4 (w 10, f 10 ) Connect C 1 to 1 in MST: choose g 11 f 11 w 11 forbidden other w 1j : either forbidden or chosen z 1j forbidden = only connection via x 1

MSTCG with a 3-ladder h 11 C 1 1 1 g 11 1 f 11 e 11 11 a 1 ā 1 w 14 x 1 w 13 x 1 b 1 z 14 z 13 w 12 w11 z 12 z 11 y 1 w 10 ȳ 1 1 2 b1 Conflict Graph G: (x 1, x 1 ) ( 11, g 11 ) (z 11, w 11, f 11 ) (z 1j, w 1j, f 1j ), j = 2,3,4 (w 10, f 10 ) Connect C 1 to 1 in MST: choose g 11 f 11 w 11 forbidden other w 1j : either forbidden or chosen z 1j forbidden = only connection via x 1 MST: exactly one edge g ij for every clause j

MSTCG with a 3-ladder Construction It can be shown: There is a truth assignment for a 3-Sat instance with k clauses there is a conflict-free spanning tree with weight k.

MSTCG with a 3-ladder Construction It can be shown: There is a truth assignment for a 3-Sat instance with k clauses there is a conflict-free spanning tree with weight k. Conclusion Conflicts makes optimization (and life in general) much more difficult. = try to avoid conflicts whenever possible! Thank you for your attention!