Ch9: Exact Inference: Variable Elimination. Shimi Salant, Barak Sternberg
|
|
- Morris West
- 5 years ago
- Views:
Transcription
1 Ch9: Exact Inference: Variable Elimination Shimi Salant Barak Sternberg
2 Part 1
3 Reminder introduction (1/3) We saw two ways to represent (finite discrete) distributions via graphical data structures: Bayesian Network is a DAG =Π factorizes over for each = Markov Network means: : an edge is an undirected graph = Φ Φ factorizes over means: each is a clique in Φ Φ Φ D I G S L = ϕ ϕ ϕ
4 Reminder introduction (2/3) We reviewed a concept of separation in graphs and saw we can query the graph itself in order to get independence assertions all of which apply in the represented distribution. Bayesian Network ={ B d-sepg Markov Network ; } ={ Φ Φ seph ; } * we also mentioned that these are solid representations in the sense that are "infinitely many more" 's for which = than those for which.
5 Reminder introduction (3/3) We will now use these graphical data structure in order to: answer probabilistic queries such as = =? show that properties of the graphs determine upper bounds for computational cost of answering query i.e. these properties provide a way to gauge/reduce costs. show algorithms that take these properties into consideration.
6 Definition of the inference task (1/2) Some general context: types of tasks we may wish to carry out with PGMs: inference learning structure learning given graph and factors/cpds find probabilities e.g. =? given graph and data find factors/cpds (namely: their parameters) given variables and data find graph structure and factors/cpds * extra characteristic of learning tasks: data might be partially observed i.e. we're not given values for all variables.
7 Definition of the inference task (2/2) The exact inference task is defined as: Given a fully parameterized BN or MN over variables = and : Compute = are the query variables are the evidence/observed variables is the evidence itself. can be empty in which case we're after. * note that we're after a distribution size of answer is which - to begin with - is exponential in.
8 Hardness of inference task (1/12) We first consider only BNs: = =? For any specific valuation we can compute the joint.5.5 = = = :
9 Hardness of inference task.5.5 = =
10 Hardness of inference task.5.5 = =
11 Hardness of inference task.5.5 = =
12 Hardness of inference task.5.5 = =
13 Hardness of inference task.5.5 = = =. 8
14 Hardness of inference task (2/12) Task: compute =. Naive solution: sum the joint. Denote Σ =. For each Then compute Σ Then for each / = This would entail = = compute:.. compute. = work for each.
15 Hardness of inference task (3/12) Task: compute =. Any solution: we show it's NP-hard. Reminder: The 3-SAT decision problem: Given a formula over binary variables = where every = ± ± ± is a disjunction of 3 literals [example: = - decide whether there is a satisfying assignment to. 3-SAT is in NP: i.e. there is an polynomial-time verifier algorithm that for a given problem instance and an assignment checks if proves that that is satisfiable. 3-SAT is NP-hard: i.e. every other problem in NP can be reduced to 3-SAT. ]
16 Hardness of inference task (4/12) Consider the following tasks each a sub-case of the next: (1) (2) (3) (4) = =? = = =? = =? = >? ( can be empty) ( can be empty is a single variable) (a decision problem) If (4) is NP-hard the original inference task (1) certainly is. The BN-Positive problem: Given a BN a variable and value decide whether = >.
17 Hardness of inference task (5/12) The BN-Positive problem: Given a BN a variable and value decide whether = >. BN-Positive is NP-hard: We next show a reduction from 3-SAT to BN-Positive. (also: BN-Positive is in NP: Given (a candidate proof) values to all variables where = in we can compute = ξ and check for positivity. i.e. BN-Positive is NP-Complete.)
18 Hardness of inference task (6/12) Given a 3-SAT formula = we construct a BN: (in polynomial time) for each formula variable = =. over binary variables { : a binary variable with for each formula clause : a deterministic binary variable with ( = ) = iff is satisfied by values of a layer of deterministic binary AND variables { = = iff all are valued. } } s.t.
19 Hardness of inference task (7/12) is not satisfiable no assignment to { } can yield is satisfiable an assignment to { } can yield = = = hence = hence = = > we can solve 3-SAT. i.e. if we can answer whether = BN-Positive problem (4) is NP-hard. = =? our initial inference problem (1) is NP-hard for BNs. =. >.
20 Hardness of inference task (8/12) For MNs: we can easily translate a BN to an MN (hence MNs are NP-hard too): Given a BN where =Π we can construct an MN Φ modeling the same distribution by: using CPDs as factors: ( )= and Φ =. translate directed to undirected (also need to add edges between all parents of a node cost of addition is polynomial). Φ = D I G S L = Φ ϕ
21 Hardness of inference task (9/12) * extra note: BN-Positive is a decision problem i.e. of the for are there a y solutio s (in BN above: = >? i.e. is there an event in the event space - all possible valuations to all variables - in which = and event has a positive probability). The general i fere e task is of the for how a y solutio s are there (in BN above: knowing how many such = > events there are we would have divided this amount by size of event space to obtain the probability = ). It turns out the inference task belongs to an even harder class of problems #P-hard whi h o sists of su h ou ti g pro le s.
22 Hardness of inference task (10/12) Approximate inference: obtain an approximation for. We consider measuring accuracy of = rather than that of def: def: has absolute error if. might be too coarse of a criterion e.g. = has relative error if + + is between. e.g. true answer for. and.. 8.
23 Hardness of inference task (11/12) def: def: has absolute error if. might be too coarse of a criterion e.g. = has relative error if + + is between. e.g. true answer If we can get such a relative error approximation > > and we can solve BN-Positive which is NP-hard relative error inference is NP-hard. for. and :. 8.
24 Hardness of inference task (12/12) def: def: has absolute error if. might be too coarse of a criterion e.g. = has relative error if + + is between. e.g. true answer for. and For absolute error inference the situation is more specific: For the case where no evidence is present i.e. there exists a randomized polynomial-time algorithm. For the case where evidence is present i.e. and. - absolute error inference is NP-hard.. Namely - any useful absolute error inference is NP-hard when evidence is present. 8.
25 End of part 1
26
27 VE algorithm is + is number of initial where is number of variables factors in factor set Φ and is the size (number of entries) of the largest intermediate factor created throughout run.
28
29 def 9.5: the induced graph for factor set Φ (over ) and an ordering (over eliminated variables ) we define the induced graph Φ as an undirected graph over with edges where and appear in the scope of some intermediate factor when Sum-Product-VE(Φ ) is executed. theorem 9.6: For Φ and their induced graph Φ : The scope of every is a clique in Φ. Every maximal clique in Φ is the scope of a certain. 1st statement means: size of largest clique in Φ bounds 2st statement means: bound is tight it is encountered..
30
31 Part 3
32 Finding elimination orderings (1/12) We reached the problem of finding an optimal ordering for VE. Considering a network Φ we saw each ordering yields an induced graph Φ whose largest clique size is an exponent in run-time bound. We now consider the ordering problem in purely graphical terms. We can observe the induced graph K of the network's graph (instead of its factor set Φ) since the induced graph depends only on how variables appear together in factors i.e. depends only on structure of network's graph.
33 Finding elimination orderings Compute = : Sum-Product-VE(Φ = { } = Φ= { )) }
34 Finding elimination orderings Compute : Sum-Product-VE(Φ = { } = Φ= { Eliminate )) = (factor size: 2) = Σ = Σ an added edge to induced-graph (was already present in original graph) }
35 Finding elimination orderings Compute : Sum-Product-VE(Φ = { } = Φ= { Eliminate )) } = (factor size: 3) =Σ=Σ an added fill-edge to induced-graph (wasn't present in original graph)
36 Finding elimination orderings Compute : Sum-Product-VE(Φ = { } = Φ= { Eliminate )) } = (factor size: 2) =Σ =Σ
37 Finding elimination orderings Compute : Sum-Product-VE(Φ = { } = Φ= { Eliminate )) } = (factor size: 2) = Σ = ΣS
38 Finding elimination orderings Compute : Sum-Product-VE(Φ = { } = Φ= { } = )) =
39 Finding elimination orderings (2/12) def: def: the induced-width of is the size of largest clique in (the -1 is for having zero width for an which is a tree) the tree-width of is its minimal induced-width = min (i.e. size of the smallest clique we can hope for in an induced graph of minus 1) The ordering problem is now: find = rgmin - 1.
40 Finding elimination orderings (3/12) (1) = rgmin =? (2) = min =? (3) ℕ? (find optimal ordering) (find min. induced width i.e. tree-width) (decide whether tree-width ) A theorem from graph theory: (3) is NP-complete. (2) is NP-hard (1) is NP-hard and we have now shown that finding optimal ordering is NP-hard. We will now translate the ordering problem to a different graphical problem thereby yielding an ordering algorithms for a certain type of graphs.
41 Finding elimination orderings (4/12) def: A graph (directed or undirected) is chordal if every loop of length greater than 3 has a chord. Such a graph is tria gulated all polygons are divided to triangles.
42 Finding elimination orderings (5/12) We next show that the class of induced graphs is the class of chordal graphs. * remark: this is also the class of graphs for which there are prefect I-maps i.e. for a graph in this class = for a distribution that factorize over graph.
43 Finding elimination orderings (6/12) is induced is chordal: Assume is not chordal: it has a loop Assume WLOG was eliminated first. After line 3 of Sum-Product-Eliminate-Var(Φ no more edges are added to. ) has the edges : when line 3 runs: appears in some factor with appears in some factor with will appear in of line 3 has the edge Since.
44 Finding elimination orderings (7/12) is chordal is induced: def: For an undirected graph over a tree is a clique tree for if: every node in the tree is a clique in the graph. every maximal clique in the graph is a node in the tree. for each an edge in : separates nodes in : those appearing on 's side of the tree from those appearing on 's side.
45 Finding elimination orderings (8/12) def: For an undirected graph over a tree is a clique tree for if: every node in the tree is a clique in the graph. every maximal clique in the graph is a node in the tree. for each an edge in : separates nodes in : those appearing on 's side of the tree from those appearing on 's side.
46 Finding elimination orderings (9/12) Not every graph has a clique tree but: Every chordal graph has a clique tree. A property of clique trees: Every leaf in the tree has a node present only there.
47 Finding elimination orderings (10/12) is chordal is induced: We show that if is chordal then = for some ordering. Claim: all variables of a chordal graph over variables can be eliminated without introducing fill-edges. Induction on : is chordal it has a clique tree. is a clique tree it has a leaf node present only in. and in that leaf node a variable can be eliminated from without introducing fill-edges: present only in all of 's neighbors are in. is a clique in all neighbors already connected to each other. Removing from we are left with a chordal graph of variables.
48 Finding elimination orderings Example: graph = 's clique tree induced graph
49 Finding elimination orderings graph = 's clique tree induced graph
50 Finding elimination orderings graph = 's clique tree induced graph
51 Finding elimination orderings graph = 's clique tree induced graph
52 Finding elimination orderings graph = 's clique tree induced graph
53 Finding elimination orderings graph = 's clique tree induced graph
54 Finding elimination orderings graph = 's clique tree induced graph
55 Finding elimination orderings graph = 's clique tree induced graph
56 Finding elimination orderings (11/12) For a chordal graph we have an algorithm that does not produce fill-edges: Max-Cardinality produces an ordering consistent with always choosing a node which is only present in a leaf of the (current) graph's clique-tree and only in that leaf. Meaning: Max-Cardinality solves the ordering problem for chordal graphs.
57 Finding elimination orderings (12/12) The general ordering problem therefore reduces to a well-known problem from graph-theory: Minimal-Triangulation: Given a graph find a chordal graph containing such that 's largest clique is as small as possible. There are different graph-theoretic algorithms addressing Minimal-Triangulation offering different levels of performance guarantees.
58 Greedy search for elimination ordering (1/3) We saw that the inference problem = =? is NP-hard. Moreover even finding an optimal ordering for variable elimination for running Sum-Product-VE(Φ ) is NP-hard. A practical approach is therefore to search the ordering space for a sub-optimal yet as-good-as-possible ordering. We can search this space greedily each time choosing a variable whose elimination will result in least cost (for some definition of cost) given current state of induced graph. We do not have formal complexity guarantees for this approach yet it works surprisingly well in practice.
59 Greedy search for elimination ordering (2/3) * note that is the induced graph. Possible evaluation metrics: min-neigh min-neigh-weight min-fill min-fill-weight = number of neighbors of = product of cardinalities of neighbors of = number of added fill-edges if is eliminated = sum of weights of fill-edges where the weight of a fill-edge is the product of cardinalities of edge's nodes
60 Greedy search for elimination ordering (3/3) Neither of these is universally better than others. A variant of the greedy approach is to introduce stochasticity into it by not always deterministically selecting the that minimizes. For example: upon each elimination round select out of a random half of remaining variables. This serves to introduce exploration into the ordering search (rather than have it be fully exploitative). Note that the greedy algorithm runs in polynomial time and it can compute the number of operations the VE algorithm itself will execute. A suggested practice for large networks where such pre-computation time is negligible is to execute the greedy algorithm multiple times and use the best ordering obtained.
61
62 *extra material
63 Conditioning (1/6) Consider MNs: Φ = Φ Φ = Φ Φ = Φ Sum-Product-VE calculated Sum-Product-VE(Φ ) ( = ) returned such that: =Σ =Σ =Σ = Π Φ Φ = Φ :... and if Φ is needed - renormalize.
64 Conditioning (2/6) We used the ability to perform a sum-product calculation for getting conditional probabilities: Cond-Prob-VE calculated Φ and Φ : It returned s.t. (now = ) and a scalar:... and if Φ =Σ =Σ =Σ =Σ =Σ is needed: Φ Π Φ[=] [ = ] [ = ] Φ = Φ Φ = Φ / Φ = Φ
65 Conditioning (3/6) Calculating conditional probabilities = can reduce sizes of encountered intermediate factors - since evidence variables can render other variables independent.
66 Conditioning (4/6) Example: same network as before with evidence = (using Cond-Prob-VE) = Compute = for Φ = { } = )) Φ= { } Φ Φ[I = i ] = { } = { [ ] [ ] }
67 Conditioning Compute = for Φ = { } = Φ= { Eliminate )) [ ] [ ] = (factor size: 2) = Σ = Σ }
68 Conditioning Compute = for Φ = { } = Φ= { )) [ ] Eliminate [ ] } = [ ] (factor size: 1) [ ] = Σ =Σ [ ] [ ] ℝ a scalar dependent on.
69 Conditioning Compute = for Φ = { } = )) Φ= { [ ] ℝ [ ] Eliminate = [ ] [ ] = Σ = Σ [ ] } (factor size: 2)
70 Conditioning Compute = for Φ = { } = )) Φ= { [ ] ℝ [ ] } = [ ] [ ] =
71 Conditioning (5/6) So computing Φ can introduce smaller intermediate factors compared to computation of Φ. We can do the following (where we build upon Cond-Prob-VE - our ability to calculate Φ and Φ ): For some for every Φ = Φ then compute Φ = Φ Σ and have: / Φ= Φ Φ Φ Computing Φ Φ compute and Φ = and Σ as such is called conditioning. Φ Φ = Φ
72 Conditioning (6/6) Conditioning does no less work than ordinary VE we've seen so far it generally does more by running VE many times but the maximal intermediate factor ever encountered over VE runs can be smaller -- which is a necessity if we cannot hold very large intermediate factors in memory.
Probabilistic Graphical Models
Probabilistic Graphical Models Raquel Urtasun and Tamir Hazan TTI Chicago April 22, 2011 Raquel Urtasun and Tamir Hazan (TTI-C) Graphical Models April 22, 2011 1 / 22 If the graph is non-chordal, then
More informationRecitation 4: Elimination algorithm, reconstituted graph, triangulation
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 Recitation 4: Elimination algorithm, reconstituted graph, triangulation
More informationSTAT 598L Probabilistic Graphical Models. Instructor: Sergey Kirshner. Exact Inference
STAT 598L Probabilistic Graphical Models Instructor: Sergey Kirshner Exact Inference What To Do With Bayesian/Markov Network? Compact representation of a complex model, but Goal: efficient extraction of
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Raquel Urtasun and Tamir Hazan TTI Chicago April 25, 2011 Raquel Urtasun and Tamir Hazan (TTI-C) Graphical Models April 25, 2011 1 / 17 Clique Trees Today we are going to
More informationNP-Completeness. Algorithms
NP-Completeness Algorithms The NP-Completeness Theory Objective: Identify a class of problems that are hard to solve. Exponential time is hard. Polynomial time is easy. Why: Do not try to find efficient
More informationMassachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014
Suggested Reading: Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 Probabilistic Modelling and Reasoning: The Junction
More informationEE512 Graphical Models Fall 2009
EE512 Graphical Models Fall 2009 Prof. Jeff Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2009 http://ssli.ee.washington.edu/~bilmes/ee512fa09 Lecture 11 -
More informationBayesian Networks, Winter Yoav Haimovitch & Ariel Raviv
Bayesian Networks, Winter 2009-2010 Yoav Haimovitch & Ariel Raviv 1 Chordal Graph Warm up Theorem 7 Perfect Vertex Elimination Scheme Maximal cliques Tree Bibliography M.C.Golumbic Algorithmic Graph Theory
More informationNP-Hardness. We start by defining types of problem, and then move on to defining the polynomial-time reductions.
CS 787: Advanced Algorithms NP-Hardness Instructor: Dieter van Melkebeek We review the concept of polynomial-time reductions, define various classes of problems including NP-complete, and show that 3-SAT
More informationComputational problems. Lecture 2: Combinatorial search and optimisation problems. Computational problems. Examples. Example
Lecture 2: Combinatorial search and optimisation problems Different types of computational problems Examples of computational problems Relationships between problems Computational properties of different
More informationP and NP (Millenium problem)
CMPS 2200 Fall 2017 P and NP (Millenium problem) Carola Wenk Slides courtesy of Piotr Indyk with additions by Carola Wenk CMPS 2200 Introduction to Algorithms 1 We have seen so far Algorithms for various
More informationMassachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 Recitation-6: Hardness of Inference Contents 1 NP-Hardness Part-II
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Lecture 8 Junction Trees CS/CNS/EE 155 Andreas Krause Announcements Homework 2 due next Wednesday (Nov 4) in class Start early!!! Project milestones due Monday (Nov 9) 4
More informationLecture 11: May 1, 2000
/ EE596 Pat. Recog. II: Introduction to Graphical Models Spring 2000 Lecturer: Jeff Bilmes Lecture 11: May 1, 2000 University of Washington Dept. of Electrical Engineering Scribe: David Palmer 11.1 Graph
More informationCS521 \ Notes for the Final Exam
CS521 \ Notes for final exam 1 Ariel Stolerman Asymptotic Notations: CS521 \ Notes for the Final Exam Notation Definition Limit Big-O ( ) Small-o ( ) Big- ( ) Small- ( ) Big- ( ) Notes: ( ) ( ) ( ) ( )
More informationExact Algorithms Lecture 7: FPT Hardness and the ETH
Exact Algorithms Lecture 7: FPT Hardness and the ETH February 12, 2016 Lecturer: Michael Lampis 1 Reminder: FPT algorithms Definition 1. A parameterized problem is a function from (χ, k) {0, 1} N to {0,
More informationThese notes present some properties of chordal graphs, a set of undirected graphs that are important for undirected graphical models.
Undirected Graphical Models: Chordal Graphs, Decomposable Graphs, Junction Trees, and Factorizations Peter Bartlett. October 2003. These notes present some properties of chordal graphs, a set of undirected
More informationTreewidth and graph minors
Treewidth and graph minors Lectures 9 and 10, December 29, 2011, January 5, 2012 We shall touch upon the theory of Graph Minors by Robertson and Seymour. This theory gives a very general condition under
More information/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18
601.433/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18 22.1 Introduction We spent the last two lectures proving that for certain problems, we can
More informationLecture 5: Exact inference. Queries. Complexity of inference. Queries (continued) Bayesian networks can answer questions about the underlying
given that Maximum a posteriori (MAP query: given evidence 2 which has the highest probability: instantiation of all other variables in the network,, Most probable evidence (MPE: given evidence, find an
More informationTraveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R + Goal: find a tour (Hamiltonian cycle) of minimum cost
Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R + Goal: find a tour (Hamiltonian cycle) of minimum cost Traveling Salesman Problem (TSP) Input: undirected graph G=(V,E), c: E R
More informationW4231: Analysis of Algorithms
W4231: Analysis of Algorithms 11/23/99 NP-completeness of 3SAT, Minimum Vertex Cover, Maximum Independent Set, Boolean Formulae A Boolean formula is an expression that we can build starting from Boolean
More informationComplexity of Disjoint Π-Vertex Deletion for Disconnected Forbidden Subgraphs
Journal of Graph Algorithms and Applications http://jgaa.info/ vol. 18, no. 4, pp. 603 631 (2014) DOI: 10.7155/jgaa.00339 Complexity of Disjoint Π-Vertex Deletion for Disconnected Forbidden Subgraphs Jiong
More informationComputability Theory
CS:4330 Theory of Computation Spring 2018 Computability Theory Other NP-Complete Problems Haniel Barbosa Readings for this lecture Chapter 7 of [Sipser 1996], 3rd edition. Sections 7.4 and 7.5. The 3SAT
More informationLecture 5: Exact inference
Lecture 5: Exact inference Queries Inference in chains Variable elimination Without evidence With evidence Complexity of variable elimination which has the highest probability: instantiation of all other
More informationCS 512, Spring 2017: Take-Home End-of-Term Examination
CS 512, Spring 2017: Take-Home End-of-Term Examination Out: Tuesday, 9 May 2017, 12:00 noon Due: Wednesday, 10 May 2017, by 11:59 am Turn in your solutions electronically, as a single PDF file, by placing
More information9.1 Cook-Levin Theorem
CS787: Advanced Algorithms Scribe: Shijin Kong and David Malec Lecturer: Shuchi Chawla Topic: NP-Completeness, Approximation Algorithms Date: 10/1/2007 As we ve already seen in the preceding lecture, two
More informationNotes for Lecture 24
U.C. Berkeley CS170: Intro to CS Theory Handout N24 Professor Luca Trevisan December 4, 2001 Notes for Lecture 24 1 Some NP-complete Numerical Problems 1.1 Subset Sum The Subset Sum problem is defined
More informationPCP and Hardness of Approximation
PCP and Hardness of Approximation January 30, 2009 Our goal herein is to define and prove basic concepts regarding hardness of approximation. We will state but obviously not prove a PCP theorem as a starting
More information1. Suppose you are given a magic black box that somehow answers the following decision problem in polynomial time:
1. Suppose you are given a magic black box that somehow answers the following decision problem in polynomial time: Input: A CNF formula ϕ with n variables x 1, x 2,..., x n. Output: True if there is an
More informationRandomness and Computation March 25, Lecture 5
0368.463 Randomness and Computation March 25, 2009 Lecturer: Ronitt Rubinfeld Lecture 5 Scribe: Inbal Marhaim, Naama Ben-Aroya Today Uniform generation of DNF satisfying assignments Uniform generation
More informationCopyright 2000, Kevin Wayne 1
Guessing Game: NP-Complete? 1. LONGEST-PATH: Given a graph G = (V, E), does there exists a simple path of length at least k edges? YES. SHORTEST-PATH: Given a graph G = (V, E), does there exists a simple
More informationECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning
ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning Topics Bayes Nets: Inference (Finish) Variable Elimination Graph-view of VE: Fill-edges, induced width
More informationAPPROXIMATION ALGORITHMS FOR TREEWIDTH. Keywords: Treewidth, Triangulation, Tree Decomposition, Network Flow.
APPROXIMATION ALGORITHMS FOR TREEWIDTH EYAL AMIR COMPUTER SCIENCE DEPARTMENT UNIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN URBANA, IL 61801, USA EYAL@CS.UIUC.EDU Abstract. This paper presents algorithms whose
More informationBelief propagation in a bucket-tree. Handouts, 275B Fall Rina Dechter. November 1, 2000
Belief propagation in a bucket-tree Handouts, 275B Fall-2000 Rina Dechter November 1, 2000 1 From bucket-elimination to tree-propagation The bucket-elimination algorithm, elim-bel, for belief updating
More informationLecture 13: May 10, 2002
EE96 Pat. Recog. II: Introduction to Graphical Models University of Washington Spring 00 Dept. of Electrical Engineering Lecture : May 0, 00 Lecturer: Jeff Bilmes Scribe: Arindam Mandal, David Palmer(000).
More informationProve, where is known to be NP-complete. The following problems are NP-Complete:
CMPSCI 601: Recall From Last Time Lecture 21 To prove is NP-complete: Prove NP. Prove, where is known to be NP-complete. The following problems are NP-Complete: SAT (Cook-Levin Theorem) 3-SAT 3-COLOR CLIQUE
More information1 Definition of Reduction
1 Definition of Reduction Problem A is reducible, or more technically Turing reducible, to problem B, denoted A B if there a main program M to solve problem A that lacks only a procedure to solve problem
More informationCSC 373: Algorithm Design and Analysis Lecture 4
CSC 373: Algorithm Design and Analysis Lecture 4 Allan Borodin January 14, 2013 1 / 16 Lecture 4: Outline (for this lecture and next lecture) Some concluding comments on optimality of EST Greedy Interval
More informationChordal Graphs: Theory and Algorithms
Chordal Graphs: Theory and Algorithms 1 Chordal graphs Chordal graph : Every cycle of four or more vertices has a chord in it, i.e. there is an edge between two non consecutive vertices of the cycle. Also
More informationMarkov Random Fields
3750 Machine earning ecture 4 Markov Random ields Milos auskrecht milos@cs.pitt.edu 5329 ennott quare 3750 dvanced Machine earning Markov random fields Probabilistic models with symmetric dependences.
More informationOn Covering a Graph Optimally with Induced Subgraphs
On Covering a Graph Optimally with Induced Subgraphs Shripad Thite April 1, 006 Abstract We consider the problem of covering a graph with a given number of induced subgraphs so that the maximum number
More informationIntroduction to Graph Theory
Introduction to Graph Theory Tandy Warnow January 20, 2017 Graphs Tandy Warnow Graphs A graph G = (V, E) is an object that contains a vertex set V and an edge set E. We also write V (G) to denote the vertex
More informationExercises Computational Complexity
Exercises Computational Complexity March 22, 2017 Exercises marked with a are more difficult. 1 Chapter 7, P and NP Exercise 1. Suppose some pancakes are stacked on a surface such that no two pancakes
More informationVertex Cover Approximations
CS124 Lecture 20 Heuristics can be useful in practice, but sometimes we would like to have guarantees. Approximation algorithms give guarantees. It is worth keeping in mind that sometimes approximation
More informationALGORITHMS EXAMINATION Department of Computer Science New York University December 17, 2007
ALGORITHMS EXAMINATION Department of Computer Science New York University December 17, 2007 This examination is a three hour exam. All questions carry the same weight. Answer all of the following six questions.
More informationCS154, Lecture 18: PCPs, Hardness of Approximation, Approximation-Preserving Reductions, Interactive Proofs, Zero-Knowledge, Cold Fusion, Peace in
CS154, Lecture 18: PCPs, Hardness of Approximation, Approximation-Preserving Reductions, Interactive Proofs, Zero-Knowledge, Cold Fusion, Peace in the Middle East There are thousands of NP-complete problems
More informationFaster parameterized algorithms for Minimum Fill-In
Faster parameterized algorithms for Minimum Fill-In Hans L. Bodlaender Pinar Heggernes Yngve Villanger Abstract We present two parameterized algorithms for the Minimum Fill-In problem, also known as Chordal
More informationCS270 Combinatorial Algorithms & Data Structures Spring Lecture 19:
CS270 Combinatorial Algorithms & Data Structures Spring 2003 Lecture 19: 4.1.03 Lecturer: Satish Rao Scribes: Kevin Lacker and Bill Kramer Disclaimer: These notes have not been subjected to the usual scrutiny
More informationIntroduction to Parameterized Complexity
Introduction to Parameterized Complexity M. Pouly Department of Informatics University of Fribourg, Switzerland Internal Seminar June 2006 Outline Introduction & Motivation The Misery of Dr. O The Perspective
More informationCS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018
CS 580: Algorithm Design and Analysis Jeremiah Blocki Purdue University Spring 2018 Chapter 11 Approximation Algorithms Slides by Kevin Wayne. Copyright @ 2005 Pearson-Addison Wesley. All rights reserved.
More informationCSC2420 Fall 2012: Algorithm Design, Analysis and Theory An introductory (i.e. foundational) level graduate course.
CSC2420 Fall 2012: Algorithm Design, Analysis and Theory An introductory (i.e. foundational) level graduate course. Allan Borodin November 8, 2012; Lecture 9 1 / 24 Brief Announcements 1 Game theory reading
More informationCOMP260 Spring 2014 Notes: February 4th
COMP260 Spring 2014 Notes: February 4th Andrew Winslow In these notes, all graphs are undirected. We consider matching, covering, and packing in bipartite graphs, general graphs, and hypergraphs. We also
More informationThe Resolution Algorithm
The Resolution Algorithm Introduction In this lecture we introduce the Resolution algorithm for solving instances of the NP-complete CNF- SAT decision problem. Although the algorithm does not run in polynomial
More informationPrinciples of AI Planning. Principles of AI Planning. 7.1 How to obtain a heuristic. 7.2 Relaxed planning tasks. 7.1 How to obtain a heuristic
Principles of AI Planning June 8th, 2010 7. Planning as search: relaxed planning tasks Principles of AI Planning 7. Planning as search: relaxed planning tasks Malte Helmert and Bernhard Nebel 7.1 How to
More informationGraphical models and message-passing algorithms: Some introductory lectures
Graphical models and message-passing algorithms: Some introductory lectures Martin J. Wainwright 1 Introduction Graphical models provide a framework for describing statistical dependencies in (possibly
More informationKernelization Upper Bounds for Parameterized Graph Coloring Problems
Kernelization Upper Bounds for Parameterized Graph Coloring Problems Pim de Weijer Master Thesis: ICA-3137910 Supervisor: Hans L. Bodlaender Computing Science, Utrecht University 1 Abstract This thesis
More informationTopic: Local Search: Max-Cut, Facility Location Date: 2/13/2007
CS880: Approximations Algorithms Scribe: Chi Man Liu Lecturer: Shuchi Chawla Topic: Local Search: Max-Cut, Facility Location Date: 2/3/2007 In previous lectures we saw how dynamic programming could be
More informationFaster parameterized algorithms for Minimum Fill-In
Faster parameterized algorithms for Minimum Fill-In Hans L. Bodlaender Pinar Heggernes Yngve Villanger Technical Report UU-CS-2008-042 December 2008 Department of Information and Computing Sciences Utrecht
More informationGraphical Models. Pradeep Ravikumar Department of Computer Science The University of Texas at Austin
Graphical Models Pradeep Ravikumar Department of Computer Science The University of Texas at Austin Useful References Graphical models, exponential families, and variational inference. M. J. Wainwright
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Raquel Urtasun and Tamir Hazan TTI Chicago April 8, 2011 Raquel Urtasun and Tamir Hazan (TTI-C) Graphical Models April 8, 2011 1 / 19 Factor Graphs H does not reveal the
More informationNP Completeness. Andreas Klappenecker [partially based on slides by Jennifer Welch]
NP Completeness Andreas Klappenecker [partially based on slides by Jennifer Welch] Overview We already know the following examples of NPC problems: SAT 3SAT We are going to show that the following are
More informationApproximation Algorithms
Approximation Algorithms Given an NP-hard problem, what should be done? Theory says you're unlikely to find a poly-time algorithm. Must sacrifice one of three desired features. Solve problem to optimality.
More information6 : Factor Graphs, Message Passing and Junction Trees
10-708: Probabilistic Graphical Models 10-708, Spring 2018 6 : Factor Graphs, Message Passing and Junction Trees Lecturer: Kayhan Batmanghelich Scribes: Sarthak Garg 1 Factor Graphs Factor Graphs are graphical
More informationAn IPS for TQBF Intro to Approximability
An IPS for TQBF Intro to Approximability Outline Proof that TQBF (the language of true (valid) quantified Boolean formulas) is in the class IP Introduction to approximation algorithms for NP optimization
More informationFixed-Parameter Algorithms, IA166
Fixed-Parameter Algorithms, IA166 Sebastian Ordyniak Faculty of Informatics Masaryk University Brno Spring Semester 2013 Introduction Outline 1 Introduction Algorithms on Locally Bounded Treewidth Layer
More informationFramework for Design of Dynamic Programming Algorithms
CSE 441T/541T Advanced Algorithms September 22, 2010 Framework for Design of Dynamic Programming Algorithms Dynamic programming algorithms for combinatorial optimization generalize the strategy we studied
More informationApproximation Algorithms
Approximation Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A 4 credit unit course Part of Theoretical Computer Science courses at the Laboratory of Mathematics There will be 4 hours
More informationP = NP; P NP. Intuition of the reduction idea:
1 Polynomial Time Reducibility The question of whether P = NP is one of the greatest unsolved problems in the theoretical computer science. Two possibilities of relationship between P and N P P = NP; P
More informationConflict Graphs for Combinatorial Optimization Problems
Conflict Graphs for Combinatorial Optimization Problems Ulrich Pferschy joint work with Andreas Darmann and Joachim Schauer University of Graz, Austria Introduction Combinatorial Optimization Problem CO
More informationColoring 3-Colorable Graphs
Coloring -Colorable Graphs Charles Jin April, 015 1 Introduction Graph coloring in general is an etremely easy-to-understand yet powerful tool. It has wide-ranging applications from register allocation
More informationarxiv: v2 [cs.cc] 29 Mar 2010
On a variant of Monotone NAE-3SAT and the Triangle-Free Cut problem. arxiv:1003.3704v2 [cs.cc] 29 Mar 2010 Peiyush Jain, Microsoft Corporation. June 28, 2018 Abstract In this paper we define a restricted
More informationSemi-Independent Partitioning: A Method for Bounding the Solution to COP s
Semi-Independent Partitioning: A Method for Bounding the Solution to COP s David Larkin University of California, Irvine Abstract. In this paper we introduce a new method for bounding the solution to constraint
More informationSolution for Homework set 3
TTIC 300 and CMSC 37000 Algorithms Winter 07 Solution for Homework set 3 Question (0 points) We are given a directed graph G = (V, E), with two special vertices s and t, and non-negative integral capacities
More informationPart II. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS
Part II C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Converting Directed to Undirected Graphs (1) Converting Directed to Undirected Graphs (2) Add extra links between
More informationIntroduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/18/14
600.363 Introduction to Algorithms / 600.463 Algorithms I Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/18/14 23.1 Introduction We spent last week proving that for certain problems,
More informationNecessary edges in k-chordalizations of graphs
Necessary edges in k-chordalizations of graphs Hans L. Bodlaender Abstract In this note, we look at which edges must always be added to a given graph G = (V, E), when we want to make it a chordal graph
More informationMa/CS 6b Class 26: Art Galleries and Politicians
Ma/CS 6b Class 26: Art Galleries and Politicians By Adam Sheffer The Art Gallery Problem Problem. We wish to place security cameras at a gallery, such that they cover it completely. Every camera can cover
More informationOn Structural Parameterizations of the Matching Cut Problem
On Structural Parameterizations of the Matching Cut Problem N. R. Aravind, Subrahmanyam Kalyanasundaram, and Anjeneya Swami Kare Department of Computer Science and Engineering, IIT Hyderabad, Hyderabad,
More informationCS242: Probabilistic Graphical Models Lecture 3: Factor Graphs & Variable Elimination
CS242: Probabilistic Graphical Models Lecture 3: Factor Graphs & Variable Elimination Instructor: Erik Sudderth Brown University Computer Science September 11, 2014 Some figures and materials courtesy
More informationBayesian Networks Inference
Bayesian Networks Inference Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University November 5 th, 2007 2005-2007 Carlos Guestrin 1 General probabilistic inference Flu Allergy Query: Sinus
More informationHW Graph Theory SOLUTIONS (hbovik) - Q
1, Diestel 9.3: An arithmetic progression is an increasing sequence of numbers of the form a, a+d, a+ d, a + 3d.... Van der Waerden s theorem says that no matter how we partition the natural numbers into
More informationNOTE ON MINIMALLY k-connected GRAPHS
NOTE ON MINIMALLY k-connected GRAPHS R. Rama a, Suresh Badarla a a Department of Mathematics, Indian Institute of Technology, Chennai, India ABSTRACT A k-tree is either a complete graph on (k+1) vertices
More informationMC 302 GRAPH THEORY 10/1/13 Solutions to HW #2 50 points + 6 XC points
MC 0 GRAPH THEORY 0// Solutions to HW # 0 points + XC points ) [CH] p.,..7. This problem introduces an important class of graphs called the hypercubes or k-cubes, Q, Q, Q, etc. I suggest that before you
More informationTheorem 2.9: nearest addition algorithm
There are severe limits on our ability to compute near-optimal tours It is NP-complete to decide whether a given undirected =(,)has a Hamiltonian cycle An approximation algorithm for the TSP can be used
More information31.6 Powers of an element
31.6 Powers of an element Just as we often consider the multiples of a given element, modulo, we consider the sequence of powers of, modulo, where :,,,,. modulo Indexing from 0, the 0th value in this sequence
More informationFMA901F: Machine Learning Lecture 6: Graphical Models. Cristian Sminchisescu
FMA901F: Machine Learning Lecture 6: Graphical Models Cristian Sminchisescu Graphical Models Provide a simple way to visualize the structure of a probabilistic model and can be used to design and motivate
More informationComplexity. Congestion Games. Algorithmic Game Theory. Alexander Skopalik Algorithmic Game Theory 2013 Congestion Games
Algorithmic Game Theory Complexity of pure Nash equilibria We investigate the complexity of finding Nash equilibria in different kinds of congestion games. Our study is restricted to congestion games with
More informationChordal graphs MPRI
Chordal graphs MPRI 2017 2018 Michel Habib habib@irif.fr http://www.irif.fr/~habib Sophie Germain, septembre 2017 Schedule Chordal graphs Representation of chordal graphs LBFS and chordal graphs More structural
More informationFINAL EXAM SOLUTIONS
COMP/MATH 3804 Design and Analysis of Algorithms I Fall 2015 FINAL EXAM SOLUTIONS Question 1 (12%). Modify Euclid s algorithm as follows. function Newclid(a,b) if a
More informationDr. Amotz Bar-Noy s Compendium of Algorithms Problems. Problems, Hints, and Solutions
Dr. Amotz Bar-Noy s Compendium of Algorithms Problems Problems, Hints, and Solutions Chapter 1 Searching and Sorting Problems 1 1.1 Array with One Missing 1.1.1 Problem Let A = A[1],..., A[n] be an array
More informationUnit 8: Coping with NP-Completeness. Complexity classes Reducibility and NP-completeness proofs Coping with NP-complete problems. Y.-W.
: Coping with NP-Completeness Course contents: Complexity classes Reducibility and NP-completeness proofs Coping with NP-complete problems Reading: Chapter 34 Chapter 35.1, 35.2 Y.-W. Chang 1 Complexity
More informationApproximation Algorithms
Chapter 8 Approximation Algorithms Algorithm Theory WS 2016/17 Fabian Kuhn Approximation Algorithms Optimization appears everywhere in computer science We have seen many examples, e.g.: scheduling jobs
More informationGraphs and Discrete Structures
Graphs and Discrete Structures Nicolas Bousquet Louis Esperet Fall 2018 Abstract Brief summary of the first and second course. É 1 Chromatic number, independence number and clique number The chromatic
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 5 Inference
More informationPATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Bayesian Networks Directed Acyclic Graph (DAG) Bayesian Networks General Factorization Bayesian Curve Fitting (1) Polynomial Bayesian
More informationFinite Model Generation for Isabelle/HOL Using a SAT Solver
Finite Model Generation for / Using a SAT Solver Tjark Weber webertj@in.tum.de Technische Universität München Winterhütte, März 2004 Finite Model Generation for / p.1/21 is a generic proof assistant: Highly
More informationV,T C3: S,L,B T C4: A,L,T A,L C5: A,L,B A,B C6: C2: X,A A
Inference II Daphne Koller Stanford University CS228 Handout #13 In the previous chapter, we showed how efficient inference can be done in a BN using an algorithm called Variable Elimination, that sums
More informationDecision Problems. Observation: Many polynomial algorithms. Questions: Can we solve all problems in polynomial time? Answer: No, absolutely not.
Decision Problems Observation: Many polynomial algorithms. Questions: Can we solve all problems in polynomial time? Answer: No, absolutely not. Definition: The class of problems that can be solved by polynomial-time
More informationMini-Buckets: A General Scheme for Generating Approximations in Automated Reasoning
Mini-Buckets: A General Scheme for Generating Approximations in Automated Reasoning Rina Dechter* Department of Information and Computer Science University of California, Irvine dechter@ics. uci. edu Abstract
More information