CS242: Probabilistic Graphical Models Lecture 2B: Loopy Belief Propagation & Junction Trees
|
|
- Phoebe Gloria Stewart
- 5 years ago
- Views:
Transcription
1 CS242: Probabilistic Graphical Models Lecture 2B: Loopy Belief Propagation & Junction Trees Professor Erik Sudderth Brown University Computer Science September 22, 2016 Some figures and materials courtesy David Barber, Bayesian Reasoning and Machine Learning
2 CS242: Lecture 2B Outline Ø Elimination Algorithms & Clique Trees Ø Junction Tree Algorithm Ø Loopy Belief Propagation X2 X 4 X 4 X 1 X 1 X1 X1 X1 X 3 X 5 X 6 X 1 X 6 X 3 X2 X 5 X2 X 3 X 5 X 3 X 5
3 x s V Undirected Graphical Models Random variable associated with node s. Set of N nodes or vertices, {1, 2,...,N} x 1 x 2 E Set of undirected edges (s,t) linking pairs of nodes. x 3 C Undirected Markov Random Field: p(x) = 1 Z Set of cliques (fully connected subsets) of nodes. Y c2c c(x c ) 0 Z>0 c(x c ) Z = X x Y c2c x 4 x 5 c(x c ) Arbitrary non-negative potential function. Normalization constant, or partition function. Defined as sum over all possible joint states.
4 p(x) = 1 Z Clique-Based Inference Algorithms Y c2c c(x c ) z c = {x s s 2 c}, c2 C z 123 p(z) / Y c(z c ) z 34 z 35 c2c x 1 x 2 x 3 For each clique c, define a variable z c which enumerates joint configurations of dependent variables Does this define an equivalent joint distribution? x 4 x 5 PROBLEM: We have defined multiple copies of the variables in the true model, but not enforced any relationships among them
5 p(x) = 1 Z Clique-Based Inference Algorithms z c = {x s s 2 c}, c2 C p(z) / Y c2c Y c2c c(x c ) c(z c ) Y d6=c cd(z c,z d ) z 123 z 34 z 35 x 1 x 2 For each clique c, define a variable z c which enumerates joint configurations of dependent variables Add potentials enforcing consistency between all pairs of clique variables which share one of the original variables: 1 if zc (x cd(z c,z d )= s )=z d (x s ) for all s 2 c \ d, 0 otherwise. PROBLEM: The graph may have a large number of pairwise consistency constraints, and inference will be difficult x 3 x 4 x 5
6 p(x) = 1 Z Clique-Based Inference Algorithms Y c2c c(x c ) z c = {x s s 2 c}, c2 C p(z) / Y Y c(z c ) c2c (c,d)2e(c) cd(z c,z d ) z 123 z 34 z 35 For each clique c, define a variable z c which enumerates joint configurations of dependent variables Add potentials enforcing consistency between some subset of pairs of cliques, taking advantage of transitivity of equality: x a = x b,x b = x c! x a = x c Question: How many edges are needed for global consistency? When can we build a tree-structured clique graph? x 1 x 2 x 3 x 4 x 5
7 An Undirected View of Graph Elimination Elimination Order: (6,5,4,3,2,1) X 4 X 4 X 1 X 6 X1 X 6 X 3 X 5 X 3 X 5 For the node being eliminated, marginalize over values of associated variable Compute a new potential function involving all other variables which are neighbors of the just-marginalized variables, and are thus related by some potential If necessary, add edges to create a clique for the newly created potential
8 An Undirected View of Graph Elimination Elimination Order: (6,5,4,3,2,1) X 4 X 4 X 1 X 6 X1 X 6 X 4 X 3 X 5 X 3 X 5 X 1 X 3 X 5
9 An Undirected View of Graph Elimination X 4 Elimination Order: (6,5,4,3,2,1) X 4 X 1 X 1 X 1 X 3 X 5 X 3 X 3 X 1 X 1
10 Elimination Clique Tree X2 X 4 X 1 X1 X1 X 3 X 5 X 6 The clique tree contains the cliques (fully connected subsets) which are generated as elimination executes X2 X 3 X 5 X2 X 4 X 1 X 1 X1 X1 X1 X 3 X 5 X 6 The separator sets contain the variables which are shared among each linked pair of cliques X 3 X2 X 3 X 5 X2 X 5
11 CS242: Lecture 2B Outline Ø Elimination Algorithms & Clique Trees Ø Junction Tree Algorithm Ø Loopy Belief Propagation b d {b, d} {b} COLLECT messages DISTRIBUTE messages 1 4 a c e f {a, b, c} {b, c} {b, c, e} {b, e} {b, e, f} root Some figures from Paskin (2003)
12 Marginal Inference Algorithms One Marginal All Marginals Tree elimination applied to leaves of tree belief propagation or sum-product algorithm Graph elimination algorithm junction tree algorithm: belief propagation on a junction tree (special clique tree)
13 Clique Trees and Junction Trees X 4 X2 X 4 X 1 X 6 X1 X1 X 3 X 5 X 6 X 3 X 5 This clique tree has the junction tree property: the clique nodes containing any variable from the original model form a connected subtree We can exactly represent the distribution ignoring redundant constraints Note that not all clique trees are junction trees: X1 X1 X 3 X2 X 3 X 5 X2 X 4 X 5 X 6 X2 X 3 X 5
14 Finding a Junction Tree The Junction Tree Property: A clique tree is a junction tree if for every pair of cliques V, W, all cliques on the (unique) path between V and W contain any shared variables. A B A,B Not a Junction Tree! C D A,C B,D C,D A B C D Given a set of cliques, how can we efficiently find a clique tree with the junction tree (running intersection) property? How can we be sure that at least one junction tree exists? Strategy: Augment the graph with additional edges Ø Cliques of original graph are always subsets of cliques of the augmented graph, so original distribution still factorizes appropriately Ø As cliques grow, will eventually be able to construct a junction tree Question: Which undirected graphs have junction trees?
15 p(a, b, c, d) = Z Junction Trees and Triangulation b The Junction Tree Property: A clique tree is a junction tree if for every pair of cliques V, W, all cliques on the (unique) path between V and W contain any shared variables. A B A A,B a B,D Bd c C D A,C C (a) C,D D a, b, c b, c b, c, d (b) A chord is an edge connectingfigure two non-adjacent nodes : (a) Markov network (a,inb, some c) (b, c,cycle d). (b) Clique graph represen A cycle is chordless if it contains no chords A graph is triangulated (chordal) if it contains no chordless cycles of length 4 or more Cliqueofpotential assignments Theorem: The maximal cliques a graph have a corresponding The separator potential may be set to the normalisation constan junction tree if and only if that undirected graph is triangulated Cliques have potentials b, c) and d). Lemma: For a non-complete triangulated graph with at least 3(a, nodes, there (b, is ac,decomposition of the nodes into disjoint sets A, B, S such that S separates A from B, and S is complete. Ø Key induction argument in constructing junction tree from triangulation Ø Implies existence of elimination ordering which introduces no new edges
16 Constructing a Junction Tree A ABD ABD B D D BD C E BCD CD CDE Theorem: A clique tree is a junction tree if and only if it is a maximal spanning tree of: Ø Graph nodes: One for each maximal clique in triangulation Ø Graph edges: Link all pairs of cliques that share at least one variable Ø Edge weights: Cardinality of separator set (intersection) of cliques Ø Computational complexity: At worst quadratic in number of maximal cliques Junction Tree Algorithms for General-Purpose Inference 1. Triangulate the target undirected graphical model Ø Any elimination ordering generates a valid triangulation (linear in graph size) Ø But finding the optimal triangulation is NP-hard 2. Arrange triangulated cliques into a junction tree (at worst quadratic in graph size) 3. Execute sum-product algorithm on junction tree (exponential in clique size) BCD CD CDE
17 Example: Junction Tree Formation 1. Compute the elimination cliques (the order here is f, d, e, c, b, a). d d b b b b a f a a a b c e c e c e c a 2. Form the complete cluster graph over the maximal elimination cliques and find a maximum-weight spanning tree. {b, d} {b, e, f} 1 {b, d} {b} NOTE: For a given set of cliques, there may be multiple valid junction trees {b, c, e} 2 {a, b, c} {a, b, c} {b, c} {b, c, e} {b, e} {b, e, f} Paskin 2003
18 Example: Junction Tree Formation a b c d e a b c d e f g h i f g h i j k j k l l a b c d e f g h i j k l
19 Example: Junction Tree Formation a b c d e a b c d e f g h i f g h i j k j k l l a b c d e f g h i j k l
20 Example: Junction Tree Formation abf bf bcfg cdhi di dei a b c d e cfg chi f g h i cfgj cj ck chik j k cjk jk l jkl
21 Reminder: Pairwise Sum-Product BP Set of neighbors of node t: (t) ={s 2V (s, t) 2F} m ts (x s ) x s x t x s x t m st (x t ) p t (x t ) / Y s2 (t) m st (x t ) m ts (x s )= X Y st(x s,x t ) m ut (x t ) x t u2 (t)\s K t p t (x t ) m st (x t ) m ts (x s ) number of discrete states for random variable x t marginal distribution of the K t discrete states of random variable x t message from node s to node t, a vector of K t non-negative numbers message from node t to node s, a vector of K s non-negative numbers
22 p(z) / Y c2c cd(z c,z d )= Sum-Product for Junction Trees c(z c ) Y (c,d)2e(c) cd(z c,z d ) 1 if zc (x s )=z d (x s ) for all s 2 c \ d, 0 otherwise. Original Undirected Graphical Model Ø Nodes: Discrete random variables x s Ø Cliques: Groups of variables over which the joint distribution factorizes into potentials Junction Tree Ø Nodes: Maximal cliques in some triangulation Ø Node variables: All joint configurations z c of variables in corresponding clique Ø Node potentials: Potentials from original model Ø Edge potentials: Indicator functions forcing consistency of shared variables (separator sets) z c = {x s s 2 c}, c2 C {a, b, c} a {b, c} b c {b, d} {b} {b, c, e} d e {b, e} f {b, e, f}
23 Sum-Product for Junction Trees Sum-Product for Junction Trees (Shafer-Shenoy) Ø Express algorithm via original variables x s Ø Messages depend on clique intersection (separators) Ø Efficient schedules compute each message once C i C j µ ji C i C j S ij COLLECT messages DISTRIBUTE messages 1 4 µ ij p j (x Cj ) / Cj (x Cj ) Y µ ij (x Sij ) i2 (j) root {b, d} S ij = S ji = C i \ C j µ ji (x Sji ) / X C j (x Cj ) x Rji R ji = C j \ S ji Y µ kj (x Skj ) k2 (j)\i {a, b, c} {b, c} {b} {b, c, e} {b, e} {b, e, f}
24 Sum-Product for Junction Trees Sum-Product for Junction Trees (Shafer-Shenoy) Ø Express algorithm via original variables x s Ø Messages depend on clique intersection (separators) Ø Efficient schedules compute each message once Storage & Computational Cost O P j Qs2C j K s, where x s 2{1,...,K s } Exponential in sizes of maximal cliques. p j (x Cj ) / Cj (x Cj ) Y µ ij (x Sij ) µ ji (x Sji ) / X C j (x Cj ) x Rji i2 (j) Y k2 (j)\i µ kj (x Skj ) COLLECT messages DISTRIBUTE messages root {a, b, c} {b, c} {b, d} {b} 4 {b, c, e} {b, e} {b, e, f}
25 a b c Summary: Junction Tree Algorithm d e f {a, b, c} {b, c} {b, d} {b} {b, c, e} p j (x Cj ) / Cj (x Cj ) Y i2 (j) {b, e} {b, e, f} µ ij (x Sij ) COLLECT messages DISTRIBUTE messages root S ij = S ji = C i \ C j Junction Tree Algorithms for General-Purpose Inference 1. If necessary, convert graphical model to undirected form (linear in graph size) 2. Triangulate the target undirected graph Ø Any elimination ordering generates a valid triangulation (linear in graph size) Ø Finding an optimal triangulation, with minimal cliques, is NP-hard 3. Arrange triangulated cliques into a junction tree (at worst quadratic in graph size) 4. Execute sum-product algorithm on junction tree (exponential in clique size)
26 CS242: Lecture 2B Outline Ø Elimination Algorithms & Clique Trees Ø Junction Tree Algorithm Ø Loopy Belief Propagation
27 Inference for Graphs with Cycles For graphs with cycles, the dynamic programming BP derivation breaks Junction Tree Algorithm Cluster nodes to break cycles Run BP on the tree of clusters Exact, but may be intractable Loopy Belief Propagation Iterate local BP message updates on cyclic graph Hope beliefs converge Empirically, often very effective Coming later: Justification as variational method
28 Sum-Product Belief Propagation Set of neighbors of node s: (s) ={f 2F s 2 f} x t x u x s f x v x w Loopy BP: Message updates can be iteratively computed on graphs with cycles. But marginals incorrect! x t x u x s f x v m sf (x s )= Y g2 (s)\f m gs (x s ) / p s(x s ) m fs (x s ) Marginal Distribution of Each Variable: Marginal Distribution of Each Factor: Clique of variables linked by factor. m fs (x s )= X x f\s p s (x s ) / Y f2 (s) f (x f ) Y m fs (x s ) t2f\s p f (x f ) / f (x f ) Y m sf (x s ) s2f x w m tf (x t )
29 Low Density Parity Check (LDPC) Codes Parity Check Factors Evidence (observation) Factors Data Bits Parity Bits Ø Each variable node is binary, so x s 2 {0, 1} Ø Parity check factors equal 1 if the sum of the connected bits is even, 0 if the sum is odd (invalid codewords are excluded) Ø Unary evidence factors equal probability that each bit is a 0 or 1, given data. Assumes independent noise on each bit. Loopy BP: \
30 Loopy Belief Propagation A Revolution: Belief Propagation Graphs With Cycles In NIPS 1997: Brendan J. Frey Department of Computer Science University of Toronto David J. C. MacKay Department of Physics, Cavendish Laboratory Cambridge University UAI 1999:
CS242: Probabilistic Graphical Models Lecture 3: Factor Graphs & Variable Elimination
CS242: Probabilistic Graphical Models Lecture 3: Factor Graphs & Variable Elimination Instructor: Erik Sudderth Brown University Computer Science September 11, 2014 Some figures and materials courtesy
More information6 : Factor Graphs, Message Passing and Junction Trees
10-708: Probabilistic Graphical Models 10-708, Spring 2018 6 : Factor Graphs, Message Passing and Junction Trees Lecturer: Kayhan Batmanghelich Scribes: Sarthak Garg 1 Factor Graphs Factor Graphs are graphical
More informationExpectation Propagation
Expectation Propagation Erik Sudderth 6.975 Week 11 Presentation November 20, 2002 Introduction Goal: Efficiently approximate intractable distributions Features of Expectation Propagation (EP): Deterministic,
More informationPart II. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS
Part II C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Converting Directed to Undirected Graphs (1) Converting Directed to Undirected Graphs (2) Add extra links between
More informationOSU CS 536 Probabilistic Graphical Models. Loopy Belief Propagation and Clique Trees / Join Trees
OSU CS 536 Probabilistic Graphical Models Loopy Belief Propagation and Clique Trees / Join Trees Slides from Kevin Murphy s Graphical Model Tutorial (with minor changes) Reading: Koller and Friedman Ch
More informationMean Field and Variational Methods finishing off
Readings: K&F: 10.1, 10.5 Mean Field and Variational Methods finishing off Graphical Models 10708 Carlos Guestrin Carnegie Mellon University November 5 th, 2008 10-708 Carlos Guestrin 2006-2008 1 10-708
More informationMean Field and Variational Methods finishing off
Readings: K&F: 10.1, 10.5 Mean Field and Variational Methods finishing off Graphical Models 10708 Carlos Guestrin Carnegie Mellon University November 5 th, 2008 10-708 Carlos Guestrin 2006-2008 1 10-708
More informationECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning
ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning Topics Markov Random Fields: Inference Exact: VE Exact+Approximate: BP Readings: Barber 5 Dhruv Batra
More informationComputer Vision Group Prof. Daniel Cremers. 4a. Inference in Graphical Models
Group Prof. Daniel Cremers 4a. Inference in Graphical Models Inference on a Chain (Rep.) The first values of µ α and µ β are: The partition function can be computed at any node: Overall, we have O(NK 2
More informationMassachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014
Suggested Reading: Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 Probabilistic Modelling and Reasoning: The Junction
More informationLoopy Belief Propagation
Loopy Belief Propagation Research Exam Kristin Branson September 29, 2003 Loopy Belief Propagation p.1/73 Problem Formalization Reasoning about any real-world problem requires assumptions about the structure
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Lecture 8 Junction Trees CS/CNS/EE 155 Andreas Krause Announcements Homework 2 due next Wednesday (Nov 4) in class Start early!!! Project milestones due Monday (Nov 9) 4
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 5 Inference
More informationLecture 9: Undirected Graphical Models Machine Learning
Lecture 9: Undirected Graphical Models Machine Learning Andrew Rosenberg March 5, 2010 1/1 Today Graphical Models Probabilities in Undirected Graphs 2/1 Undirected Graphs What if we allow undirected graphs?
More informationPATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Bayesian Networks Directed Acyclic Graph (DAG) Bayesian Networks General Factorization Bayesian Curve Fitting (1) Polynomial Bayesian
More informationChapter 8 of Bishop's Book: Graphical Models
Chapter 8 of Bishop's Book: Graphical Models Review of Probability Probability density over possible values of x Used to find probability of x falling in some range For continuous variables, the probability
More informationCollective classification in network data
1 / 50 Collective classification in network data Seminar on graphs, UCSB 2009 Outline 2 / 50 1 Problem 2 Methods Local methods Global methods 3 Experiments Outline 3 / 50 1 Problem 2 Methods Local methods
More informationPart I: Sum Product Algorithm and (Loopy) Belief Propagation. What s wrong with VarElim. Forwards algorithm (filtering) Forwards-backwards algorithm
OU 56 Probabilistic Graphical Models Loopy Belief Propagation and lique Trees / Join Trees lides from Kevin Murphy s Graphical Model Tutorial (with minor changes) eading: Koller and Friedman h 0 Part I:
More informationJunction tree propagation - BNDG 4-4.6
Junction tree propagation - BNDG 4-4. Finn V. Jensen and Thomas D. Nielsen Junction tree propagation p. 1/2 Exact Inference Message Passing in Join Trees More sophisticated inference technique; used in
More informationProbabilistic Graphical Models
Overview of Part Two Probabilistic Graphical Models Part Two: Inference and Learning Christopher M. Bishop Exact inference and the junction tree MCMC Variational methods and EM Example General variational
More informationV,T C3: S,L,B T C4: A,L,T A,L C5: A,L,B A,B C6: C2: X,A A
Inference II Daphne Koller Stanford University CS228 Handout #13 In the previous chapter, we showed how efficient inference can be done in a BN using an algorithm called Variable Elimination, that sums
More informationProbabilistic Graphical Models
School of Computer Science Probabilistic Graphical Models Theory of Variational Inference: Inner and Outer Approximation Eric Xing Lecture 14, February 29, 2016 Reading: W & J Book Chapters Eric Xing @
More informationThese notes present some properties of chordal graphs, a set of undirected graphs that are important for undirected graphical models.
Undirected Graphical Models: Chordal Graphs, Decomposable Graphs, Junction Trees, and Factorizations Peter Bartlett. October 2003. These notes present some properties of chordal graphs, a set of undirected
More informationTree-structured approximations by expectation propagation
Tree-structured approximations by expectation propagation Thomas Minka Department of Statistics Carnegie Mellon University Pittsburgh, PA 15213 USA minka@stat.cmu.edu Yuan Qi Media Laboratory Massachusetts
More informationInformation Processing Letters
Information Processing Letters 112 (2012) 449 456 Contents lists available at SciVerse ScienceDirect Information Processing Letters www.elsevier.com/locate/ipl Recursive sum product algorithm for generalized
More informationD-Separation. b) the arrows meet head-to-head at the node, and neither the node, nor any of its descendants, are in the set C.
D-Separation Say: A, B, and C are non-intersecting subsets of nodes in a directed graph. A path from A to B is blocked by C if it contains a node such that either a) the arrows on the path meet either
More informationEE512 Graphical Models Fall 2009
EE512 Graphical Models Fall 2009 Prof. Jeff Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2009 http://ssli.ee.washington.edu/~bilmes/ee512fa09 Lecture 13 -
More informationComputer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models
Prof. Daniel Cremers 4. Probabilistic Graphical Models Directed Models The Bayes Filter (Rep.) (Bayes) (Markov) (Tot. prob.) (Markov) (Markov) 2 Graphical Representation (Rep.) We can describe the overall
More informationMotivation: Shortcomings of Hidden Markov Model. Ko, Youngjoong. Solution: Maximum Entropy Markov Model (MEMM)
Motivation: Shortcomings of Hidden Markov Model Maximum Entropy Markov Models and Conditional Random Fields Ko, Youngjoong Dept. of Computer Engineering, Dong-A University Intelligent System Laboratory,
More informationBayesian Networks, Winter Yoav Haimovitch & Ariel Raviv
Bayesian Networks, Winter 2009-2010 Yoav Haimovitch & Ariel Raviv 1 Chordal Graph Warm up Theorem 7 Perfect Vertex Elimination Scheme Maximal cliques Tree Bibliography M.C.Golumbic Algorithmic Graph Theory
More informationGraphical Models. Pradeep Ravikumar Department of Computer Science The University of Texas at Austin
Graphical Models Pradeep Ravikumar Department of Computer Science The University of Texas at Austin Useful References Graphical models, exponential families, and variational inference. M. J. Wainwright
More informationCOS 513: Foundations of Probabilistic Modeling. Lecture 5
COS 513: Foundations of Probabilistic Modeling Young-suk Lee 1 Administrative Midterm report is due Oct. 29 th. Recitation is at 4:26pm in Friend 108. Lecture 5 R is a computer language for statistical
More informationMore details on Loopy BP
Readings: K&F: 11.3, 11.5 Yedidia et al. paper from the class website Chapter 9 - Jordan Loopy Belief Propagation Generalized Belief Propagation Unifying Variational and GBP Learning Parameters of MNs
More informationEE512A Advanced Inference in Graphical Models
EE512A Advanced Inference in Graphical Models Fall Quarter, Lecture 6 http://j.ee.washington.edu/~bilmes/classes/ee512a_fall_2014/ Prof. Jeff Bilmes University of Washington, Seattle Department of Electrical
More informationLecture 4: Undirected Graphical Models
Lecture 4: Undirected Graphical Models Department of Biostatistics University of Michigan zhenkewu@umich.edu http://zhenkewu.com/teaching/graphical_model 15 September, 2016 Zhenke Wu BIOSTAT830 Graphical
More informationAcyclic Network. Tree Based Clustering. Tree Decomposition Methods
Summary s Join Tree Importance of s Solving Topological structure defines key features for a wide class of problems CSP: Inference in acyclic network is extremely efficient (polynomial) Idea: remove cycles
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Raquel Urtasun and Tamir Hazan TTI Chicago April 25, 2011 Raquel Urtasun and Tamir Hazan (TTI-C) Graphical Models April 25, 2011 1 / 17 Clique Trees Today we are going to
More informationAN ANALYSIS ON MARKOV RANDOM FIELDS (MRFs) USING CYCLE GRAPHS
Volume 8 No. 0 208, -20 ISSN: 3-8080 (printed version); ISSN: 34-3395 (on-line version) url: http://www.ijpam.eu doi: 0.2732/ijpam.v8i0.54 ijpam.eu AN ANALYSIS ON MARKOV RANDOM FIELDS (MRFs) USING CYCLE
More informationDecomposition of log-linear models
Graphical Models, Lecture 5, Michaelmas Term 2009 October 27, 2009 Generating class Dependence graph of log-linear model Conformal graphical models Factor graphs A density f factorizes w.r.t. A if there
More informationFinding Non-overlapping Clusters for Generalized Inference Over Graphical Models
1 Finding Non-overlapping Clusters for Generalized Inference Over Graphical Models Divyanshu Vats and José M. F. Moura arxiv:1107.4067v2 [stat.ml] 18 Mar 2012 Abstract Graphical models use graphs to compactly
More informationAn Effective Upperbound on Treewidth Using Partial Fill-in of Separators
An Effective Upperbound on Treewidth Using Partial Fill-in of Separators Boi Faltings Martin Charles Golumbic June 28, 2009 Abstract Partitioning a graph using graph separators, and particularly clique
More informationHomework 1: Belief Propagation & Factor Graphs
Homework 1: Belief Propagation & Factor Graphs Brown University CS 242: Probabilistic Graphical Models Homework due at 11:59pm on October 5, 2016 We examine the problem of computing marginal distributions
More informationModels for grids. Computer vision: models, learning and inference. Multi label Denoising. Binary Denoising. Denoising Goal.
Models for grids Computer vision: models, learning and inference Chapter 9 Graphical Models Consider models where one unknown world state at each pixel in the image takes the form of a grid. Loops in the
More informationBelief propagation in a bucket-tree. Handouts, 275B Fall Rina Dechter. November 1, 2000
Belief propagation in a bucket-tree Handouts, 275B Fall-2000 Rina Dechter November 1, 2000 1 From bucket-elimination to tree-propagation The bucket-elimination algorithm, elim-bel, for belief updating
More informationMarkov Random Fields
3750 Machine earning ecture 4 Markov Random ields Milos auskrecht milos@cs.pitt.edu 5329 ennott quare 3750 dvanced Machine earning Markov random fields Probabilistic models with symmetric dependences.
More informationECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning
ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning Topics Bayes Nets: Inference (Finish) Variable Elimination Graph-view of VE: Fill-edges, induced width
More informationLecture 11: May 1, 2000
/ EE596 Pat. Recog. II: Introduction to Graphical Models Spring 2000 Lecturer: Jeff Bilmes Lecture 11: May 1, 2000 University of Washington Dept. of Electrical Engineering Scribe: David Palmer 11.1 Graph
More informationSummary. Acyclic Networks Join Tree Clustering. Tree Decomposition Methods. Acyclic Network. Tree Based Clustering. Tree Decomposition.
Summary s Join Tree Importance of s Solving Topological structure denes key features for a wide class of problems CSP: Inference in acyclic network is extremely ecient (polynomial) Idea: remove cycles
More informationAn Introduction to LP Relaxations for MAP Inference
An Introduction to LP Relaxations for MAP Inference Adrian Weller MLSALT4 Lecture Feb 27, 2017 With thanks to David Sontag (NYU) for use of some of his slides and illustrations For more information, see
More information10708 Graphical Models: Homework 4
10708 Graphical Models: Homework 4 Due November 12th, beginning of class October 29, 2008 Instructions: There are six questions on this assignment. Each question has the name of one of the TAs beside it,
More informationAcyclic Network. Tree Based Clustering. Tree Decomposition Methods
Summary s Cluster Tree Elimination Importance of s Solving Topological structure dene key features for a wide class of problems CSP: Inference in acyclic network is extremely ecient (polynomial) Idea:
More informationLearning the Structure of Sum-Product Networks. Robert Gens Pedro Domingos
Learning the Structure of Sum-Product Networks Robert Gens Pedro Domingos w 20 10x O(n) X Y LL PLL CLL CMLL Motivation SPN Structure Experiments Review Learning Graphical Models Representation Inference
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Raquel Urtasun and Tamir Hazan TTI Chicago April 22, 2011 Raquel Urtasun and Tamir Hazan (TTI-C) Graphical Models April 22, 2011 1 / 22 If the graph is non-chordal, then
More informationSTAT 598L Probabilistic Graphical Models. Instructor: Sergey Kirshner. Exact Inference
STAT 598L Probabilistic Graphical Models Instructor: Sergey Kirshner Exact Inference What To Do With Bayesian/Markov Network? Compact representation of a complex model, but Goal: efficient extraction of
More informationGraphical Models as Block-Tree Graphs
Graphical Models as Block-Tree Graphs 1 Divyanshu Vats and José M. F. Moura arxiv:1007.0563v2 [stat.ml] 13 Nov 2010 Abstract We introduce block-tree graphs as a framework for deriving efficient algorithms
More informationMachine Learning. Sourangshu Bhattacharya
Machine Learning Sourangshu Bhattacharya Bayesian Networks Directed Acyclic Graph (DAG) Bayesian Networks General Factorization Curve Fitting Re-visited Maximum Likelihood Determine by minimizing sum-of-squares
More informationMultiagent Graph Coloring: Pareto Efficiency, Fairness and Individual Rationality By Yaad Blum, Jeffrey S. Rosenchein
Multiagent Graph Coloring: Pareto Efficiency, Fairness and Individual Rationality By Yaad Blum, Jeffrey S. Rosenchein Presented by: Simina Brânzei Outline Introduction Background Coloring from an Economics
More informationParallel Gibbs Sampling From Colored Fields to Thin Junction Trees
Parallel Gibbs Sampling From Colored Fields to Thin Junction Trees Joseph Gonzalez Yucheng Low Arthur Gretton Carlos Guestrin Draw Samples Sampling as an Inference Procedure Suppose we wanted to know the
More informationAlgorithms for Markov Random Fields in Computer Vision
Algorithms for Markov Random Fields in Computer Vision Dan Huttenlocher November, 2003 (Joint work with Pedro Felzenszwalb) Random Field Broadly applicable stochastic model Collection of n sites S Hidden
More informationFaster parameterized algorithms for Minimum Fill-In
Faster parameterized algorithms for Minimum Fill-In Hans L. Bodlaender Pinar Heggernes Yngve Villanger Technical Report UU-CS-2008-042 December 2008 Department of Information and Computing Sciences Utrecht
More informationLecture 5: Exact inference. Queries. Complexity of inference. Queries (continued) Bayesian networks can answer questions about the underlying
given that Maximum a posteriori (MAP query: given evidence 2 which has the highest probability: instantiation of all other variables in the network,, Most probable evidence (MPE: given evidence, find an
More informationGraphical Models. David M. Blei Columbia University. September 17, 2014
Graphical Models David M. Blei Columbia University September 17, 2014 These lecture notes follow the ideas in Chapter 2 of An Introduction to Probabilistic Graphical Models by Michael Jordan. In addition,
More informationExact Inference: Elimination and Sum Product (and hidden Markov models)
Exact Inference: Elimination and Sum Product (and hidden Markov models) David M. Blei Columbia University October 13, 2015 The first sections of these lecture notes follow the ideas in Chapters 3 and 4
More informationStructured Region Graphs: Morphing EP into GBP
Structured Region Graphs: Morphing EP into GBP Max Welling Dept. of Computer Science UC Irvine Irvine CA 92697-3425 welling@ics.uci.edu Thomas P. Minka Microsoft Research Cambridge, CB3 0FB, UK minka@microsoft.com
More informationLecture 5: Exact inference
Lecture 5: Exact inference Queries Inference in chains Variable elimination Without evidence With evidence Complexity of variable elimination which has the highest probability: instantiation of all other
More informationLecture 13: May 10, 2002
EE96 Pat. Recog. II: Introduction to Graphical Models University of Washington Spring 00 Dept. of Electrical Engineering Lecture : May 0, 00 Lecturer: Jeff Bilmes Scribe: Arindam Mandal, David Palmer(000).
More informationComputer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models
Prof. Daniel Cremers 4. Probabilistic Graphical Models Directed Models The Bayes Filter (Rep.) (Bayes) (Markov) (Tot. prob.) (Markov) (Markov) 2 Graphical Representation (Rep.) We can describe the overall
More informationIntegrating Probabilistic Reasoning with Constraint Satisfaction
Integrating Probabilistic Reasoning with Constraint Satisfaction IJCAI Tutorial #7 Instructor: Eric I. Hsu July 17, 2011 http://www.cs.toronto.edu/~eihsu/tutorial7 Getting Started Discursive Remarks. Organizational
More informationCh9: Exact Inference: Variable Elimination. Shimi Salant, Barak Sternberg
Ch9: Exact Inference: Variable Elimination Shimi Salant Barak Sternberg Part 1 Reminder introduction (1/3) We saw two ways to represent (finite discrete) distributions via graphical data structures: Bayesian
More informationGraphs and Network Flows IE411. Lecture 21. Dr. Ted Ralphs
Graphs and Network Flows IE411 Lecture 21 Dr. Ted Ralphs IE411 Lecture 21 1 Combinatorial Optimization and Network Flows In general, most combinatorial optimization and integer programming problems are
More informationMachine Learning
Machine Learning 10-601 Tom M. Mitchell Machine Learning Department Carnegie Mellon University April 1, 2019 Today: Inference in graphical models Learning graphical models Readings: Bishop chapter 8 Bayesian
More informationComputer vision: models, learning and inference. Chapter 10 Graphical Models
Computer vision: models, learning and inference Chapter 10 Graphical Models Independence Two variables x 1 and x 2 are independent if their joint probability distribution factorizes as Pr(x 1, x 2 )=Pr(x
More informationLearning Bounded Treewidth Bayesian Networks
Journal of Machine Learning Research 9 (2008) 2287-2319 Submitted 5/08; Published 10/08 Learning Bounded Treewidth Bayesian Networks Gal Elidan Department of Statistics Hebrew University Jerusalem, 91905,
More informationMatching Theory. Figure 1: Is this graph bipartite?
Matching Theory 1 Introduction A matching M of a graph is a subset of E such that no two edges in M share a vertex; edges which have this property are called independent edges. A matching M is said to
More information3 : Representation of Undirected GMs
0-708: Probabilistic Graphical Models 0-708, Spring 202 3 : Representation of Undirected GMs Lecturer: Eric P. Xing Scribes: Nicole Rafidi, Kirstin Early Last Time In the last lecture, we discussed directed
More informationECE521: Week 11, Lecture March 2017: HMM learning/inference. With thanks to Russ Salakhutdinov
ECE521: Week 11, Lecture 20 27 March 2017: HMM learning/inference With thanks to Russ Salakhutdinov Examples of other perspectives Murphy 17.4 End of Russell & Norvig 15.2 (Artificial Intelligence: A Modern
More informationRegularization and Markov Random Fields (MRF) CS 664 Spring 2008
Regularization and Markov Random Fields (MRF) CS 664 Spring 2008 Regularization in Low Level Vision Low level vision problems concerned with estimating some quantity at each pixel Visual motion (u(x,y),v(x,y))
More informationGraphical Models & HMMs
Graphical Models & HMMs Henrik I. Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I. Christensen (RIM@GT) Graphical Models
More informationThe Basics of Graphical Models
The Basics of Graphical Models David M. Blei Columbia University September 30, 2016 1 Introduction (These notes follow Chapter 2 of An Introduction to Probabilistic Graphical Models by Michael Jordan.
More informationChordal Graphs: Theory and Algorithms
Chordal Graphs: Theory and Algorithms 1 Chordal graphs Chordal graph : Every cycle of four or more vertices has a chord in it, i.e. there is an edge between two non consecutive vertices of the cycle. Also
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Raquel Urtasun and Tamir Hazan TTI Chicago April 8, 2011 Raquel Urtasun and Tamir Hazan (TTI-C) Graphical Models April 8, 2011 1 / 19 Factor Graphs H does not reveal the
More informationConvergent message passing algorithms - a unifying view
Convergent message passing algorithms - a unifying view Talya Meltzer, Amir Globerson and Yair Weiss School of Computer Science and Engineering The Hebrew University of Jerusalem, Jerusalem, Israel {talyam,gamir,yweiss}@cs.huji.ac.il
More informationBipartite Roots of Graphs
Bipartite Roots of Graphs Lap Chi Lau Department of Computer Science University of Toronto Graph H is a root of graph G if there exists a positive integer k such that x and y are adjacent in G if and only
More informationError correction guarantees
Error correction guarantees Drawback of asymptotic analyses Valid only as long as the incoming messages are independent. (independence assumption) The messages are independent for l iterations only if
More informationChordal graphs MPRI
Chordal graphs MPRI 2017 2018 Michel Habib habib@irif.fr http://www.irif.fr/~habib Sophie Germain, septembre 2017 Schedule Chordal graphs Representation of chordal graphs LBFS and chordal graphs More structural
More informationA Tutorial Introduction to Belief Propagation
A Tutorial Introduction to Belief Propagation James Coughlan August 2009 Table of Contents Introduction p. 3 MRFs, graphical models, factor graphs 5 BP 11 messages 16 belief 22 sum-product vs. max-product
More informationECE521 Lecture 21 HMM cont. Message Passing Algorithms
ECE521 Lecture 21 HMM cont Message Passing Algorithms Outline Hidden Markov models Numerical example of figuring out marginal of the observed sequence Numerical example of figuring out the most probable
More informationLecture 3: Conditional Independence - Undirected
CS598: Graphical Models, Fall 2016 Lecture 3: Conditional Independence - Undirected Lecturer: Sanmi Koyejo Scribe: Nate Bowman and Erin Carrier, Aug. 30, 2016 1 Review for the Bayes-Ball Algorithm Recall
More informationDirected Graphical Models
Copyright c 2008 2010 John Lafferty, Han Liu, and Larry Wasserman Do Not Distribute Chapter 18 Directed Graphical Models Graphs give a powerful way of representing independence relations and computing
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Lecture 17 EM CS/CNS/EE 155 Andreas Krause Announcements Project poster session on Thursday Dec 3, 4-6pm in Annenberg 2 nd floor atrium! Easels, poster boards and cookies
More informationLecture 6. Register Allocation. I. Introduction. II. Abstraction and the Problem III. Algorithm
I. Introduction Lecture 6 Register Allocation II. Abstraction and the Problem III. Algorithm Reading: Chapter 8.8.4 Before next class: Chapter 10.1-10.2 CS243: Register Allocation 1 I. Motivation Problem
More informationSmall Survey on Perfect Graphs
Small Survey on Perfect Graphs Michele Alberti ENS Lyon December 8, 2010 Abstract This is a small survey on the exciting world of Perfect Graphs. We will see when a graph is perfect and which are families
More informationJunction Trees and Chordal Graphs
Graphical Models, Lecture 6, Michaelmas Term 2009 October 30, 2009 Decomposability Factorization of Markov distributions Explicit formula for MLE Consider an undirected graph G = (V, E). A partitioning
More informationECE521 W17 Tutorial 10
ECE521 W17 Tutorial 10 Shenlong Wang and Renjie Liao *Some of materials are credited to Jimmy Ba, Eric Sudderth, Chris Bishop Introduction to A4 1, Graphical Models 2, Message Passing 3, HMM Introduction
More informationMixture Models and the EM Algorithm
Mixture Models and the EM Algorithm Padhraic Smyth, Department of Computer Science University of California, Irvine c 2017 1 Finite Mixture Models Say we have a data set D = {x 1,..., x N } where x i is
More informationNotes for Lecture 24
U.C. Berkeley CS170: Intro to CS Theory Handout N24 Professor Luca Trevisan December 4, 2001 Notes for Lecture 24 1 Some NP-complete Numerical Problems 1.1 Subset Sum The Subset Sum problem is defined
More informationFaster parameterized algorithms for Minimum Fill-In
Faster parameterized algorithms for Minimum Fill-In Hans L. Bodlaender Pinar Heggernes Yngve Villanger Abstract We present two parameterized algorithms for the Minimum Fill-In problem, also known as Chordal
More informationConditional Random Fields and beyond D A N I E L K H A S H A B I C S U I U C,
Conditional Random Fields and beyond D A N I E L K H A S H A B I C S 5 4 6 U I U C, 2 0 1 3 Outline Modeling Inference Training Applications Outline Modeling Problem definition Discriminative vs. Generative
More informationSpeeding Up Computation in Probabilistic Graphical Models using GPGPUs
Speeding Up Computation in Probabilistic Graphical Models using GPGPUs Lu Zheng & Ole J. Mengshoel {lu.zheng ole.mengshoel}@sv.cmu.edu GPU Technology Conference San Jose, CA, March 20, 2013 CMU in Silicon
More informationCS649 Sensor Networks IP Track Lecture 6: Graphical Models
CS649 Sensor Networks IP Track Lecture 6: Grahical Models I-Jeng Wang htt://hinrg.cs.jhu.edu/wsn06/ Sring 2006 CS 649 1 Sring 2006 CS 649 2 Grahical Models Grahical Model: grahical reresentation of joint
More information