Loopy Belief Propagation

Size: px
Start display at page:

Download "Loopy Belief Propagation"

Transcription

1 Loopy Belief Propagation Research Exam Kristin Branson September 29, 2003 Loopy Belief Propagation p.1/73

2 Problem Formalization Reasoning about any real-world problem requires assumptions about the structure of the problem: the relevant variables and the interrelationships of these variables. A graphical model is a formal representation of these assumptions. Loopy Belief Propagation p.2/73

3 Problem Formalization Reasoning about any real-world problem requires assumptions about the structure of the problem: the relevant variables and the interrelationships of these variables. A graphical model is a formal representation of these assumptions. Loopy Belief Propagation p.3/73

4 Probabilistic Model These assumptions are a simplification of the problem s true structure. The world appears stochastic in terms of the model. Graphical models are interpreted as describing the probability distribution of random variables. Loopy Belief Propagation p.4/73

5 Probabilistic Inference Reasoning about real-world problems can be modeled as probabilistic inference on a distribution described by a graph. Probabilistic inference involves computing desired properties of a distribution: What is the most probable state of the variables, given some evidence? What is the marginal distribution of a subset of the variables, given some evidence? Loopy Belief Propagation p.5/73

6 Inference Example Estimate the intensity value of each pixel of an image given a corrupted version of the image. Loopy Belief Propagation p.6/73

7 Inference Example Estimate the intensity value of each pixel of an image given a corrupted version of the image. Assume. Assume relationship between uncorrupted pixel variables can be described by a local smoothness constraint. Loopy Belief Propagation p.7/73

8 Inference is Intractible Assuming a sparse graphical model greatly simplifies the problem. Still, probabilistic inference is in general intractible. Exact inference algorithms are exponential in the graph size. Pearl s Belief Propagation (BP) algorithm performs approximate inference on an arbitrary graphical model. Loopy Belief Propagation p.8/73

9 Loopy BP The assumptions made by BP only hold for acyclic graphs. For graphs containing cycles, loopy BP is not guaranteed to converge or be correct. However, it has been applied with experimental success. Loopy Belief Propagation p.9/73

10 Experimental Results Impressive experimental results were first observed in graphical code schemes. The Turbo code error-correcting code scheme was described as the most exciting and potentially important development in coding theory in many years (McEliece et al., 1995). Murphy et al. experimented with loopy BP on graphical models that appear in machine learning. They concluded that loopy BP did converge to good approximations in many cases. Since then, loopy BP has successfully been applied to many applications in machine learning and computer vision. Loopy Belief Propagation p.10/73

11 Theoretical Analysis When considering applying loopy BP, one would like to know whether it will converge to good approximations. In this exam, I present recent analyses of loopy BP. Loopy Belief Propagation p.11/73

12 Outline Background. Undirected graphical models. Belief Propagation algorithm. Three techniques for analyzing loopy BP. Algebraic analysis. Unwrapped tree. Reparameterization. Future work. Loopy Belief Propagation p.12/73

13 Outline Background. Undirected graphical models. Belief Propagation algorithm. Three techniques for analyzing loopy BP. Algebraic analysis. Unwrapped tree. Reparameterization. Future work. Loopy Belief Propagation p.13/73

14 Markov Random Fields An undirected graphical model represents a distribution by an undirected graph Nodes represent random variables. Edges represent dependencies. Each clique is associated with a potential function,. Loopy Belief Propagation p.14/73

15 Markov Properties Paths in the graph represent dependencies. If two nodes are not connected, the variables are independent. A B If nodes separate nodes from nodes, then and are conditionally independent given. The Hammersley-Clifford theorem: the conditional independence assumptions hold if and only if the distribution factorizes as the product of potential functions over cliques:. C Loopy Belief Propagation p.15/73

16 Pairwise MRFs To simplify notation, we focus on pairwise MRFs. The largest clique size in a pairwise MRF is two. The distribution can therefore be represented as A MRF with larger cliques can be converted into a pairwise MRF. Loopy Belief Propagation p.16/73

17 Probabilistic Inference We discuss two inference problems: Marginalization: For each node, compute MAP assignment: Find the maximum probability assignment given the evidence: argmax Loopy Belief Propagation p.17/73

18 Max-Marginals To find the MAP assignment, we will compute the max-marginals: The MAP assignment maximizes. Loopy Belief Propagation p.18/73

19 Notation To simplify notation, we assume that effect of the observed data is encapsulated in the single-node potentials. Loopy Belief Propagation p.19/73

20 Variable Elimination Exact inference can be performed by repeatedly eliminating variables: Loopy Belief Propagation p.20/73

21 Outline Background. Undirected graphical models. Belief Propagation algorithm. Three techniques for analyzing loopy BP. Algebraic analysis. Unwrapped tree. Reparameterization. Future work. Loopy Belief Propagation p.21/73

22 BP for Trees BP breaks the [max-]marginalization for a node into independent subproblems corresponding to the subtrees rooted at the neighbors of. In each subproblem, BP eliminates all the variables except. The result is a message. Loopy Belief Propagation p.22/73

23 BP for Trees BP is a dynamic programming form of variable elimination. The creation of a message is equivalent to repeatedly removing leaf nodes of the subtree: Loopy Belief Propagation p.23/73

24 BP for Trees In terms of these messages, the [max-]marginals are Loopy Belief Propagation p.24/73

25 Parallel Message Passing Instead of waiting for smaller problems to be solved before solving larger problems, we can iteratively pass messages in parallel. until convergence, Initialize the messages for all For Update using for all.. Loopy Belief Propagation p.25/73

26 Loopy BP The parallel BP algorithm can be applied to arbitrary graphs. However, the assumptions made by BP do not hold for a loopy graph. Loopy BP is not guaranteed to converge. If it does converge, it is not guaranteed to converge to the correct [max-]marginals. We will call the approximate [max-]marginals beliefs. Loopy Belief Propagation p.26/73

27 Theoretical Analysis When will loopy BP converge? How good an approximation are the [max-]marginals and max-product assignment? I present three techniques for analyzing BP. The first two analyze the message-passing dynamics, while the third analyzes the steady-state beliefs directly. Loopy Belief Propagation p.27/73

28 Outline Background. Undirected graphical models. Belief Propagation algorithm. Three techniques for analyzing loopy BP. Algebraic analysis. Unwrapped tree. Reparameterization. Future work. Loopy Belief Propagation p.28/73

29 Algebraic Analysis Overview We first discuss an algebraic analysis of the sum-product algorithm for a single-cycle graph. We represent each message update of the sum-product algorithm as a matrix multiplication. We use linear algebra results to show the relationship between the steady-state beliefs and the true marginals, as well as convergence properties. The sum-product algorithm converges for a single-cycle graph. The convergence rate and the accuracy of the beliefs are related. Loopy Belief Propagation p.29/73

30 Matrix Representation We represent the message and belief functions as vectors and. Similarly, we represent the single- and pair-node potentials as matrices. and Loopy Belief Propagation p.30/73

31 Matrix Representation We represent the message and belief functions as vectors and. Similarly, we represent the single- and pair-node potentials as matrices. and Loopy Belief Propagation p.31/73

32 Matrix Message Updates For a graph consisting of a single cycle, a message update is a matrix multiplication Loopy Belief Propagation p.32/73

33 Matrix Message Updates For a graph consisting of a single cycle, a message update is a matrix multiplication Loopy Belief Propagation p.33/73

34 Matrix Belief Updates For a graph consisting of a single cycle, a belief update is diag Loopy Belief Propagation p.34/73

35 % & % # " $ #! ) ) ) ' + ' Matrix Message Updates! A series of message-updates is a series of matrix multiplications. ) * ** ) ) ( ' Loopy Belief Propagation p.35/73

36 Power Method Lemma converges to the principal eigenvector of,. The convergence rate is the ratio of the first two eigenvalues,. This applies if The eigenvalues follow (e.g. if the distribution is positive). The initial vector is not orthogonal to. Loopy Belief Propagation p.36/73

37 True Marginals The sums and multiplications performed when computing the marginals are a distributed version of the sums and multiplications performed when computing the diagonal elements of : diag Loopy Belief Propagation p.37/73

38 Beliefs is the left eigenvector of, since The steady-state beliefs are therefore the diagonal elements of the outer product of the right and left principal eigenvectors of. Loopy Belief Propagation p.38/73

39 Beliefs vs True Marginals The diagonal elements of the outer product of the right and left principal eigenvectors is an approximation of the diagonal elements of The goodness of the approximation depends on the ratio. Recall that the convergence rate depends on a similar ratio,. The faster the convergence, the better the approximation.. Loopy Belief Propagation p.39/73

40 Algebraic Analysis Recap By representing the sum-product algorithm on a single-cycle graph as a series of matrix multiplications, we showed the following results: The sum-product algorithm converges for positive distributions. Both the covergence rate and the accuracy of the steady-state beliefs depend on the relative size of the first eigenvalue of the same matrix. Loopy Belief Propagation p.40/73

41 Outline Background. Undirected graphical models. Belief Propagation algorithm. Three techniques for analyzing loopy BP. Algebraic analysis. Unwrapped tree. Reparameterization. Future work. Loopy Belief Propagation p.41/73

42 Unwrapped Tree To analyze the BP algorithm, we construct the unwrapped tree,, an acyclic graph that is locally equivalent to the original graph, Loopy Belief Propagation p.42/73

43 Want Unwrapped Tree Analysis Loopy Belief Propagation p.43/73

44 Want Know Unwrapped Tree Analysis Loopy Belief Propagation p.44/73

45 Want Know Unwrapped Tree Analysis Show Show Loopy Belief Propagation p.45/73

46 Unwrapped Tree Overview The unwrapped tree was used to prove: The max-product assignment is exact for a graph containing a single cycle. The max-product assignment has a higher probability than any other assignment in a large neighborhood. Loopy Belief Propagation p.46/73

47 Want Know Unwrapped Tree Construction Show Show Loopy Belief Propagation p.47/73

48 Unwrapped Tree Construction The unwrapped tree,, is constructed from as follows: Choose an arbitrary root node. Initialize Repeat: For each leaf of, find the neighbors of the corresponding node in, other than the parent of in. Add these nodes to the tree.. Loopy Belief Propagation p.48/73

49 Unwrapped Tree Construction The unwrapped tree,, is constructed from as follows: Choose an arbitrary root node. Initialize Repeat: For each leaf of, find the neighbors of the corresponding node in, other than the parent of in. Add these nodes to the tree.. Loopy Belief Propagation p.49/73

50 Unwrapped Tree Construction The unwrapped tree,, is constructed from as follows: Choose an arbitrary root node. Initialize Repeat: For each leaf of, find the neighbors of the corresponding node in, other than the parent of in. Add these nodes to the tree.. Loopy Belief Propagation p.50/73

51 Unwrapped Tree Construction The unwrapped tree,, is constructed from as follows: Choose an arbitrary root node. Initialize Repeat: For each leaf of, find the neighbors of the corresponding node in, other than the parent of in. Add these nodes to the tree.. Loopy Belief Propagation p.51/73

52 Unwrapped Tree Construction The unwrapped tree,, is constructed from as follows: Choose an arbitrary root node. Initialize Repeat: For each leaf of, find the neighbors of the corresponding node in, other than the parent of in. Add these nodes to the tree.. Loopy Belief Propagation p.52/73

53 Unwrapped Tree Construction Copy the potentials from the corresponding nodes in. Modify the leaf single-node potentials to simulate the steady-state messages in. Because the graphs are locally the same, the will be replicas of. Loopy Belief Propagation p.53/73

54 Graphs Containing a Single Cycle For a graph containing a single cycle, the unwrapped tree is an infinite chain. We can construct so that each node is replicated times in the interior. Loopy Belief Propagation p.54/73

55 Want Know Graphs Containing a Single Cycle Show Show Loopy Belief Propagation p.55/73

56 Graphs Containing a Single Cycle Let be the log-likelihood of assignment for. Since the interior of consists of replicas of, the log-likelihood for is where As is the log-likelihood for the two leaf nodes. does not depend on, in the limit as,. Loopy Belief Propagation p.56/73

57 Want Know Graphs Containing a Single Cycle Shown Shown Loopy Belief Propagation p.57/73

58 Optimality for Arbitrary Graphs Let be a set of nodes whose induced subgraph contains at most one cycle per connected component. We can show that has a higher probability than any. Loopy Belief Propagation p.58/73

59 Outline Background. Undirected graphical models. Belief Propagation algorithm. Three techniques for analyzing loopy BP. Algebraic analysis. Unwrapped tree. Reparameterization. Future work. Loopy Belief Propagation p.59/73

60 Reparameterization Analysis The past two analysis techniques analyzed the message-passing dynamics of BP. The reparameterization technique analyzes the steady-state beliefs. Loopy Belief Propagation p.60/73

61 Reparameterization Overview We show that the beliefs define another parameterization of the distribution In this parameterization, We show that the steady-state beliefs are consistent w.r.t every subtree. We show that the max-product assignment satisfies an optimality condition w.r.t. every subgraph with at most one cycle per connected component.. Loopy Belief Propagation p.61/73

62 Steady-State Beliefs We analyze the steady-state single- and pair-node beliefs: * Loopy Belief Propagation p.62/73

63 Belief Parameterization define another parameterization of the The beliefs distribution: Loopy Belief Propagation p.63/73

64 Belief Parameterization define another parameterization of the The beliefs distribution: ). as can be shown by substituting in the definition of Loopy Belief Propagation p.64/73

65 Consistency Definition: Let distribution be a subgraph with The beliefs beliefs are consistent w.r.t if the corresponding are the true max-marginals of. Loopy Belief Propagation p.65/73

66 Edge Consistency The edge beliefs are consistent: as can be seen by substituting in the message definitions of and. Loopy Belief Propagation p.66/73

67 Tree Consistency The steady-state beliefs are consistent w.r.t every subtree of. This is shown by exploiting the edge consistency described to remove leaf nodes one at a time. In the end, we are left with only two nodes, a trivial base case. Loopy Belief Propagation p.67/73

68 Tree Plus Cycle Optimality Let be a subgraph of with at most one cycle per connected component and distribution. maximizes The max-product assignment Loopy Belief Propagation p.68/73

69 Tree Plus Cycle Optimality Using the edge consistency described, we can show that for any other assignment. Loopy Belief Propagation p.69/73

70 Tree Plus Cycle Optimality If is a connected subgraph containing one cycle, then the edges of can be directed so that each node has exactly one parent: where is the parent of. Loopy Belief Propagation p.70/73

71 Corollaries of TPS Optimality The two results proved using the unwrapped tree are corollaries of the Tree-Plus-Cycle optimality. The Tree-Plus-Cycle optimality can also be used to show an error bound on the max-product assignment for an arbitrary graph. Loopy Belief Propagation p.71/73

72 Future Work I have presented three techniques for analyzing loopy BP. Experimental results are better than the results proved. Future work includes extending each technique to be more general and prove stronger results. Prove convergence properties of the max-product algorithm on a single-cycle graph using algebraic analysis. Prove the optimality of the max-product algorithm for specific multiple-loop graphs using the unwrapped tree technique. Show more powerful optimality results for arbitrary graph structures with specific potential properties. Loopy Belief Propagation p.72/73

73 References Aji, S., Horn, G., and McEliece, R. (1998). On the convergence of iterative decoding on graphs with a single cycle. In IEEE International Symposium on Information Theory. McEliece, R., Rodemich, E., and Cheng, J. (1995). The Turbo decision algorithm. In 33rd Allerton Conference on Communications, Control and Computing, Monticello, IL. Murphy, K., Weiss, Y., and Jordan, M. (1999). Loopy belief propagation for approximate inference: An empirical study. In Uncertainty in Artificial Intelligence, pages Pearl, J. (1998). Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann Publishers, Inc., San Mateo, CA. Wainwright, M. (January, 2002). Stochastic Processes on Graphs with Cycles: Geometric and Variational Approaches. PhD thesis, MIT, Cambridge, MA. Wainwright, M., Jaakola, T., and Willsky, A. (2003). Tree-based reparameterization framework for analysis of sum-product and related algorithms. IEEE Transactions on Information Theory, 49(5). Wainwright, M., Jaakola, T., and Willsky, A. (October 28, 2002). Tree consistency and bounds on the performance of the max-product algorithm and its generalizations. Technical Report P 2554, Laboratory for Information and Decision Systems, MIT. Weiss, Y. (November, 1997). Belief propagation and revision in networks with loops. Technical Report AI Memo No. 1616, C.B.C.L. Paper No. 155, AI Lab, MIT. Weiss, Y. and Freeman, W. (2001a). Correctness of belief propagation in Gaussian graphical models or arbitrary topology. Neural Computation, 13: Weiss, Y. and Freeman, W. (2001b). On the optimality of solutions of the max-product belief propagation algorithm in arbitrary graphs. IEEE Transactions on Information Theory, Loopy Belief Propagation p.73/73 47(2):

Expectation Propagation

Expectation Propagation Expectation Propagation Erik Sudderth 6.975 Week 11 Presentation November 20, 2002 Introduction Goal: Efficiently approximate intractable distributions Features of Expectation Propagation (EP): Deterministic,

More information

Information Processing Letters

Information Processing Letters Information Processing Letters 112 (2012) 449 456 Contents lists available at SciVerse ScienceDirect Information Processing Letters www.elsevier.com/locate/ipl Recursive sum product algorithm for generalized

More information

Part II. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS

Part II. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Part II C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Converting Directed to Undirected Graphs (1) Converting Directed to Undirected Graphs (2) Add extra links between

More information

Probabilistic Graphical Models

Probabilistic Graphical Models School of Computer Science Probabilistic Graphical Models Theory of Variational Inference: Inner and Outer Approximation Eric Xing Lecture 14, February 29, 2016 Reading: W & J Book Chapters Eric Xing @

More information

Graphical Models. Pradeep Ravikumar Department of Computer Science The University of Texas at Austin

Graphical Models. Pradeep Ravikumar Department of Computer Science The University of Texas at Austin Graphical Models Pradeep Ravikumar Department of Computer Science The University of Texas at Austin Useful References Graphical models, exponential families, and variational inference. M. J. Wainwright

More information

D-Separation. b) the arrows meet head-to-head at the node, and neither the node, nor any of its descendants, are in the set C.

D-Separation. b) the arrows meet head-to-head at the node, and neither the node, nor any of its descendants, are in the set C. D-Separation Say: A, B, and C are non-intersecting subsets of nodes in a directed graph. A path from A to B is blocked by C if it contains a node such that either a) the arrows on the path meet either

More information

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models Prof. Daniel Cremers 4. Probabilistic Graphical Models Directed Models The Bayes Filter (Rep.) (Bayes) (Markov) (Tot. prob.) (Markov) (Markov) 2 Graphical Representation (Rep.) We can describe the overall

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Bayesian Networks Directed Acyclic Graph (DAG) Bayesian Networks General Factorization Bayesian Curve Fitting (1) Polynomial Bayesian

More information

Computer Vision Group Prof. Daniel Cremers. 4a. Inference in Graphical Models

Computer Vision Group Prof. Daniel Cremers. 4a. Inference in Graphical Models Group Prof. Daniel Cremers 4a. Inference in Graphical Models Inference on a Chain (Rep.) The first values of µ α and µ β are: The partition function can be computed at any node: Overall, we have O(NK 2

More information

Chapter 8 of Bishop's Book: Graphical Models

Chapter 8 of Bishop's Book: Graphical Models Chapter 8 of Bishop's Book: Graphical Models Review of Probability Probability density over possible values of x Used to find probability of x falling in some range For continuous variables, the probability

More information

These notes present some properties of chordal graphs, a set of undirected graphs that are important for undirected graphical models.

These notes present some properties of chordal graphs, a set of undirected graphs that are important for undirected graphical models. Undirected Graphical Models: Chordal Graphs, Decomposable Graphs, Junction Trees, and Factorizations Peter Bartlett. October 2003. These notes present some properties of chordal graphs, a set of undirected

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Overview of Part Two Probabilistic Graphical Models Part Two: Inference and Learning Christopher M. Bishop Exact inference and the junction tree MCMC Variational methods and EM Example General variational

More information

CS242: Probabilistic Graphical Models Lecture 2B: Loopy Belief Propagation & Junction Trees

CS242: Probabilistic Graphical Models Lecture 2B: Loopy Belief Propagation & Junction Trees CS242: Probabilistic Graphical Models Lecture 2B: Loopy Belief Propagation & Junction Trees Professor Erik Sudderth Brown University Computer Science September 22, 2016 Some figures and materials courtesy

More information

Undirected Graphical Models. Raul Queiroz Feitosa

Undirected Graphical Models. Raul Queiroz Feitosa Undirected Graphical Models Raul Queiroz Feitosa Pros and Cons Advantages of UGMs over DGMs UGMs are more natural for some domains (e.g. context-dependent entities) Discriminative UGMs (CRF) are better

More information

Multiple Constraint Satisfaction by Belief Propagation: An Example Using Sudoku

Multiple Constraint Satisfaction by Belief Propagation: An Example Using Sudoku Multiple Constraint Satisfaction by Belief Propagation: An Example Using Sudoku Todd K. Moon and Jacob H. Gunther Utah State University Abstract The popular Sudoku puzzle bears structural resemblance to

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 5 Inference

More information

Tree-structured approximations by expectation propagation

Tree-structured approximations by expectation propagation Tree-structured approximations by expectation propagation Thomas Minka Department of Statistics Carnegie Mellon University Pittsburgh, PA 15213 USA minka@stat.cmu.edu Yuan Qi Media Laboratory Massachusetts

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Overview of Part One Probabilistic Graphical Models Part One: Graphs and Markov Properties Christopher M. Bishop Graphs and probabilities Directed graphs Markov properties Undirected graphs Examples Microsoft

More information

Mean Field and Variational Methods finishing off

Mean Field and Variational Methods finishing off Readings: K&F: 10.1, 10.5 Mean Field and Variational Methods finishing off Graphical Models 10708 Carlos Guestrin Carnegie Mellon University November 5 th, 2008 10-708 Carlos Guestrin 2006-2008 1 10-708

More information

6 : Factor Graphs, Message Passing and Junction Trees

6 : Factor Graphs, Message Passing and Junction Trees 10-708: Probabilistic Graphical Models 10-708, Spring 2018 6 : Factor Graphs, Message Passing and Junction Trees Lecturer: Kayhan Batmanghelich Scribes: Sarthak Garg 1 Factor Graphs Factor Graphs are graphical

More information

Lecture 3: Conditional Independence - Undirected

Lecture 3: Conditional Independence - Undirected CS598: Graphical Models, Fall 2016 Lecture 3: Conditional Independence - Undirected Lecturer: Sanmi Koyejo Scribe: Nate Bowman and Erin Carrier, Aug. 30, 2016 1 Review for the Bayes-Ball Algorithm Recall

More information

A Tutorial Introduction to Belief Propagation

A Tutorial Introduction to Belief Propagation A Tutorial Introduction to Belief Propagation James Coughlan August 2009 Table of Contents Introduction p. 3 MRFs, graphical models, factor graphs 5 BP 11 messages 16 belief 22 sum-product vs. max-product

More information

10708 Graphical Models: Homework 4

10708 Graphical Models: Homework 4 10708 Graphical Models: Homework 4 Due November 12th, beginning of class October 29, 2008 Instructions: There are six questions on this assignment. Each question has the name of one of the TAs beside it,

More information

3136 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 52, NO. 11, NOVEMBER 2004

3136 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 52, NO. 11, NOVEMBER 2004 3136 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 52, NO. 11, NOVEMBER 2004 Embedded Trees: Estimation of Gaussian Processes on Graphs with Cycles Erik B. Sudderth, Student Member, IEEE, Martin J. Wainwright,

More information

Bayesian Networks, Winter Yoav Haimovitch & Ariel Raviv

Bayesian Networks, Winter Yoav Haimovitch & Ariel Raviv Bayesian Networks, Winter 2009-2010 Yoav Haimovitch & Ariel Raviv 1 Chordal Graph Warm up Theorem 7 Perfect Vertex Elimination Scheme Maximal cliques Tree Bibliography M.C.Golumbic Algorithmic Graph Theory

More information

Comparison of Graph Cuts with Belief Propagation for Stereo, using Identical MRF Parameters

Comparison of Graph Cuts with Belief Propagation for Stereo, using Identical MRF Parameters Comparison of Graph Cuts with Belief Propagation for Stereo, using Identical MRF Parameters Marshall F. Tappen William T. Freeman Computer Science and Artificial Intelligence Laboratory Massachusetts Institute

More information

The Basics of Graphical Models

The Basics of Graphical Models The Basics of Graphical Models David M. Blei Columbia University September 30, 2016 1 Introduction (These notes follow Chapter 2 of An Introduction to Probabilistic Graphical Models by Michael Jordan.

More information

Algorithms for Markov Random Fields in Computer Vision

Algorithms for Markov Random Fields in Computer Vision Algorithms for Markov Random Fields in Computer Vision Dan Huttenlocher November, 2003 (Joint work with Pedro Felzenszwalb) Random Field Broadly applicable stochastic model Collection of n sites S Hidden

More information

AN ANALYSIS ON MARKOV RANDOM FIELDS (MRFs) USING CYCLE GRAPHS

AN ANALYSIS ON MARKOV RANDOM FIELDS (MRFs) USING CYCLE GRAPHS Volume 8 No. 0 208, -20 ISSN: 3-8080 (printed version); ISSN: 34-3395 (on-line version) url: http://www.ijpam.eu doi: 0.2732/ijpam.v8i0.54 ijpam.eu AN ANALYSIS ON MARKOV RANDOM FIELDS (MRFs) USING CYCLE

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 3, MARCH

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 3, MARCH IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 54, NO 3, MARCH 2008 1241 Max-Product for Maximum Weight Matching: Convergence, Correctness, LP Duality Mohsen Bayati, Devavrat Shah, Mayank Sharma Abstract

More information

Models for grids. Computer vision: models, learning and inference. Multi label Denoising. Binary Denoising. Denoising Goal.

Models for grids. Computer vision: models, learning and inference. Multi label Denoising. Binary Denoising. Denoising Goal. Models for grids Computer vision: models, learning and inference Chapter 9 Graphical Models Consider models where one unknown world state at each pixel in the image takes the form of a grid. Loops in the

More information

Mean Field and Variational Methods finishing off

Mean Field and Variational Methods finishing off Readings: K&F: 10.1, 10.5 Mean Field and Variational Methods finishing off Graphical Models 10708 Carlos Guestrin Carnegie Mellon University November 5 th, 2008 10-708 Carlos Guestrin 2006-2008 1 10-708

More information

Motivation: Shortcomings of Hidden Markov Model. Ko, Youngjoong. Solution: Maximum Entropy Markov Model (MEMM)

Motivation: Shortcomings of Hidden Markov Model. Ko, Youngjoong. Solution: Maximum Entropy Markov Model (MEMM) Motivation: Shortcomings of Hidden Markov Model Maximum Entropy Markov Models and Conditional Random Fields Ko, Youngjoong Dept. of Computer Engineering, Dong-A University Intelligent System Laboratory,

More information

Statistical and Learning Techniques in Computer Vision Lecture 1: Markov Random Fields Jens Rittscher and Chuck Stewart

Statistical and Learning Techniques in Computer Vision Lecture 1: Markov Random Fields Jens Rittscher and Chuck Stewart Statistical and Learning Techniques in Computer Vision Lecture 1: Markov Random Fields Jens Rittscher and Chuck Stewart 1 Motivation Up to now we have considered distributions of a single random variable

More information

A New Approach to Early Sketch Processing

A New Approach to Early Sketch Processing A New Approach to Early Sketch Processing Sonya Cates and Randall Davis MIT Computer Science and Artificial Intelligence Laboratory 32 Vassar Street Cambridge, MA 02139 {sjcates, davis}@csail.mit.edu Abstract

More information

Graphical Models. David M. Blei Columbia University. September 17, 2014

Graphical Models. David M. Blei Columbia University. September 17, 2014 Graphical Models David M. Blei Columbia University September 17, 2014 These lecture notes follow the ideas in Chapter 2 of An Introduction to Probabilistic Graphical Models by Michael Jordan. In addition,

More information

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models Prof. Daniel Cremers 4. Probabilistic Graphical Models Directed Models The Bayes Filter (Rep.) (Bayes) (Markov) (Tot. prob.) (Markov) (Markov) 2 Graphical Representation (Rep.) We can describe the overall

More information

Introduction to Graphical Models

Introduction to Graphical Models Robert Collins CSE586 Introduction to Graphical Models Readings in Prince textbook: Chapters 10 and 11 but mainly only on directed graphs at this time Credits: Several slides are from: Review: Probability

More information

Inference in the Promedas medical expert system

Inference in the Promedas medical expert system Inference in the Promedas medical expert system Bastian Wemmenhove 1, Joris M. Mooij 1, Wim Wiegerinck 1, Martijn Leisink 1, Hilbert J. Kappen 1, and Jan P. Neijt 2 1 Department of Biophysics, Radboud

More information

Computer vision: models, learning and inference. Chapter 10 Graphical Models

Computer vision: models, learning and inference. Chapter 10 Graphical Models Computer vision: models, learning and inference Chapter 10 Graphical Models Independence Two variables x 1 and x 2 are independent if their joint probability distribution factorizes as Pr(x 1, x 2 )=Pr(x

More information

Variational Methods for Graphical Models

Variational Methods for Graphical Models Chapter 2 Variational Methods for Graphical Models 2.1 Introduction The problem of probabb1istic inference in graphical models is the problem of computing a conditional probability distribution over the

More information

Lecture 4: Undirected Graphical Models

Lecture 4: Undirected Graphical Models Lecture 4: Undirected Graphical Models Department of Biostatistics University of Michigan zhenkewu@umich.edu http://zhenkewu.com/teaching/graphical_model 15 September, 2016 Zhenke Wu BIOSTAT830 Graphical

More information

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms for Inference Fall 2014

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms for Inference Fall 2014 Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms for Inference Fall 2014 1 Course Overview This course is about performing inference in complex

More information

OSU CS 536 Probabilistic Graphical Models. Loopy Belief Propagation and Clique Trees / Join Trees

OSU CS 536 Probabilistic Graphical Models. Loopy Belief Propagation and Clique Trees / Join Trees OSU CS 536 Probabilistic Graphical Models Loopy Belief Propagation and Clique Trees / Join Trees Slides from Kevin Murphy s Graphical Model Tutorial (with minor changes) Reading: Koller and Friedman Ch

More information

A Well-Behaved Algorithm for Simulating Dependence Structures of Bayesian Networks

A Well-Behaved Algorithm for Simulating Dependence Structures of Bayesian Networks A Well-Behaved Algorithm for Simulating Dependence Structures of Bayesian Networks Yang Xiang and Tristan Miller Department of Computer Science University of Regina Regina, Saskatchewan, Canada S4S 0A2

More information

Belief propagation and MRF s

Belief propagation and MRF s Belief propagation and MRF s Bill Freeman 6.869 March 7, 2011 1 1 Undirected graphical models A set of nodes joined by undirected edges. The graph makes conditional independencies explicit: If two nodes

More information

Pairwise Clustering and Graphical Models

Pairwise Clustering and Graphical Models Pairwise Clustering and Graphical Models Noam Shental Computer Science & Eng. Center for Neural Computation Hebrew University of Jerusalem Jerusalem, Israel 994 fenoam@cs.huji.ac.il Tomer Hertz Computer

More information

Faster parameterized algorithms for Minimum Fill-In

Faster parameterized algorithms for Minimum Fill-In Faster parameterized algorithms for Minimum Fill-In Hans L. Bodlaender Pinar Heggernes Yngve Villanger Technical Report UU-CS-2008-042 December 2008 Department of Information and Computing Sciences Utrecht

More information

Max-Product Particle Belief Propagation

Max-Product Particle Belief Propagation Max-Product Particle Belief Propagation Rajkumar Kothapa Department of Computer Science Brown University Providence, RI 95 rjkumar@cs.brown.edu Jason Pacheco Department of Computer Science Brown University

More information

Exact Inference: Elimination and Sum Product (and hidden Markov models)

Exact Inference: Elimination and Sum Product (and hidden Markov models) Exact Inference: Elimination and Sum Product (and hidden Markov models) David M. Blei Columbia University October 13, 2015 The first sections of these lecture notes follow the ideas in Chapters 3 and 4

More information

Collective classification in network data

Collective classification in network data 1 / 50 Collective classification in network data Seminar on graphs, UCSB 2009 Outline 2 / 50 1 Problem 2 Methods Local methods Global methods 3 Experiments Outline 3 / 50 1 Problem 2 Methods Local methods

More information

CS242: Probabilistic Graphical Models Lecture 3: Factor Graphs & Variable Elimination

CS242: Probabilistic Graphical Models Lecture 3: Factor Graphs & Variable Elimination CS242: Probabilistic Graphical Models Lecture 3: Factor Graphs & Variable Elimination Instructor: Erik Sudderth Brown University Computer Science September 11, 2014 Some figures and materials courtesy

More information

Workshop report 1. Daniels report is on website 2. Don t expect to write it based on listening to one project (we had 6 only 2 was sufficient

Workshop report 1. Daniels report is on website 2. Don t expect to write it based on listening to one project (we had 6 only 2 was sufficient Workshop report 1. Daniels report is on website 2. Don t expect to write it based on listening to one project (we had 6 only 2 was sufficient quality) 3. I suggest writing it on one presentation. 4. Include

More information

Faster parameterized algorithms for Minimum Fill-In

Faster parameterized algorithms for Minimum Fill-In Faster parameterized algorithms for Minimum Fill-In Hans L. Bodlaender Pinar Heggernes Yngve Villanger Abstract We present two parameterized algorithms for the Minimum Fill-In problem, also known as Chordal

More information

2. Graphical Models. Undirected graphical models. Factor graphs. Bayesian networks. Conversion between graphical models. Graphical Models 2-1

2. Graphical Models. Undirected graphical models. Factor graphs. Bayesian networks. Conversion between graphical models. Graphical Models 2-1 Graphical Models 2-1 2. Graphical Models Undirected graphical models Factor graphs Bayesian networks Conversion between graphical models Graphical Models 2-2 Graphical models There are three families of

More information

Image Restoration using Markov Random Fields

Image Restoration using Markov Random Fields Image Restoration using Markov Random Fields Based on the paper Stochastic Relaxation, Gibbs Distributions and Bayesian Restoration of Images, PAMI, 1984, Geman and Geman. and the book Markov Random Field

More information

FMA901F: Machine Learning Lecture 6: Graphical Models. Cristian Sminchisescu

FMA901F: Machine Learning Lecture 6: Graphical Models. Cristian Sminchisescu FMA901F: Machine Learning Lecture 6: Graphical Models Cristian Sminchisescu Graphical Models Provide a simple way to visualize the structure of a probabilistic model and can be used to design and motivate

More information

ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning

ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning Topics Markov Random Fields: Inference Exact: VE Exact+Approximate: BP Readings: Barber 5 Dhruv Batra

More information

GAUSSIAN graphical models are used to represent the

GAUSSIAN graphical models are used to represent the IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 60, NO. 8, AUGUST 2012 4135 Feedback Message Passing for Inference in Gaussian Graphical Models Ying Liu, Student Member, IEEE, Venkat Chandrasekaran, Member,

More information

3 : Representation of Undirected GMs

3 : Representation of Undirected GMs 0-708: Probabilistic Graphical Models 0-708, Spring 202 3 : Representation of Undirected GMs Lecturer: Eric P. Xing Scribes: Nicole Rafidi, Kirstin Early Last Time In the last lecture, we discussed directed

More information

Junction tree propagation - BNDG 4-4.6

Junction tree propagation - BNDG 4-4.6 Junction tree propagation - BNDG 4-4. Finn V. Jensen and Thomas D. Nielsen Junction tree propagation p. 1/2 Exact Inference Message Passing in Join Trees More sophisticated inference technique; used in

More information

Lecture 5: Exact inference. Queries. Complexity of inference. Queries (continued) Bayesian networks can answer questions about the underlying

Lecture 5: Exact inference. Queries. Complexity of inference. Queries (continued) Bayesian networks can answer questions about the underlying given that Maximum a posteriori (MAP query: given evidence 2 which has the highest probability: instantiation of all other variables in the network,, Most probable evidence (MPE: given evidence, find an

More information

Convergent message passing algorithms - a unifying view

Convergent message passing algorithms - a unifying view Convergent message passing algorithms - a unifying view Talya Meltzer, Amir Globerson and Yair Weiss School of Computer Science and Engineering The Hebrew University of Jerusalem, Jerusalem, Israel {talyam,gamir,yweiss}@cs.huji.ac.il

More information

ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning

ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning Topics Bayes Nets: Inference (Finish) Variable Elimination Graph-view of VE: Fill-edges, induced width

More information

Building Classifiers using Bayesian Networks

Building Classifiers using Bayesian Networks Building Classifiers using Bayesian Networks Nir Friedman and Moises Goldszmidt 1997 Presented by Brian Collins and Lukas Seitlinger Paper Summary The Naive Bayes classifier has reasonable performance

More information

On the Space-Time Trade-off in Solving Constraint Satisfaction Problems*

On the Space-Time Trade-off in Solving Constraint Satisfaction Problems* Appeared in Proc of the 14th Int l Joint Conf on Artificial Intelligence, 558-56, 1995 On the Space-Time Trade-off in Solving Constraint Satisfaction Problems* Roberto J Bayardo Jr and Daniel P Miranker

More information

5 Minimal I-Maps, Chordal Graphs, Trees, and Markov Chains

5 Minimal I-Maps, Chordal Graphs, Trees, and Markov Chains Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms for Inference Fall 2014 5 Minimal I-Maps, Chordal Graphs, Trees, and Markov Chains Recall

More information

Node Aggregation for Distributed Inference in Bayesian Networks

Node Aggregation for Distributed Inference in Bayesian Networks Node Aggregation for Distributed Inference in Bayesian Networks Kuo-Chu Chang and Robert Fung Advanced Decision Systmes 1500 Plymouth Street Mountain View, California 94043-1230 Abstract This study describes

More information

Generalized Belief Propagation on Tree Robust Structured Region Graphs

Generalized Belief Propagation on Tree Robust Structured Region Graphs Generalized Belief Propagation on Tree Robust Structured Region Graphs Andrew E. Gelfand Dept. of Computer Science University of California, Irvine Irvine, CA 92697-3425, USA Max Welling Dept. of Computer

More information

Byzantine Consensus in Directed Graphs

Byzantine Consensus in Directed Graphs Byzantine Consensus in Directed Graphs Lewis Tseng 1,3, and Nitin Vaidya 2,3 1 Department of Computer Science, 2 Department of Electrical and Computer Engineering, and 3 Coordinated Science Laboratory

More information

Copyright 2009 IEEE. Reprinted from IEEE Transactions on Information Theory.

Copyright 2009 IEEE. Reprinted from IEEE Transactions on Information Theory. Copyright 2009 IEEE. Reprinted from IEEE Transactions on Information Theory. This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement

More information

arxiv: v2 [cs.cv] 30 Jul 2018

arxiv: v2 [cs.cv] 30 Jul 2018 Scene Grammars, Factor Graphs, and Belief Propagation Jeroen Chua Brown University Providence, RI, USA jeroen chua@alumni.brown.edu Pedro F. Felzenszwalb Brown University Providence, RI, USA pff@brown.edu

More information

Conditional Random Fields and beyond D A N I E L K H A S H A B I C S U I U C,

Conditional Random Fields and beyond D A N I E L K H A S H A B I C S U I U C, Conditional Random Fields and beyond D A N I E L K H A S H A B I C S 5 4 6 U I U C, 2 0 1 3 Outline Modeling Inference Training Applications Outline Modeling Problem definition Discriminative vs. Generative

More information

Graphical models and message-passing algorithms: Some introductory lectures

Graphical models and message-passing algorithms: Some introductory lectures Graphical models and message-passing algorithms: Some introductory lectures Martin J. Wainwright 1 Introduction Graphical models provide a framework for describing statistical dependencies in (possibly

More information

CS 532c Probabilistic Graphical Models N-Best Hypotheses. December

CS 532c Probabilistic Graphical Models N-Best Hypotheses. December CS 532c Probabilistic Graphical Models N-Best Hypotheses Zvonimir Rakamaric Chris Dabrowski December 18 2004 Contents 1 Introduction 3 2 Background Info 3 3 Brute Force Algorithm 4 3.1 Description.........................................

More information

Regularization and Markov Random Fields (MRF) CS 664 Spring 2008

Regularization and Markov Random Fields (MRF) CS 664 Spring 2008 Regularization and Markov Random Fields (MRF) CS 664 Spring 2008 Regularization in Low Level Vision Low level vision problems concerned with estimating some quantity at each pixel Visual motion (u(x,y),v(x,y))

More information

Introduction to Graph Theory

Introduction to Graph Theory Introduction to Graph Theory Tandy Warnow January 20, 2017 Graphs Tandy Warnow Graphs A graph G = (V, E) is an object that contains a vertex set V and an edge set E. We also write V (G) to denote the vertex

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 51, NO. 7, JULY A New Class of Upper Bounds on the Log Partition Function

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 51, NO. 7, JULY A New Class of Upper Bounds on the Log Partition Function IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 51, NO. 7, JULY 2005 2313 A New Class of Upper Bounds on the Log Partition Function Martin J. Wainwright, Member, IEEE, Tommi S. Jaakkola, and Alan S. Willsky,

More information

Accumulator networks: Suitors of local probability propagation

Accumulator networks: Suitors of local probability propagation Accumulator networks: Suitors of local probability propagation Brendan J. Frey and Anitha Kannan Intelligent Algorithms Lab, University of Toronto, www. cs. toronto. edu/ "-'frey Abstract One way to approximate

More information

Integrating Probabilistic Reasoning with Constraint Satisfaction

Integrating Probabilistic Reasoning with Constraint Satisfaction Integrating Probabilistic Reasoning with Constraint Satisfaction IJCAI Tutorial #7 Instructor: Eric I. Hsu July 17, 2011 http://www.cs.toronto.edu/~eihsu/tutorial7 Getting Started Discursive Remarks. Organizational

More information

Markov Logic: Representation

Markov Logic: Representation Markov Logic: Representation Overview Statistical relational learning Markov logic Basic inference Basic learning Statistical Relational Learning Goals: Combine (subsets of) logic and probability into

More information

Structured Region Graphs: Morphing EP into GBP

Structured Region Graphs: Morphing EP into GBP Structured Region Graphs: Morphing EP into GBP Max Welling Dept. of Computer Science UC Irvine Irvine CA 92697-3425 welling@ics.uci.edu Thomas P. Minka Microsoft Research Cambridge, CB3 0FB, UK minka@microsoft.com

More information

An Effective Upperbound on Treewidth Using Partial Fill-in of Separators

An Effective Upperbound on Treewidth Using Partial Fill-in of Separators An Effective Upperbound on Treewidth Using Partial Fill-in of Separators Boi Faltings Martin Charles Golumbic June 28, 2009 Abstract Partitioning a graph using graph separators, and particularly clique

More information

More details on Loopy BP

More details on Loopy BP Readings: K&F: 11.3, 11.5 Yedidia et al. paper from the class website Chapter 9 - Jordan Loopy Belief Propagation Generalized Belief Propagation Unifying Variational and GBP Learning Parameters of MNs

More information

Parallel Gibbs Sampling From Colored Fields to Thin Junction Trees

Parallel Gibbs Sampling From Colored Fields to Thin Junction Trees Parallel Gibbs Sampling From Colored Fields to Thin Junction Trees Joseph Gonzalez Yucheng Low Arthur Gretton Carlos Guestrin Draw Samples Sampling as an Inference Procedure Suppose we wanted to know the

More information

Message-passing algorithms (continued)

Message-passing algorithms (continued) Message-passing algorithms (continued) Nikos Komodakis Ecole des Ponts ParisTech, LIGM Traitement de l information et vision artificielle Graphs with loops We saw that Belief Propagation can exactly optimize

More information

Probabilistic inference in graphical models

Probabilistic inference in graphical models Probabilistic inference in graphical models MichaelI.Jordan jordan@cs.berkeley.edu Division of Computer Science and Department of Statistics University of California, Berkeley Yair Weiss yweiss@cs.huji.ac.il

More information

c 2004 Society for Industrial and Applied Mathematics

c 2004 Society for Industrial and Applied Mathematics SIAM J. MATRIX ANAL. APPL. Vol. 26, No. 2, pp. 390 399 c 2004 Society for Industrial and Applied Mathematics HERMITIAN MATRICES, EIGENVALUE MULTIPLICITIES, AND EIGENVECTOR COMPONENTS CHARLES R. JOHNSON

More information

Mesh segmentation. Florent Lafarge Inria Sophia Antipolis - Mediterranee

Mesh segmentation. Florent Lafarge Inria Sophia Antipolis - Mediterranee Mesh segmentation Florent Lafarge Inria Sophia Antipolis - Mediterranee Outline What is mesh segmentation? M = {V,E,F} is a mesh S is either V, E or F (usually F) A Segmentation is a set of sub-meshes

More information

Lecture 5: Exact inference

Lecture 5: Exact inference Lecture 5: Exact inference Queries Inference in chains Variable elimination Without evidence With evidence Complexity of variable elimination which has the highest probability: instantiation of all other

More information

COS 513: Foundations of Probabilistic Modeling. Lecture 5

COS 513: Foundations of Probabilistic Modeling. Lecture 5 COS 513: Foundations of Probabilistic Modeling Young-suk Lee 1 Administrative Midterm report is due Oct. 29 th. Recitation is at 4:26pm in Friend 108. Lecture 5 R is a computer language for statistical

More information

IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 20, NO. 9, SEPTEMBER

IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 20, NO. 9, SEPTEMBER IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 20, NO. 9, SEPTEMBER 2011 2401 Probabilistic Image Modeling With an Extended Chain Graph for Human Activity Recognition and Image Segmentation Lei Zhang, Member,

More information

Distributed minimum spanning tree problem

Distributed minimum spanning tree problem Distributed minimum spanning tree problem Juho-Kustaa Kangas 24th November 2012 Abstract Given a connected weighted undirected graph, the minimum spanning tree problem asks for a spanning subtree with

More information

Conditional Random Fields for Object Recognition

Conditional Random Fields for Object Recognition Conditional Random Fields for Object Recognition Ariadna Quattoni Michael Collins Trevor Darrell MIT Computer Science and Artificial Intelligence Laboratory Cambridge, MA 02139 {ariadna, mcollins, trevor}@csail.mit.edu

More information

Lecture 9: Undirected Graphical Models Machine Learning

Lecture 9: Undirected Graphical Models Machine Learning Lecture 9: Undirected Graphical Models Machine Learning Andrew Rosenberg March 5, 2010 1/1 Today Graphical Models Probabilities in Undirected Graphs 2/1 Undirected Graphs What if we allow undirected graphs?

More information

Improving Mining Quality by Exploiting Data Dependency

Improving Mining Quality by Exploiting Data Dependency Improving Mining Quality by Exploiting Data Dependency Fang Chu, Yizhou Wang, Carlo Zaniolo, D.Stott Parker {fchu, wangyz, zaniolo, stott}@cs.ucla.edu University of California, Los Angeles, CA 90095, USA

More information

Faster parameterized algorithm for Cluster Vertex Deletion

Faster parameterized algorithm for Cluster Vertex Deletion Faster parameterized algorithm for Cluster Vertex Deletion Dekel Tsur arxiv:1901.07609v1 [cs.ds] 22 Jan 2019 Abstract In the Cluster Vertex Deletion problem the input is a graph G and an integer k. The

More information

Markov Random Fields and Gibbs Sampling for Image Denoising

Markov Random Fields and Gibbs Sampling for Image Denoising Markov Random Fields and Gibbs Sampling for Image Denoising Chang Yue Electrical Engineering Stanford University changyue@stanfoed.edu Abstract This project applies Gibbs Sampling based on different Markov

More information

Machine Learning. Sourangshu Bhattacharya

Machine Learning. Sourangshu Bhattacharya Machine Learning Sourangshu Bhattacharya Bayesian Networks Directed Acyclic Graph (DAG) Bayesian Networks General Factorization Curve Fitting Re-visited Maximum Likelihood Determine by minimizing sum-of-squares

More information

Graphical Models, Distributed Fusion, and Sensor Networks

Graphical Models, Distributed Fusion, and Sensor Networks Graphical Models, Distributed Fusion, and Sensor Networks Alan S. Willsky October 2005 One Group s Journey The launch: Collaboration with Albert Benveniste and Michele Basseville Initial question: what

More information