2. Graphical Models. Undirected graphical models. Factor graphs. Bayesian networks. Conversion between graphical models. Graphical Models 2-1
|
|
- Audra Jasmine Robertson
- 6 years ago
- Views:
Transcription
1 Graphical Models Graphical Models Undirected graphical models Factor graphs Bayesian networks Conversion between graphical models
2 Graphical Models 2-2 Graphical models There are three families of graphical models that are closely related, but suitable for different applications and different probability distributions: Undirected graphical models (also known as Markov Random Fields) Factor graphs Bayesian networks we will learn what they are, how they are different and how to switch between them. consider a probability distribution over x = (x 1, x 2,..., x n ) µ(x 1, x 2,..., x n ) a graphical model is a graph and a set of functions over a subset of random variables which define the probability distribution of interest
3 graphical model is a marriage between probability theory and graph theory that allows compact representation and efficient inference, when the probability distribution of interest has special independence and conditional independence structures for example, consider a random vector x = (x 1, x 2, x 3 ) and a given distribution µ(x 1, x 2, x 3 ) we use (with a slight abuse of notations) µ(x 1 ) x 2,x 3 µ(x 1, x 2, x 3 ), and µ(x 1, x 2 ) x 3 µ(x 1, x 2, x 3 ) to denote the first order and the second order marginals respectively we can list all possible independence structures x 1 (x 2, x 3 ) µ(x 1, x 2, x 3 ) = µ(x 1 )µ(x 2, x 3 ) (1) x 1 x 2 µ(x 1, x 2 ) = µ(x 1 )µ(x 2 ) (2) x 1 x 2 x 3 x 1 x 3 x 2 µ(x 1, x 2 x 3 ) = µ(x 1 x 3 )µ(x 2 x 3 )(3) and various permutations and combinations of these Graphical Models 2-3
4 Graphical Models 2-4 warm-up exercise (1) (2) proof: µ(x 1, x 2 ) = x 3 µ(x 1, x 2, x 3 ) (1) = x 3 µ(x 1 )µ(x 2, x 3 ) = µ(x 1 )µ(x 2 ) (2) (3) counter example: X 1 X 2 and X 3 = X 1 + X 2 (2) (3) counter example: Z 1, Z 2, X 3 are independent and X 1 = X 3 + Z 1, X 2 = X 3 + Z 2 this hints that different graphical models are required to represent different notions of independence in fact, there are exponentially many possible independencies, resulting in doubly exponentially many possible independence structures in a distribution however, there are only 2 n2 undirected graphs (4 n2 for directed) hence, graphical models only capture (important) subsets of possible independence structures
5 a probabilistic graphical model is a graph G(V, E) representing a family of probability distributions 1. that share the same factorization of the probability distribution; and 2. that share the same independence structure. undirected graphical model = Markov Random Field (MRF) µ(x) = 1 ψ c (x c ) Z c C(G) where C(G) is the set of all maximal cliques in the undirected graph G(V, E) factor graph model (FG) µ(x) = 1 ψ a (x a ) Z a F where F is the set of factor nodes in the undirected bipartite graph G(V, F, E) and a is the set of neighbors of the node a directed graphical model = Bayesian Network (BN) µ(x) = i V µ(x i x π(i) ) where π(i) is the set of parent nodes in the directed graph G(V, E) Graphical Models 2-5
6 Graphical Models 2-6 warm-up example: Markov Random Fields (MRF) and Factor Graphs (FG) MRF FG factorization independence x 1 x 2 x 3 µ(x 1, x 2, x 3 ) none ψ(x 1, x 2 )ψ(x 1, x 3 ) x 2 x 3 x 1 [HW1.1] ψ(x 1, x 2 )ψ(x 3 ) x 3 (x 1, x 2 ) ψ(x 1 )ψ(x 2 )ψ(x 3 ) all indep.
7 Graphical Models 2-7 warm-up example: Bayesian Network (BN) of ordering (x 1 x 2 x 3 ) BN x 1 x 2 factorization µ(x 1 )µ(x 2 x 1 )µ(x 3 x 1, x 2 ) independence none x 3 µ(x 1 )µ(x 2 x 1 )µ(x 3 x 1 ) x 2 x 3 x 1 µ(x 1 )µ(x 2 )µ(x 3 x 1, x 2 ) x 1 x 2 µ(x 1 )µ(x 2 x 1 )µ(x 3 x 2 ) x 1 x 3 x 2 µ(x 1 )µ(x 2 x 1 )µ(x 3 ) x 3 (x 1, x 2 ) µ(x 1 )µ(x 2 )µ(x 3 ) all indep.
8 Family #1: Undirected Pairwise Graphical Models Graphical Models 2-8
9 Graphical Models 2-9 Family #1: Undirected Pairwise Graphical Models (a.k.a. Pairwise MRF) x 10 x 2 x 3 x 11 x 1 x 4 x 12 x 9 x 5 x 7 x8 x 6 G = (V, E), V = [n] {1,..., n}, x = (x 1,..., x n ), x i X
10 x 2 x 3 x 10 x 11 x 1 x 4 x 12 x 9 x 5 x 7 x8 Undirected pairwise graphical models are specified by Graph G = (V, E) Alphabet X Compatibility function ψ ij : X X R +, for all (i, j) E µ(x) = 1 Z (i,j) E x 6 ψ ij (x i, x j ) pairwise MRF only allow compatibility functions over two variables Graphical Models 2-10
11 Graphical Models 2-11 Undirected Pairwise Graphical Models Alphabet X Typically X < Occationally X = R and µ(dx) = 1 Z ψ ij (x i, x j ) dx (i,j) E (all formulae interpreted as densities) Compatibility function ψ ij µ(x) = 1 Z ψ ij (x i, x j ) (i,j) E Partition function Z plays a crucial role! Z = ψ ij (x i, x j ) x X V (i,j) E
12 Graph notation i {neighborhood of vertex i}, 9 = {5, 6, 7} deg(i) = i, deg(9) = 3 x U (x i ) i U, x {1,5} = (x 1, x 5 ) x 9 = (x 5, x 6, x 7 ) Complete graph Clique Graphical Models 2-12
13 Example x 2 x 3 x 10 x 11 x 1 x 4 x 12 x 9 x 5 x 7 x8 Coloring (e.g. ring tone) Given graph G = (V, E) and a set of colors X = {R, G, B} Find a coloring of the vertices such that no two adjacent vertices have the same color Fundamental question: Chromatic number Graphical Models 2-13 x 6
14 x 2 x 3 x 10 x 11 x 1 x 4 x 12 x 9 x 5 x 7 A (joint) probability of interest is uniform measure over all possible colorings: µ(x) = 1 I(x i x j ) Z x8 (i,j) E Z = total number of colorings Sampling = coloring similarly, independent set problem [HW1.3, 1.4] Graphical Models 2-14 x 6
15 (General) Undirected Graphical Model x 10 x 2 x 3 x 11 x 1 x 4 x 12 x 9 x 5 x 7 x8 x 6 Undirected graphical models are specified by Graph G = (V, E) Alphabet X Compatibility function ψ c : X c R +, for all maximal cliques c C µ(x) = 1 ψ c (x c ) Z Graphical Models 2-15 c C
16 Family #2: Factor Graph Models Graphical Models 2-16
17 Family #2: Factor graph models 4 b x 4 c x 5 x 6 variable x i X x 3 d x 2 x 1 factor ψ a (x 1, x 5, x 6 ) a }{{} a Factor graph G = (V, F, E) Variable nodes i, j, k, V Function nodes a, b, c, F Variable node x i X, for all i V Function node ψ a : X a R +, for all a F µ(x) = 1 ψ a (x a ) Z Graphical Models 2-17 a F
18 Factor graph models 4 b x 4 c x 5 x 6 variable x i X x 3 d x 2 x 1 factor ψ a (x 1, x 5, x 6 ) a }{{} a Factor graph model is specified by Factor graph G = (V, F, E) Alphabet X Compatibility function ψ a : X a R +, for a F µ(x) = 1 ψ a (x a ) Z a F Partition function: Z = x X a F V ψ a(x a ) Graphical Models 2-18
19 Conversion between factor graphs and pairwise models From pairwise model to factor graph A pairwise model on G(V, E) with alphabet X can be represented by a factor graph G (V, F, E ) with V = V, F E, E = 2 E, X = X. Put a factor node on each edge From factor graph to a general undirected graphical model A factor model on G(V, F, E) with alphabet X can be represented by a pairwise model on G (V, E ) with V = V, E a F a 2, X = X. A factor node is turned into a clique From factor graph to a pairwise model A factor model on G(V, F, E) can be represented by a pairwise model on G (V, E ) with V = V F, E = E, X = X, = max a F deg(a). A factor node is represented by a large variable node Graphical Models 2-19
20 Family #3: Bayesian Networks Graphical Models 2-20
21 Graphical Models 2-21 Family #3: Bayesian networks x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 DAG: Directed Acyclic Graph G = (V, D) Variable nodes V = [n], x i X, for all i V Define π(i) {parents of i} Set of directed edges D µ(x) = i V µ i (x i x π(i) )
22 Graphical Models 2-22 x 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 Bayesian network is specified by directed acyclic graph G = (V, D) alphabet X conditional probability µi ( ) : X X π(i) R +, for i V µ(x) = i V µ i (x i x π(i) ) we do not need normalization (1/Z) since µ i (x i x π(i) ) = 1 x i X x X V µ(x) = 1
23 Conversion between Bayes networks and factor graphs from Bayes network to factor graph A Bayes network G = (V, D) with alphabet X can be represented by a factor graph model on G = (V, F, E ) with V = V, F = V, E = D + V, X = X. represent by a factor node each conditional probability moralization for conversion from BN to MRF from factor graph to Bayes network A factor model on G = (V, F, E) with alphabet X can be represented by a Bayes network G = (V, D ) with V = V and X = X. take a topological ordering, e.g. x 1,..., x n for each node i, starting from the first node, find a minimal set U {1,..., i 1} such that x i is conditionally independent of x {1,...,i 1}\U given x U. (we will learn how to do this) in general the resulting Bayesian network is dense Graphical Models 2-23
24 Because MRF and BN are incomparable, some independence structure is lost in conversion Graphical Models 2-24 µ(x) = ψ(x 1, x 2)ψ(x 1, x 3)ψ(x 2, x 4)ψ(x 3, x 4) x 1 x 4 (x 2, x 3) x 2 x 3 (x 1, x 4) x 2 x 3 (x 1, x 4) µ(x) = µ(x 2)µ(x 3)µ(x 1 x 2, x 3) x 2 x 3 no independence
25 Graphical Models 2-25 Factor graphs are more fine grained than undirected graphical models ψ(x 1, x 2, x 3 ) ψ 12 (x 1, x 2 )ψ 23 (x 2, x 3 )ψ 31 (x 3, x 1 ) ψ 123 (x 1, x 2, x 3 ) all three encodes same independencies, but different factorizations (in particular the degrees of freedom in the compatibility functions are 3 X 2 vs. X 3 ) set of independencies represented by MRF is the same as FG but FG can represent a larger set of factorizations
26 Graphical Models 2-26 undirected graphical models can be represented by factor graphs we can go from MRF to FG without losing any information on the independencies implies by the model Bayesian networks are not compatible with undirected graphical models or factor graphs if we go from one model to the other, and then back to the original model, then we will not, in general, get back the same model as we started out with we lose any information on the independencies implies by the model, when switching from one model to the other
27 Graphical Models 2-27 Bayes networks with observed variables V = H O Hidden variables: x = (x i ) i H Observed variables: y = (y i ) i O µ(x, y) = i H µ(x i x π(i) H, y π(i) O ) i O µ(y i x π(i) H, y π(i) O ) Typically interested in µ y (x) µ(x y) and arg max x µ y (x)
28 Graphical Models 2-28 Example Forensic Science [Kadane, Shum, A probabilistic analysis of the Sacco and Vanzetti evidence, 1996] [Taroni et al., Bayesian Networks and Probabilistic Inference in Forensic Science, 2006]
29 Graphical Models 2-29 Example diseases soft ORs symptoms Medical Diagnosis [M. Shwe, et al., Methods of Information in Medicine, 1991]
30 Graphical Models 2-30 Roadmap Cond. Indep. Factorization Graphical Graph Cond. Indep. µ(x) µ(x) Model G implied by G 1 x 1 {x 2, x 3 } x 4 ; Z ψa (x a ) FG Factor Markov 1 x 4 {} x 7 ; Z ψc (x C ) MRF Undirected Markov. ψi (x i x π(i) ) BN Directed Markov Any µ(x) can be represented by {FG,MRF,BN} A µ(x) can be represented by multiple {FG,MRF,BN} with multiple graphs (but same µ(x)) We want a simple graph representation (sparse, small alphabet size) Memory to store the graphical model Computations for inference µ(x) with some conditional independence structure can be represented by simple {FG,MRF,BN}
2. Graphical Models. Undirected pairwise graphical models. Factor graphs. Bayesian networks. Conversion between graphical models. Graphical Models 2-1
Graphical Models 2-1 2. Graphical Models Undirected pairwise graphical models Factor graphs Bayesian networks Conversion between graphical models Graphical Models 2-2 Graphical models Families of graphical
More informationComputer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models
Prof. Daniel Cremers 4. Probabilistic Graphical Models Directed Models The Bayes Filter (Rep.) (Bayes) (Markov) (Tot. prob.) (Markov) (Markov) 2 Graphical Representation (Rep.) We can describe the overall
More informationComputer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models
Prof. Daniel Cremers 4. Probabilistic Graphical Models Directed Models The Bayes Filter (Rep.) (Bayes) (Markov) (Tot. prob.) (Markov) (Markov) 2 Graphical Representation (Rep.) We can describe the overall
More information3 : Representation of Undirected GMs
0-708: Probabilistic Graphical Models 0-708, Spring 202 3 : Representation of Undirected GMs Lecturer: Eric P. Xing Scribes: Nicole Rafidi, Kirstin Early Last Time In the last lecture, we discussed directed
More informationD-Separation. b) the arrows meet head-to-head at the node, and neither the node, nor any of its descendants, are in the set C.
D-Separation Say: A, B, and C are non-intersecting subsets of nodes in a directed graph. A path from A to B is blocked by C if it contains a node such that either a) the arrows on the path meet either
More informationLecture 3: Conditional Independence - Undirected
CS598: Graphical Models, Fall 2016 Lecture 3: Conditional Independence - Undirected Lecturer: Sanmi Koyejo Scribe: Nate Bowman and Erin Carrier, Aug. 30, 2016 1 Review for the Bayes-Ball Algorithm Recall
More informationLecture 4: Undirected Graphical Models
Lecture 4: Undirected Graphical Models Department of Biostatistics University of Michigan zhenkewu@umich.edu http://zhenkewu.com/teaching/graphical_model 15 September, 2016 Zhenke Wu BIOSTAT830 Graphical
More informationCS242: Probabilistic Graphical Models Lecture 3: Factor Graphs & Variable Elimination
CS242: Probabilistic Graphical Models Lecture 3: Factor Graphs & Variable Elimination Instructor: Erik Sudderth Brown University Computer Science September 11, 2014 Some figures and materials courtesy
More informationGraphical Models. Pradeep Ravikumar Department of Computer Science The University of Texas at Austin
Graphical Models Pradeep Ravikumar Department of Computer Science The University of Texas at Austin Useful References Graphical models, exponential families, and variational inference. M. J. Wainwright
More information6 : Factor Graphs, Message Passing and Junction Trees
10-708: Probabilistic Graphical Models 10-708, Spring 2018 6 : Factor Graphs, Message Passing and Junction Trees Lecturer: Kayhan Batmanghelich Scribes: Sarthak Garg 1 Factor Graphs Factor Graphs are graphical
More informationRecall from last time. Lecture 4: Wrap-up of Bayes net representation. Markov networks. Markov blanket. Isolating a node
Recall from last time Lecture 4: Wrap-up of Bayes net representation. Markov networks Markov blanket, moral graph Independence maps and perfect maps Undirected graphical models (Markov networks) A Bayes
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Raquel Urtasun and Tamir Hazan TTI Chicago April 8, 2011 Raquel Urtasun and Tamir Hazan (TTI-C) Graphical Models April 8, 2011 1 / 19 Factor Graphs H does not reveal the
More information4 Factor graphs and Comparing Graphical Model Types
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms for Inference Fall 2014 4 Factor graphs and Comparing Graphical Model Types We now introduce
More informationECE521 W17 Tutorial 10
ECE521 W17 Tutorial 10 Shenlong Wang and Renjie Liao *Some of materials are credited to Jimmy Ba, Eric Sudderth, Chris Bishop Introduction to A4 1, Graphical Models 2, Message Passing 3, HMM Introduction
More informationProbabilistic Graphical Models
Overview of Part One Probabilistic Graphical Models Part One: Graphs and Markov Properties Christopher M. Bishop Graphs and probabilities Directed graphs Markov properties Undirected graphs Examples Microsoft
More informationLecture 9: Undirected Graphical Models Machine Learning
Lecture 9: Undirected Graphical Models Machine Learning Andrew Rosenberg March 5, 2010 1/1 Today Graphical Models Probabilities in Undirected Graphs 2/1 Undirected Graphs What if we allow undirected graphs?
More informationMachine Learning
Machine Learning 10-601 Tom M. Mitchell Machine Learning Department Carnegie Mellon University October 2, 2012 Today: Graphical models Bayes Nets: Representing distributions Conditional independencies
More informationChapter 8 of Bishop's Book: Graphical Models
Chapter 8 of Bishop's Book: Graphical Models Review of Probability Probability density over possible values of x Used to find probability of x falling in some range For continuous variables, the probability
More informationMachine Learning. Sourangshu Bhattacharya
Machine Learning Sourangshu Bhattacharya Bayesian Networks Directed Acyclic Graph (DAG) Bayesian Networks General Factorization Curve Fitting Re-visited Maximum Likelihood Determine by minimizing sum-of-squares
More informationMachine Learning
Machine Learning 10-601 Tom M. Mitchell Machine Learning Department Carnegie Mellon University February 18, 2015 Today: Graphical models Bayes Nets: Representing distributions Conditional independencies
More informationECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning
ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning Topics Bayes Nets: Inference (Finish) Variable Elimination Graph-view of VE: Fill-edges, induced width
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 5 Inference
More informationComputer Vision Group Prof. Daniel Cremers. 4a. Inference in Graphical Models
Group Prof. Daniel Cremers 4a. Inference in Graphical Models Inference on a Chain (Rep.) The first values of µ α and µ β are: The partition function can be computed at any node: Overall, we have O(NK 2
More informationMachine Learning!!!!! Srihari. Chain Graph Models. Sargur Srihari
Chain Graph Models Sargur Srihari srihari@cedar.buffalo.edu 1 Topics PDAGs or Chain Graphs Generalization of CRF to Chain Graphs Independencies in Chain Graphs Summary BN versus MN 2 Partially Directed
More informationCS 343: Artificial Intelligence
CS 343: Artificial Intelligence Bayes Nets: Inference Prof. Scott Niekum The University of Texas at Austin [These slides based on those of Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.
More informationLecture 5: Exact inference. Queries. Complexity of inference. Queries (continued) Bayesian networks can answer questions about the underlying
given that Maximum a posteriori (MAP query: given evidence 2 which has the highest probability: instantiation of all other variables in the network,, Most probable evidence (MPE: given evidence, find an
More informationThe Basics of Graphical Models
The Basics of Graphical Models David M. Blei Columbia University September 30, 2016 1 Introduction (These notes follow Chapter 2 of An Introduction to Probabilistic Graphical Models by Michael Jordan.
More informationCOMP90051 Statistical Machine Learning
COMP90051 Statistical Machine Learning Semester 2, 2016 Lecturer: Trevor Cohn 21. Independence in PGMs; Example PGMs Independence PGMs encode assumption of statistical independence between variables. Critical
More informationLoopy Belief Propagation
Loopy Belief Propagation Research Exam Kristin Branson September 29, 2003 Loopy Belief Propagation p.1/73 Problem Formalization Reasoning about any real-world problem requires assumptions about the structure
More informationEE512 Graphical Models Fall 2009
EE512 Graphical Models Fall 2009 Prof. Jeff Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2009 http://ssli.ee.washington.edu/~bilmes/ee512fa09 Lecture 11 -
More informationBayesian Networks Inference
Bayesian Networks Inference Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University November 5 th, 2007 2005-2007 Carlos Guestrin 1 General probabilistic inference Flu Allergy Query: Sinus
More informationCollective classification in network data
1 / 50 Collective classification in network data Seminar on graphs, UCSB 2009 Outline 2 / 50 1 Problem 2 Methods Local methods Global methods 3 Experiments Outline 3 / 50 1 Problem 2 Methods Local methods
More informationGraphical models and message-passing algorithms: Some introductory lectures
Graphical models and message-passing algorithms: Some introductory lectures Martin J. Wainwright 1 Introduction Graphical models provide a framework for describing statistical dependencies in (possibly
More informationComputer vision: models, learning and inference. Chapter 10 Graphical Models
Computer vision: models, learning and inference Chapter 10 Graphical Models Independence Two variables x 1 and x 2 are independent if their joint probability distribution factorizes as Pr(x 1, x 2 )=Pr(x
More informationFMA901F: Machine Learning Lecture 6: Graphical Models. Cristian Sminchisescu
FMA901F: Machine Learning Lecture 6: Graphical Models Cristian Sminchisescu Graphical Models Provide a simple way to visualize the structure of a probabilistic model and can be used to design and motivate
More informationGraphical Models. David M. Blei Columbia University. September 17, 2014
Graphical Models David M. Blei Columbia University September 17, 2014 These lecture notes follow the ideas in Chapter 2 of An Introduction to Probabilistic Graphical Models by Michael Jordan. In addition,
More information5/3/2010Z:\ jeh\self\notes.doc\7 Chapter 7 Graphical models and belief propagation Graphical models and belief propagation
//00Z:\ jeh\self\notes.doc\7 Chapter 7 Graphical models and belief propagation 7. Graphical models and belief propagation Outline graphical models Bayesian networks pair wise Markov random fields factor
More informationBayesian Machine Learning - Lecture 6
Bayesian Machine Learning - Lecture 6 Guido Sanguinetti Institute for Adaptive and Neural Computation School of Informatics University of Edinburgh gsanguin@inf.ed.ac.uk March 2, 2015 Today s lecture 1
More informationJunction tree propagation - BNDG 4-4.6
Junction tree propagation - BNDG 4-4. Finn V. Jensen and Thomas D. Nielsen Junction tree propagation p. 1/2 Exact Inference Message Passing in Join Trees More sophisticated inference technique; used in
More information1 : Introduction to GM and Directed GMs: Bayesian Networks. 3 Multivariate Distributions and Graphical Models
10-708: Probabilistic Graphical Models, Spring 2015 1 : Introduction to GM and Directed GMs: Bayesian Networks Lecturer: Eric P. Xing Scribes: Wenbo Liu, Venkata Krishna Pillutla 1 Overview This lecture
More informationECE521 Lecture 21 HMM cont. Message Passing Algorithms
ECE521 Lecture 21 HMM cont Message Passing Algorithms Outline Hidden Markov models Numerical example of figuring out marginal of the observed sequence Numerical example of figuring out the most probable
More informationStat 5421 Lecture Notes Graphical Models Charles J. Geyer April 27, Introduction. 2 Undirected Graphs
Stat 5421 Lecture Notes Graphical Models Charles J. Geyer April 27, 2016 1 Introduction Graphical models come in many kinds. There are graphical models where all the variables are categorical (Lauritzen,
More informationECE521 Lecture 18 Graphical Models Hidden Markov Models
ECE521 Lecture 18 Graphical Models Hidden Markov Models Outline Graphical models Conditional independence Conditional independence after marginalization Sequence models hidden Markov models 2 Graphical
More informationVariational Methods for Graphical Models
Chapter 2 Variational Methods for Graphical Models 2.1 Introduction The problem of probabb1istic inference in graphical models is the problem of computing a conditional probability distribution over the
More informationBACKGROUND: A BRIEF INTRODUCTION TO GRAPH THEORY
BACKGROUND: A BRIEF INTRODUCTION TO GRAPH THEORY General definitions; Representations; Graph Traversals; Topological sort; Graphs definitions & representations Graph theory is a fundamental tool in sparse
More informationExpectation Propagation
Expectation Propagation Erik Sudderth 6.975 Week 11 Presentation November 20, 2002 Introduction Goal: Efficiently approximate intractable distributions Features of Expectation Propagation (EP): Deterministic,
More informationLecture 5: Exact inference
Lecture 5: Exact inference Queries Inference in chains Variable elimination Without evidence With evidence Complexity of variable elimination which has the highest probability: instantiation of all other
More informationFrom the Jungle to the Garden: Growing Trees for Markov Chain Monte Carlo Inference in Undirected Graphical Models
From the Jungle to the Garden: Growing Trees for Markov Chain Monte Carlo Inference in Undirected Graphical Models by Jean-Noël Rivasseau, M.Sc., Ecole Polytechnique, 2003 A THESIS SUBMITTED IN PARTIAL
More informationDecomposition of log-linear models
Graphical Models, Lecture 5, Michaelmas Term 2009 October 27, 2009 Generating class Dependence graph of log-linear model Conformal graphical models Factor graphs A density f factorizes w.r.t. A if there
More informationCourse Introduction / Review of Fundamentals of Graph Theory
Course Introduction / Review of Fundamentals of Graph Theory Hiroki Sayama sayama@binghamton.edu Rise of Network Science (From Barabasi 2010) 2 Network models Many discrete parts involved Classic mean-field
More informationThese notes present some properties of chordal graphs, a set of undirected graphs that are important for undirected graphical models.
Undirected Graphical Models: Chordal Graphs, Decomposable Graphs, Junction Trees, and Factorizations Peter Bartlett. October 2003. These notes present some properties of chordal graphs, a set of undirected
More informationCS 188: Artificial Intelligence
CS 188: Artificial Intelligence Bayes Nets: Inference Instructors: Dan Klein and Pieter Abbeel --- University of California, Berkeley [These slides were created by Dan Klein and Pieter Abbeel for CS188
More informationIntroduction to Graph Theory
Introduction to Graph Theory Tandy Warnow January 20, 2017 Graphs Tandy Warnow Graphs A graph G = (V, E) is an object that contains a vertex set V and an edge set E. We also write V (G) to denote the vertex
More informationCOS 513: Foundations of Probabilistic Modeling. Lecture 5
COS 513: Foundations of Probabilistic Modeling Young-suk Lee 1 Administrative Midterm report is due Oct. 29 th. Recitation is at 4:26pm in Friend 108. Lecture 5 R is a computer language for statistical
More informationCOMP90051 Statistical Machine Learning
COMP90051 Statistical Machine Learning Semester 2, 2016 Lecturer: Trevor Cohn 20. PGM Representation Next Lectures Representation of joint distributions Conditional/marginal independence * Directed vs
More informationPart II. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS
Part II C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Converting Directed to Undirected Graphs (1) Converting Directed to Undirected Graphs (2) Add extra links between
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Raquel Urtasun and Tamir Hazan TTI Chicago April 25, 2011 Raquel Urtasun and Tamir Hazan (TTI-C) Graphical Models April 25, 2011 1 / 17 Clique Trees Today we are going to
More information5 Minimal I-Maps, Chordal Graphs, Trees, and Markov Chains
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms for Inference Fall 2014 5 Minimal I-Maps, Chordal Graphs, Trees, and Markov Chains Recall
More informationWorkshop report 1. Daniels report is on website 2. Don t expect to write it based on listening to one project (we had 6 only 2 was sufficient
Workshop report 1. Daniels report is on website 2. Don t expect to write it based on listening to one project (we had 6 only 2 was sufficient quality) 3. I suggest writing it on one presentation. 4. Include
More informationMachine Learning
Machine Learning 10-701 Tom M. Mitchell Machine Learning Department Carnegie Mellon University February 15, 2011 Today: Graphical models Inference Conditional independence and D-separation Learning from
More informationNumber Theory and Graph Theory
1 Number Theory and Graph Theory Chapter 6 Basic concepts and definitions of graph theory By A. Satyanarayana Reddy Department of Mathematics Shiv Nadar University Uttar Pradesh, India E-mail: satya8118@gmail.com
More informationCS 343: Artificial Intelligence
CS 343: Artificial Intelligence Bayes Nets: Independence Prof. Scott Niekum The University of Texas at Austin [These slides based on those of Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.
More informationcontext. (Similarly, we write ν for ν(g), etc. if there is no risk of confusion.) For a subset A V we write N G (A) = {v V ( w A)(v w)} (the set of
Graph Theory CMSC-27500 Spring 2015 http://people.cs.uchicago.edu/ laci/15graphs Homework set #4. First batch posted 4-9, 8am, updated 10:20am. Problems 4.16 4.31 added at 11:30pm. Due Tuesday, April 14
More informationPATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Bayesian Networks Directed Acyclic Graph (DAG) Bayesian Networks General Factorization Bayesian Curve Fitting (1) Polynomial Bayesian
More information6. Lecture notes on matroid intersection
Massachusetts Institute of Technology 18.453: Combinatorial Optimization Michel X. Goemans May 2, 2017 6. Lecture notes on matroid intersection One nice feature about matroids is that a simple greedy algorithm
More informationJunction Trees and Chordal Graphs
Graphical Models, Lecture 6, Michaelmas Term 2009 October 30, 2009 Decomposability Factorization of Markov distributions Explicit formula for MLE Consider an undirected graph G = (V, E). A partitioning
More informationECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning
ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning Topics Markov Random Fields: Inference Exact: VE Exact+Approximate: BP Readings: Barber 5 Dhruv Batra
More informationBayesian Networks, Winter Yoav Haimovitch & Ariel Raviv
Bayesian Networks, Winter 2009-2010 Yoav Haimovitch & Ariel Raviv 1 Chordal Graph Warm up Theorem 7 Perfect Vertex Elimination Scheme Maximal cliques Tree Bibliography M.C.Golumbic Algorithmic Graph Theory
More informationOSU CS 536 Probabilistic Graphical Models. Loopy Belief Propagation and Clique Trees / Join Trees
OSU CS 536 Probabilistic Graphical Models Loopy Belief Propagation and Clique Trees / Join Trees Slides from Kevin Murphy s Graphical Model Tutorial (with minor changes) Reading: Koller and Friedman Ch
More informationMassachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms for Inference Fall 2014
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms for Inference Fall 2014 1 Course Overview This course is about performing inference in complex
More informationSTAT 598L Probabilistic Graphical Models. Instructor: Sergey Kirshner. Exact Inference
STAT 598L Probabilistic Graphical Models Instructor: Sergey Kirshner Exact Inference What To Do With Bayesian/Markov Network? Compact representation of a complex model, but Goal: efficient extraction of
More informationMassachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 Recitation-6: Hardness of Inference Contents 1 NP-Hardness Part-II
More informationA Brief Introduction to Bayesian Networks AIMA CIS 391 Intro to Artificial Intelligence
A Brief Introduction to Bayesian Networks AIMA 14.1-14.3 CIS 391 Intro to Artificial Intelligence (LDA slides from Lyle Ungar from slides by Jonathan Huang (jch1@cs.cmu.edu)) Bayesian networks A simple,
More informationCPS 102: Discrete Mathematics. Quiz 3 Date: Wednesday November 30, Instructor: Bruce Maggs NAME: Prob # Score. Total 60
CPS 102: Discrete Mathematics Instructor: Bruce Maggs Quiz 3 Date: Wednesday November 30, 2011 NAME: Prob # Score Max Score 1 10 2 10 3 10 4 10 5 10 6 10 Total 60 1 Problem 1 [10 points] Find a minimum-cost
More informationJunction Trees and Chordal Graphs
Graphical Models, Lecture 6, Michaelmas Term 2011 October 24, 2011 Decomposability Factorization of Markov distributions Explicit formula for MLE Consider an undirected graph G = (V, E). A partitioning
More informationV,T C3: S,L,B T C4: A,L,T A,L C5: A,L,B A,B C6: C2: X,A A
Inference II Daphne Koller Stanford University CS228 Handout #13 In the previous chapter, we showed how efficient inference can be done in a BN using an algorithm called Variable Elimination, that sums
More informationA graph is finite if its vertex set and edge set are finite. We call a graph with just one vertex trivial and all other graphs nontrivial.
2301-670 Graph theory 1.1 What is a graph? 1 st semester 2550 1 1.1. What is a graph? 1.1.2. Definition. A graph G is a triple (V(G), E(G), ψ G ) consisting of V(G) of vertices, a set E(G), disjoint from
More information10708 Graphical Models: Homework 4
10708 Graphical Models: Homework 4 Due November 12th, beginning of class October 29, 2008 Instructions: There are six questions on this assignment. Each question has the name of one of the TAs beside it,
More informationToday. Logistic Regression. Decision Trees Redux. Graphical Models. Maximum Entropy Formulation. Now using Information Theory
Today Logistic Regression Maximum Entropy Formulation Decision Trees Redux Now using Information Theory Graphical Models Representing conditional dependence graphically 1 Logistic Regression Optimization
More informationAN ANALYSIS ON MARKOV RANDOM FIELDS (MRFs) USING CYCLE GRAPHS
Volume 8 No. 0 208, -20 ISSN: 3-8080 (printed version); ISSN: 34-3395 (on-line version) url: http://www.ijpam.eu doi: 0.2732/ijpam.v8i0.54 ijpam.eu AN ANALYSIS ON MARKOV RANDOM FIELDS (MRFs) USING CYCLE
More informationAMS /672: Graph Theory Homework Problems - Week V. Problems to be handed in on Wednesday, March 2: 6, 8, 9, 11, 12.
AMS 550.47/67: Graph Theory Homework Problems - Week V Problems to be handed in on Wednesday, March : 6, 8, 9,,.. Assignment Problem. Suppose we have a set {J, J,..., J r } of r jobs to be filled by a
More informationStatistical and Learning Techniques in Computer Vision Lecture 1: Markov Random Fields Jens Rittscher and Chuck Stewart
Statistical and Learning Techniques in Computer Vision Lecture 1: Markov Random Fields Jens Rittscher and Chuck Stewart 1 Motivation Up to now we have considered distributions of a single random variable
More informationIntroduction to Graphs
Graphs Introduction to Graphs Graph Terminology Directed Graphs Special Graphs Graph Coloring Representing Graphs Connected Graphs Connected Component Reading (Epp s textbook) 10.1-10.3 1 Introduction
More informationCh9: Exact Inference: Variable Elimination. Shimi Salant, Barak Sternberg
Ch9: Exact Inference: Variable Elimination Shimi Salant Barak Sternberg Part 1 Reminder introduction (1/3) We saw two ways to represent (finite discrete) distributions via graphical data structures: Bayesian
More informationAssignment 4 Solutions of graph problems
Assignment 4 Solutions of graph problems 1. Let us assume that G is not a cycle. Consider the maximal path in the graph. Let the end points of the path be denoted as v 1, v k respectively. If either of
More informationCheng Soon Ong & Christian Walder. Canberra February June 2018
Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression
More informationPart II. Graph Theory. Year
Part II Year 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2017 53 Paper 3, Section II 15H Define the Ramsey numbers R(s, t) for integers s, t 2. Show that R(s, t) exists for all s,
More informationBelief propagation in a bucket-tree. Handouts, 275B Fall Rina Dechter. November 1, 2000
Belief propagation in a bucket-tree Handouts, 275B Fall-2000 Rina Dechter November 1, 2000 1 From bucket-elimination to tree-propagation The bucket-elimination algorithm, elim-bel, for belief updating
More informationBasic Graph Theory with Applications to Economics
Basic Graph Theory with Applications to Economics Debasis Mishra February 6, What is a Graph? Let N = {,..., n} be a finite set. Let E be a collection of ordered or unordered pairs of distinct elements
More informationDirected Graphical Models
Copyright c 2008 2010 John Lafferty, Han Liu, and Larry Wasserman Do Not Distribute Chapter 18 Directed Graphical Models Graphs give a powerful way of representing independence relations and computing
More informationMarkov Random Fields
3750 Machine earning ecture 4 Markov Random ields Milos auskrecht milos@cs.pitt.edu 5329 ennott quare 3750 dvanced Machine earning Markov random fields Probabilistic models with symmetric dependences.
More informationUndirected Graphical Models. Raul Queiroz Feitosa
Undirected Graphical Models Raul Queiroz Feitosa Pros and Cons Advantages of UGMs over DGMs UGMs are more natural for some domains (e.g. context-dependent entities) Discriminative UGMs (CRF) are better
More informationParallel Exact Inference on the Cell Broadband Engine Processor
Parallel Exact Inference on the Cell Broadband Engine Processor Yinglong Xia and Viktor K. Prasanna {yinglonx, prasanna}@usc.edu University of Southern California http://ceng.usc.edu/~prasanna/ SC 08 Overview
More informationLearning decomposable models with a bounded clique size
Learning decomposable models with a bounded clique size Achievements 2014-2016 Aritz Pérez Basque Center for Applied Mathematics Bilbao, March, 2016 Outline 1 Motivation and background 2 The problem 3
More informationAbout the Tutorial. Audience. Prerequisites. Disclaimer & Copyright. Graph Theory
About the Tutorial This tutorial offers a brief introduction to the fundamentals of graph theory. Written in a reader-friendly style, it covers the types of graphs, their properties, trees, graph traversability,
More information5. Lecture notes on matroid intersection
Massachusetts Institute of Technology Handout 14 18.433: Combinatorial Optimization April 1st, 2009 Michel X. Goemans 5. Lecture notes on matroid intersection One nice feature about matroids is that a
More informationProblem 1. Which of the following is true of functions =100 +log and = + log? Problem 2. Which of the following is true of functions = 2 and =3?
Multiple-choice Problems: Problem 1. Which of the following is true of functions =100+log and =+log? a) = b) =Ω c) =Θ d) All of the above e) None of the above Problem 2. Which of the following is true
More informationBuilding Classifiers using Bayesian Networks
Building Classifiers using Bayesian Networks Nir Friedman and Moises Goldszmidt 1997 Presented by Brian Collins and Lukas Seitlinger Paper Summary The Naive Bayes classifier has reasonable performance
More informationHomework #5 Algorithms I Spring 2017
Homework #5 Algorithms I 600.463 Spring 2017 Due on: Saturday, March 18th, 11:59pm Late submissions: will NOT be accepted Format: Please start each problem on a new page. Where to submit: On Gradescope,
More informationGraph Theory S 1 I 2 I 1 S 2 I 1 I 2
Graph Theory S I I S S I I S Graphs Definition A graph G is a pair consisting of a vertex set V (G), and an edge set E(G) ( ) V (G). x and y are the endpoints of edge e = {x, y}. They are called adjacent
More information