Graphical Models Reconstruction

Size: px
Start display at page:

Download "Graphical Models Reconstruction"

Transcription

1 Graphical Models Reconstruction Graph Theory Course Project Firoozeh Sepehr April 27 th 2016 Firoozeh Sepehr Graphical Models Reconstruction 1/50

2 Outline 1 Overview 2 History and Background 3 Graphical Models 4 Reconstruction 5 Open Issues Firoozeh Sepehr Graphical Models Reconstruction 2/50

3 Outline 1 Overview 2 History and Background 3 Graphical Models 4 Reconstruction 5 Open Issues Firoozeh Sepehr Graphical Models Reconstruction 3/50

4 Overview What are graphical models? Graphical Models 1;2 Combination of Probability Theory and Graph Theory Tackling problems of uncertainty and complexity Utilizing modularity for complex systems Graphical representation of dependencies embedded in probabilistic models b b c c a a g g e e d d f f Bayesian/Belief Networks Markov Networks Firoozeh Sepehr Graphical Models Reconstruction 4/50

5 Overview Markov vs Bayesian Networks Markov Networks Undirected graphical models Correlations between variables Mostly used in physics and vision communities Bayesian/Belief Networks Directed graphical models Directed Acyclic Graphs (DAGs) Causal relationships between variables Mostly used in AI and machine learning communities Use Bayes rule for inference Firoozeh Sepehr Graphical Models Reconstruction 5/50

6 Overview Applications So many different applications Pattern recognition Diagnosis of diseases Desicion-theoretic systems 4 Statistical physics Signal and image processing Inferring cellular networks in biological systems 3 Firoozeh Sepehr Graphical Models Reconstruction 6/50

7 Outline 1 Overview 2 History and Background 3 Graphical Models 4 Reconstruction 5 Open Issues Firoozeh Sepehr Graphical Models Reconstruction 7/50

8 History and Background Probability theory Foundations of probability theory 2 Go back to 16th century when Gerolamo Cardano began a formal analysis of games of chance, followed by additional key developments by Pierre de Fermat and Blaise Pascal in 17th century. The initial development involved only discrete probability spaces and the analysis methods were purely combinatorial. Gerolamo Cardano Pierre de Fermat Blaise Pascal Italian, French, French, Science, maths, Mathematics and law 10 Theology, mathematics, philosophy, and literature 9 philosophy and physics 11 Firoozeh Sepehr Graphical Models Reconstruction 8/50

9 History and Background Probability theory Foundations of probability theory - cont d The foundations of modern probability theory were laid by Andrey Kolmogorov in the 1930s. Andrey Kolmogorov Russian, Mathematics Known for Topology, Intuitionistic logic, Turbulence studies, Classical mechanics, Mathematical analysis, Kolmogorov complexity 12 Firoozeh Sepehr Graphical Models Reconstruction 9/50

10 History and Background Bayes rule Bayes theorem 2 Shown in the 18th century by Reverend Thomas Bayes. This theorem allows us to use a model that tells us the conditional probability of event a given event b in order to compute the contrapositive: the conditional probability of event b given event a. This type of reasoning is central to the use of graphical models - Bayesian network. Thomas Bayes English, Statistician, philosopher and Presbyterian minister 13 Firoozeh Sepehr Graphical Models Reconstruction 10/50

11 History and Background Origins of graphical models Origins of graphical models 2 Representing interactions between variables in a multidimensional distribution using a graph structure originates in several communities Statistical physics: Gibbs - used an undirected graph to represent the distribution over a system of interacting particles Genetics: path analysis of Sewal Wright - proposed the use of a directed graph to study inheritance in natural species Statistics: Bartlett - analyzing interactions between variables in the study of contingency tables, also known as log-linear models Computer science: Artificial Intelligence (AI) to perform difficult tasks such as oil-well location or medical diagnosis, at an expert level Firoozeh Sepehr Graphical Models Reconstruction 11/50

12 History and Background Origins of graphical models Expert systems 2 Need for methods that allow the interation of multiple pieces of evidence and provide support for making decisions under uncertainty Huge success in predicting the diseases using evidences like sysmptoms and test results in the 1970s Fell into disfavor in AI community 1 AI should be based on similar methods to human intelligence 2 Use of strong independence assumptions mae in the existing expert systems was not a flexible, scalable mechanism Firoozeh Sepehr Graphical Models Reconstruction 12/50

13 History and Background Origins of graphical models Expert systems - cont d Widespread acceptance of probabilistic methods began in the late 1980s 1 Series of seminal theoretical developments Bayesian network framework by Judea Pearl and his colleaagues in 1988 Foundations for efficient reasoning using probabilistic graphical models by S. L. Lauritzen and D.J. Spiegelhalter in Construction of large-scale, highly successful expert systems based on this framework that avoided the unrealistically strong assumptions made by early probabilistic expert systems Pathfinder expert system (which assists community pathologists with the diagnosis of lymph-node pathology) constructed by Heckerman and colleagues in Firoozeh Sepehr Graphical Models Reconstruction 13/50

14 Outline 1 Overview 2 History and Background 3 Graphical Models 4 Reconstruction 5 Open Issues Firoozeh Sepehr Graphical Models Reconstruction 14/50

15 Graphical Models Definitions Directed and undirected graphs G = (N, E) is an undirected graph G = (N, E) a directed graph Degree, indegree and outdegree For a vertex y N degree is deg(y) indegree is deg (y) outdegree is deg + (y) Root and leaf If deg (y) = 0, y is a root and if deg + (y) = 0, y is a leaf Firoozeh Sepehr Graphical Models Reconstruction 15/50

16 Graphical Models Definitions Chains and paths A chain starting from y i and ending in y j is an ordered sequence of distinct nodes (y π1, y π2,..., y πl 1, y πl ) where y i = y π1, y j = y πl and (y k, y k+1 ) E A path starting from y i and ending in y j is an ordered sequence of distinct nodes (y π1, y π2,..., y πl 1, y πl ) where y i = y π1, y j = y πl and either (y k, y k+1 ) E or (y k+1, y k ) E Note Chains are a special case of paths! Firoozeh Sepehr Graphical Models Reconstruction 16/50

17 Graphical Models Definitions Parents, Children, Ancestors, Descendants Consider a directed graph G = (N, E) and y i N. Given a set X N: y i is a parent of y j if there is a directed edge from y i to y j pa(x ) := {y i N y j X : y i is a parent of y j } y j is a child of y i if there is a directed edge from y i to y j ch(x ) := {y j N y i X : y j is a child of y i } y i is an ancestor of y j if there is a chain from y i to y j an(x ) := {y i N y j X : y i is an ancestor of y j } y j is a descendant of y i if there is a chain from y i to y j de(x ) := {y j N y i X : y j is a descendant of y i } Neighbors ngb(y i ), are the union of parents and children set. Firoozeh Sepehr Graphical Models Reconstruction 17/50

18 Graphical Models Definitions Visualize... Roots, Leaves Paths, Chains Parents, Children, Ancestors, Descendants, Neighbors a d g b f e c Firoozeh Sepehr Graphical Models Reconstruction 18/50

19 Graphical Models Definitions Forks, inverted forks and chain links 6 Consider a path (y π1, y π2,..., y πl 1, y πl ) in a directed graph G = (N, E). Vertex y πi is a fork if (y πi, y πi 1 ) and (y πi, y πi+1 ) are in E an inverted fork (or collider) if (y πi 1, y πi ) and (y πi+1, y πi ) are in E a chain link in all other cases a d g b f e c Firoozeh Sepehr Graphical Models Reconstruction 19/50

20 Graphical Models What is factorization? Factorization Joint probability distribution Using the chain rule and assuming an arbitrary order d on variables 2 p(x 1, x 2,..., x n ) = Π n i=1p(x i x 1, x 2,..., x i 1 ) (1) Using graphical models - leads to a compact representation 8 Undirected GM p(x 1, x 2,..., x n) = 1 Z Π (i,j) E φ k (x i, x j ) (2) Undirected Tree GM (using junction tree theory) p(x 1, x 2,..., x n) = Π n i=1p(x i )Π (i,j) E p(x i, x j ) p(x i )p(x j ) (3) Directed GM p(x 1, x 2,..., x n) = Π n i=1p(x i pa(x i )) (4) Firoozeh Sepehr Graphical Models Reconstruction 20/50

21 Graphical Models What is factorization? Example 1 Consider we have N binary random variables, for representation of joint probability distribution chain rule requires O(2 N ) parameters GM requires O(2 pa ) which could reduce the number of parameters exponentially depending on which conditional assumptions we make - helps in inference and learning Firoozeh Sepehr Graphical Models Reconstruction 21/50

22 Graphical Models What is factorization? Example 2 Joint probability distribution 1 Using the chain rule p(x 1, x 2, x 3, x 4, x 5, x 6 ) = p(x 1 )p(x 2 x 1 )p(x 3 x 1, x 2 ) p(x 4 x 1, x 2, x 3 )p(x 5 x 1, x 2, x 3, x 4 ) p(x 6 x 1, x 2, x 3, x 4, x 5 ) Using graphical models p(x 1, x 2, x 3, x 4, x 5, x 6 ) = p(x 1 )p(x 2 x 1 )p(x 3 x 2, x 5 ) p(x 4 x 1 )p(x 5 x 4 )p(x 6 x 5 ) (5) (6) x 1 x 2 x 3 x 6 x 4 x 5 Firoozeh Sepehr Graphical Models Reconstruction 22/50

23 Graphical Models Fun application of joint distribution factorization In rooted trees Joint probability distribution is the same! Use the Bayes rule... b b b b a c a c a c a Undirected a is root b is root c is root c p(a, b, c) = p(a)p(b a)p(c b) = p(a) p(b) p(a, b) p(b) p(a) p(c b) = p(b)p(a b)p(c b) = p(c)p(b c)p(a b) (7) Firoozeh Sepehr Graphical Models Reconstruction 23/50

24 Graphical Models Undirected Graphical Models Undirected graphical models Family of multivariate probability distributions that factorize according to a graph G = (N, E) Set of vertices, N, represents random variables Set of edges, E, encodes the set of conditional independencies between variables Definition Random vector X is said to be Markov on G if for every i, the random variable x i is conditionally independent of all other variables given its neighbours. p(x i x \i ) = p(x i ngb(x i )) (8) where p is the joint probability distribution. Firoozeh Sepehr Graphical Models Reconstruction 24/50

25 Graphical Models Undirected Graphical Models Tree-structured graphical models Family of multivariate probability distributions that are Markov on a tree T = (N, E) Firoozeh Sepehr Graphical Models Reconstruction 25/50

26 Graphical Models Definition d-separation 6 A subset of variables S is said to separate x i from x j if all paths between x i and x j are separated by S A path P is separated by a subset S of variables if at least one pair of successive edges along P is blocked by S block 6 Two edges meeting head-to-tail or tail-to-tail at node x (x is a chain or a fork) are blocked by S if x is in S Two edges meeting head-to-head at node x (x is an inverted fork) are blocked by S if neither x nor any of its descendants is in S. Firoozeh Sepehr Graphical Models Reconstruction 26/50

27 Graphical Models Definition d-separation Example 6 d sep(x 2, x 3 {x 1 })? d sep(x 2, x 3 {x 1, x 4 })? d sep(x 2, x 3 {x 1, x 6 })? x 4 x 2 x 1 x 3 x 5 x 6 Firoozeh Sepehr Graphical Models Reconstruction 27/50

28 Graphical Models Interesting application Lumiere project 5 The Lumiere Project centers on harnessing probability and utility to provide assistance to computer software users. Lumiere prototypes served as the basis for components of the Office Assistant in the Microsoft Office 97 suite of productivity applications. Infers a user s needs by considering a user s background, actions, and queries Challenges are Model construction about time-varying goals of computer users Needs a large database - over 25,000 hours of usability studies were invested in Office 97 Firoozeh Sepehr Graphical Models Reconstruction 28/50

29 Outline 1 Overview 2 History and Background 3 Graphical Models 4 Reconstruction 5 Open Issues Firoozeh Sepehr Graphical Models Reconstruction 29/50

30 Reconstruction What is reconstruction? Reconstruction The problem is that samples are available only from a subset of variables The goal is to learn the minimal latent tree - trees without any redundant hidden nodes Latent and minimal latent trees Note A latent tree is a tree with node set N = V H, where V is the set of observed nodes and H is the set of latent (hidden) nodes. Set of minimal latent trees, T 3, is the set of latent trees that each hidden node has at least three neighbors (hidden or observed) All leaves are observed, although not all observed nodes need to be leaves. Firoozeh Sepehr Graphical Models Reconstruction 30/50

31 Graphical Models Interesting application Vista system 4 A decision-theoretic system that has been used at NASA Mission Control Center in Houston for several years. Uses Bayesian networks to interpret live telemetry and provides advice on the likelihood of alternative failures of the space shuttle s propulsion systems. Considers time criticality and recommends actions of the highest expected utility Employs decision-theoretic methods for controlling the display of information to dynamically identify the most important information to highlight Firoozeh Sepehr Graphical Models Reconstruction 31/50

32 Reconstruction Additive metric Define a measurement 8 Information distances Defined for pairwisse distributions For guassian graphical models, correlation coefficient of two random variables x i and x j ρ ij = cov(x i, x j ) var(xi )var(x j ) (9) Information distance d ij = log ρ ij (10) Inverse relation between information distance and correlation Extendable to discrete random variables Firoozeh Sepehr Graphical Models Reconstruction 32/50

33 Reconstruction Additive metric Proposition 8 The information distances d ij are additive tree metrics. In other words, if the joint probabiliry distribution p(x) is a tree-structured graphical model Markov on the tree T p = (N, E p ), then the information distances are additive on T p. k, l N : d kl = (i,j) Path kl d ij (11) Proof Homework! Firoozeh Sepehr Graphical Models Reconstruction 33/50

34 Reconstruction Sibling grouping Lemma 8 For distances d ij for all i, j V on a tree T T 3, the following two properties on Φ ijk = d ik d jk hold. 1 Φ ijk = d ij for all k V \i,j iff i is a leaf and j is its parent 1 Φ ijk = d ij for all k V \i,j iff j is a leaf and i is its parent 2 d ij < Φ ijk = Φ ijk < d ij for all k, k V \i,j iff both i and j are leaves and they have the same parent (they belong to the same sibling group) Proof of 2 Homework! Firoozeh Sepehr Graphical Models Reconstruction 34/50

35 Reconstruction Sibling grouping Proof of 1 : Using the additive property of information distances, if i is a leaf and j is its parent, d ik = d ij + d jk, therefore, Φ ijk = d ij for all k i, j. : By contradiction, i and j are not connected with an edge. Then there exists a node u i, j on the path connecting i and j. If u V, then let k = u, otherwise, let k be an observed node in the subtree away from i and j which exists since T T 3. Therefore, d ij = d iu + d uj > d iu d uj = d ik d kj = Φ ijk which is a contradiction. i u j k Firoozeh Sepehr Graphical Models Reconstruction 35/50

36 Reconstruction Sibling grouping Proof of 1 - cont d : By contradition, if i is not a leaf, then there exists a node u i, j such that (i, u) E. Let k = u if u V, otherwise, let k be an observed node in the subtree away from i and j. Therefore, Φ ijk = d ik d jk = d ij < d ij which is again a contradiction, therefore, i is a leaf. j i u k Firoozeh Sepehr Graphical Models Reconstruction 36/50

37 Reconstruction Sibling grouping Using previous Lemma to determine node relationships 8 For every pair of i, j V consider the following: 1 If Φ ijk = d ij for all k V \i,j, then i is a leaf node and j is a parent of i. Similarly, if Φ ijk = d ij for all k V \i,j, then j is a leaf and i is a parent of j. 2 If Φ ijk is constant for all k V \i,j but not equal to either d ij or d ij, then i and j are leaves and they are siblings. 3 If Φ ijk is not equal for all k V \i,j, then there are three cases: (a) Nodes i and j are not siblings nor have a parent-child relationship. (b) Nodes i and j are siblings but at least one of them is not a leaf. (c) Nodes i and j have a parent-child relationship but the child is not a leaf. Firoozeh Sepehr Graphical Models Reconstruction 37/50

38 Reconstruction Sibling grouping Visualize... Case 1 Case 2 d1 d1 d3 d3 d2 d2 j d4 d8 i d5 d6 d7 d4 d8 d5 d6 i d7 j Φ ijk = d 8 = d ij Φ ijk d ij Φ ijk = d 6 d 7 d ij = d 6 + d 7 for all k V \ i, j Firoozeh Sepehr Graphical Models Reconstruction 38/50

39 Reconstruction Sibling grouping Visualize... Case 3a Case 3b Case 3c d1 d1 d1 i d4 k d2 d5 d6 d3 k d7 j i d4 d2 k j d5 d6 d3 d7 k d4 i j d2 d5 d6 d3 d7 d8 k d8 k d8 Φ ijk Φ ijk Φ ijk Φ ijk Φ ijk Φ ijk Φ ijk = d 4 + d 2 + d 3 d 7 Φ ijk = d 4 + d 5 Φ ijk = d 4 d 5 Φ ijk = d 4 d 2 d 3 d 7 Φ ijk = d 5 Φ ijk = d 5 for all k, k V \ i, j Firoozeh Sepehr Graphical Models Reconstruction 39/50

40 Reconstruction Recursive Grouping (RG) Algorithm Recursive Grouping (RG) Algorithm 1 Initialize Y = V 2 Compute Φ ijk = d ik d jk for all i, j, k Y 3 Using sibling grouping, define {Π l } L l=1 to be partitions of Y such that for every subset Π l (with Π l 2), any two nodes are either siblings which are leaves or they have a parent-child relationship in which the child is a leaf 4 Add singles sets to Y new 5 For each Π l with Π l 2, if Π l contains a parent node, add it to Y new, otherwise, create a new hidden node and connect it to all the nodes in Π l and add the node to Y new 6 Update Y old to be Y and Y to be Y new 7 Compute the distances of new hidden nodes 8 If Y 3, go to step 2, otherwise, if Y = 2, connect two remaining nodes in Y and stop. If Y = 1, stop. Firoozeh Sepehr Graphical Models Reconstruction 40/50

41 Reconstruction Recursive Grouping (RG) Algorithm Visualize... h 2 h Original latent tree h h 1 First iteration 3 h 2 h 3 h 2 h h Second iteration Third iteration h 1 3 Firoozeh Sepehr Graphical Models Reconstruction 41/50

42 Reconstruction Recursive Grouping (RG) Algorithm Proof of step 7 7 Compute the distances of new hidden nodes Let i, j ch(h) and k Y old i, j. We know that d ih d jh = d ik d jk = Φ ijk and d ih + d jh = d ij. Therefore, we can recover the distances between a previously active node i Y old and its new hidden parent h Y using d ih = 1 2 (d ij + Φ ijk ) (12) For any other active node l Y, we can compute d hl using a child node i ch(h) using { dil d d hl = ih, if l Y old d ik d ih d lk, otherwise, where k ch(l) (13) Firoozeh Sepehr Graphical Models Reconstruction 42/50

43 Reconstruction Recap Steps to learn a latent tree 1 Define an additive metric 2 Perform sibling grouping test to determine nodes relationships 3 Perform RG algorithm Firoozeh Sepehr Graphical Models Reconstruction 43/50

44 Outline 1 Overview 2 History and Background 3 Graphical Models 4 Reconstruction 5 Open Issues Firoozeh Sepehr Graphical Models Reconstruction 44/50

45 Open Issues What next? Improvement! Probabilistic models are used as a key component in some challenging applications and they remain to be applied in some other fields Learning other types of GMs Polytrees General graphs Applying the theorems on random processes Define interrelations Firoozeh Sepehr Graphical Models Reconstruction 45/50

46 Homework Question 1 Prove that information distances are additive tree metrics. Question 2 Prove that for distances d ij for all i, j V on a tree T T 3, the following the following property on Φ ijk = d ik d jk holds 2 d ij < Φ ijk = Φ ijk < d ij for all k, k V \i,j iff both i and j are leaves and they have the same parent (they belong to the same sibling group) Firoozeh Sepehr Graphical Models Reconstruction 46/50

47 Homework Question 3 Draw the digraph associated with the following matrix and answer the followings. d sep(x 1, x 2 {x 6, x 7})? d sep(x 4, x 5 {x 1, x 2, x 3, x 6})? d sep(x 1, x 7 {x 3, x 4, x 5})? M = (14) Firoozeh Sepehr Graphical Models Reconstruction 47/50

48 Questions? Firoozeh Sepehr Graphical Models Reconstruction 48/50

49 References I [1] Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference J. Pearl, 1988 [2] Probabilistic Graphical Models, Principles and Techniques D. Koller, N. Friedman, MIT Press, 2009 [3] Inferring Cellular Networks Using Probabilistic Graphical Models N. Friedman, Vol 303, Issue 5659, pp , 2004 [4] Vista Goes Online: Decision-Analytic Systems for Real-Time Decision-Making in Mission Control M. Barry, E. Horvitz, C. Ruokangas, S. Srinivas, N , 1994 [5] The Lumiere Project: Bayesian User Modeling for Inferring the Goals and Needs of Software Users E. Horvitz, J. Breese, D. Heckerman, D. Hovel, K. Rommelse, 1998 [6] Fusion, Propagation, and Structuring in Belief Networks J. Pearl, Artificial Intelligence 29, 1986 Firoozeh Sepehr Graphical Models Reconstruction 49/50

50 References II [7] The Recovery of Causal Polytrees from Statistical Data G. Rebane, J. Pearl, Proceedings of the Third Conference on Uncertainty in Artificial Intelligence, 1987 [8] Learning Latent Tree Graphical Models M. J. Choi, V. Y. F. Tan, A. S. Willsky, Journal of Machine Learning Research, Volume 12, 2011 [9] Gerolamo Cardano Cardano [10] Pierre de Fermat de Fermat [11] Blaise Pascal Pascal [12] Andrey Kolmogorov Kolmogorov [13] Thomas Bayes Bayes [14] An Evaluation of the Diagnostic Accuracy of Pathfinder D. E. Heckerman, 1991 Firoozeh Sepehr Graphical Models Reconstruction 50/50

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models Prof. Daniel Cremers 4. Probabilistic Graphical Models Directed Models The Bayes Filter (Rep.) (Bayes) (Markov) (Tot. prob.) (Markov) (Markov) 2 Graphical Representation (Rep.) We can describe the overall

More information

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models Prof. Daniel Cremers 4. Probabilistic Graphical Models Directed Models The Bayes Filter (Rep.) (Bayes) (Markov) (Tot. prob.) (Markov) (Markov) 2 Graphical Representation (Rep.) We can describe the overall

More information

D-Separation. b) the arrows meet head-to-head at the node, and neither the node, nor any of its descendants, are in the set C.

D-Separation. b) the arrows meet head-to-head at the node, and neither the node, nor any of its descendants, are in the set C. D-Separation Say: A, B, and C are non-intersecting subsets of nodes in a directed graph. A path from A to B is blocked by C if it contains a node such that either a) the arrows on the path meet either

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Bayesian Networks Directed Acyclic Graph (DAG) Bayesian Networks General Factorization Bayesian Curve Fitting (1) Polynomial Bayesian

More information

Machine Learning. Sourangshu Bhattacharya

Machine Learning. Sourangshu Bhattacharya Machine Learning Sourangshu Bhattacharya Bayesian Networks Directed Acyclic Graph (DAG) Bayesian Networks General Factorization Curve Fitting Re-visited Maximum Likelihood Determine by minimizing sum-of-squares

More information

Loopy Belief Propagation

Loopy Belief Propagation Loopy Belief Propagation Research Exam Kristin Branson September 29, 2003 Loopy Belief Propagation p.1/73 Problem Formalization Reasoning about any real-world problem requires assumptions about the structure

More information

Foundations of Computer Science Spring Mathematical Preliminaries

Foundations of Computer Science Spring Mathematical Preliminaries Foundations of Computer Science Spring 2017 Equivalence Relation, Recursive Definition, and Mathematical Induction Mathematical Preliminaries Mohammad Ashiqur Rahman Department of Computer Science College

More information

Lecture 5: Exact inference. Queries. Complexity of inference. Queries (continued) Bayesian networks can answer questions about the underlying

Lecture 5: Exact inference. Queries. Complexity of inference. Queries (continued) Bayesian networks can answer questions about the underlying given that Maximum a posteriori (MAP query: given evidence 2 which has the highest probability: instantiation of all other variables in the network,, Most probable evidence (MPE: given evidence, find an

More information

2. Graphical Models. Undirected graphical models. Factor graphs. Bayesian networks. Conversion between graphical models. Graphical Models 2-1

2. Graphical Models. Undirected graphical models. Factor graphs. Bayesian networks. Conversion between graphical models. Graphical Models 2-1 Graphical Models 2-1 2. Graphical Models Undirected graphical models Factor graphs Bayesian networks Conversion between graphical models Graphical Models 2-2 Graphical models There are three families of

More information

These notes present some properties of chordal graphs, a set of undirected graphs that are important for undirected graphical models.

These notes present some properties of chordal graphs, a set of undirected graphs that are important for undirected graphical models. Undirected Graphical Models: Chordal Graphs, Decomposable Graphs, Junction Trees, and Factorizations Peter Bartlett. October 2003. These notes present some properties of chordal graphs, a set of undirected

More information

Research Article Structural Learning about Directed Acyclic Graphs from Multiple Databases

Research Article Structural Learning about Directed Acyclic Graphs from Multiple Databases Abstract and Applied Analysis Volume 2012, Article ID 579543, 9 pages doi:10.1155/2012/579543 Research Article Structural Learning about Directed Acyclic Graphs from Multiple Databases Qiang Zhao School

More information

Part II. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS

Part II. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Part II C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Converting Directed to Undirected Graphs (1) Converting Directed to Undirected Graphs (2) Add extra links between

More information

Directed Graphical Models

Directed Graphical Models Copyright c 2008 2010 John Lafferty, Han Liu, and Larry Wasserman Do Not Distribute Chapter 18 Directed Graphical Models Graphs give a powerful way of representing independence relations and computing

More information

Workshop report 1. Daniels report is on website 2. Don t expect to write it based on listening to one project (we had 6 only 2 was sufficient

Workshop report 1. Daniels report is on website 2. Don t expect to write it based on listening to one project (we had 6 only 2 was sufficient Workshop report 1. Daniels report is on website 2. Don t expect to write it based on listening to one project (we had 6 only 2 was sufficient quality) 3. I suggest writing it on one presentation. 4. Include

More information

1 : Introduction to GM and Directed GMs: Bayesian Networks. 3 Multivariate Distributions and Graphical Models

1 : Introduction to GM and Directed GMs: Bayesian Networks. 3 Multivariate Distributions and Graphical Models 10-708: Probabilistic Graphical Models, Spring 2015 1 : Introduction to GM and Directed GMs: Bayesian Networks Lecturer: Eric P. Xing Scribes: Wenbo Liu, Venkata Krishna Pillutla 1 Overview This lecture

More information

Graphical Models. Pradeep Ravikumar Department of Computer Science The University of Texas at Austin

Graphical Models. Pradeep Ravikumar Department of Computer Science The University of Texas at Austin Graphical Models Pradeep Ravikumar Department of Computer Science The University of Texas at Austin Useful References Graphical models, exponential families, and variational inference. M. J. Wainwright

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Overview of Part One Probabilistic Graphical Models Part One: Graphs and Markov Properties Christopher M. Bishop Graphs and probabilities Directed graphs Markov properties Undirected graphs Examples Microsoft

More information

Graphical models are a lot like a circuit diagram they are written down to visualize and better understand a problem.

Graphical models are a lot like a circuit diagram they are written down to visualize and better understand a problem. Machine Learning (ML, F16) Lecture#15 (Tuesday, Nov. 1st) Lecturer: Byron Boots Graphical Models 1 Graphical Models Often, one is interested in representing a joint distribution P over a set of n random

More information

The Basics of Graphical Models

The Basics of Graphical Models The Basics of Graphical Models David M. Blei Columbia University September 30, 2016 1 Introduction (These notes follow Chapter 2 of An Introduction to Probabilistic Graphical Models by Michael Jordan.

More information

Lecture 5: Exact inference

Lecture 5: Exact inference Lecture 5: Exact inference Queries Inference in chains Variable elimination Without evidence With evidence Complexity of variable elimination which has the highest probability: instantiation of all other

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 5 Inference

More information

Summary: A Tutorial on Learning With Bayesian Networks

Summary: A Tutorial on Learning With Bayesian Networks Summary: A Tutorial on Learning With Bayesian Networks Markus Kalisch May 5, 2006 We primarily summarize [4]. When we think that it is appropriate, we comment on additional facts and more recent developments.

More information

COS 513: Foundations of Probabilistic Modeling. Lecture 5

COS 513: Foundations of Probabilistic Modeling. Lecture 5 COS 513: Foundations of Probabilistic Modeling Young-suk Lee 1 Administrative Midterm report is due Oct. 29 th. Recitation is at 4:26pm in Friend 108. Lecture 5 R is a computer language for statistical

More information

Graphical Models. David M. Blei Columbia University. September 17, 2014

Graphical Models. David M. Blei Columbia University. September 17, 2014 Graphical Models David M. Blei Columbia University September 17, 2014 These lecture notes follow the ideas in Chapter 2 of An Introduction to Probabilistic Graphical Models by Michael Jordan. In addition,

More information

Statistical Techniques in Robotics (STR, S15) Lecture#06 (Wednesday, January 28)

Statistical Techniques in Robotics (STR, S15) Lecture#06 (Wednesday, January 28) Statistical Techniques in Robotics (STR, S15) Lecture#06 (Wednesday, January 28) Lecturer: Byron Boots Graphical Models 1 Graphical Models Often one is interested in representing a joint distribution P

More information

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Cheng Soon Ong & Christian Walder. Canberra February June 2018 Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression

More information

Lecture 4: Undirected Graphical Models

Lecture 4: Undirected Graphical Models Lecture 4: Undirected Graphical Models Department of Biostatistics University of Michigan zhenkewu@umich.edu http://zhenkewu.com/teaching/graphical_model 15 September, 2016 Zhenke Wu BIOSTAT830 Graphical

More information

Recall from last time. Lecture 4: Wrap-up of Bayes net representation. Markov networks. Markov blanket. Isolating a node

Recall from last time. Lecture 4: Wrap-up of Bayes net representation. Markov networks. Markov blanket. Isolating a node Recall from last time Lecture 4: Wrap-up of Bayes net representation. Markov networks Markov blanket, moral graph Independence maps and perfect maps Undirected graphical models (Markov networks) A Bayes

More information

FMA901F: Machine Learning Lecture 6: Graphical Models. Cristian Sminchisescu

FMA901F: Machine Learning Lecture 6: Graphical Models. Cristian Sminchisescu FMA901F: Machine Learning Lecture 6: Graphical Models Cristian Sminchisescu Graphical Models Provide a simple way to visualize the structure of a probabilistic model and can be used to design and motivate

More information

Bayesian Networks. A Bayesian network is a directed acyclic graph that represents causal relationships between random variables. Earthquake.

Bayesian Networks. A Bayesian network is a directed acyclic graph that represents causal relationships between random variables. Earthquake. Bayes Nets Independence With joint probability distributions we can compute many useful things, but working with joint PD's is often intractable. The naïve Bayes' approach represents one (boneheaded?)

More information

Stat 5421 Lecture Notes Graphical Models Charles J. Geyer April 27, Introduction. 2 Undirected Graphs

Stat 5421 Lecture Notes Graphical Models Charles J. Geyer April 27, Introduction. 2 Undirected Graphs Stat 5421 Lecture Notes Graphical Models Charles J. Geyer April 27, 2016 1 Introduction Graphical models come in many kinds. There are graphical models where all the variables are categorical (Lauritzen,

More information

Computer vision: models, learning and inference. Chapter 10 Graphical Models

Computer vision: models, learning and inference. Chapter 10 Graphical Models Computer vision: models, learning and inference Chapter 10 Graphical Models Independence Two variables x 1 and x 2 are independent if their joint probability distribution factorizes as Pr(x 1, x 2 )=Pr(x

More information

Directed Graphical Models (Bayes Nets) (9/4/13)

Directed Graphical Models (Bayes Nets) (9/4/13) STA561: Probabilistic machine learning Directed Graphical Models (Bayes Nets) (9/4/13) Lecturer: Barbara Engelhardt Scribes: Richard (Fangjian) Guo, Yan Chen, Siyang Wang, Huayang Cui 1 Introduction For

More information

2. Graphical Models. Undirected pairwise graphical models. Factor graphs. Bayesian networks. Conversion between graphical models. Graphical Models 2-1

2. Graphical Models. Undirected pairwise graphical models. Factor graphs. Bayesian networks. Conversion between graphical models. Graphical Models 2-1 Graphical Models 2-1 2. Graphical Models Undirected pairwise graphical models Factor graphs Bayesian networks Conversion between graphical models Graphical Models 2-2 Graphical models Families of graphical

More information

Modeling and Reasoning with Bayesian Networks. Adnan Darwiche University of California Los Angeles, CA

Modeling and Reasoning with Bayesian Networks. Adnan Darwiche University of California Los Angeles, CA Modeling and Reasoning with Bayesian Networks Adnan Darwiche University of California Los Angeles, CA darwiche@cs.ucla.edu June 24, 2008 Contents Preface 1 1 Introduction 1 1.1 Automated Reasoning........................

More information

Graphical Models and Markov Blankets

Graphical Models and Markov Blankets Stephan Stahlschmidt Ladislaus von Bortkiewicz Chair of Statistics C.A.S.E. Center for Applied Statistics and Economics Humboldt-Universität zu Berlin Motivation 1-1 Why Graphical Models? Illustration

More information

Node Aggregation for Distributed Inference in Bayesian Networks

Node Aggregation for Distributed Inference in Bayesian Networks Node Aggregation for Distributed Inference in Bayesian Networks Kuo-Chu Chang and Robert Fung Advanced Decision Systmes 1500 Plymouth Street Mountain View, California 94043-1230 Abstract This study describes

More information

Lecture 3: Graphs and flows

Lecture 3: Graphs and flows Chapter 3 Lecture 3: Graphs and flows Graphs: a useful combinatorial structure. Definitions: graph, directed and undirected graph, edge as ordered pair, path, cycle, connected graph, strongly connected

More information

Foundations of Discrete Mathematics

Foundations of Discrete Mathematics Foundations of Discrete Mathematics Chapter 12 By Dr. Dalia M. Gil, Ph.D. Trees Tree are useful in computer science, where they are employed in a wide range of algorithms. They are used to construct efficient

More information

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms for Inference Fall 2014

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms for Inference Fall 2014 Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms for Inference Fall 2014 1 Course Overview This course is about performing inference in complex

More information

Lecture 22 Tuesday, April 10

Lecture 22 Tuesday, April 10 CIS 160 - Spring 2018 (instructor Val Tannen) Lecture 22 Tuesday, April 10 GRAPH THEORY Directed Graphs Directed graphs (a.k.a. digraphs) are an important mathematical modeling tool in Computer Science,

More information

CS 343: Artificial Intelligence

CS 343: Artificial Intelligence CS 343: Artificial Intelligence Bayes Nets: Inference Prof. Scott Niekum The University of Texas at Austin [These slides based on those of Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.

More information

Machine Learning

Machine Learning Machine Learning 10-701 Tom M. Mitchell Machine Learning Department Carnegie Mellon University February 15, 2011 Today: Graphical models Inference Conditional independence and D-separation Learning from

More information

Graph Algorithms Using Depth First Search

Graph Algorithms Using Depth First Search Graph Algorithms Using Depth First Search Analysis of Algorithms Week 8, Lecture 1 Prepared by John Reif, Ph.D. Distinguished Professor of Computer Science Duke University Graph Algorithms Using Depth

More information

Machine Learning Lecture 16

Machine Learning Lecture 16 ourse Outline Machine Learning Lecture 16 undamentals (2 weeks) ayes ecision Theory Probability ensity stimation Undirected raphical Models & Inference 28.06.2016 iscriminative pproaches (5 weeks) Linear

More information

Consistent and Efficient Reconstruction of Latent Tree Models

Consistent and Efficient Reconstruction of Latent Tree Models Stochastic Systems Group Consistent and Efficient Reconstruction of Latent Tree Models Myung Jin Choi Joint work with Vincent Tan, Anima Anandkumar, and Alan S. Willsky Laboratory for Information and Decision

More information

Graphical Models Part 1-2 (Reading Notes)

Graphical Models Part 1-2 (Reading Notes) Graphical Models Part 1-2 (Reading Notes) Wednesday, August 3 2011, 2:35 PM Notes for the Reading of Chapter 8 Graphical Models of the book Pattern Recognition and Machine Learning (PRML) by Chris Bishop

More information

Machine Learning

Machine Learning Machine Learning 10-601 Tom M. Mitchell Machine Learning Department Carnegie Mellon University October 2, 2012 Today: Graphical models Bayes Nets: Representing distributions Conditional independencies

More information

Learning Latent Tree Graphical Models

Learning Latent Tree Graphical Models Journal of Machine Learning Research 12 (2011) 1771-1812 Submitted 9/10; Revised 2/11; Published 5/11 Learning Latent Tree Graphical Models Myung Jin Choi Stochastic Systems Group Laboratory for Information

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Overview of Part Two Probabilistic Graphical Models Part Two: Inference and Learning Christopher M. Bishop Exact inference and the junction tree MCMC Variational methods and EM Example General variational

More information

Machine Learning

Machine Learning Machine Learning 10-601 Tom M. Mitchell Machine Learning Department Carnegie Mellon University February 18, 2015 Today: Graphical models Bayes Nets: Representing distributions Conditional independencies

More information

ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning

ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning Topics Bayes Nets: Inference (Finish) Variable Elimination Graph-view of VE: Fill-edges, induced width

More information

3 : Representation of Undirected GMs

3 : Representation of Undirected GMs 0-708: Probabilistic Graphical Models 0-708, Spring 202 3 : Representation of Undirected GMs Lecturer: Eric P. Xing Scribes: Nicole Rafidi, Kirstin Early Last Time In the last lecture, we discussed directed

More information

Machine Learning. Lecture Slides for. ETHEM ALPAYDIN The MIT Press, h1p://

Machine Learning. Lecture Slides for. ETHEM ALPAYDIN The MIT Press, h1p:// Lecture Slides for INTRODUCTION TO Machine Learning ETHEM ALPAYDIN The MIT Press, 2010 alpaydin@boun.edu.tr h1p://www.cmpe.boun.edu.tr/~ethem/i2ml2e CHAPTER 16: Graphical Models Graphical Models Aka Bayesian

More information

Probabilistic Partial Evaluation: Exploiting rule structure in probabilistic inference

Probabilistic Partial Evaluation: Exploiting rule structure in probabilistic inference Probabilistic Partial Evaluation: Exploiting rule structure in probabilistic inference David Poole Department of Computer Science University of British Columbia 2366 Main Mall, Vancouver, B.C., Canada

More information

A note on the pairwise Markov condition in directed Markov fields

A note on the pairwise Markov condition in directed Markov fields TECHNICAL REPORT R-392 April 2012 A note on the pairwise Markov condition in directed Markov fields Judea Pearl University of California, Los Angeles Computer Science Department Los Angeles, CA, 90095-1596,

More information

Approximate Discrete Probability Distribution Representation using a Multi-Resolution Binary Tree

Approximate Discrete Probability Distribution Representation using a Multi-Resolution Binary Tree Approximate Discrete Probability Distribution Representation using a Multi-Resolution Binary Tree David Bellot and Pierre Bessière GravirIMAG CNRS and INRIA Rhône-Alpes Zirst - 6 avenue de l Europe - Montbonnot

More information

CS 343: Artificial Intelligence

CS 343: Artificial Intelligence CS 343: Artificial Intelligence Bayes Nets: Independence Prof. Scott Niekum The University of Texas at Austin [These slides based on those of Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.

More information

Introduction to Computers and Programming. Concept Question

Introduction to Computers and Programming. Concept Question Introduction to Computers and Programming Prof. I. K. Lundqvist Lecture 7 April 2 2004 Concept Question G1(V1,E1) A graph G(V, where E) is V1 a finite = {}, nonempty E1 = {} set of G2(V2,E2) vertices and

More information

A Brief Introduction to Bayesian Networks AIMA CIS 391 Intro to Artificial Intelligence

A Brief Introduction to Bayesian Networks AIMA CIS 391 Intro to Artificial Intelligence A Brief Introduction to Bayesian Networks AIMA 14.1-14.3 CIS 391 Intro to Artificial Intelligence (LDA slides from Lyle Ungar from slides by Jonathan Huang (jch1@cs.cmu.edu)) Bayesian networks A simple,

More information

6c Lecture 3 & 4: April 8 & 10, 2014

6c Lecture 3 & 4: April 8 & 10, 2014 6c Lecture 3 & 4: April 8 & 10, 2014 3.1 Graphs and trees We begin by recalling some basic definitions from graph theory. Definition 3.1. A (undirected, simple) graph consists of a set of vertices V and

More information

Testing Independencies in Bayesian Networks with i-separation

Testing Independencies in Bayesian Networks with i-separation Proceedings of the Twenty-Ninth International Florida Artificial Intelligence Research Society Conference Testing Independencies in Bayesian Networks with i-separation Cory J. Butz butz@cs.uregina.ca University

More information

Abstract. 2 Background 2.1 Belief Networks. 1 Introduction

Abstract. 2 Background 2.1 Belief Networks. 1 Introduction Probabilistic Partial Evaluation: Exploiting rule structure in probabilistic inference* David Poole Department of Computer Science University of British Columbia 2366 Main Mall, Vancouver, B.C., Canada

More information

Evolution Module. 6.1 Phylogenetic Trees. Bob Gardner and Lev Yampolski. Integrated Biology and Discrete Math (IBMS 1300)

Evolution Module. 6.1 Phylogenetic Trees. Bob Gardner and Lev Yampolski. Integrated Biology and Discrete Math (IBMS 1300) Evolution Module 6.1 Phylogenetic Trees Bob Gardner and Lev Yampolski Integrated Biology and Discrete Math (IBMS 1300) Fall 2008 1 INDUCTION Note. The natural numbers N is the familiar set N = {1, 2, 3,...}.

More information

An undirected graph is a tree if and only of there is a unique simple path between any 2 of its vertices.

An undirected graph is a tree if and only of there is a unique simple path between any 2 of its vertices. Trees Trees form the most widely used subclasses of graphs. In CS, we make extensive use of trees. Trees are useful in organizing and relating data in databases, file systems and other applications. Formal

More information

10708 Graphical Models: Homework 2

10708 Graphical Models: Homework 2 10708 Graphical Models: Homework 2 Due October 15th, beginning of class October 1, 2008 Instructions: There are six questions on this assignment. Each question has the name of one of the TAs beside it,

More information

Graphical Models & HMMs

Graphical Models & HMMs Graphical Models & HMMs Henrik I. Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I. Christensen (RIM@GT) Graphical Models

More information

BACKGROUND: A BRIEF INTRODUCTION TO GRAPH THEORY

BACKGROUND: A BRIEF INTRODUCTION TO GRAPH THEORY BACKGROUND: A BRIEF INTRODUCTION TO GRAPH THEORY General definitions; Representations; Graph Traversals; Topological sort; Graphs definitions & representations Graph theory is a fundamental tool in sparse

More information

Building Classifiers using Bayesian Networks

Building Classifiers using Bayesian Networks Building Classifiers using Bayesian Networks Nir Friedman and Moises Goldszmidt 1997 Presented by Brian Collins and Lukas Seitlinger Paper Summary The Naive Bayes classifier has reasonable performance

More information

STAT 598L Probabilistic Graphical Models. Instructor: Sergey Kirshner. Exact Inference

STAT 598L Probabilistic Graphical Models. Instructor: Sergey Kirshner. Exact Inference STAT 598L Probabilistic Graphical Models Instructor: Sergey Kirshner Exact Inference What To Do With Bayesian/Markov Network? Compact representation of a complex model, but Goal: efficient extraction of

More information

Bayesian Classification Using Probabilistic Graphical Models

Bayesian Classification Using Probabilistic Graphical Models San Jose State University SJSU ScholarWorks Master's Projects Master's Theses and Graduate Research Spring 2014 Bayesian Classification Using Probabilistic Graphical Models Mehal Patel San Jose State University

More information

Graphs. Part I: Basic algorithms. Laura Toma Algorithms (csci2200), Bowdoin College

Graphs. Part I: Basic algorithms. Laura Toma Algorithms (csci2200), Bowdoin College Laura Toma Algorithms (csci2200), Bowdoin College Undirected graphs Concepts: connectivity, connected components paths (undirected) cycles Basic problems, given undirected graph G: is G connected how many

More information

Fully dynamic algorithm for recognition and modular decomposition of permutation graphs

Fully dynamic algorithm for recognition and modular decomposition of permutation graphs Fully dynamic algorithm for recognition and modular decomposition of permutation graphs Christophe Crespelle Christophe Paul CNRS - Département Informatique, LIRMM, Montpellier {crespell,paul}@lirmm.fr

More information

Crossing bridges. Crossing bridges Great Ideas in Theoretical Computer Science. Lecture 12: Graphs I: The Basics. Königsberg (Prussia)

Crossing bridges. Crossing bridges Great Ideas in Theoretical Computer Science. Lecture 12: Graphs I: The Basics. Königsberg (Prussia) 15-251 Great Ideas in Theoretical Computer Science Lecture 12: Graphs I: The Basics February 22nd, 2018 Crossing bridges Königsberg (Prussia) Now Kaliningrad (Russia) Is there a way to walk through the

More information

Exact Inference: Elimination and Sum Product (and hidden Markov models)

Exact Inference: Elimination and Sum Product (and hidden Markov models) Exact Inference: Elimination and Sum Product (and hidden Markov models) David M. Blei Columbia University October 13, 2015 The first sections of these lecture notes follow the ideas in Chapters 3 and 4

More information

Statistical and Learning Techniques in Computer Vision Lecture 1: Markov Random Fields Jens Rittscher and Chuck Stewart

Statistical and Learning Techniques in Computer Vision Lecture 1: Markov Random Fields Jens Rittscher and Chuck Stewart Statistical and Learning Techniques in Computer Vision Lecture 1: Markov Random Fields Jens Rittscher and Chuck Stewart 1 Motivation Up to now we have considered distributions of a single random variable

More information

I I I I I I I I I I I I I I I I I I I

I I I I I I I I I I I I I I I I I I I UPDATNG PROBABLTES N MULTPLY-CONNECTED BELEF NETWORKS H.J. Suennondt and G.F. Cooper Medical Computer Science Group Stanford University Stanford, California 94305-5479 This paper focuses on probability

More information

Sum-Product Networks. STAT946 Deep Learning Guest Lecture by Pascal Poupart University of Waterloo October 15, 2015

Sum-Product Networks. STAT946 Deep Learning Guest Lecture by Pascal Poupart University of Waterloo October 15, 2015 Sum-Product Networks STAT946 Deep Learning Guest Lecture by Pascal Poupart University of Waterloo October 15, 2015 Introduction Outline What is a Sum-Product Network? Inference Applications In more depth

More information

Graph Algorithms. Chapter 22. CPTR 430 Algorithms Graph Algorithms 1

Graph Algorithms. Chapter 22. CPTR 430 Algorithms Graph Algorithms 1 Graph Algorithms Chapter 22 CPTR 430 Algorithms Graph Algorithms Why Study Graph Algorithms? Mathematical graphs seem to be relatively specialized and abstract Why spend so much time and effort on algorithms

More information

A Discovery Algorithm for Directed Cyclic Graphs

A Discovery Algorithm for Directed Cyclic Graphs A Discovery Algorithm for Directed Cyclic Graphs Thomas Richardson 1 Logic and Computation Programme CMU, Pittsburgh PA 15213 1. Introduction Directed acyclic graphs have been used fruitfully to represent

More information

Recognizability Equals Definability for Graphs of Bounded Treewidth and Bounded Chordality

Recognizability Equals Definability for Graphs of Bounded Treewidth and Bounded Chordality Recognizability Equals Definability for Graphs of Bounded Treewidth and Bounded Chordality Hans L. Bodlaender, Utrecht University and Eindhoven University of Technology Pinar Heggernes, University of Bergen

More information

Reasoning About Uncertainty

Reasoning About Uncertainty Reasoning About Uncertainty Graphical representation of causal relations (examples) Graphical models Inference in graphical models (introduction) 1 Jensen, 1996 Example 1: Icy Roads 2 1 Jensen, 1996 Example

More information

Lecture 5: Graphs & their Representation

Lecture 5: Graphs & their Representation Lecture 5: Graphs & their Representation Why Do We Need Graphs Graph Algorithms: Many problems can be formulated as problems on graphs and can be solved with graph algorithms. To learn those graph algorithms,

More information

A Well-Behaved Algorithm for Simulating Dependence Structures of Bayesian Networks

A Well-Behaved Algorithm for Simulating Dependence Structures of Bayesian Networks A Well-Behaved Algorithm for Simulating Dependence Structures of Bayesian Networks Yang Xiang and Tristan Miller Department of Computer Science University of Regina Regina, Saskatchewan, Canada S4S 0A2

More information

Bayesian Machine Learning - Lecture 6

Bayesian Machine Learning - Lecture 6 Bayesian Machine Learning - Lecture 6 Guido Sanguinetti Institute for Adaptive and Neural Computation School of Informatics University of Edinburgh gsanguin@inf.ed.ac.uk March 2, 2015 Today s lecture 1

More information

BAYESIAN NETWORKS STRUCTURE LEARNING

BAYESIAN NETWORKS STRUCTURE LEARNING BAYESIAN NETWORKS STRUCTURE LEARNING Xiannian Fan Uncertainty Reasoning Lab (URL) Department of Computer Science Queens College/City University of New York http://url.cs.qc.cuny.edu 1/52 Overview : Bayesian

More information

Evolutionary tree reconstruction (Chapter 10)

Evolutionary tree reconstruction (Chapter 10) Evolutionary tree reconstruction (Chapter 10) Early Evolutionary Studies Anatomical features were the dominant criteria used to derive evolutionary relationships between species since Darwin till early

More information

Evaluating the Effect of Perturbations in Reconstructing Network Topologies

Evaluating the Effect of Perturbations in Reconstructing Network Topologies DSC 2 Working Papers (Draft Versions) http://www.ci.tuwien.ac.at/conferences/dsc-2/ Evaluating the Effect of Perturbations in Reconstructing Network Topologies Florian Markowetz and Rainer Spang Max-Planck-Institute

More information

Lecture 20: Clustering and Evolution

Lecture 20: Clustering and Evolution Lecture 20: Clustering and Evolution Study Chapter 10.4 10.8 11/11/2014 Comp 555 Bioalgorithms (Fall 2014) 1 Clique Graphs A clique is a graph where every vertex is connected via an edge to every other

More information

AN ANALYSIS ON MARKOV RANDOM FIELDS (MRFs) USING CYCLE GRAPHS

AN ANALYSIS ON MARKOV RANDOM FIELDS (MRFs) USING CYCLE GRAPHS Volume 8 No. 0 208, -20 ISSN: 3-8080 (printed version); ISSN: 34-3395 (on-line version) url: http://www.ijpam.eu doi: 0.2732/ijpam.v8i0.54 ijpam.eu AN ANALYSIS ON MARKOV RANDOM FIELDS (MRFs) USING CYCLE

More information

A GRAPH FROM THE VIEWPOINT OF ALGEBRAIC TOPOLOGY

A GRAPH FROM THE VIEWPOINT OF ALGEBRAIC TOPOLOGY A GRAPH FROM THE VIEWPOINT OF ALGEBRAIC TOPOLOGY KARL L. STRATOS Abstract. The conventional method of describing a graph as a pair (V, E), where V and E repectively denote the sets of vertices and edges,

More information

Inference for loglinear models (contd):

Inference for loglinear models (contd): Stat 504, Lecture 25 1 Inference for loglinear models (contd): Loglinear/Logit connection Intro to Graphical Models Stat 504, Lecture 25 2 Loglinear Models no distinction between response and explanatory

More information

v V Question: How many edges are there in a graph with 10 vertices each of degree 6?

v V Question: How many edges are there in a graph with 10 vertices each of degree 6? ECS20 Handout Graphs and Trees March 4, 2015 (updated 3/9) Notion of a graph 1. A graph G = (V,E) consists of V, a nonempty set of vertices (or nodes) and E, a set of pairs of elements of V called edges.

More information

Variational Methods for Graphical Models

Variational Methods for Graphical Models Chapter 2 Variational Methods for Graphical Models 2.1 Introduction The problem of probabb1istic inference in graphical models is the problem of computing a conditional probability distribution over the

More information

BMI/STAT 768: Lecture 06 Trees in Graphs

BMI/STAT 768: Lecture 06 Trees in Graphs BMI/STAT 768: Lecture 06 Trees in Graphs Moo K. Chung mkchung@wisc.edu February 11, 2018 Parts of this lecture is based on [3, 5]. Many objects and data can be represented as networks. Unfortunately networks

More information

4 Basics of Trees. Petr Hliněný, FI MU Brno 1 FI: MA010: Trees and Forests

4 Basics of Trees. Petr Hliněný, FI MU Brno 1 FI: MA010: Trees and Forests 4 Basics of Trees Trees, actually acyclic connected simple graphs, are among the simplest graph classes. Despite their simplicity, they still have rich structure and many useful application, such as in

More information

Introduction to Graphical Models

Introduction to Graphical Models Robert Collins CSE586 Introduction to Graphical Models Readings in Prince textbook: Chapters 10 and 11 but mainly only on directed graphs at this time Credits: Several slides are from: Review: Probability

More information

Av. Prof. Mello Moraes, 2231, , São Paulo, SP - Brazil

Av. Prof. Mello Moraes, 2231, , São Paulo, SP - Brazil " Generalizing Variable Elimination in Bayesian Networks FABIO GAGLIARDI COZMAN Escola Politécnica, University of São Paulo Av Prof Mello Moraes, 31, 05508-900, São Paulo, SP - Brazil fgcozman@uspbr Abstract

More information

Chapter 8 of Bishop's Book: Graphical Models

Chapter 8 of Bishop's Book: Graphical Models Chapter 8 of Bishop's Book: Graphical Models Review of Probability Probability density over possible values of x Used to find probability of x falling in some range For continuous variables, the probability

More information

Conditional PSDDs: Modeling and Learning with Modular Knowledge

Conditional PSDDs: Modeling and Learning with Modular Knowledge Conditional PSDDs: Modeling and Learning with Modular Knowledge Yujia Shen and Arthur Choi and Adnan Darwiche Computer Science Department University of California, Los Angeles {yujias,aychoi,darwiche}@csuclaedu

More information