Junction tree propagation - BNDG 4-4.6

Size: px
Start display at page:

Download "Junction tree propagation - BNDG 4-4.6"

Transcription

1 Junction tree propagation - BNDG 4-4. Finn V. Jensen and Thomas D. Nielsen Junction tree propagation p. 1/2

2 Exact Inference Message Passing in Join Trees More sophisticated inference technique; used in most implemented Bayesian Network systems (e.g. Hugin). Overview: Given: Bayesian network for P(V) Preprocessing Construct a new internal representation for P(V) called a junction tree Inference/Updating Retrieve P(A E = e) for single A V Junction tree propagation p. 2/2

3 Calculate a marginal P(A 4 ) = X X X X X A 1 A 2 A 3 A A φ 1 (A 1 )φ 2 (A 2, A 1 )φ 3 (A 3, A 1 )φ 4 (A 4, A 2 )φ (A, A 2, A 3 )φ (A, A 3 ) Junction tree propagation p. 3/2

4 Calculate a marginal P(A 4 ) = X X X X X A 1 A 2 A 3 A A φ 1 (A 1 )φ 2 (A 2, A 1 )φ 3 (A 3, A 1 )φ 4 (A 4, A 2 )φ (A, A 2, A 3 )φ (A, A 3 ) = X A 1 φ 1 (A 1 ) X A 2 φ 2 (A 2, A 1 )φ 4 (A 4, A 2 ) X A 3 φ 3 (A 3, A 1 ) X A φ (A, A 2, A 3 ) X A φ (A, A 3 ) Junction tree propagation p. 3/2

5 Calculate a marginal P(A 4 )= P A 1 φ 1 (A 1 ) P A 2 φ 2 (A 2,A 1 )φ 4 (A 4,A 2 ) P A 3 φ 3 (A 3,A 1 ) P A φ (A,A 2,A 3 ) P A φ (A,A 3 ) Junction tree propagation p. 4/2

6 Calculate a marginal Some of the calculations can be simplified. E.g. X A P(A A 3 ) = 1 P(A 4 )= P A 1 φ 1 (A 1 ) P A 2 φ 2 (A 2,A 1 )φ 4 (A 4,A 2 ) P A 3 φ 3 (A 3,A 1 ) P A φ (A,A 2,A 3 ) P A φ (A,A 3 ) Junction tree propagation p. 4/2

7 Calculate a marginal Assume evidence e = (e,e ); represented as 0/1 potentials. Then: P(A 4,e) = X φ 1 (A 1 ) X φ 2 (A 2, A 1 )φ 4 (A 4, A 2 ) A 1 A 2 X X φ 3 (A 3, A 1 ) φ (A, A 2, A 3 )e A 3 A X φ (A, A 3 )e A and P(A 4 e) = P(A 4,e) P A 4 P(A 4,e) Junction tree propagation p. /2

8 Calculate a marginal Assume evidence e = (e,e ); represented as 0/1 potentials. Then: P(A 4,e) = X φ 1 (A 1 ) X φ 2 (A 2, A 1 )φ 4 (A 4, A 2 ) A 1 A 2 X X φ 3 (A 3, A 1 ) φ (A, A 2, A 3 )e A 3 A X φ (A, A 3 )e A and P(A 4 e) = P(A 4,e) P A 4 P(A 4,e) The process is sufficiently general to handle all evidence scenarios! We look for a general structure in which these calculations can be performed for all variables. Junction tree propagation p. /2

9 Domain graphs The Bayesian network The domain graph The link (A 2, A 3 ) is called a moral link. Junction tree propagation p. /2

10 Domain graphs Moralization in general For all nodes X: Connect pairwise all parents of X with undirected links. Replace all original directed links by undirected ones. The Bayesian network Junction tree propagation p. /2

11 Domain graphs Moralization in general For all nodes X: Connect pairwise all parents of X with undirected links. Replace all original directed links by undirected ones. The Bayesian network Step 1 Junction tree propagation p. /2

12 Domain graphs Moralization in general For all nodes X: Connect pairwise all parents of X with undirected links. Replace all original directed links by undirected ones. The Bayesian network Step 1 Step 2 Junction tree propagation p. /2

13 Eliminating a variable Suppose that when calculating P(A 4 ) we start off by eliminating A 3 : φ (A 1, A 2, A, A ) = X A 3 φ 3 (A 3, A 1 )φ 4 (A 4, A 2 )φ (A, A 2, A 3 ). Graphically: The Bayesian network The domain graph The new domain graph Junction tree propagation p. /2

14 Eliminating a variable Suppose that when calculating P(A 4 ) we start off by eliminating A 3 : φ (A 1, A 2, A, A ) = X A 3 φ 3 (A 3, A 1 )φ 4 (A 4, A 2 )φ (A, A 2, A 3 ). Graphically: The Bayesian network The domain graph The new domain graph Perfect elimination sequences If all variables can be eliminated without introducing fill-in edges, then the elimination sequence is called perfect. An example could be A, A, A 3, A 1, A 2, A 4. Junction tree propagation p. /2

15 Perfect elimination sequences Perfect elimination sequences All perfect elimination sequences produce the same domain set, namely the set of cliques of the domain graph. Clique A set of nodes is complete if all nodes are pairwise linked. A complete set is a clique if it is not a subset of another complete set. Property Any perfect elimination sequence ending with A is optimal w.r.t. calculating P(A). Junction tree propagation p. 9/2

16 Triangulated graphs Definition An undirected graph with a perfect elimination sequence is called a triangulated graph. Example not triangulated triangulated Junction tree propagation p. 10/2

17 Triangulated graphs Definition An undirected graph with a perfect elimination sequence is called a triangulated graph. Example not triangulated triangulated Corollary A graph is triangulated iff all nodes can successively be eliminated without introducing fill-in edges. Junction tree propagation p. 10/2

18 Triangulated graphs Definition An undirected graph with a perfect elimination sequence is called a triangulated graph. Example not triangulated triangulated Corollary A graph is triangulated iff there does not exist a cycle of length 4 that is not cut by a cord. Junction tree propagation p. 10/2

19 Triangulated graphs Definition An undirected graph with a perfect elimination sequence is called a triangulated graph. Example not triangulated triangulated Corollary In a triangulated graph, each variable A has a perfect elimination sequence ending with A. Junction tree propagation p. 10/2

20 Triangulated graphs Definition An undirected graph with a perfect elimination sequence is called a triangulated graph. Example not triangulated triangulated Assumption For now we shall assume that the domain graph is triangulated! Junction tree propagation p. 10/2

21 Exact Inference Join Tree (Junction Tree) Let G be the set of cliques from an undirected graph, and let the cliques of G be organized in a tree T. Then T is a join tree if a variable V that is contained in two nodes C,C also is contained in every node on the (unique) path connecting C and C. Example: Junction tree propagation p. 11/2

22 Exact Inference Join Tree (Junction Tree) Let G be the set of cliques from an undirected graph, and let the cliques of G be organized in a tree T. Then T is a join tree if a variable V that is contained in two nodes C,C also is contained in every node on the (unique) path connecting C and C. Example: (a) A join tree (b) Not a join tree Junction tree propagation p. 11/2

23 Exact Inference Join Tree (Junction Tree) Let G be the set of cliques from an undirected graph, and let the cliques of G be organized in a tree T. Then T is a join tree if a variable V that is contained in two nodes C,C also is contained in every node on the (unique) path connecting C and C. Example: (a) A join tree (b) Not a join tree Theorem: An undirected graph G is triangulated if and only if the cliques of G can be organized in a join tree. Junction tree propagation p. 11/2

24 Join tree construction Define undirected graph (C, E ): From undirected graph to join graph C : Set of cliques in triangulated moral graph E : {(C 1,C 2 ) C 1 C 2 } Junction tree propagation p. 12/2

25 Join tree construction Define undirected graph (C, E ): From undirected graph to join graph C : Set of cliques in triangulated moral graph E : {(C 1,C 2 ) C 1 C 2 } Junction tree propagation p. 12/2

26 Join tree construction Define undirected graph (C, E ): From undirected graph to join graph C : Set of cliques in triangulated moral graph E : {(C 1,C 2 ) C 1 C 2 } Junction tree propagation p. 12/2

27 Join tree construction Define undirected graph (C, E ): From undirected graph to join graph C : Set of cliques in triangulated moral graph E : {(C 1,C 2 ) C 1 C 2 } Junction tree propagation p. 12/2

28 Join tree construction Define undirected graph (C, E ): From undirected graph to join graph C : Set of cliques in triangulated moral graph E : {(C 1,C 2 ) C 1 C 2 } 1,2,3 2,4, 2,3, 3,,,,, Junction tree propagation p. 12/2

29 Join tree construction From join graph to join tree Find maximal spanning tree: for edge e = (C 1,C 2 ) define weight w(e) := C 1 C 2. Let (C, E) be a maximal spanning tree of (C, E ) with respect to edge weights. Junction tree propagation p. 13/2

30 Join tree construction From join graph to join tree Find maximal spanning tree: for edge e = (C 1,C 2 ) define weight w(e) := C 1 C 2. Let (C, E) be a maximal spanning tree of (C, E ) with respect to edge weights. 1,2,3 2,4, 2,3, 3,,,,, Junction tree propagation p. 13/2

31 Join tree construction From join graph to join tree Find maximal spanning tree: for edge e = (C 1,C 2 ) define weight w(e) := C 1 C 2. Let (C, E) be a maximal spanning tree of (C, E ) with respect to edge weights. 1 1,2,3 2,4, ,3, 1 2,,, 3,, 2 Junction tree propagation p. 13/2

32 Join tree construction From join graph to join tree Find maximal spanning tree: for edge e = (C 1,C 2 ) define weight w(e) := C 1 C 2. Let (C, E) be a maximal spanning tree of (C, E ) with respect to edge weights. 1 1,2,3 2,4, ,3, 1 2,,, 3,, 2 Junction tree propagation p. 13/2

33 Join tree construction From join graph to join tree Find maximal spanning tree: for edge e = (C 1,C 2 ) define weight w(e) := C 1 C 2. Let (C, E) be a maximal spanning tree of (C, E ) with respect to edge weights. 1,2,3 2,4, 2,3, 3,,,,, Junction tree propagation p. 13/2

34 Join tree construction Another example Junction tree propagation p. 14/2

35 Join tree construction Another example A 1 A 2 A 3 A 4 A A Junction tree propagation p. 14/2

36 Join tree construction Another example A 1 A 2 A 3 A 4 A A 1 A 1, A 2, A 3 1 A 2, A A 2, A 3, A A 2, A 3, A Junction tree propagation p. 14/2

37 Join tree construction Another example A 1 A 2 A 3 A 4 A A A 1, A 2, A 3 A 1, A 2, A A 2, A 4 2 A 2, A 3, A A 2, A 4 A 2, A 3, A 1 1 A 2, A 3, A A 2, A 3, A Junction tree propagation p. 14/2

38 Junction tree construction Correctness Let (V, ) be a triangulated graph, (C, E ). Every maximal spanning tree (C, E) of (C, E ) is a join tree for V (Jensen,Jensen 1994). If (V, ) is the triangulated moral graph of a Bayesian network, then there exists for every V V a clique C C with {V } pa(v ) C. Junction tree propagation p. 1/2

39 Propagation in junction trees Initialization Let T be a join tree for the domain graph G with potentials Φ. A junction tree for G consists of T with the additions: each potential φ from Φ is assigned to a clique containing dom (φ). each link has a separator attached. each separator contains two mailboxes, one for each direction. Example Junction tree propagation p. 1/2

40 Propagation in junction trees Propagation To calculate P(A 4 ) we find a clique (V ) containing A 4 and send messages to that clique. Example φ 1, φ 2, φ 3 V 4 : A 1, A 2, A 3 ψ 4 ψ 2 = φ S 2 ψ 1 = φ S 1 S 4 : A 2 S 2 : A 2, A 3 S 1 : A 3 φ 4 φ φ V : A 2, A 4 V 2 : A 2, A 3, A V 1 : A 3, A Junction tree propagation p. 1/2

41 Propagation in junction trees Propagation To calculate P(A 4 ) we find a clique (V ) containing A 4 and send messages to that clique. Example φ 1, φ 2, φ 3 V 4 : A 1, A 2, A 3 1 : ψ 2 = φ S 2 1 : ψ 1 = φ S 1 S 4 : A 2 S 2 : A 2, A 3 S 1 : A 3 φ 4 φ φ V : A 2, A 4 V 2 : A 2, A 3, A V 1 : A 3, A Junction tree propagation p. 1/2

42 Propagation in junction trees Propagation To calculate P(A 4 ) we find a clique (V ) containing A 4 and send messages to that clique. Example φ 1, φ 2, φ 3 V 4 : A 1, A 2, A 3 2 : ψ 4 =... 1 : ψ 1 : ψ 1 = φ S 1 2 = φ S 2 S 4 : A 2 S 2 : A 2, A 3 S 1 : A 3 φ 4 φ φ V : A 2, A 4 V 2 : A 2, A 3, A V 1 : A 3, A ψ 4 = P A 1 φ 1 φ 2 PA 3 φ 3 ψ 1 ψ 2 Now evidence has been collected to V 4 and we get P(A 4 ) = P A 2 φ 4 ψ 4. Junction tree propagation p. 1/2

43 Propagation in junction trees Propagation To calculate P(A i ) for any A i we send messages away from V. Example continued φ 1, φ 2, φ 3 V 4 : A 1, A 2, A 3 2 : ψ 4 =... 1 : ψ 1 : ψ 1 = φ S 1 2 = φ S 2 S 4 : A 2 S 2 : A 2, A 3 S 1 : A 3 φ 4 φ φ V : A 2, A 4 V 2 : A 2, A 3, A V 1 : A 3, A ψ 4 = P A 1 φ 1 φ 2 PA 3 φ 3 ψ 1 ψ 2 Junction tree propagation p. 1/2

44 Propagation in junction trees Propagation To calculate P(A i ) for any A i we send messages away from V. Example continued φ 1, φ 2, φ 3 V 4 : A 1, A 2, A 3 2 : ψ 4 =... 1 : ψ 1 : ψ 1 = φ S 1 2 = φ S 2 3 : ψ 4 = φ A 2 4 S 4 : A 2 S 2 : A 2, A 3 S 1 : A 3 φ 4 φ φ V : A 2, A 4 V 2 : A 2, A 3, A V 1 : A 3, A Junction tree propagation p. 1/2

45 Propagation in junction trees Propagation To calculate P(A i ) for any A i we send messages away from V. Example continued φ 1, φ 2, φ 3 V 4 : A 1, A 2, A 3 2 : ψ 4 : ψ 2 =... 4 : ψ 1 4 =... =... 1 : ψ 1 : ψ 1 = φ S 1 2 = φ S 2 3 : ψ 4 = φ A 2 4 S 4 : A 2 S 2 : A 2, A 3 S 1 : A 3 φ φ 4 φ V : A 2, A 4 V 2 : A 2, A 3, A V 1 : A 3, A ψ 2 = P A 1 ψ 1 φ 1 φ 2 φ 3 ψ 4 {ψ 4, P A 1 φ 1 φ 2 φ 3, ψ 1 } ψ 1 = P A 2 ψ 2 ψ 4 PA 1 φ 1 φ 2 φ 3 Junction tree propagation p. 1/2

46 Propagation in junction trees In general: Sending a message from C i to C j m li C l C j m ij C i m mi C m m ij := X Y Y φ m ki C j \C i φ Φ Ci C k nd(c i ) Junction tree propagation p. 19/2

47 Exact Inference Messages passing in general R Collect and distribute messages i) Collect messages to a preselected root R Junction tree propagation p. 20/2

48 Exact Inference Messages passing in general 1 1 R 1 Collect and distribute messages i) Collect messages to a preselected root R Junction tree propagation p. 20/2

49 Exact Inference Messages passing in general R 1 Collect and distribute messages i) Collect messages to a preselected root R Junction tree propagation p. 20/2

50 Exact Inference Messages passing in general R 3 1 Collect and distribute messages i) Collect messages to a preselected root R ii) Distribute messages away from the root R Junction tree propagation p. 20/2

51 Exact Inference Messages passing in general R Collect and distribute messages i) Collect messages to a preselected root R ii) Distribute messages away from the root R Junction tree propagation p. 20/2

52 Propagation in junction trees Theorem Let the junction tree T represent the Bayesian network BN over the universe U and with evidence e. Assume that T is full. 1. Let V be a clique with set of potentials Φ V, and let S 1,..., S k be V s neighboring separators and with V -directed messages Ψ 1,..., Ψ k. Then P(V, e) = Y Φ V Y Ψ1... YΨ k. 2. Let S be a separator with the sets Ψ S and Ψ S in the mailboxes. Then P(S, e) = Y Ψ S Y Ψ S. Junction tree propagation p. 21/2

53 Propagation in junction trees Theorem Let the junction tree T represent the Bayesian network BN over the universe U and with evidence e. Assume that T is full. 1. Let V be a clique with set of potentials Φ V, and let S 1,..., S k be V s neighboring separators and with V -directed messages Ψ 1,..., Ψ k. Then P(V, e) = Y Φ V Y Ψ1... YΨ k. 2. Let S be a separator with the sets Ψ S and Ψ S in the mailboxes. Then P(S, e) = Y Ψ S Y Ψ S. Evidence Evidence is inserted by adding corresponding 0/1-potentials to the appropriate cliques. Junction tree propagation p. 21/2

54 Triangulation of graphs So far we have assumed that the domain graph is triangulated. If this is not the case, then we embed it in a triangulated graph. Example Junction tree propagation p. 22/2

55 Triangulation of graphs So far we have assumed that the domain graph is triangulated. If this is not the case, then we embed it in a triangulated graph. Example A triangulated graph extending (b) Junction tree propagation p. 22/2

56 Triangulation of graphs Junction tree propagation p. 23/2

57 Triangulation of graphs A heuristic Repeatedly eliminate a simplicial node, and if this is not possible eliminate a node minimizing (fa(x) are noneliminated neighbors of X): fa(x) (fill-in-size ). Junction tree propagation p. 23/2

58 Triangulation of graphs 1 1,2,3 2,4, ,3, 2 3,, 1 2,,, A heuristic Repeatedly eliminate a simplicial node, and if this is not possible eliminate a node minimizing (fa(x) are noneliminated neighbors of X): fa(x) (fill-in-size ). Junction tree propagation p. 23/2

59 Triangulation of graphs 1 1,2,3 2,4, ,3, 2 3,, 1 2,,, A heuristic Repeatedly eliminate a simplicial node, and if this is not possible eliminate a node minimizing (fa(x) are noneliminated neighbors of X): fa(x) (fill-in-size ). Junction tree propagation p. 23/2

60 Triangulation of graphs 1,2,3 2,4, 2,3, 3,,,,, A heuristic Repeatedly eliminate a simplicial node, and if this is not possible eliminate a node minimizing (fa(x) are noneliminated neighbors of X): fa(x) (fill-in-size ). Junction tree propagation p. 23/2

61 Triangulation of graphs A heuristic Repeatedly eliminate a simplicial node, and if this is not possible eliminate a node minimizing (fa(x) are noneliminated neighbors of X): fa(x) (fill-in-size ). Junction tree propagation p. 23/2

62 Triangulation of graphs A heuristic Repeatedly eliminate a simplicial node, and if this is not possible eliminate a node minimizing (fa(x) are noneliminated neighbors of X): fa(x) (fill-in-size ). Junction tree propagation p. 23/2

63 Triangulation of graphs A heuristic Repeatedly eliminate a simplicial node, and if this is not possible eliminate a node minimizing (fa(x) are noneliminated neighbors of X): fa(x) (fill-in-size ). Junction tree propagation p. 23/2

64 Triangulation of graphs A heuristic Repeatedly eliminate a simplicial node, and if this is not possible eliminate a node minimizing (fa(x) are noneliminated neighbors of X): fa(x) (fill-in-size ). Junction tree propagation p. 23/2

65 Triangulation of graphs A heuristic Repeatedly eliminate a simplicial node, and if this is not possible eliminate a node minimizing (fa(x) are noneliminated neighbors of X): fa(x) (fill-in-size ). Junction tree propagation p. 23/2

66 Triangulation of graphs A heuristic Repeatedly eliminate a simplicial node, and if this is not possible eliminate a node minimizing (fa(x) are noneliminated neighbors of X): fa(x) (fill-in-size ). sp(fa(x)) (clique-size). P {Y,Z}:{Y,Z} nb(x) Z nb(y ) sp({y, Z}) (fill-in-weight) Junction tree propagation p. 23/2

67 Triangulation of graphs Usually there are many possible triangulations and finding the best one is NP-hard! Junction tree propagation p. 23/2

68 Summary Bayesian network Junction tree propagation p. 24/2

69 Summary Bayesian network Moral graph Junction tree propagation p. 24/2

70 Summary Bayesian network Moral graph Triangulated graph Junction tree propagation p. 24/2

71 Summary Bayesian network Moral graph Triangulated graph 1,2,3 2,4, 2,3, 3,,,,, Join graph Junction tree propagation p. 24/2

72 Summary Bayesian network Moral graph Triangulated graph 1,2,3 2,4, 1,2,3 2,4, 2,3, 3,, 2,3, 3,,,,, Join graph,,, Join tree Junction tree propagation p. 24/2

73 Summary Bayesian network Moral graph Triangulated graph 1,2,3 2,4, 1,2,3 2,4, 1,2,3 2,4, 2,3 2, 2,3, 3,, 2,3, 3,, 2,3, 3, 3,,,,,,,,,,,, Join graph Join tree Junction tree Junction tree propagation p. 24/2

74 Exact Inference Summary Given: Bayesian network BN for P(V) Junction tree propagation p. 2/2

75 Exact Inference Summary Given: Bayesian network BN for P(V) Moralize Domain graph G for BN (also called the moral graph) Junction tree propagation p. 2/2

76 Exact Inference Summary Given: Bayesian network BN for P(V) Moralize Domain graph G for BN (also called the moral graph) Triangulate Triangulated graph for G Junction tree propagation p. 2/2

77 Exact Inference Summary Given: Bayesian network BN for P(V) Moralize Domain graph G for BN (also called the moral graph) Triangulate Triangulated graph for G Find cliques Join graph Junction tree propagation p. 2/2

78 Exact Inference Summary Given: Bayesian network BN for P(V) Moralize Domain graph G for BN (also called the moral graph) Triangulate Triangulated graph for G Find cliques Join graph Find maximal spanning tree Join tree Junction tree propagation p. 2/2

79 Exact Inference Summary Given: Bayesian network BN for P(V) Moralize Domain graph G for BN (also called the moral graph) Triangulate Triangulated graph for G Find cliques Join graph Find maximal spanning tree Join tree Perform belief updating by message passing. Junction tree propagation p. 2/2

Belief Updating in Bayes Networks (1)

Belief Updating in Bayes Networks (1) Belief Updating in Bayes Networks (1) Mark Fishel, Mihkel Pajusalu, Roland Pihlakas Bayes Networks p. 1 Some Concepts Belief: probability Belief updating: probability calculating based on a BN (model,

More information

Lecture 9: Undirected Graphical Models Machine Learning

Lecture 9: Undirected Graphical Models Machine Learning Lecture 9: Undirected Graphical Models Machine Learning Andrew Rosenberg March 5, 2010 1/1 Today Graphical Models Probabilities in Undirected Graphs 2/1 Undirected Graphs What if we allow undirected graphs?

More information

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models Prof. Daniel Cremers 4. Probabilistic Graphical Models Directed Models The Bayes Filter (Rep.) (Bayes) (Markov) (Tot. prob.) (Markov) (Markov) 2 Graphical Representation (Rep.) We can describe the overall

More information

D-Separation. b) the arrows meet head-to-head at the node, and neither the node, nor any of its descendants, are in the set C.

D-Separation. b) the arrows meet head-to-head at the node, and neither the node, nor any of its descendants, are in the set C. D-Separation Say: A, B, and C are non-intersecting subsets of nodes in a directed graph. A path from A to B is blocked by C if it contains a node such that either a) the arrows on the path meet either

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Raquel Urtasun and Tamir Hazan TTI Chicago April 25, 2011 Raquel Urtasun and Tamir Hazan (TTI-C) Graphical Models April 25, 2011 1 / 17 Clique Trees Today we are going to

More information

Part II. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS

Part II. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Part II C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Converting Directed to Undirected Graphs (1) Converting Directed to Undirected Graphs (2) Add extra links between

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Raquel Urtasun and Tamir Hazan TTI Chicago April 8, 2011 Raquel Urtasun and Tamir Hazan (TTI-C) Graphical Models April 8, 2011 1 / 19 Factor Graphs H does not reveal the

More information

6 : Factor Graphs, Message Passing and Junction Trees

6 : Factor Graphs, Message Passing and Junction Trees 10-708: Probabilistic Graphical Models 10-708, Spring 2018 6 : Factor Graphs, Message Passing and Junction Trees Lecturer: Kayhan Batmanghelich Scribes: Sarthak Garg 1 Factor Graphs Factor Graphs are graphical

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Lecture 8 Junction Trees CS/CNS/EE 155 Andreas Krause Announcements Homework 2 due next Wednesday (Nov 4) in class Start early!!! Project milestones due Monday (Nov 9) 4

More information

EE512 Graphical Models Fall 2009

EE512 Graphical Models Fall 2009 EE512 Graphical Models Fall 2009 Prof. Jeff Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2009 http://ssli.ee.washington.edu/~bilmes/ee512fa09 Lecture 11 -

More information

These notes present some properties of chordal graphs, a set of undirected graphs that are important for undirected graphical models.

These notes present some properties of chordal graphs, a set of undirected graphs that are important for undirected graphical models. Undirected Graphical Models: Chordal Graphs, Decomposable Graphs, Junction Trees, and Factorizations Peter Bartlett. October 2003. These notes present some properties of chordal graphs, a set of undirected

More information

V,T C3: S,L,B T C4: A,L,T A,L C5: A,L,B A,B C6: C2: X,A A

V,T C3: S,L,B T C4: A,L,T A,L C5: A,L,B A,B C6: C2: X,A A Inference II Daphne Koller Stanford University CS228 Handout #13 In the previous chapter, we showed how efficient inference can be done in a BN using an algorithm called Variable Elimination, that sums

More information

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014 Suggested Reading: Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 Probabilistic Modelling and Reasoning: The Junction

More information

Mean Field and Variational Methods finishing off

Mean Field and Variational Methods finishing off Readings: K&F: 10.1, 10.5 Mean Field and Variational Methods finishing off Graphical Models 10708 Carlos Guestrin Carnegie Mellon University November 5 th, 2008 10-708 Carlos Guestrin 2006-2008 1 10-708

More information

CS242: Probabilistic Graphical Models Lecture 2B: Loopy Belief Propagation & Junction Trees

CS242: Probabilistic Graphical Models Lecture 2B: Loopy Belief Propagation & Junction Trees CS242: Probabilistic Graphical Models Lecture 2B: Loopy Belief Propagation & Junction Trees Professor Erik Sudderth Brown University Computer Science September 22, 2016 Some figures and materials courtesy

More information

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models Prof. Daniel Cremers 4. Probabilistic Graphical Models Directed Models The Bayes Filter (Rep.) (Bayes) (Markov) (Tot. prob.) (Markov) (Markov) 2 Graphical Representation (Rep.) We can describe the overall

More information

OSU CS 536 Probabilistic Graphical Models. Loopy Belief Propagation and Clique Trees / Join Trees

OSU CS 536 Probabilistic Graphical Models. Loopy Belief Propagation and Clique Trees / Join Trees OSU CS 536 Probabilistic Graphical Models Loopy Belief Propagation and Clique Trees / Join Trees Slides from Kevin Murphy s Graphical Model Tutorial (with minor changes) Reading: Koller and Friedman Ch

More information

Lecture 5: Exact inference. Queries. Complexity of inference. Queries (continued) Bayesian networks can answer questions about the underlying

Lecture 5: Exact inference. Queries. Complexity of inference. Queries (continued) Bayesian networks can answer questions about the underlying given that Maximum a posteriori (MAP query: given evidence 2 which has the highest probability: instantiation of all other variables in the network,, Most probable evidence (MPE: given evidence, find an

More information

Lecture 5: Exact inference

Lecture 5: Exact inference Lecture 5: Exact inference Queries Inference in chains Variable elimination Without evidence With evidence Complexity of variable elimination which has the highest probability: instantiation of all other

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Bayesian Networks Directed Acyclic Graph (DAG) Bayesian Networks General Factorization Bayesian Curve Fitting (1) Polynomial Bayesian

More information

Computer Vision Group Prof. Daniel Cremers. 4a. Inference in Graphical Models

Computer Vision Group Prof. Daniel Cremers. 4a. Inference in Graphical Models Group Prof. Daniel Cremers 4a. Inference in Graphical Models Inference on a Chain (Rep.) The first values of µ α and µ β are: The partition function can be computed at any node: Overall, we have O(NK 2

More information

Lecture 13: May 10, 2002

Lecture 13: May 10, 2002 EE96 Pat. Recog. II: Introduction to Graphical Models University of Washington Spring 00 Dept. of Electrical Engineering Lecture : May 0, 00 Lecturer: Jeff Bilmes Scribe: Arindam Mandal, David Palmer(000).

More information

Belief propagation in a bucket-tree. Handouts, 275B Fall Rina Dechter. November 1, 2000

Belief propagation in a bucket-tree. Handouts, 275B Fall Rina Dechter. November 1, 2000 Belief propagation in a bucket-tree Handouts, 275B Fall-2000 Rina Dechter November 1, 2000 1 From bucket-elimination to tree-propagation The bucket-elimination algorithm, elim-bel, for belief updating

More information

Part I: Sum Product Algorithm and (Loopy) Belief Propagation. What s wrong with VarElim. Forwards algorithm (filtering) Forwards-backwards algorithm

Part I: Sum Product Algorithm and (Loopy) Belief Propagation. What s wrong with VarElim. Forwards algorithm (filtering) Forwards-backwards algorithm OU 56 Probabilistic Graphical Models Loopy Belief Propagation and lique Trees / Join Trees lides from Kevin Murphy s Graphical Model Tutorial (with minor changes) eading: Koller and Friedman h 0 Part I:

More information

ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning

ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning Topics Markov Random Fields: Inference Exact: VE Exact+Approximate: BP Readings: Barber 5 Dhruv Batra

More information

Bayesian Networks, Winter Yoav Haimovitch & Ariel Raviv

Bayesian Networks, Winter Yoav Haimovitch & Ariel Raviv Bayesian Networks, Winter 2009-2010 Yoav Haimovitch & Ariel Raviv 1 Chordal Graph Warm up Theorem 7 Perfect Vertex Elimination Scheme Maximal cliques Tree Bibliography M.C.Golumbic Algorithmic Graph Theory

More information

COS 513: Foundations of Probabilistic Modeling. Lecture 5

COS 513: Foundations of Probabilistic Modeling. Lecture 5 COS 513: Foundations of Probabilistic Modeling Young-suk Lee 1 Administrative Midterm report is due Oct. 29 th. Recitation is at 4:26pm in Friend 108. Lecture 5 R is a computer language for statistical

More information

ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning

ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning Topics Bayes Nets: Inference (Finish) Variable Elimination Graph-view of VE: Fill-edges, induced width

More information

Loopy Belief Propagation

Loopy Belief Propagation Loopy Belief Propagation Research Exam Kristin Branson September 29, 2003 Loopy Belief Propagation p.1/73 Problem Formalization Reasoning about any real-world problem requires assumptions about the structure

More information

Machine Learning. Sourangshu Bhattacharya

Machine Learning. Sourangshu Bhattacharya Machine Learning Sourangshu Bhattacharya Bayesian Networks Directed Acyclic Graph (DAG) Bayesian Networks General Factorization Curve Fitting Re-visited Maximum Likelihood Determine by minimizing sum-of-squares

More information

2. Graphical Models. Undirected pairwise graphical models. Factor graphs. Bayesian networks. Conversion between graphical models. Graphical Models 2-1

2. Graphical Models. Undirected pairwise graphical models. Factor graphs. Bayesian networks. Conversion between graphical models. Graphical Models 2-1 Graphical Models 2-1 2. Graphical Models Undirected pairwise graphical models Factor graphs Bayesian networks Conversion between graphical models Graphical Models 2-2 Graphical models Families of graphical

More information

Computational Intelligence

Computational Intelligence Computational Intelligence A Logical Approach Problems for Chapter 10 Here are some problems to help you understand the material in Computational Intelligence: A Logical Approach. They are designed to

More information

CS242: Probabilistic Graphical Models Lecture 3: Factor Graphs & Variable Elimination

CS242: Probabilistic Graphical Models Lecture 3: Factor Graphs & Variable Elimination CS242: Probabilistic Graphical Models Lecture 3: Factor Graphs & Variable Elimination Instructor: Erik Sudderth Brown University Computer Science September 11, 2014 Some figures and materials courtesy

More information

Lecture 11: May 1, 2000

Lecture 11: May 1, 2000 / EE596 Pat. Recog. II: Introduction to Graphical Models Spring 2000 Lecturer: Jeff Bilmes Lecture 11: May 1, 2000 University of Washington Dept. of Electrical Engineering Scribe: David Palmer 11.1 Graph

More information

STAT 598L Probabilistic Graphical Models. Instructor: Sergey Kirshner. Exact Inference

STAT 598L Probabilistic Graphical Models. Instructor: Sergey Kirshner. Exact Inference STAT 598L Probabilistic Graphical Models Instructor: Sergey Kirshner Exact Inference What To Do With Bayesian/Markov Network? Compact representation of a complex model, but Goal: efficient extraction of

More information

2. Graphical Models. Undirected graphical models. Factor graphs. Bayesian networks. Conversion between graphical models. Graphical Models 2-1

2. Graphical Models. Undirected graphical models. Factor graphs. Bayesian networks. Conversion between graphical models. Graphical Models 2-1 Graphical Models 2-1 2. Graphical Models Undirected graphical models Factor graphs Bayesian networks Conversion between graphical models Graphical Models 2-2 Graphical models There are three families of

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Raquel Urtasun and Tamir Hazan TTI Chicago April 22, 2011 Raquel Urtasun and Tamir Hazan (TTI-C) Graphical Models April 22, 2011 1 / 22 If the graph is non-chordal, then

More information

Markov Random Fields

Markov Random Fields 3750 Machine earning ecture 4 Markov Random ields Milos auskrecht milos@cs.pitt.edu 5329 ennott quare 3750 dvanced Machine earning Markov random fields Probabilistic models with symmetric dependences.

More information

Machine Learning

Machine Learning Machine Learning 10-601 Tom M. Mitchell Machine Learning Department Carnegie Mellon University October 2, 2012 Today: Graphical models Bayes Nets: Representing distributions Conditional independencies

More information

FMA901F: Machine Learning Lecture 6: Graphical Models. Cristian Sminchisescu

FMA901F: Machine Learning Lecture 6: Graphical Models. Cristian Sminchisescu FMA901F: Machine Learning Lecture 6: Graphical Models Cristian Sminchisescu Graphical Models Provide a simple way to visualize the structure of a probabilistic model and can be used to design and motivate

More information

Machine Learning

Machine Learning Machine Learning 10-601 Tom M. Mitchell Machine Learning Department Carnegie Mellon University February 18, 2015 Today: Graphical models Bayes Nets: Representing distributions Conditional independencies

More information

Chapter 8 of Bishop's Book: Graphical Models

Chapter 8 of Bishop's Book: Graphical Models Chapter 8 of Bishop's Book: Graphical Models Review of Probability Probability density over possible values of x Used to find probability of x falling in some range For continuous variables, the probability

More information

Mean Field and Variational Methods finishing off

Mean Field and Variational Methods finishing off Readings: K&F: 10.1, 10.5 Mean Field and Variational Methods finishing off Graphical Models 10708 Carlos Guestrin Carnegie Mellon University November 5 th, 2008 10-708 Carlos Guestrin 2006-2008 1 10-708

More information

Lecture 4: Undirected Graphical Models

Lecture 4: Undirected Graphical Models Lecture 4: Undirected Graphical Models Department of Biostatistics University of Michigan zhenkewu@umich.edu http://zhenkewu.com/teaching/graphical_model 15 September, 2016 Zhenke Wu BIOSTAT830 Graphical

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Overview of Part Two Probabilistic Graphical Models Part Two: Inference and Learning Christopher M. Bishop Exact inference and the junction tree MCMC Variational methods and EM Example General variational

More information

A Factor Tree Inference Algorithm for Bayesian Networks and its Applications

A Factor Tree Inference Algorithm for Bayesian Networks and its Applications A Factor Tree Infere Algorithm for Bayesian Nets and its Applications Wenhui Liao, Weihong Zhang and Qiang Ji Department of Electrical, Computer and System Engineering Rensselaer Polytechnic Institute,

More information

3 : Representation of Undirected GMs

3 : Representation of Undirected GMs 0-708: Probabilistic Graphical Models 0-708, Spring 202 3 : Representation of Undirected GMs Lecturer: Eric P. Xing Scribes: Nicole Rafidi, Kirstin Early Last Time In the last lecture, we discussed directed

More information

Decomposition of log-linear models

Decomposition of log-linear models Graphical Models, Lecture 5, Michaelmas Term 2009 October 27, 2009 Generating class Dependence graph of log-linear model Conformal graphical models Factor graphs A density f factorizes w.r.t. A if there

More information

Information Processing Letters

Information Processing Letters Information Processing Letters 112 (2012) 449 456 Contents lists available at SciVerse ScienceDirect Information Processing Letters www.elsevier.com/locate/ipl Recursive sum product algorithm for generalized

More information

Bayesian Networks Inference (continued) Learning

Bayesian Networks Inference (continued) Learning Learning BN tutorial: ftp://ftp.research.microsoft.com/pub/tr/tr-95-06.pdf TAN paper: http://www.cs.huji.ac.il/~nir/abstracts/frgg1.html Bayesian Networks Inference (continued) Learning Machine Learning

More information

Finding Non-overlapping Clusters for Generalized Inference Over Graphical Models

Finding Non-overlapping Clusters for Generalized Inference Over Graphical Models 1 Finding Non-overlapping Clusters for Generalized Inference Over Graphical Models Divyanshu Vats and José M. F. Moura arxiv:1107.4067v2 [stat.ml] 18 Mar 2012 Abstract Graphical models use graphs to compactly

More information

Machine Learning!!!!! Srihari. Chain Graph Models. Sargur Srihari

Machine Learning!!!!! Srihari. Chain Graph Models. Sargur Srihari Chain Graph Models Sargur Srihari srihari@cedar.buffalo.edu 1 Topics PDAGs or Chain Graphs Generalization of CRF to Chain Graphs Independencies in Chain Graphs Summary BN versus MN 2 Partially Directed

More information

Dynamic J ointrees. Figure 1: Belief networks and respective jointrees.

Dynamic J ointrees. Figure 1: Belief networks and respective jointrees. 97 Dynamic J ointrees Adnan Darwiche Department of Mathematics American University of Beirut PO Box 11-236 Beirut, Lebanon darwiche@aub. edu.lb Abstract It is well known that one can ignore parts of a

More information

More details on Loopy BP

More details on Loopy BP Readings: K&F: 11.3, 11.5 Yedidia et al. paper from the class website Chapter 9 - Jordan Loopy Belief Propagation Generalized Belief Propagation Unifying Variational and GBP Learning Parameters of MNs

More information

5 Minimal I-Maps, Chordal Graphs, Trees, and Markov Chains

5 Minimal I-Maps, Chordal Graphs, Trees, and Markov Chains Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms for Inference Fall 2014 5 Minimal I-Maps, Chordal Graphs, Trees, and Markov Chains Recall

More information

Chordal Graphs: Theory and Algorithms

Chordal Graphs: Theory and Algorithms Chordal Graphs: Theory and Algorithms 1 Chordal graphs Chordal graph : Every cycle of four or more vertices has a chord in it, i.e. there is an edge between two non consecutive vertices of the cycle. Also

More information

Machine Learning

Machine Learning Machine Learning 10-701 Tom M. Mitchell Machine Learning Department Carnegie Mellon University February 15, 2011 Today: Graphical models Inference Conditional independence and D-separation Learning from

More information

A Comparison of Lauritzen-Spiegelhalter, Hugin, and Shenoy-Shafer Architectures for Computing Marginals of Probability Distributions

A Comparison of Lauritzen-Spiegelhalter, Hugin, and Shenoy-Shafer Architectures for Computing Marginals of Probability Distributions Appeared in: G. F. Cooper & S. Moral (eds.), Uncertainty in Artificial Intelligence, Vol. 14, 1999, pp. 328--337, Morgan Kaufmann, San Francisco, CA. A Comparison of Lauritzen-Spiegelhalter, Hugin, and

More information

Computability Theory

Computability Theory CS:4330 Theory of Computation Spring 2018 Computability Theory Other NP-Complete Problems Haniel Barbosa Readings for this lecture Chapter 7 of [Sipser 1996], 3rd edition. Sections 7.4 and 7.5. The 3SAT

More information

Max-Sum Inference Algorithm

Max-Sum Inference Algorithm Ma-Sum Inference Algorithm Sargur Srihari srihari@cedar.buffalo.edu 1 The ma-sum algorithm Sum-product algorithm Takes joint distribution epressed as a factor graph Efficiently finds marginals over component

More information

4 Factor graphs and Comparing Graphical Model Types

4 Factor graphs and Comparing Graphical Model Types Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms for Inference Fall 2014 4 Factor graphs and Comparing Graphical Model Types We now introduce

More information

Survey of contemporary Bayesian Network Structure Learning methods

Survey of contemporary Bayesian Network Structure Learning methods Survey of contemporary Bayesian Network Structure Learning methods Ligon Liu September 2015 Ligon Liu (CUNY) Survey on Bayesian Network Structure Learning (slide 1) September 2015 1 / 38 Bayesian Network

More information

A New Approach For Convert Multiply-Connected Trees in Bayesian networks

A New Approach For Convert Multiply-Connected Trees in Bayesian networks A New Approach For Convert Multiply-Connected Trees in Bayesian networks 1 Hussein Baloochian, Alireza khantimoory, 2 Saeed Balochian 1 Islamic Azad university branch of zanjan 2 Islamic Azad university

More information

Expectation Propagation

Expectation Propagation Expectation Propagation Erik Sudderth 6.975 Week 11 Presentation November 20, 2002 Introduction Goal: Efficiently approximate intractable distributions Features of Expectation Propagation (EP): Deterministic,

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 5 Inference

More information

Forensic Statistics and Graphical Models (2) Richard Gill Spring Semester

Forensic Statistics and Graphical Models (2) Richard Gill Spring Semester Forensic Statistics and Graphical Models (2) Richard Gill Spring Semester 2015 http://www.math.leidenuniv.nl/~gill/teaching/graphical Graph Theory An undirected graph consists of a set of vertices and

More information

Iterative Algorithms for Graphical Models 1

Iterative Algorithms for Graphical Models 1 Iterative Algorithms for Graphical Models Robert Mateescu School of Information and Computer Science University of California, Irvine mateescu@ics.uci.edu http://www.ics.uci.edu/ mateescu June 30, 2003

More information

5/3/2010Z:\ jeh\self\notes.doc\7 Chapter 7 Graphical models and belief propagation Graphical models and belief propagation

5/3/2010Z:\ jeh\self\notes.doc\7 Chapter 7 Graphical models and belief propagation Graphical models and belief propagation //00Z:\ jeh\self\notes.doc\7 Chapter 7 Graphical models and belief propagation 7. Graphical models and belief propagation Outline graphical models Bayesian networks pair wise Markov random fields factor

More information

Ch9: Exact Inference: Variable Elimination. Shimi Salant, Barak Sternberg

Ch9: Exact Inference: Variable Elimination. Shimi Salant, Barak Sternberg Ch9: Exact Inference: Variable Elimination Shimi Salant Barak Sternberg Part 1 Reminder introduction (1/3) We saw two ways to represent (finite discrete) distributions via graphical data structures: Bayesian

More information

Enumeration of Perfect Sequences in Chordal Graphs

Enumeration of Perfect Sequences in Chordal Graphs Enumeration of Perfect Sequences in Chordal Graphs Yasuko Matsui, Ryuhei Uehara, Takeaki Uno Center for Computational Epidemiology and Response Analysis University of North Texas May 4, 20 (CeCERA) GSCM

More information

Speeding Up Computation in Probabilistic Graphical Models using GPGPUs

Speeding Up Computation in Probabilistic Graphical Models using GPGPUs Speeding Up Computation in Probabilistic Graphical Models using GPGPUs Lu Zheng & Ole J. Mengshoel {lu.zheng ole.mengshoel}@sv.cmu.edu GPU Technology Conference San Jose, CA, March 20, 2013 CMU in Silicon

More information

Recall from last time. Lecture 4: Wrap-up of Bayes net representation. Markov networks. Markov blanket. Isolating a node

Recall from last time. Lecture 4: Wrap-up of Bayes net representation. Markov networks. Markov blanket. Isolating a node Recall from last time Lecture 4: Wrap-up of Bayes net representation. Markov networks Markov blanket, moral graph Independence maps and perfect maps Undirected graphical models (Markov networks) A Bayes

More information

EE512 Graphical Models Fall 2009

EE512 Graphical Models Fall 2009 EE512 Graphical Models Fall 2009 Prof. Jeff Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2009 http://ssli.ee.washington.edu/~bilmes/ee512fa09 Lecture 13 -

More information

10708 Graphical Models: Homework 4

10708 Graphical Models: Homework 4 10708 Graphical Models: Homework 4 Due November 12th, beginning of class October 29, 2008 Instructions: There are six questions on this assignment. Each question has the name of one of the TAs beside it,

More information

Triangulation Heuristics for BN2O Networks

Triangulation Heuristics for BN2O Networks Triangulation Heuristics for BN2O Networks Petr Savicky 1 and JiříVomlel 2 1 Institute of Computer Science Academy of Sciences of the Czech Republic Pod vodárenskou věží 2, 182 07 Praha 8, Czech Republic

More information

CS 664 Flexible Templates. Daniel Huttenlocher

CS 664 Flexible Templates. Daniel Huttenlocher CS 664 Flexible Templates Daniel Huttenlocher Flexible Template Matching Pictorial structures Parts connected by springs and appearance models for each part Used for human bodies, faces Fischler&Elschlager,

More information

Dynamic Planar-Cuts: Efficient Computation of Min-Marginals for Outer-Planar Models

Dynamic Planar-Cuts: Efficient Computation of Min-Marginals for Outer-Planar Models Dynamic Planar-Cuts: Efficient Computation of Min-Marginals for Outer-Planar Models Dhruv Batra 1 1 Carnegie Mellon University www.ece.cmu.edu/ dbatra Tsuhan Chen 2,1 2 Cornell University tsuhan@ece.cornell.edu

More information

Collective classification in network data

Collective classification in network data 1 / 50 Collective classification in network data Seminar on graphs, UCSB 2009 Outline 2 / 50 1 Problem 2 Methods Local methods Global methods 3 Experiments Outline 3 / 50 1 Problem 2 Methods Local methods

More information

Graphical Models as Block-Tree Graphs

Graphical Models as Block-Tree Graphs Graphical Models as Block-Tree Graphs 1 Divyanshu Vats and José M. F. Moura arxiv:1007.0563v2 [stat.ml] 13 Nov 2010 Abstract We introduce block-tree graphs as a framework for deriving efficient algorithms

More information

Recitation 4: Elimination algorithm, reconstituted graph, triangulation

Recitation 4: Elimination algorithm, reconstituted graph, triangulation Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 Recitation 4: Elimination algorithm, reconstituted graph, triangulation

More information

Solving Influence Diagrams using HUGIN, Shafer-Shenoy and Lazy Propagation Λ

Solving Influence Diagrams using HUGIN, Shafer-Shenoy and Lazy Propagation Λ Solving Influence Diagrams using HUGIN, Shafer-Shenoy and Lazy Propagation Λ Anders L. Madsen Hugin Expert A/S Niels Jernes Vej 10 Box 8201 DK-9220 Aalborg Ø, Denmark Anders.L.Madsen@hugin.com Abstract

More information

Bayesian Networks Inference

Bayesian Networks Inference Bayesian Networks Inference Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University November 5 th, 2007 2005-2007 Carlos Guestrin 1 General probabilistic inference Flu Allergy Query: Sinus

More information

Search Algorithms for Solving Queries on Graphical Models & the Importance of Pseudo-trees in their Complexity.

Search Algorithms for Solving Queries on Graphical Models & the Importance of Pseudo-trees in their Complexity. Search Algorithms for Solving Queries on Graphical Models & the Importance of Pseudo-trees in their Complexity. University of California, Irvine CS199: Individual Study with Rina Dechter Héctor Otero Mediero

More information

NP-Completeness. Algorithms

NP-Completeness. Algorithms NP-Completeness Algorithms The NP-Completeness Theory Objective: Identify a class of problems that are hard to solve. Exponential time is hard. Polynomial time is easy. Why: Do not try to find efficient

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Overview of Part One Probabilistic Graphical Models Part One: Graphs and Markov Properties Christopher M. Bishop Graphs and probabilities Directed graphs Markov properties Undirected graphs Examples Microsoft

More information

AND/OR Search Spaces for Graphical Models

AND/OR Search Spaces for Graphical Models AND/OR Search Spaces for Graphical Models Rina Dechter Department of Information and Computer Science University of California, Irvine, CA 9697-45 {dechter }@ics.uci.edu Abstract The paper introduces an

More information

Machine Learning

Machine Learning Machine Learning 10-601 Tom M. Mitchell Machine Learning Department Carnegie Mellon University April 1, 2019 Today: Inference in graphical models Learning graphical models Readings: Bishop chapter 8 Bayesian

More information

Dirac-type characterizations of graphs without long chordless cycles

Dirac-type characterizations of graphs without long chordless cycles Dirac-type characterizations of graphs without long chordless cycles Vašek Chvátal Department of Computer Science Rutgers University chvatal@cs.rutgers.edu Irena Rusu LIFO Université de Orléans irusu@lifo.univ-orleans.fr

More information

Workshop report 1. Daniels report is on website 2. Don t expect to write it based on listening to one project (we had 6 only 2 was sufficient

Workshop report 1. Daniels report is on website 2. Don t expect to write it based on listening to one project (we had 6 only 2 was sufficient Workshop report 1. Daniels report is on website 2. Don t expect to write it based on listening to one project (we had 6 only 2 was sufficient quality) 3. I suggest writing it on one presentation. 4. Include

More information

10708 Graphical Models: Homework 2

10708 Graphical Models: Homework 2 10708 Graphical Models: Homework 2 Due October 15th, beginning of class October 1, 2008 Instructions: There are six questions on this assignment. Each question has the name of one of the TAs beside it,

More information

Learning Bounded Treewidth Bayesian Networks

Learning Bounded Treewidth Bayesian Networks Journal of Machine Learning Research 9 (2008) 2287-2319 Submitted 5/08; Published 10/08 Learning Bounded Treewidth Bayesian Networks Gal Elidan Department of Statistics Hebrew University Jerusalem, 91905,

More information

Efficient Probabilistic Inference Algorithms for Cooperative Multiagent Systems

Efficient Probabilistic Inference Algorithms for Cooperative Multiagent Systems University of Windsor Scholarship at UWindsor Electronic Theses and Dissertations 2010 Efficient Probabilistic Inference Algorithms for Cooperative Multiagent Systems Hongxuan Jin University of Windsor

More information

Markov Network Structure Learning

Markov Network Structure Learning Markov Network Structure Learning Sargur Srihari srihari@cedar.buffalo.edu 1 Topics Structure Learning as model selection Structure Learning using Independence Tests Score based Learning: Hypothesis Spaces

More information

Junction Trees and Chordal Graphs

Junction Trees and Chordal Graphs Graphical Models, Lecture 6, Michaelmas Term 2009 October 30, 2009 Decomposability Factorization of Markov distributions Explicit formula for MLE Consider an undirected graph G = (V, E). A partitioning

More information

Computing Largest Correcting Codes and Their Estimates Using Optimization on Specially Constructed Graphs p.1/30

Computing Largest Correcting Codes and Their Estimates Using Optimization on Specially Constructed Graphs p.1/30 Computing Largest Correcting Codes and Their Estimates Using Optimization on Specially Constructed Graphs Sergiy Butenko Department of Industrial Engineering Texas A&M University College Station, TX 77843

More information

Exact learning and inference for planar graphs

Exact learning and inference for planar graphs Supervisor: Nic Schraudolph September 2007 Ising model Particles modify their behaviour to conform with neighbours behaviour What is the mean energy? Used in chemistry, physics, biology... More than 12,000

More information

Probabilistic inference in graphical models

Probabilistic inference in graphical models Probabilistic inference in graphical models MichaelI.Jordan jordan@cs.berkeley.edu Division of Computer Science and Department of Statistics University of California, Berkeley Yair Weiss yweiss@cs.huji.ac.il

More information

Graph Theory S 1 I 2 I 1 S 2 I 1 I 2

Graph Theory S 1 I 2 I 1 S 2 I 1 I 2 Graph Theory S I I S S I I S Graphs Definition A graph G is a pair consisting of a vertex set V (G), and an edge set E(G) ( ) V (G). x and y are the endpoints of edge e = {x, y}. They are called adjacent

More information

Distributed Algorithms 6.046J, Spring, 2015 Part 2. Nancy Lynch

Distributed Algorithms 6.046J, Spring, 2015 Part 2. Nancy Lynch Distributed Algorithms 6.046J, Spring, 2015 Part 2 Nancy Lynch 1 This Week Synchronous distributed algorithms: Leader Election Maximal Independent Set Breadth-First Spanning Trees Shortest Paths Trees

More information

8 Colouring Planar Graphs

8 Colouring Planar Graphs 8 Colouring Planar Graphs The Four Colour Theorem Lemma 8.1 If G is a simple planar graph, then (i) 12 v V (G)(6 deg(v)) with equality for triangulations. (ii) G has a vertex of degree 5. Proof: For (i),

More information