AN ANALYSIS ON MARKOV RANDOM FIELDS (MRFs) USING CYCLE GRAPHS

Similar documents
Loopy Belief Propagation

Multiple Constraint Satisfaction by Belief Propagation: An Example Using Sudoku

A Tutorial Introduction to Belief Propagation

Probabilistic Graphical Models

Information Processing Letters

Chapter 8 of Bishop's Book: Graphical Models

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models

CLASSES OF VERY STRONGLY PERFECT GRAPHS. Ganesh R. Gandal 1, R. Mary Jeya Jothi 2. 1 Department of Mathematics. Sathyabama University Chennai, INDIA

FMA901F: Machine Learning Lecture 6: Graphical Models. Cristian Sminchisescu

CS242: Probabilistic Graphical Models Lecture 2B: Loopy Belief Propagation & Junction Trees

Lecture 4: Undirected Graphical Models

Expectation Propagation

Lecture 9: Undirected Graphical Models Machine Learning

ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning

Mesh segmentation. Florent Lafarge Inria Sophia Antipolis - Mediterranee

1 Random Walks on Graphs

Lecture 3: Conditional Independence - Undirected

Machine Learning. Sourangshu Bhattacharya

6 : Factor Graphs, Message Passing and Junction Trees

Belief propagation and MRF s

Statistical and Learning Techniques in Computer Vision Lecture 1: Markov Random Fields Jens Rittscher and Chuck Stewart

5/3/2010Z:\ jeh\self\notes.doc\7 Chapter 7 Graphical models and belief propagation Graphical models and belief propagation

Math 776 Graph Theory Lecture Note 1 Basic concepts

CS242: Probabilistic Graphical Models Lecture 3: Factor Graphs & Variable Elimination

D-Separation. b) the arrows meet head-to-head at the node, and neither the node, nor any of its descendants, are in the set C.

Conditional Random Fields and beyond D A N I E L K H A S H A B I C S U I U C,

RAMSEY NUMBERS IN SIERPINSKI TRIANGLE. Vels University, Pallavaram Chennai , Tamil Nadu, INDIA

Graphs. Pseudograph: multiple edges and loops allowed

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS

10708 Graphical Models: Homework 4

Collective classification in network data

Computer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models

THE DESIGN OF STRUCTURED REGULAR LDPC CODES WITH LARGE GIRTH. Haotian Zhang and José M. F. Moura

2. Graphical Models. Undirected graphical models. Factor graphs. Bayesian networks. Conversion between graphical models. Graphical Models 2-1

Decomposition of log-linear models

Link Structure Analysis

STA 4273H: Statistical Machine Learning

Vertex Magic Total Labelings of Complete Graphs 1

Math 778S Spectral Graph Theory Handout #2: Basic graph theory

More details on Loopy BP

Graphical Models. David M. Blei Columbia University. September 17, 2014

Tree-structured approximations by expectation propagation

An Application of Graph Theory in Cryptography

Junction Trees and Chordal Graphs

Algorithms for Markov Random Fields in Computer Vision

Probabilistic inference in graphical models

Lecture 5: Graphs. Rajat Mittal. IIT Kanpur

W[1]-hardness. Dániel Marx. Recent Advances in Parameterized Complexity Tel Aviv, Israel, December 3, 2017

A DISCUSSION ON SSP STRUCTURE OF PAN, HELM AND CROWN GRAPHS

Graph Theory S 1 I 2 I 1 S 2 I 1 I 2

Comparison of Graph Cuts with Belief Propagation for Stereo, using Identical MRF Parameters

LDPC Codes a brief Tutorial

Probabilistic and Statistical Inference Laboratory University of Toronto, Toronto, ON, Canada

Estimating the Information Rate of Noisy Two-Dimensional Constrained Channels

ON SOME LABELINGS OF LINE GRAPH OF BARBELL GRAPH

Graphs (MTAT , 6 EAP) Lectures: Mon 14-16, hall 404 Exercises: Wed 14-16, hall 402

Lecture 3: Recap. Administrivia. Graph theory: Historical Motivation. COMP9020 Lecture 4 Session 2, 2017 Graphs and Trees

Part II. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS

The Basics of Graphical Models

Total magic cordial labeling and square sum total magic cordial labeling in extended duplicate graph of triangular snake

Lecture 1: Examples, connectedness, paths and cycles

Regularization and Markov Random Fields (MRF) CS 664 Spring 2008

10 Sum-product on factor tree graphs, MAP elimination

Probabilistic Graphical Models

Probabilistic Graphical Models

New Message-Passing Decoding Algorithm of LDPC Codes by Partitioning Check Nodes 1

Finding Non-overlapping Clusters for Generalized Inference Over Graphical Models

On the Relationships between Zero Forcing Numbers and Certain Graph Coverings

Assignment 4 Solutions of graph problems

Introduction to Graph Theory

On Balance Index Set of Double graphs and Derived graphs

From the Jungle to the Garden: Growing Trees for Markov Chain Monte Carlo Inference in Undirected Graphical Models

Application of Message Passing and Sinkhorn Balancing Algorithms for Probabilistic Graphical Models

Overlapped Scheduling for Folded LDPC Decoding Based on Matrix Permutation

Characterization of Super Strongly Perfect Graphs in Chordal and Strongly Chordal Graphs

Variational Methods for Graphical Models

Graph Theory: Introduction

Bayesian Networks, Winter Yoav Haimovitch & Ariel Raviv

CSCI 2950-P Homework 1: Belief Propagation, Inference, & Factor Graphs

Notes for Lecture 20

Computer Vision Group Prof. Daniel Cremers. 4a. Inference in Graphical Models

Vertex Magic Total Labelings of Complete Graphs

Graphical Models. Pradeep Ravikumar Department of Computer Science The University of Texas at Austin

Data mining --- mining graphs

Inference in the Promedas medical expert system

LOW-DENSITY PARITY-CHECK (LDPC) codes [1] can

ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning

Analysis of Some Bistar Related MMD Graphs

4 Factor graphs and Comparing Graphical Model Types

Probabilistic Graphical Models

CS270 Combinatorial Algorithms & Data Structures Spring Lecture 19:

arxiv: v1 [cs.dm] 21 Dec 2015

K-Means and Gaussian Mixture Models

Junction tree propagation - BNDG 4-4.6

3 : Representation of Undirected GMs

Varying Applications (examples)

CS649 Sensor Networks IP Track Lecture 6: Graphical Models

Probabilistic Graphical Models

Homework 1: Belief Propagation & Factor Graphs

1.5D PARALLEL SPARSE MATRIX-VECTOR MULTIPLY

Transcription:

Volume 8 No. 0 208, -20 ISSN: 3-8080 (printed version); ISSN: 34-3395 (on-line version) url: http://www.ijpam.eu doi: 0.2732/ijpam.v8i0.54 ijpam.eu AN ANALYSIS ON MARKOV RANDOM FIELDS (MRFs) USING CYCLE GRAPHS F. Anitha Florence Vinola and G. Padma 2,2 Department of Mathematics Sathyabama Institute of Science and Technology Chennai-9, India. E-mail: anithaflorence98@gmail.com E-mail: govindanpadma970@gmail.com 2 Abstract An undirected graphical structure whose vertices are set of random variables having a Markov property is called a Markov Random Field (MRF). Some of the Markov properties are discussed with cycle graph and complete graph using undirected graphical representations. Belief propagation (Message passing algorithm) over the Markov random field has many useful applications, and has been successfully applied to several important computer vision problems. In coding theory, the error codes such as Low Density Parity Check (LDPC) codes and Turbo codes are mainly applied to minimize the errors, when the messages passed from one medium to another medium using belief propagation. This paper describes the relation between the belief propagation and maximal cliques in terms of undirected graphical structure. AMS Subject Classification: 60J20, 60J05, 60J0, 60J25. Key Words and Phrases: Markov random field, Belief propagation, maximal clique, undirected graph, error correcting codes.

Introduction An undirected graphical structure with Markov properties is called a Markov Random Field (MRF). MRF has many useful applications in Bayesian networks, error correcting codes, wireless networks using Belief Propagation Algorithm (BPA). The importance of Belief propagation algorithm is to imply the marginal densities on every node of the graphical structure [, 5, 7]. Hence the Belief propagation is also known as the message passing algorithm. BPA estimates the marginal densities for each unobserved nodes, conditional on any observed nodes. Belief propagation is mainly used in artificial intelligence and information theory [8, 9]. It explains the enormous applications in different fields such as lowdensity parity check codes, turbo codes etc. Belief propagation was first introduced by Judea Pearl in 982, who described the algorithm on trees and was later extended to poly trees. It seems to be powerful in many undirected graphical structures. BPA is generally presented as message update equations on a factor graphs involving messages between variable nodes and their neighbouring factor nodes and vice-versa [2, 4, 6]. The message transmitted from one medium to another medium in an undirected graphical structure is the way of generalising Belief propagation algorithm. The variant of belief propagation algorithm is the Gaussian belief propagation when the underlying distributions are Gaussian [3]. The scope and features of this paper are arranged in the following way. The definitions and graphical representations of factor graphs and theorems are discussed in chapter II, graphical representation of a Markov random field in chapter III, discussion of Markovian properties using cycle and complete graphs in chapter IV, belief propagation algorithm for coding and decoding messages using maximal cliques on undirected graphs in chapter V and conclusions in chapter VI. 2

2 Definitions and Theorems Markov Random Field A Markov random field or Markov network or undirected graphical model in which the vertices of a graph are the set of random variables having a Markov property described by an undirected graphs. Markov random fields are undirected graphical structures that may be cyclic. Clique A clique is a sub graph of an undirected graphs such that every two distinct vertices in the clique are adjacent. That is its induced sub graph is complete. Maximal Clique Maximal Clique is a clique that cannot include one more adjacent vertex to its vertex set, and it does not exist exclusively within the vertex set of a large clique. Factor graph A factor graph is a bipartite graph that refers to the factorization of a function. Factor graphs are used to represent the joint probability mass function of the variables and factorization of a probability distribution function that consist of the system in probability theory and its applications. A factor graph can be used to group the variable nodes and factor nodes, and gives the important information about statistical dependencies among these variables in probabilistic modelling of systems. The decoding of capacity-approaching error correcting codes, such as LDPC and turbo codes is the most powerful success of factor graphs and the sum-product algorithm. In a factor graph circles are represented as variable nodes, square boxes are represented as factors and the straight lines are represented as an edge between the variables and the factors. 3

Theorem. An irreducible Markov chain is transient for undirected graphical structure iff for some state i there exists a non zero vector y j such that p ij y j for all j i and y j < for all j (or) the graph should contain the maximal clique of vertex which is less than or equal to half of the number of vertices of a given graph. 3 Graphical Representation of a Markov Random Field The following undirected graphical structure is the representation of Markov random field. Figure : (Undirected graphical structure) The transition probability matrix (tpm) corresponding to the undirected graphical structure is given by 0 0 0 0 0 0 4 4 4 4 0 0 0 0 0 0 4 4 4 4 0 0 0 0 0 0 4 4 4 4 0 0 0 0 0 4 4 4 8 8 0 0 0 0 0 4 4 4 8 8 P = 0 0 0 3 0 0 0 0 8 8 4 0 0 0 0 0 3 0 4 2 2 2 0 0 0 0 0 0 0 2 24 24 0 0 0 0 0 0 2 0 0 0 0 0 0 2 0 24 24 24 0 24 4

For the above tpm, the zero sub-square matrix of order 4 < 5 = 0 = n, the chain is irreducible [0] and using Theorem the chain 2 2 is transient. 4 Discussion of Markovian Properties for Cycle and Complete Graphs A graph in which each distinct pair of vertices are adjacent is called a complete graph. If the degree of all the vertices of a graph are equal (n degrees), then the graph is called n-regular graph. A graph in which the starting and ending vertices are same is called a cycle graph. Consider the following 5-regular complete graph. Figure 2: (complete graph) The tpm corresponding to the above undirected complete graph is P = 0 4 4 4 4 0 4 4 4 4 0 4 4 4 4 0 4 4 4 4 4 4 4 0 4 For a 4-regular complete graph (Figure 2), only the diagonal elements of a tpm are zero. Since the complete graph is always an irreducible Markov chain [0] and the states are aperiodic. Since the states are finite and irreducible, they are non-null persistent which gives the result that the complete graph is always ergodic. Consider the following cycle graph. 5

Figure 3: (Cycle graph) The tpm corresponding to the above undirected cycle graph is 0 0 0 2 2 0 0 0 2 2 P = 0 0 0 2 2 0 0 0 2 2 0 0 0 2 2 For a cycle graph, the tpm can be constructed and it is found that the probability of all the states in any one step becomes nonzero. (i.e.) p n ij > 0. Therefore the chain is irreducible and the states are finite, the chain is non-null persistent. If the given undirected graphical structure is a cycle of odd vertices, the chain is aperiodic. If the given undirected graphical structure is a cycle of even vertices, the chain is periodic of period 2. A cycle graph also posses the nature of a random walk, which is a mathematical formalization of a path that consists of a succession of random steps in Markov random field. 5 Belief Propagation Algorithm in Terms of Maximal Cliques There are different ways of defining and tracking the set of regions in a graph that can exchange messages. One method uses ideas introduced by Kikuchi in the physics literature, and is known as cluster variation method. The two different improvements 6

to belief propagation are the cluster variation method and the survey propagation algorithms. Belief propagation algorithm is used for tracking, partition and many image representation tasks. Belief propagation is a probabilistic graphical model which gives the detailed knowledge of probability distributions that shares a common structure. Hammersly-Clifford theorem helps to identify the exact structure of the graph which is nothing but the product of all the maximal cliques of the graph. The probability of an image x, which are considered to be unobserved nodes under a Markov random field can be written as a product of all the maximal cliques that are the observed nodes. p(x) = ψ(x C ), where X C is the image region corresponding to the clique C, ψ is a potential function of the clique, and Z is Z C a normalization function which makes the total probability under integral area is equal to one. In belief propagation, for finding the marginal probability at every node, a message is a re-usable partial sum for the marginalization calculations. Figure 4: The marginal probability of the image node x with respect to the observed nodes yi s gives p ( ) x = y p(y) x 2 x 3 x 4 x 5 φ 2 (x, x 2 )φ 3 (x, x 3 )φ 4 (x, x 4 ) φ 5 (x, x 5 )ψ 4 (y 4, x )ψ 5 (y 5, x ) = p(y) m 4(x )m 5 (x ). 7

Similarly the marginal probabilities of the remaining nodes gives the same result that the Figure 4 shows the interrelations among the observed nodes (y i s) and image nodes (x i s). Since nodes of the maximal clique are adjacent, when the message passed on to the observed nodes, belief propagation applied on the maximal cliques minimizes the run time error. 6 Conclusion Graphs are an interesting and exciting way of representing and picturising the relationship between many variables. A graph helps us to identify the conditional independence relationship between the variables. In this paper, the belief propagation algorithm is graphically discussed in terms of maximal cliques. As the belief propagation algorithm is used in LDPC and Turbo codes, to minimize the error is discussed in detail in terms of undirected graphical structure. The given graphical circuit contains a maximal clique of size greater than or equal to, half the size of the graph. Therefore, Markov random field along with the maximal cliques improvise the error correction of LDPC codes and Turbo codes. References [] S. Benedetto, G. Montorsi, D. Divsalar, F. Pollara, Soft-output decoding algorithms in iterative decoding of Turbo codes, Technical Report, 42-24, JPL TDA (996). [2] F.R. Kschischang, B.J. Frey, H.A. Loeliger, Factor graphs and the sum-product algorithm, IEEE Transactions on Information Theory (998). [3] M.I. Jordan, Z. Ghahramani, T.S. Jakkola, L.K. Saul, An introduction to variational methods for graphical methods, Machine Learning, 37 (999), 83-233. [4] F.R. Kschischang, B.T. Frey, H.A. Loelinger, Factor graphs and sum product algorithm, IEEE T. Info Th., 47(2) (200), 498-59. 8

[5] M.F. Tappen, W.T. Freeman, Comparison of graph cuts with belief propagation for sterio, using identical MRF parameters, ICCV, 2 (2003), 900-907. [6] Lecture Notes: Factor graphs and belief propagation Marc Toussaint Machine Learning & Robotics group, TU Berlin Franklinstr. 28/29, FR 6-9, 0587 Berlin, Germany March 4, 2008. [7] T.S. Yedidia, W.T. Freeman, Y. Weiss, Bethe free energies, Kikuchi approximations, and belief propagation algorithms, MERL Technical Report 200-6 (200). [8] G. Padma, C. Vijayalakshmi, A Comparison on Soft-Error Correcting Codes of memory Cells in a Markov Random field, Proceedings of the international conference on cloud computing and egovernance (202), 59-64, ISBN: 978-8-920575-0-7. [9] G. Padma, C. Vijayalakshmi, Implementation of Belief propagation Iterative Method on Markov chains by Designing Bayesian Networks, CiiT International Journal of Artificial Intelligent Systems and Machine Learning, 3(6) (20). [0] F. Anitha Florence Vinola, G. Padma, An analysis on the Markov chain properties using pictorial representation, International Conference on Innovations in information Embedded and Communication Systems (ICIIECS), III (207), 560 564, ISBN: 978--5090-3293-8. 9

20