Image Restoration using Markov Random Fields

Similar documents
Undirected Graphical Models. Raul Queiroz Feitosa

Texture Modeling using MRF and Parameters Estimation

Algorithms for Markov Random Fields in Computer Vision

Markov/Conditional Random Fields, Graph Cut, and applications in Computer Vision

Statistical and Learning Techniques in Computer Vision Lecture 1: Markov Random Fields Jens Rittscher and Chuck Stewart

intro, applications MRF, labeling... how it can be computed at all? Applications in segmentation: GraphCut, GrabCut, demos

Markov Random Fields and Gibbs Sampling for Image Denoising

Mesh segmentation. Florent Lafarge Inria Sophia Antipolis - Mediterranee

Application of MRF s to Segmentation

Markov Networks in Computer Vision

Regularization and Markov Random Fields (MRF) CS 664 Spring 2008

Markov Networks in Computer Vision. Sargur Srihari

Image analysis. Computer Vision and Classification Image Segmentation. 7 Image analysis

Unsupervised Texture Image Segmentation Using MRF- EM Framework

Digital Image Processing Laboratory: MAP Image Restoration

MRF-based Algorithms for Segmentation of SAR Images

3 : Representation of Undirected GMs

Bayesian Methods in Vision: MAP Estimation, MRFs, Optimization

GiRaF: a toolbox for Gibbs Random Fields analysis

MARKOV RANDOM FIELD IMAGE MODELLING. By Michael Mc Grath

CRFs for Image Classification

Mathematics in Image Processing

Models for grids. Computer vision: models, learning and inference. Multi label Denoising. Binary Denoising. Denoising Goal.

Sampling informative/complex a priori probability distributions using Gibbs sampling assisted by sequential simulation

Combining Local and Global Features for Image Segmentation Using Iterative Classification and Region Merging

MRF Based LSB Steganalysis: A New Measure of Steganography Capacity

MR IMAGE SEGMENTATION

Loopy Belief Propagation

Detection of Man-made Structures in Natural Images

Convexization in Markov Chain Monte Carlo

Integrating Intensity and Texture in Markov Random Fields Segmentation. Amer Dawoud and Anton Netchaev. {amer.dawoud*,

Digital Image Processing Laboratory: Markov Random Fields and MAP Image Segmentation

Markov Random Fields with Efficient Approximations

Image Processing. Filtering. Slide 1

Simulated Annealing. Slides based on lecture by Van Larhoven

Expectation Propagation

Image Segmentation Using Iterated Graph Cuts Based on Multi-scale Smoothing

Lecture 4: Undirected Graphical Models

10708 Graphical Models: Homework 4

Conditional Random Fields and beyond D A N I E L K H A S H A B I C S U I U C,

Exact optimization for Markov random fields with convex priors

Computer vision: models, learning and inference. Chapter 10 Graphical Models

Segmentation. Bottom up Segmentation Semantic Segmentation

CS6670: Computer Vision

Belief propagation and MRF s

BAYESIAN SEGMENTATION OF THREE DIMENSIONAL IMAGES USING THE EM/MPM ALGORITHM. A Thesis. Submitted to the Faculty.

Super Resolution Using Graph-cut

Neighbourhood-consensus message passing and its potentials in image processing applications

Probabilistic Graphical Models

D-Separation. b) the arrows meet head-to-head at the node, and neither the node, nor any of its descendants, are in the set C.

Spatial Regularization of Functional Connectivity Using High-Dimensional Markov Random Fields

Lecture 3: Conditional Independence - Undirected

Graphical Models, Bayesian Method, Sampling, and Variational Inference

Entropy Controlled Gauss-Markov Random Measure Field Models for Early Vision

Fast Approximate Energy Minimization via Graph Cuts

Probabilistic Graphical Models

I image restoration that uses the observed image data

Introduction to Image Super-resolution. Presenter: Kevin Su

Chapter 1. Introduction

LEFT IMAGE WITH BLUR PARAMETER RIGHT IMAGE WITH BLUR PARAMETER. σ1 (x, y ) R1. σ2 (x, y ) LEFT IMAGE WITH CORRESPONDENCE

Combinatorial optimization and its applications in image Processing. Filip Malmberg

Subjective Randomness and Natural Scene Statistics

WE present here a method of modelling texture

Chapter 8 of Bishop's Book: Graphical Models

Today. Gradient descent for minimization of functions of real variables. Multi-dimensional scaling. Self-organizing maps

AN ACCURATE IMAGE SEGMENTATION USING REGION SPLITTING TECHNIQUE

Document image binarisation using Markov Field Model

Image Segmentation Using Iterated Graph Cuts BasedonMulti-scaleSmoothing

Discriminative Random Fields

Scan Matching. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

Iterative MAP and ML Estimations for Image Segmentation

Norbert Schuff VA Medical Center and UCSF

Computer Vision Group Prof. Daniel Cremers. 4a. Inference in Graphical Models

Discrete Optimization Methods in Computer Vision CSE 6389 Slides by: Boykov Modified and Presented by: Mostafa Parchami Basic overview of graph cuts

Markov Face Models. Abstract. 1. Introduction. Department of Statistics & Probability Department of Computer Science & Engineering

All images are degraded

Chapter 7: Competitive learning, clustering, and self-organizing maps

Interactive Image Segmentation

10 Sum-product on factor tree graphs, MAP elimination

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Bayesian Estimation for Skew Normal Distributions Using Data Augmentation

Markov Random Fields in Image Segmentation

A Hierarchical Statistical Framework for the Segmentation of Deformable Objects in Image Sequences Charles Kervrann and Fabrice Heitz IRISA / INRIA -

EFFICIENT BAYESIAN INFERENCE USING FULLY CONNECTED CONDITIONAL RANDOM FIELDS WITH STOCHASTIC CLIQUES. M. J. Shafiee, A. Wong, P. Siva, P.

Markov Logic: Representation

Empirical Bayesian Motion Segmentation

Lesion Segmentation and Bias Correction in Breast Ultrasound B-mode Images Including Elastography Information

Resolving motion ambiguities

INTRO TO THE METROPOLIS ALGORITHM

Patch Based Blind Image Super Resolution

Digital Image Processing COSC 6380/4393

Theoretical Concepts of Machine Learning

Ground Tracking in Ground Penetrating Radar

Keywords: Image segmentation, EM algorithm, MAP Estimation, GMM-HMRF, color image segmentation

A hybrid approach for image half-toning combining simulated annealing and Neural Networks based techniques

Introduction to Medical Image Processing

Smooth Image Segmentation by Nonparametric Bayesian Inference

An Adaptive Clustering Algorithm for Image Segmentation

Computer Science Workbench

Discussion on Bayesian Model Selection and Parameter Estimation in Extragalactic Astronomy by Martin Weinberg

Transcription:

Image Restoration using Markov Random Fields Based on the paper Stochastic Relaxation, Gibbs Distributions and Bayesian Restoration of Images, PAMI, 1984, Geman and Geman. and the book Markov Random Field Modeling in Computer Vision, by Stan Z. Li AJIT RAJWADE, CAP 6930. 1

Overview What are MRFs? (Locality Principle) Image Restoration approach using MRFs. Gibbs Random Fields Equivalence between MRF and GRF Concept of Metropolis Sampler and Gibbs Sampler. Discontinuities: Line Processes Results Limitations 2

Markov Random Fields: Intro Parameterized image model, where parameters have much lower dimensionality than the image. Consider image X with pixels (x1,x2,..xn). Basic concept: to factorize the global joint distribution of all the pixel values into a product of local factors. Within each local model, each entity depends only upon a set of neighboring entities. 3

MRF Intro To represent these relationships, use a graphical model. Each pixel in an image: NODE in the graph. Nodes arranged on a 2D lattice. Neighborhood relations: EDGES in the graph. Each node bears a label (value of intensity, gradient, segment label etc.) 4

Neighborhood Systems A neighborhood system defines what nodes will be treated as neighbors. Example: Neighborhood Systems of order one (4 neighbors) and two (8 neighbors). 5

Cliques Neighborhood systems have associated cliques. Cliques are a subset of nodes in which every pair of nodes are neighbors. Cliques for neighborhoods of order one and two: 6

MRF Definition Markov Model: Conditional probability of the value of a signal at time t depends only on the k previous values. k = order of the Markov Model. Markov Random Field: An extension of above concept to 2D. The conditional probability of a pixel label depends only on the labels of its neighbors. 7

MRF Definition Let be a set of sites (nodes of a graph), and be a neighborhood system. Let be a set of random variables indexed by S, and be a set of labels. Let be the set of all possible configurations of X. Then X is an MRF w.r.t. if and only if: 8

Image Restoration Approach Let G = a degraded image. Given G, restore the original image X. Bayesian approach: find mode of the posterior distribution, i.e. Problem: computationally prohibitive even for small images with very few intensity levels. 9

Gibbs Random Fields A set of random variables X are called a GRF, if its realizations follow a Gibbs distribution, i.e. is called an energy functional, given as follows: Here is called as the clique potential. The clique potential encodes the relationship between nodes in a clique (example, difference in intensity values). 10

Gibbs Random Fields Z is a normalizing constant given as 11

MRF = GRF The Hammersley-Clifford theorem states that: If X is an MRF on a set of nodes S with a neighborhood system N, then it is also a GRF. Incidentally, the converse is also true. Practical use: it allows us to specify the joint probability of the MRF for the original (denoised) image. 12

MRF = GRF But look at the formula for joint probability: Calculating Z is impossible. The trick is we need not compute Z for the required MAP estimate. 13

MRF = GRF Recall P ( X = ω G = g) = P( G = g X = ω) P( X = ω) / P( G = g) We have We also have the prior The likelihood does not have a term for Z. 14

MRF = GRF Taking logarithms, we get 15

Image Restoration Framework Formulate task as a Bayesian estimation problem. Denoised image is given as ω = arg maxω( P( X = ω G = g)) Expand posterior as P ( X = ω G = g) = P( G = g X = ω) P( X = ω) / P( G = g) Prior is given by the GRF with the prior energy. 16

Image Restoration Framework The likelihood noise statistics. is given by knowledge of Total posterior energy is given as: U ( X = ω G = g) = U(G = g X = ω) + U ( X = ω) Minimizing posterior energy = maximizing posterior probability (MAP). Energy minimization done using a sampling scheme combined with simulated annealing. 17

Computing MAP Estimate Solution: use Metropolis or Gibbs sampler. Both samplers work by generating a sequence of configurations starting from the initial configuration. It can be proved that this sampling scheme results in the required Gibbs distribution. And that the final outcome is not dependent upon. 18

Metropolis Sampler Let the initial (possibly unlikely) configuration =. Generate a new configuration starting from the previous one through some stochastic process. Determine energy change If the energy decreases, accept and go to step (2). Else accept with a probability and return to step (2). This is repeated for a large number of iterations 19

Metropolis Sampler After many iterations of the Metropolis sampler, the generated samples will obey a Gibbs distribution. It is a stochastic algorithm, i.e. it sometimes allows changes that increase the energy and hence avoids getting trapped in local minima. Disadvantage: could be very slow if there are many rejections. 20

Gibbs Sampler The Gibbs sampler also starts from some random configuration. A new sample is generated by updating the image, site by site (raster scan), using the conditional probability of the pixel value at that site, given its neighbors, i.e. from the following: The Gibbs Sampler also yields samples from the Gibbs distribution at equilibrium (after many iterations), i.e. 21

Gibbs Sampler No question of rejections, so faster. Implementation could be done in parallel (as nonneighboring nodes can be updated independently). The temperature parameter T is decreased during successive iterations (annealing). 22

Preserving Discontinuities In image smoothing, one also wants to preserve discontinuities in an image. That means we want to prevent interaction between two nodes in a clique if an edge is present in between them (or the difference in their intensities exceeds some threshold). To incorporate such constraints, we include a line process value between every pair of (neighboring) pixels. 23

Preserving Discontinuities 24

MRF Example: Ising Model Consider a model in which L = 2 (binary image X) and having an energy function The corresponding conditional distribution at site i is then given as: P( X i = x i X Ni ) = exp( x ij ( α + βv 1+ exp( ( α + βv i, j i, j )) )) v = i j x +, i, j+ 1 xi+ 1, j 25

Application: Texture Synthesis Generate a random binary image. Run several Gibb s sampler iterations on it, making use of previously mentioned conditional probability. Different values of produce different textures. 26

Incorporation of Line Processes The energy for an MRF can be written as: With line processes, the energy is written as: Here is the line process field potential. 27

Line Processes In the paper by Geman and Geman, different configurations of line processes have been assigned different energy values: 28

Results Original Image Degraded Image 29

Results Image restored without line processes Restored with line processes 30

Limitations Parameters of the MRF (weighting factors in the clique potential functions, i.e. and ) were chosen on an ad hoc basis. Parameter estimation is a non-trivial task. 31